WorldWideScience

Sample records for emsl bgc modeling

  1. BOREAS RSS-08 BIOME-BGC Model Simulations at Tower Flux Sites in 1994

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: BIOME-BGC is a general ecosystem process model designed to simulate biogeochemical and hydrologic processes across multiple scales. BIOME-BGC is used to...

  2. BOREAS RSS-08 BIOME-BGC Model Simulations at Tower Flux Sites in 1994

    Data.gov (United States)

    National Aeronautics and Space Administration — BIOME-BGC is a general ecosystem process model designed to simulate biogeochemical and hydrologic processes across multiple scales. BIOME-BGC is used to estimate...

  3. PnET-BGC: Modeling Biogeochemical Processes in a Northern Hardwood Forest Ecosystem

    Data.gov (United States)

    National Aeronautics and Space Administration — This archived model product contains the directions, executables, and procedures for running PnET-BGC to recreate the results of: Gbondo-Tugbawa, S.S., C.T. Driscoll...

  4. PnET-BGC: Modeling Biogeochemical Processes in a Northern Hardwood Forest Ecosystem

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This archived model product contains the directions, executables, and procedures for running PnET-BGC to recreate the results of: Gbondo-Tugbawa, S.S.,...

  5. Biome-BGC: Modeling Effects of Disturbance and Climate (Thornton et al. 2002)

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This archived model product contains the directions, executables, and procedures for running Biome-BGC, Version 4.1.1, to recreate the results of the...

  6. Biome-BGC: Modeling Carbon Dynamics in Ponderosa Pine Stands (Law et al. 2003)

    Data.gov (United States)

    National Aeronautics and Space Administration — This archived model product contains the directions, executables, and procedures for running Biome-BGC, Version 4.1.2, to recreate the results of the following...

  7. Biome-BGC: Modeling Effects of Disturbance and Climate (Thornton et al. 2002)

    Data.gov (United States)

    National Aeronautics and Space Administration — This archived model product contains the directions, executables, and procedures for running Biome-BGC, Version 4.1.1, to recreate the results of the following...

  8. Biome-BGC: Modeling Carbon Dynamics in Ponderosa Pine Stands (Law et al. 2003)

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This archived model product contains the directions, executables, and procedures for running Biome-BGC, Version 4.1.2, to recreate the results of the...

  9. NACP Biome-BGC Modeled Ecosystem Carbon Balance, Pacific Northwest, USA, 1986-2010

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides Biome-BGC modeled estimates of carbon stocks and fluxes in the U.S. Pacific Northwest for the years 1986-2010. Fluxes include net ecosystem...

  10. The UK Earth System Models Marine Biogeochemical Evaluation Toolkit, BGC-val

    Science.gov (United States)

    de Mora, Lee

    2017-04-01

    The Biogeochemical Validation toolkit, BGC-val, is a model and grid independent python-based marine model evaluation framework that automates much of the validation of the marine component of an Earth System Model. BGC-val was initially developed to be a flexible and extensible system to evaluate the spin up of the marine UK Earth System Model (UKESM). However, the grid-independence and flexibility means that it is straightforward to adapt the BGC-val framework to evaluate other marine models. In addition to the marine component of the UKESM, this toolkit has been adapted to compare multiple models, including models from the CMIP5 and iMarNet inter-comparison projects. The BGC-val toolkit produces multiple levels of analysis which are presented in a simple to use interactive html5 document. Level 1 contains time series analyses, showing the development over time of many important biogeochemical and physical ocean metrics, such as the Global primary production or the Drake passage current. The second level of BGC-val is an in-depth spatial analyses of a single point in time. This is a series of point to point comparison of model and data in various regions, such as a comparison of Surface Nitrate in the model vs data from the world ocean atlas. The third level analyses are specialised ad-hoc packages to go in-depth on a specific question, such as the development of Oxygen minimum zones in the Equatorial Pacific. In additional to the three levels, the html5 document opens with a Level 0 table showing a summary of the status of the model run. The beta version of this toolkit is available via the Plymouth Marine Laboratory Gitlab server and uses the BSD 3 clause license.

  11. Modelling the carbon budget of intensive forest monitoring sites in Germany using the simulation model BIOME-BGC

    OpenAIRE

    Jochheim, H.; Puhlmann, M.; Beese, F.; Berthold, D.; Einert, P.; Kallweit, R.; Konopatzky, A.; Meesenburg, H.; Meiwes, K.-J.; Raspe, S.; Schulte-Bisping, H.; Schulz, C.

    2008-01-01

    It is shown that by calibrating the simulation model BIOME-BGC with mandatory and optional Level II data, within the ICP Forest programme, a well-founded calculation of the carbon budget of forest stands is achievable and, based on succeeded calibration, the modified BIOME-BGC model is a useful tool to assess the effect of climate change on forest ecosystems. peerReviewed

  12. Development of the BIOME-BGC model for the simulation of managed Moso bamboo forest ecosystems.

    Science.gov (United States)

    Mao, Fangjie; Li, Pingheng; Zhou, Guomo; Du, Huaqiang; Xu, Xiaojun; Shi, Yongjun; Mo, Lufeng; Zhou, Yufeng; Tu, Guoqing

    2016-05-01

    Numerical models are the most appropriate instrument for the analysis of the carbon balance of terrestrial ecosystems and their interactions with changing environmental conditions. The process-based model BIOME-BGC is widely used in simulation of carbon balance within vegetation, litter and soil of unmanaged ecosystems. For Moso bamboo forests, however, simulations with BIOME-BGC are inaccurate in terms of the growing season and the carbon allocation, due to the oversimplified representation of phenology. Our aim was to improve the applicability of BIOME-BGC for managed Moso bamboo forest ecosystem by implementing several new modules, including phenology, carbon allocation, and management. Instead of the simple phenology and carbon allocation representations in the original version, a periodic Moso bamboo phenology and carbon allocation module was implemented, which can handle the processes of Moso bamboo shooting and high growth during "on-year" and "off-year". Four management modules (digging bamboo shoots, selective cutting, obtruncation, fertilization) were integrated in order to quantify the functioning of managed ecosystems. The improved model was calibrated and validated using eddy covariance measurement data collected at a managed Moso bamboo forest site (Anji) during 2011-2013 years. As a result of these developments and calibrations, the performance of the model was substantially improved. Regarding the measured and modeled fluxes (gross primary production, total ecosystem respiration, net ecosystem exchange), relative errors were decreased by 42.23%, 103.02% and 18.67%, respectively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. EMSL Operations Manual

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Nancy S.

    2009-06-18

    This manual is a general resource tool to assist EMSL users and Laboratory staff within EMSL locate official policy, practice and subject matter experts. It is not intended to replace or amend any formal Battelle policy or practice. Users of this manual should rely only on Battelle’s Standard Based Management System (SBMS) for official policy. No contractual commitment or right of any kind is created by this manual. Battelle management reserves the right to alter, change, or delete any information contained within this manual without prior notice.

  14. Structural development and web service based sensitivity analysis of the Biome-BGC MuSo model

    Science.gov (United States)

    Hidy, Dóra; Balogh, János; Churkina, Galina; Haszpra, László; Horváth, Ferenc; Ittzés, Péter; Ittzés, Dóra; Ma, Shaoxiu; Nagy, Zoltán; Pintér, Krisztina; Barcza, Zoltán

    2014-05-01

    Studying the greenhouse gas exchange, mainly the carbon dioxide sink and source character of ecosystems is still a highly relevant research topic in biogeochemistry. During the past few years research focused on managed ecosystems, because human intervention has an important role in the formation of the land surface through agricultural management, land use change, and other practices. In spite of considerable developments current biogeochemical models still have uncertainties to adequately quantify greenhouse gas exchange processes of managed ecosystem. Therefore, it is an important task to develop and test process-based biogeochemical models. Biome-BGC is a widely used, popular biogeochemical model that simulates the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems. Biome-BGC was originally developed by the Numerical Terradynamic Simulation Group (NTSG) of University of Montana (http://www.ntsg.umt.edu/project/biome-bgc), and several other researchers used and modified it in the past. Our research group developed Biome-BGC version 4.1.1 to improve essentially the ability of the model to simulate carbon and water cycle in real managed ecosystems. The modifications included structural improvements of the model (e.g., implementation of multilayer soil module and drought related plant senescence; improved model phenology). Beside these improvements management modules and annually varying options were introduced and implemented (simulate mowing, grazing, planting, harvest, ploughing, application of fertilizers, forest thinning). Dynamic (annually varying) whole plant mortality was also enabled in the model to support more realistic simulation of forest stand development and natural disturbances. In the most recent model version separate pools have been defined for fruit. The model version which contains every former and new development is referred as Biome-BGC MuSo (Biome-BGC

  15. EMSL Outlook Review 2005

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Allison A.

    2005-04-01

    The William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) is a national user facility that contains state-of-the-art instrumentation and expert resources available for use by researchers from academia, industry, and the national laboratory system. The facility is supported by the U.S. Department of Energy’s (DOE) Biological and Environmental Research Program, but the research conducted within the facility benefits many funding agencies, including other branches of DOE, the National Institutes of Health, the National Science Foundation, and the Department of Defense. EMSL requires the continued funding and support of its stakeholders and clients to continue to grow its mission, build its reputation as a sought-after national user facility with cutting-edge capabilities, and attract high-profile users who will work to solve the most critical scientific challenges that affect DOE and the nation. In this vein, this document has been compiled to provide these stakeholders and clients with a review document that provides an abundance of information on EMSL’s history, current research activities, and proposed future direction.

  16. EMSL 2008 Operational Review

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Allison A.

    2008-08-12

    The William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) is a national user facility that contains state-of-the-art instrumentation and expert resources available for use by researchers from academia, industry, and the national laboratory system. The facility is supported by the U.S. Department of Energy’s (DOE) Biological and Environmental Research Program, but the research conducted within the facility benefits many funding agencies, including other branches of DOE, the National Institutes of Health, the National Science Foundation, and the Department of Defense. EMSL requires the continued funding and support of its stakeholders and clients to continue to grow its mission, build its reputation as a sought-after national user facility with cutting-edge capabilities, and attract high-profile users who will work to solve the most critical scientific challenges that affect DOE and the nation. In this vein, this document has been compiled to provide these stakeholders and clients with a review document that provides an abundance of information on EMSL’s history, current research activities, and proposed future direction.

  17. Biome-BGC: Terrestrial Ecosystem Process Model, Version 4.1.1

    Data.gov (United States)

    National Aeronautics and Space Administration — Biome-BGC is a computer program that estimates fluxes and storage of energy, water, carbon, and nitrogen for the vegetation and soil components of terrestrial...

  18. Biome-BGC: Terrestrial Ecosystem Process Model, Version 4.1.1

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: Biome-BGC is a computer program that estimates fluxes and storage of energy, water, carbon, and nitrogen for the vegetation and soil components of...

  19. Assessing the protection function of Alpine forest ecosystems using BGC modelling theory

    Science.gov (United States)

    Pötzelsberger, E.; Hasenauer, H.; Petritsch, R.; Pietsch, S. A.

    2009-04-01

    The purpose of this study was to assess the protection function of forests in Alpine areas by modelling the flux dynamics (water, carbon, nutrients) within a watershed as they may depend on the vegetation pattern and forest management impacts. The application case for this study was the catchment Schmittenbach, located in the province of Salzburg. Data available covered the hydrology (rainfall measurements from 1981 to 1998 and runoff measurements at the river Schmittenbach from 1981 to 2005), vegetation dynamics (currently 69% forest, predominantly Norway Spruce). The method of simulating the forest growth and water outflow was validated. For simulations of the key ecosystem processes (e.g. photosynthesis, carbon and nitrogen allocation in the different plant parts, litter fall, mineralisation, tree water uptake, transpiration, rainfall interception, evaporation, snow accumulation and snow melt, outflow of spare water) the biogeochemical ecosystem model Biome-BGC was applied. Relevant model extensions were the tree species specific parameter sets and the improved thinning regime. The model is sensitive to site characteristics and needs daily weather data and information on the atmospheric composition, which makes it sensitive to higher CO2-levels and climate change. For model validation 53 plots were selected covering the full range of site quality and stand age. Tree volume and soil was measured and compared with the respective model results. The outflow for the watershed was predicted by combining the simulated forest-outflow (derived from plot-outflow) with the outflow from the non-forest area (calculated with a fixed outflow/rainfall coefficient (OC)). The analysis of production and water related model outputs indicated that mechanistic modelling can be used as a tool to assess the performance of Alpine protection forests. The Water Use Efficiency (WUE), the ratio of Net primary production (NPP) and Transpiration, was found the highest for juvenile stands (

  20. [Parameter sensitivity of simulating net primary productivity of Larix olgensis forest based on BIOME-BGC model].

    Science.gov (United States)

    He, Li-hong; Wang, Hai-yan; Lei, Xiang-dong

    2016-02-01

    Model based on vegetation ecophysiological process contains many parameters, and reasonable parameter values will greatly improve simulation ability. Sensitivity analysis, as an important method to screen out the sensitive parameters, can comprehensively analyze how model parameters affect the simulation results. In this paper, we conducted parameter sensitivity analysis of BIOME-BGC model with a case study of simulating net primary productivity (NPP) of Larix olgensis forest in Wangqing, Jilin Province. First, with the contrastive analysis between field measurement data and the simulation results, we tested the BIOME-BGC model' s capability of simulating the NPP of L. olgensis forest. Then, Morris and EFAST sensitivity methods were used to screen the sensitive parameters that had strong influence on NPP. On this basis, we also quantitatively estimated the sensitivity of the screened parameters, and calculated the global, the first-order and the second-order sensitivity indices. The results showed that the BIOME-BGC model could well simulate the NPP of L. olgensis forest in the sample plot. The Morris sensitivity method provided a reliable parameter sensitivity analysis result under the condition of a relatively small sample size. The EFAST sensitivity method could quantitatively measure the impact of simulation result of a single parameter as well as the interaction between the parameters in BIOME-BGC model. The influential sensitive parameters for L. olgensis forest NPP were new stem carbon to new leaf carbon allocation and leaf carbon to nitrogen ratio, the effect of their interaction was significantly greater than the other parameter' teraction effect.

  1. Modeling the grazing effect on dry grassland carbon cycling with modified Biome-BGC grazing model

    Science.gov (United States)

    Luo, Geping; Han, Qifei; Li, Chaofan; Yang, Liao

    2014-05-01

    Identifying the factors that determine the carbon source/sink strength of ecosystems is important for reducing uncertainty in the global carbon cycle. Arid grassland ecosystems are a widely distributed biome type in Xinjiang, Northwest China, covering approximately one-fourth the country's land surface. These grasslands are the habitat for many endemic and rare plant and animal species and are also used as pastoral land for livestock. Using the modified Biome-BGC grazing model, we modeled carbon dynamics in Xinjiang for grasslands that varied in grazing intensity. In general, this regional simulation estimated that the grassland ecosystems in Xinjiang acted as a net carbon source, with a value of 0.38 Pg C over the period 1979-2007. There were significant effects of grazing on carbon dynamics. An over-compensatory effect in net primary productivity (NPP) and vegetation carbon (C) stock was observed when grazing intensity was lower than 0.40 head/ha. Grazing resulted in a net carbon source of 23.45 g C m-2 yr-1, which equaled 0.37 Pg in Xinjiang in the last 29 years. In general, grazing decreased vegetation C stock, while an increasing trend was observed with low grazing intensity. The soil C increased significantly (17%) with long-term grazing, while the soil C stock exhibited a steady trend without grazing. These findings have implications for grassland ecosystem management as it relates to carbon sequestration and climate change mitigation, e.g., removal of grazing should be considered in strategies that aim to increase terrestrial carbon sequestrations at local and regional scales. One of the greatest limitations in quantifying the effects of herbivores on carbon cycling is identifying the grazing systems and intensities within a given region. We hope our study emphasizes the need for large-scale assessments of how grazing impacts carbon cycling. Most terrestrial ecosystems in Xinjiang have been affected by disturbances to a greater or lesser extent in the past

  2. EMSL Contribution Plan

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Allison A.

    2008-12-01

    This Contribution Plan is EMSL’s template for achieving our vision of simultaneous excellence in all aspects of our mission as a national scientific user facility. It reflects our understanding of the long-term stewardship we must work toward to meet the scientific challenges faced by the Department of Energy (DOE) and the nation. During the next decade, we will implement the strategies contained in this Plan, working closely with the scientific community, our advisory committees, DOE’s Office of Biological and Environmental Research, and other key stakeholders. This Plan is fully aligned with the strategic plans of DOE, its Office of Science, and the Pacific Northwest National Laboratory (PNNL). We recognize that shifts in science and technology, national priorities, and resources made available through the Federal budget process create planning uncertainties and, ultimately, a highly dynamic planning environment. Accordingly, this Plan should be viewed as a living document and we continually evaluate the changing needs and opportunities posed by our stakeholders (i.e., DOE, users, staff, advisory committees), work closely with them to understand and respond to those changes, and align our strategy accordingly. This Plan is organized around two sections. Section 1 describes our vision and four strategic outcomes: 1) Scientific Innovation, 2) Capabilities that Transform Science, 3) Outstanding Management and Operations, and Engaged and Proactive Users. These outcomes provide the framework for seven critical actions we must take during the next 3 to 5 years: 1) Establishing leadership in EMSL science themes, 2) building and deploying transformational capabilities, 3) integrating computation with experiment, 4) ensuring EMSL’s workforce meets the scientific challenges of the future, 5) creating partnerships, 6) attracting and engaging users in EMSL’s long-term strategy, and 7) building a research infrastructure that meets emerging scientific needs. Section 2

  3. Modeling the Ecosystem Services Provided by Trees in Urban Ecosystems: Using Biome-BGC to Improve i-Tree Eco

    Science.gov (United States)

    Brown, Molly E.; McGroddy, Megan; Spence, Caitlin; Flake, Leah; Sarfraz, Amna; Nowak, David J.; Milesi, Cristina

    2012-01-01

    As the world becomes increasingly urban, the need to quantify the effect of trees in urban environments on energy usage, air pollution, local climate and nutrient run-off has increased. By identifying, quantifying and valuing the ecological activity that provides services in urban areas, stronger policies and improved quality of life for urban residents can be obtained. Here we focus on two radically different models that can be used to characterize urban forests. The i-Tree Eco model (formerly UFORE model) quantifies ecosystem services (e.g., air pollution removal, carbon storage) and values derived from urban trees based on field measurements of trees and local ancillary data sets. Biome-BGC (Biome BioGeoChemistry) is used to simulate the fluxes and storage of carbon, water, and nitrogen in natural environments. This paper compares i-Tree Eco's methods to those of Biome-BGC, which estimates the fluxes and storage of energy, carbon, water and nitrogen for vegetation and soil components of the ecosystem. We describe the two models and their differences in the way they calculate similar properties, with a focus on carbon and nitrogen. Finally, we discuss the implications of further integration of these two communities for land managers such as those in Maryland.

  4. The Accelerator Facility at the Environmental Molecular Sciences Laboratory (EMSL)

    Science.gov (United States)

    Thevuthasan, S.; Peden, C. H. F.; Engelhard, M. H.; Baer, D. R.; Herman, G. S.; Liang, Y.

    1997-03-01

    The EMSL, a new Department of Energy (DOE) user facility located at PNNL, will have several state-of-the-art systems, including an accelerator facility that can be used by scientists from around the world. The accelerator facility at EMSL consists of a model 9SDH-2 Pelletron 3.4 MV electrostatic tandem ion accelerator with three beam lines. These beam lines are dedicated to UHV ion scattering capabilities, implantation capabilities, and HV ion scattering capabilities, respectively. The end station attached to the UHV beam line has several electron spectroscopies such as low energy electron diffraction (LEED) and Auger electron spectroscopy (AES) in addition to the ion scattering capabilities. This end station will be interfaced with the EMSL transfer capability that allows a sample to be synthesized, processed, and characterized in several surface science UHV systems. We will discuss the accelerator facility and the capabilities along with some initial results. (Work supported by the DOE/ER/OHER)

  5. Optimizing selective cutting strategies for maximum carbon stocks and yield of Moso bamboo forest using BIOME-BGC model.

    Science.gov (United States)

    Mao, Fangjie; Zhou, Guomo; Li, Pingheng; Du, Huaqiang; Xu, Xiaojun; Shi, Yongjun; Mo, Lufeng; Zhou, Yufeng; Tu, Guoqing

    2017-04-15

    The selective cutting method currently used in Moso bamboo forests has resulted in a reduction of stand productivity and carbon sequestration capacity. Given the time and labor expense involved in addressing this problem manually, simulation using an ecosystem model is the most suitable approach. The BIOME-BGC model was improved to suit managed Moso bamboo forests, which was adapted to include age structure, specific ecological processes and management measures of Moso bamboo forest. A field selective cutting experiment was done in nine plots with three cutting intensities (high-intensity, moderate-intensity and low-intensity) during 2010-2013, and biomass of these plots was measured for model validation. Then four selective cutting scenarios were simulated by the improved BIOME-BGC model to optimize the selective cutting timings, intervals, retained ages and intensities. The improved model matched the observed aboveground carbon density and yield of different plots, with a range of relative error from 9.83% to 15.74%. The results of different selective cutting scenarios suggested that the optimal selective cutting measure should be cutting 30% culms of age 6, 80% culms of age 7, and all culms thereafter (above age 8) in winter every other year. The vegetation carbon density and harvested carbon density of this selective cutting method can increase by 74.63% and 21.5%, respectively, compared with the current selective cutting measure. The optimized selective cutting measure developed in this study can significantly promote carbon density, yield, and carbon sink capacity in Moso bamboo forests. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. The pharmacological effect of BGC20-1531, a novel prostanoid EP4 receptor antagonist, in the prostaglandin E2 human model of headache

    DEFF Research Database (Denmark)

    Antonova, Maria; Wienecke, Troels; Maubach, Karen

    2011-01-01

    Using a human Prostaglandin E(2) (PGE(2)) model of headache, we examined whether a novel potent and selective EP(4) receptor antagonist, BGC20-1531, may prevent headache and dilatation of the middle cerebral (MCA) and superficial temporal artery (STA). In a three-way cross-over trial, eight healt...

  7. Belowground Carbon Cycling Processes at the Molecular Scale: An EMSL Science Theme Advisory Panel Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Hess, Nancy J.; Brown, Gordon E.; Plata, Charity

    2014-02-21

    As part of the Belowground Carbon Cycling Processes at the Molecular Scale workshop, an EMSL Science Theme Advisory Panel meeting held in February 2013, attendees discussed critical biogeochemical processes that regulate carbon cycling in soil. The meeting attendees determined that as a national scientific user facility, EMSL can provide the tools and expertise needed to elucidate the molecular foundation that underlies mechanistic descriptions of biogeochemical processes that control carbon allocation and fluxes at the terrestrial/atmospheric interface in landscape and regional climate models. Consequently, the workshop's goal was to identify the science gaps that hinder either development of mechanistic description of critical processes or their accurate representation in climate models. In part, this report offers recommendations for future EMSL activities in this research area. The workshop was co-chaired by Dr. Nancy Hess (EMSL) and Dr. Gordon Brown (Stanford University).

  8. EMSL Bimonthly Report: June 2007 through July 2007

    Energy Technology Data Exchange (ETDEWEB)

    Showalter, Mary Ann

    2007-10-03

    The W.R. Wiley Environmental Molecular Sciences Laboratory (EMSL) is a U.S. Department of Energy (DOE) national scientific user facility located at Pacific Northwest National Laboratory (PNNL) in Richland, Washington. PNNL operates EMSL for the DOE Office of Biological and Environmental Research. At one location, EMSL offers a comprehensive array of leading-edge resources in six research facilities. Access to the capabilities and instrumentation in EMSL facilities is obtained through a peer-reviewed proposal process. The Bimonthly Report documents research activities and accomplishments of EMSL users and staff. Topics covered in the Bimonthly Report include Research Highlights of EMSL user projects, Scientific Grand Challenge Highlights, Awards and Recognition, Professional/Community Service, Major Facility Upgrades, News Coverage, Visitors and Users, New EMSL Staff, Publications, Presentations, Patents, and Journal Covers featuring EMSL user research.

  9. Copula Multivariate analysis of Gross primary production and its hydro-environmental driver; A BIOME-BGC model applied to the Antisana páramos

    Science.gov (United States)

    Minaya, Veronica; Corzo, Gerald; van der Kwast, Johannes; Galarraga, Remigio; Mynett, Arthur

    2014-05-01

    Simulations of carbon cycling are prone to uncertainties from different sources, which in general are related to input data, parameters and the model representation capacities itself. The gross carbon uptake in the cycle is represented by the gross primary production (GPP), which deals with the spatio-temporal variability of the precipitation and the soil moisture dynamics. This variability associated with uncertainty of the parameters can be modelled by multivariate probabilistic distributions. Our study presents a novel methodology that uses multivariate Copulas analysis to assess the GPP. Multi-species and elevations variables are included in a first scenario of the analysis. Hydro-meteorological conditions that might generate a change in the next 50 or more years are included in a second scenario of this analysis. The biogeochemical model BIOME-BGC was applied in the Ecuadorian Andean region in elevations greater than 4000 masl with the presence of typical vegetation of páramo. The change of GPP over time is crucial for climate scenarios of the carbon cycling in this type of ecosystem. The results help to improve our understanding of the ecosystem function and clarify the dynamics and the relationship with the change of climate variables. Keywords: multivariate analysis, Copula, BIOME-BGC, NPP, páramos

  10. Applying EMSL Capabilities to Biogeochemistry and Environmental Research

    Energy Technology Data Exchange (ETDEWEB)

    Felmy, Andy

    2007-04-19

    The Environmental Molecular Sciences laboratory (EMSL) is a national scientific user facility operated by the Pacific Northwest National Laboratory (PNNL) for the U.S. Department of Energy's Office of Biological and Environmental Research. Located in Richland, Washington, EMSL offers researchers a comprehensive array of cutting-edge capabilities unmatched anywhere else in the world and access to the expertise of over 300 resident users--all at one location. EMSL's resources are available on a peer-reviewed proposal basis and are offered at no cost if research results are shared in the open literature. Researchers are encouraged to submit a proposal centered around one of EMSL's four Science Themes, which represent growing areas of research: (1) Geochemistry/Biogeochemistry and Subsurface Science; (2) Atmospheric Aerosol Chemistry; (3) Biological Interactions and Dynamics; and (4) Science of Interfacial Phenomena. To learn more about EMSL, visit www.emsl.pnl.gov.

  11. A spatial implementation of the BIOME-BGC to model grassland GPP production and water budgets in the Ecuadorian Andean Region

    Science.gov (United States)

    Minaya, Veronica; Corzo, Gerald; van der Kwast, Johannes; Mynett, Arthur

    2016-04-01

    Many terrestrial biogeochemistry process models have been applied around the world at different scales and for a large range of ecosystems. Grasslands, and in particular the ones located in the Andean Region are essential ecosystems that sustain important ecological processes; however, just a few efforts have been made to estimate the gross primary production (GPP) and the hydrological budgets for this specific ecosystem along an altitudinal gradient. A previous study, which is one of the few available in the region, considered the heterogeneity of the main properties of the páramo vegetation and showed significant differences in plant functional types, site/soil parameters and daily meteorology. This study extends the work above mentioned and uses spatio-temporal analysis of the BIOME-BGC model results. This was done to simulate the GPP and the water fluxes in space and time, by applying altitudinal analysis. The catchment located at the southwestern slope of the Antisana volcano in Ecuador was selected as a representative area of the Andean páramos and its hydrological importance as one of the main sources of a water supply reservoir in the region. An accurate estimation of temporal changes in GPP in the region is important for carbon budget assessments, evaluation of the impact of climate change and biomass productivity. This complex and yet interesting problem was integrated by the ecosystem process model BIOME-BGC, the results were evaluated and associated to the land cover map where the growth forms of vegetation were identified. The responses of GPP and the water fluxes were not only dependent on the environmental drivers but also on the ecophysiology and the site specific parameters. The model estimated that the GPP at lower elevations doubles the amount estimated at higher elevations, which might have a large implication during extrapolations at larger spatio-temporal scales. The outcomes of the stand hydrological processes demonstrated a wrong

  12. Accounting for age Structure in Ponderosa Pine Ecosystem Analyses: Integrating Management, Disturbance Histories and Observations with the BIOME-BGC Model

    Science.gov (United States)

    Hibbard, K. A.; Law, B.; Thornton, P.

    2003-12-01

    Disturbance and management regimes in forested ecosystems have been recently highlighted as important factors contributing to quantification of carbon stocks and fluxes. Disturbance events, such as stand-replacing fires and current management regimes that emphasize understory and tree thinning are primary suspects influencing ecosystem processes, including net ecosystem productivity (NEP) in forests of the Pacific Northwest. Several recent analyses have compared simulated to measured component stocks and fluxes of carbon in Ponderosa Pine (Pinus ponderosa var. Laws) at 12 sites ranging from 9 to 300 years in central Oregon (Law et al. 2001, Law et al. 2003) using the BIOME-BGC model. Major emphases on ecosystem model developments include improving allocation logic, integrating ecosystem processes with disturbance such as fire and including nitrogen in biogeochemical cycling. In Law et al. (2001, 2003), field observations prompted BIOME-BGC improvements including dynamic allocation of carbon to fine root mass through the life of a stand. A sequence of simulations was also designed to represent both management and disturbance histories for each site, however, current age structure of each sites wasn't addressed. Age structure, or cohort management has largely been ignored by ecosystem models, however, some studies have sought to incorporate stand age with disturbance and management (e.g. Hibbard et al. 2003). In this analyses, we regressed tree ages against height (R2 = 0.67) to develop a proportional distribution of age structure for each site. To preserve the integrity of the comparison between Law et al. (2003) and this study, we maintained the same timing of harvest, however, based on the distribution of age structures, we manipulated the amount of removal. Harvest by Law et al. (2003) was set at stand-replacement (99%) levels to simulate clear-cutting and reflecting the average top 10% of the age in each plot. For the young sites, we set removal at 73%, 51% and

  13. The BGC Feedbacks Scientific Focus Area 2016 Annual Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forrest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Riley, William J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Randerson, James T. [Univ. of California, Irvine, CA (United States)

    2016-06-01

    The BGC Feedbacks Project will identify and quantify the feedbacks between biogeochemical cycles and the climate system, and quantify and reduce the uncertainties in Earth System Models (ESMs) associated with those feedbacks. The BGC Feedbacks Project will contribute to the integration of the experimental and modeling science communities, providing researchers with new tools to compare measurements and models, thereby enabling DOE to contribute more effectively to future climate assessments by the U.S. Global Change Research Program (USGCRP) and the Intergovernmental Panel on Climate Change (IPCC).

  14. EMSL Quarterly Highlights Report: 1st Quarter, FY08

    Energy Technology Data Exchange (ETDEWEB)

    Showalter, Mary Ann

    2008-01-28

    The EMSL Quarterly Highlights Report covers the science, staff and user recognition, and publication activities that occurred during the 1st quarter (October 2007 - December 2007) of Fiscal Year 2008.

  15. EMSL Quarterly Highlights Report: 1st Quarter, Fiscal Year 2009

    Energy Technology Data Exchange (ETDEWEB)

    Showalter, Mary Ann; Kathmann, Loel E.; Manke, Kristin L.

    2009-02-02

    The EMSL Quarterly Highlights Report covers the science, staff and user recognition, and publication activities that occurred during the 1st quarter (October 2008 - December 2008) of Fiscal Year 2009.

  16. EMSL Quarterly Highlights Report: FY 2008, 3rd Quarter

    Energy Technology Data Exchange (ETDEWEB)

    Showalter, Mary Ann

    2008-09-16

    The EMSL Quarterly Highlights Report covers the science, staff and user recognition, and publication activities that occurred during the 1st quarter (October 2007 - December 2007) of Fiscal Year 2008.

  17. Applications of neural networks to real-time data processing at the Environmental and Molecular Sciences Laboratory (EMSL)

    International Nuclear Information System (INIS)

    Keller, P.E.; Kouzes, R.T.; Kangas, L.J.

    1993-06-01

    Detailed design of the Environmental and Molecular Sciences Laboratory (EMSL) at the Pacific Northwest Laboratory (PNL) is nearing completion and construction is scheduled to begin later this year. This facility will assist in the environmental restoration and waste management mission at the Hanford Site. This paper identifies several real-time data processing applications within the EMSL where neural networks can potentially be beneficial. These applications include real-time sensor data acquisition and analysis, spectral analysis, process control, theoretical modeling, and data compression

  18. Simulating carbon and water cycles of larch forests in East Asia by the BIOME-BGC model with AsiaFlux data

    Directory of Open Access Journals (Sweden)

    M. Ueyama

    2010-03-01

    Full Text Available Larch forests are widely distributed across many cool-temperate and boreal regions, and they are expected to play an important role in global carbon and water cycles. Model parameterizations for larch forests still contain large uncertainties owing to a lack of validation. In this study, a process-based terrestrial biosphere model, BIOME-BGC, was tested for larch forests at six AsiaFlux sites and used to identify important environmental factors that affect the carbon and water cycles at both temporal and spatial scales.

    The model simulation performed with the default deciduous conifer parameters produced results that had large differences from the observed net ecosystem exchange (NEE, gross primary productivity (GPP, ecosystem respiration (RE, and evapotranspiration (ET. Therefore, we adjusted several model parameters in order to reproduce the observed rates of carbon and water cycle processes. This model calibration, performed using the AsiaFlux data, substantially improved the model performance. The simulated annual GPP, RE, NEE, and ET from the calibrated model were highly consistent with observed values.

    The observed and simulated GPP and RE across the six sites were positively correlated with the annual mean air temperature and annual total precipitation. On the other hand, the simulated carbon budget was partly explained by the stand disturbance history in addition to the climate. The sensitivity study indicated that spring warming enhanced the carbon sink, whereas summer warming decreased it across the larch forests. The summer radiation was the most important factor that controlled the carbon fluxes in the temperate site, but the VPD and water conditions were the limiting factors in the boreal sites. One model parameter, the allocation ratio of carbon between belowground and aboveground, was site-specific, and it was negatively correlated with the annual climate of annual mean air temperature and total precipitation

  19. Recapitalizing EMSL: Meeting Future Science and Technology Challenges

    Energy Technology Data Exchange (ETDEWEB)

    Felmy, Andrew R.

    2008-07-01

    EMSL, located in Richland, Washington, is a national scientific user facility operated for the U.S. Department of Energy (DOE) by the Pacific Northwest National Laboratory. The vision that directed the development of EMSL as a problem-solving environment for environmental molecular science has led to significant scientific progress in many areas ranging from subsurface science to atmospheric sciences, and from biochemistry to catalysis. Our scientific staff and users are recognized nationally and internationally for their significant contributions to solving challenging scientific problems. We have explored new scientific frontiers and organized a vibrant and diverse user community in support of our mission as a national scientific user facility that provides integrated experimental and computational resources in the environmental molecular sciences. Users from around the world - from academia to industry and national laboratories to international research organizations - use the resources of EMSL because of the quality of science that we enable.

  20. EMSL and Institute for Integrated Catalysis (IIC) Catalysis Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Charles T.; Datye, Abhaya K.; Henkelman, Graeme A.; Lobo, Raul F.; Schneider, William F.; Spicer, Leonard D.; Tysoe, Wilfred T.; Vohs, John M.; Baer, Donald R.; Hoyt, David W.; Thevuthasan, Suntharampillai; Mueller, Karl T.; Wang, Chong M.; Washton, Nancy M.; Lyubinetsky, Igor; Teller, Raymond G.; Andersen, Amity; Govind, Niranjan; Kowalski, Karol; Kabius, Bernd C.; Wang, Hongfei; Campbell, Allison A.; Shelton, William A.; Bylaska, Eric J.; Peden, Charles HF; Wang, Yong; King, David L.; Henderson, Michael A.; Rousseau, Roger J.; Szanyi, Janos; Dohnalek, Zdenek; Mei, Donghai; Garrett, Bruce C.; Ray, Douglas; Futrell, Jean H.; Laskin, Julia; DuBois, Daniel L.; Kuprat, Laura R.; Plata, Charity

    2011-05-24

    Within the context of significantly accelerating scientific progress in research areas that address important societal problems, a workshop was held in November 2010 at EMSL to identify specific and topically important areas of research and capability needs in catalysis-related science.

  1. EMSL Quarterly Highlights Report Second Quarter, Fiscal Year 2010 (January 1, 2010 through March 31, 2010)

    Energy Technology Data Exchange (ETDEWEB)

    West, Staci A.; Showalter, Mary Ann; Manke, Kristin L.; Carper, Ross R.; Wiley, Julie G.; Beckman, Mary T.

    2010-04-20

    The Environmental Molecular Sciences Laboratory (EMSL) is a U.S. Department of Energy (DOE) national scientific user facility located at Pacific Northwest National Laboratory (PNNL) in Richland, Washington. EMSL is operated by PNNL for the DOE-Office of Biological and Environmental Research. At one location, EMSL offers a comprehensive array of leading-edge resources and expertise. Access to the instrumentation and expertise is obtained on a peer-reviewed proposal basis. Staff members work with researchers to expedite access to these capabilities. The "EMSL Quarterly Highlights Report" documents current research and activities of EMSL staff and users.

  2. [Responses of Pinus tabulaeformis forest ecosystem in North China to climate change and elevated CO2: a simulation based on BIOME-BGC model and tree-ring data].

    Science.gov (United States)

    He, Jun-Jie; Peng, Xing-Yuan; Chen, Zhen-Ju; Cui, Ming-Xing; Zhang, Xian-Liang; Zhou, Chang-Hong

    2012-07-01

    Based on BIOME-BGC model and tree-ring data, a modeling study was conducted to estimate the dynamic changes of the net primary productivity (NPP) of Pinus tabulaeformis forest ecosystem in North China in 1952-2008, and explore the responses of the radial growth and NPP to regional climate warming as well as the dynamics of the NPP in the future climate change scenarios. The simulation results indicated the annual NPP of the P. tabulaeformis ecosystem in 1952-2008 fluctuated from 244.12 to 645.31 g C x m(-2) x a(-1), with a mean value of 418.6 g C x m(-2) x a(-1) The mean air temperature in May-June and the precipitation from previous August to current July were the main factors limiting the radial growth of P. tabulaeformis and the NPP of P. tabulaeformis ecosystem. In the study period, both the radial growth and the NPP presented a decreasing trend due to the regional warming and drying climate condition. In the future climate scenarios, the NPP would have positive responses to the increase of air temperature, precipitation, and their combination. The elevated CO2 would benefit the increase of the NPP, and the increment would be about 16.1% due to the CO2 fertilization. At both ecosystem and regional scales, the tree-ring data would be an ideal proxy to predict the ecosystem dynamic change, and could be used to validate and calibrate the process-based ecosystem models including BIOME-BGC.

  3. FIFE data analysis: Testing BIOME-BGC predictions for grasslands

    Science.gov (United States)

    Hunt, E. Raymond, Jr.

    1994-01-01

    The First International Satellite Land Surface Climatology Project (ISLSCP) Field Experiment (FIFE) was conducted in a 15 km by 15 km research area located 8 km south of Manhattan, Kansas. The site consists primarily of native tallgrass prairie mixed with gallery oak forests and croplands. The objectives of FIFE are to better understand the role of biology in controlling the interactions between the land and the atmosphere, and to determine the value of remotely sensed data for estimating climatological parameters. The goals of FIFE are twofold: the upscale integration of models, and algorithm development for satellite remote sensing. The specific objectives of the field campaigns carried out in 1987 and 1989 were the simultaneous acquisition of satellite, atmospheric, and surface data; and the understanding of the processes controlling surface energy and mass exchange. Collected data were used to study the dynamics of various ecosystem processes (photosynthesis, evaporation and transpiration, autotrophic and heterotrophic respiration, etc.). Modelling terrestrial ecosystems at scales larger than that of a homogeneous plot led to the development of simple, generalized models of biogeochemical cycles that can be accurately applied to different biomes through the use of remotely sensed data. A model was developed called BIOME-BGC (for BioGeochemical Cycles) from a coniferous forest ecosystem model, FOREST-BGC, where a biome is considered a combination of a life forms in a specified climate. A predominately C4-photosynthetic grassland is probably the most different from a coniferous forest possible, hence the FIFE site was an excellent study area for testing BIOME-BGC. The transition from an essentially one-dimensional calculation to three-dimensional, landscape scale simulations requires the introduction of such factors as meteorology, climatology, and geomorphology. By using remotely sensed geographic information data for important model inputs, process

  4. Biisofraxidin on Apoptosis of Human Gastric Cancer BGC-823 Cells

    African Journals Online (AJOL)

    Conclusion: 3,3′-Biisofraxidin significantly induces the apoptosis of BGC-823 cells in vitro and in vivo ... Plant material. Sarcandrae Herba (SH) was obtained from. Chengdu international trade city in 2013 and identified by Jian-Tao Wu. A voucher specimen ..... induced by p65 prevents doxorubicin-induced cell death.

  5. Comment on Chiesi et al. (2011): “Use of BIOME-BGC to simulate Mediterranean forest carbon stocks”

    OpenAIRE

    Eastaugh CS

    2011-01-01

    The mechanistic forest growth model BIOME-BGC utilizes a “spin-up” procedure to estimate site parameters for forests in a steady-state condition, as they may have been expected to be prior to anthropogenic influence. Forests in this condition have no net growth, as living biomass accumulation is balanced by mortality. To simulate current ecosystems it is necessary to reset the model to reflect a forest of the correct development stage. The alternative approach of simply post-adjus...

  6. Testing the applicability of BIOME-BGC to simulate beech gross primary production in Europe using a new continental weather dataset

    DEFF Research Database (Denmark)

    Chiesi, Marta; Chirici, Gherardo; Marchetti, Marco

    2016-01-01

    A daily 1-km Pan-European weather dataset can drive the BIOME-BGC model for the estimation of current and future beech gross primary production (GPP). Annual beech GPP is affected primarily by spring temperature and more irregularly by summer water stress.The spread of beech forests in Europe...... forest ecosystems having different climatic conditions where the eddy covariance technique is used to measure water and carbon fluxes. The experiment is in three main steps. First, the accuracy of BIOME-BGC GPP simulations is assessed through comparison with flux observations. Second, the influence...

  7. Inhibitory effects of CP on the growth of human gastric adenocarcinoma BGC-823 tumours in nude mice.

    Science.gov (United States)

    Wang, Hai-Jun; Liu, Yu; Zhou, Bao-Jun; Zhang, Zhan-Xue; Li, Ai-Ying; An, Ran; Yue, Bin; Fan, Li-Qiao; Li, Yong

    2018-01-01

    Objective To investigate the potential antitumour effects of [2-(6-amino-purine-9-yl)-1-hydroxy-phosphine acyl ethyl] phosphonic acid (CP) against gastric adenocarcinoma. Methods Human BGC-823 xenotransplants were established in nude mice. Animals were randomly divided into control and CP groups, which were administered NaHCO 3 vehicle alone or CP dissolved in NaHCO 3 (200 µg/kg body weight) daily, respectively. Tumour volume was measured weekly for 6 weeks. Resected tumours were assayed for proliferative activity with anti-Ki-67 or anti-proliferating cell nuclear antigen (PCNA) antibodies. Cell apoptosis was examined using terminal deoxynucleotidyl transferase-mediated dUTP nick end labelling (TUNEL) assays and with caspase-3 immunostaining. Proteins were measured by Western blotting. Results There was a significant reduction in tumour volume and a reduced percentage of Ki-67-positive or PCNA-positive cells in the CP group compared with the control group. The percentage of TUNEL-positive or caspase 3-positive cells significantly increased following CP treatment compared with the control group. Tumours from the CP group had higher levels of phosphorylated-extracellular signal-regulated kinase (p-ERK) and phosphorylated-AKT (p-AKT) compared with control tumours. Conclusion CP treatment inhibited tumour growth and induced tumour cell apoptosis in a nude mouse model of BGC-823 gastric adenocarcinoma. Activation of the AKT and ERK signalling pathways may mediate this antitumour activity.

  8. [Effect of andrographolide on proliferation and apoptosis of gastric cancer BGC-823 cells].

    Science.gov (United States)

    Li, Shu-guang; Shao, Qin-shu; Wang, Yuan-yu; Peng, Tao; Zhao, Yi-feng; Yang, Yong-jiang; Huang, Di

    2013-07-01

    To investigate the effect of andrographolide (AD) on proliferation, cell cycle and apoptosis of human gastric cells line BGC-823. MTT assay, flow cytometry and Annexin-V/PI double-staining flow cytometry assay were used to evaluate the effect of AD on proliferation, cell cycle and apoptosis of BGC-823 cells respectively. Optical microscope and transmission electron microscopy were used to observe the cell morphological changes. A time- and concentration-dependent proliferative inhibition effect of AD was demonstrated in BGC-823 cells. AD concentration lower than 7.5 mg/L possessed weak inhibitory effect,while concentration between 15.0-60.0 mg/L possessed higher inhibitory effect. The concentration higher than 60.0 mg/L had no significant increase of inhibitory effect. IC50 of AD at 24, 48 and 72 h was (35.3±4.3), (25.5±3.5) and (18.2±2.7) mg/L respectively. Compared with the negative control group, the number of G0/G1 phase cells increased significantly (PAndrographolide can inhibit BGC-823 cells proliferation, arrest BGC-823 cells in G0/G1 phase and induce apoptosis, and may be a potential traditional Chinese medicine with anti-cancer effect.

  9. Proliferative and apoptotic effects of andrographolide on the BGC-823 human gastric cancer cell line.

    Science.gov (United States)

    Li, Shu-Guang; Wang, Yuan-yu; Ye, Zai-yuan; Shao, Qing-shu; Tao, Hou-quan; Shu, Li-sha; Zhao, Yi-feng; Yang, Yong-jiang; Yang, Jing; Peng, Tao; Han, Bo; Huang, Di

    2013-01-01

    Andrographolide has been shown to have anticancer activity on diverse cancer cell lines representing different types of human cancers. The aim of this research was to investigate the anticancer and apoptotic effects of andrographolide on the BGC-823 human gastric cancer cell line. Cell proliferation and IC50 were evaluated using MTT assay, cell-cycle analysis with flow cytometry apoptotic effects with Annexin-V/propidium iodide double-staining assay, and morphologic structure with transmission electron microscopy. Immunohistochemistry and reverse-transcription PCR was used to analyze Bcl-2, Bax, and caspase-3 expressions. Andrographolide showed a time- and concentration-dependent inhibitory effects on BGC-823 cell growth. Compared to controls, the number of cells in the G0-G1-phase increased significantly, S and G2-M-phase cells decreased after 48 hours of treatment with andrographolide, and both early and late apoptotic rates increased significantly compared to the controls, all in a concentration-dependent manner. Bax and caspase-3 expressions were markedly increased, and Bcl-2 expression was decreased. Andrographolide inhibits BGC-823 cell growth and induces BGC-823 cell apoptosis by up-regulating Bax and caspase-3 expressions and down-regulating Bcl-2 expression. Andrographolide may be useful as a potent and selective agent in the treatment of human gastric cancers.

  10. Characterization of a novel nitrilase, BGC4, from Paraburkholderia graminis showing wide-spectrum substrate specificity, a potential versatile biocatalyst for the degradation of nitriles.

    Science.gov (United States)

    Fan, Haiyang; Chen, Lifeng; Sun, Huihui; Wang, Hualei; Liu, Qinghai; Ren, Yuhong; Wei, Dongzhi

    2017-11-01

    To investigate the biodegradation of nitriles via the nitrilase-mediated pathway. A novel nitrilase, BGC4, was identified from proteobacteria Paraburkholderia graminis CD41M and its potential for use in biodegradation of toxic nitriles in industrial effluents was studied. BGC4 was overexpressed in Escherichia coli BL21 (DE3), the recombinant protein was purified and its enzymatic properties analysed. Maximum activity of BGC4 nitrilase was at 30 °C and pH 7.6. BGC4 has a broad substrate activity towards aliphatic, heterocyclic, and aromatic nitriles, as well as arylacetonitriles. Iminodiacetonitrile, an aliphatic nitrile, was the optimal substrate but comparable activities were also observed with phenylacetonitrile and indole-3-acetonitrile. BGC4-expressing cells degraded industrial nitriles, such as acrylonitrile, adiponitrile, benzonitrile, mandelonitrile, and 3-cyanopyridine, showing good tolerance and conversion rates. BGC4 nitrilase has wide-spectrum substrate specificity and is suitable for efficient biodegradation of toxic nitriles.

  11. EMSL Science Theme Advisory Panel Workshop - Atmospheric Aerosol Chemistry, Climate Change, and Air Quality

    Energy Technology Data Exchange (ETDEWEB)

    Baer, Donald R.; Finlayson-Pitts, Barbara J.; Allen, Heather C.; Bertram, Allan K.; Grassian, Vicki H.; Martin, Scot T.; Penner, Joyce E.; Prather, Kimberly; Rasch, Philip J.; Signorell, Ruth; Smith, James N.; Wyslouzil, Barbara; Ziemann, Paul; Dabdub, Donald; Furche, Filipp; Nizkorodov, Sergey; Tobias, Douglas J.; Laskin, Julia; Laskin, Alexander

    2013-07-01

    This report contains the workshop scope and recommendations from the workshop attendees in identifying scientific gaps in new particle formation, growth and properties of particles and reactions in and on particles as well as the laboratory-focused capabilities, field-deployable capabilities and modeling/theory tools along with linking of models to fundamental data.

  12. EMSL Strategic Plan 2008

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Allison A.

    2008-08-15

    This Strategic Plan is EMSL’s template for achieving our vision of simultaneous excellence in all aspects of our mission as a national scientific user facility. It reflects our understanding of the long-term stewardship we must work toward to meet the scientific challenges of the Department of Energy and the nation. During the next decade, we will implement the strategies contained in this Plan, working closely with the scientific community, our advisory committees, DOE’s Office of Biological and Environmental Research, and other key stakeholders. This Strategic Plan is fully aligned with the strategic plans of DOE and its Office of Science. We recognize that shifts in science and technology, national priorities, and resources made available through the Federal budget process create planning uncertainties and, ultimately, a highly dynamic planning environment. Accordingly, this Strategic Plan should be viewed as a living document for which we will continually evaluate changing needs and opportunities posed by our stakeholders (i.e., DOE, users, staff, advisory committees), work closely with them to understand and respond to those changes, and align our strategy accordingly.

  13. Two databases derived from BGC-Argo float measurements for marine biogeochemical and bio-optical applications

    Directory of Open Access Journals (Sweden)

    E. Organelli

    2017-11-01

    Full Text Available Since 2012, an array of 105 Biogeochemical-Argo (BGC-Argo floats has been deployed across the world's oceans to assist in filling observational gaps that are required for characterizing open-ocean environments. Profiles of biogeochemical (chlorophyll and dissolved organic matter and optical (single-wavelength particulate optical backscattering, downward irradiance at three wavelengths, and photosynthetically available radiation variables are collected in the upper 1000 m every 1 to 10 days. The database of 9837 vertical profiles collected up to January 2016 is presented and its spatial and temporal coverage is discussed. Each variable is quality controlled with specifically developed procedures and its time series is quality-assessed to identify issues related to biofouling and/or instrument drift. A second database of 5748 profile-derived products within the first optical depth (i.e., the layer of interest for satellite remote sensing is also presented and its spatiotemporal distribution discussed. This database, devoted to field and remote ocean color applications, includes diffuse attenuation coefficients for downward irradiance at three narrow wavebands and one broad waveband (photosynthetically available radiation, calibrated chlorophyll and fluorescent dissolved organic matter concentrations, and single-wavelength particulate optical backscattering. To demonstrate the applicability of these databases, data within the first optical depth are compared with previously established bio-optical models and used to validate remotely derived bio-optical products. The quality-controlled databases are publicly available from the SEANOE (SEA scieNtific Open data Edition publisher at https://doi.org/10.17882/49388 and https://doi.org/10.17882/47142 for vertical profiles and products within the first optical depth, respectively.

  14. Two databases derived from BGC-Argo float measurements for marine biogeochemical and bio-optical applications

    Science.gov (United States)

    Organelli, Emanuele; Barbieux, Marie; Claustre, Hervé; Schmechtig, Catherine; Poteau, Antoine; Bricaud, Annick; Boss, Emmanuel; Briggs, Nathan; Dall'Olmo, Giorgio; D'Ortenzio, Fabrizio; Leymarie, Edouard; Mangin, Antoine; Obolensky, Grigor; Penkerc'h, Christophe; Prieur, Louis; Roesler, Collin; Serra, Romain; Uitz, Julia; Xing, Xiaogang

    2017-11-01

    Since 2012, an array of 105 Biogeochemical-Argo (BGC-Argo) floats has been deployed across the world's oceans to assist in filling observational gaps that are required for characterizing open-ocean environments. Profiles of biogeochemical (chlorophyll and dissolved organic matter) and optical (single-wavelength particulate optical backscattering, downward irradiance at three wavelengths, and photosynthetically available radiation) variables are collected in the upper 1000 m every 1 to 10 days. The database of 9837 vertical profiles collected up to January 2016 is presented and its spatial and temporal coverage is discussed. Each variable is quality controlled with specifically developed procedures and its time series is quality-assessed to identify issues related to biofouling and/or instrument drift. A second database of 5748 profile-derived products within the first optical depth (i.e., the layer of interest for satellite remote sensing) is also presented and its spatiotemporal distribution discussed. This database, devoted to field and remote ocean color applications, includes diffuse attenuation coefficients for downward irradiance at three narrow wavebands and one broad waveband (photosynthetically available radiation), calibrated chlorophyll and fluorescent dissolved organic matter concentrations, and single-wavelength particulate optical backscattering. To demonstrate the applicability of these databases, data within the first optical depth are compared with previously established bio-optical models and used to validate remotely derived bio-optical products. The quality-controlled databases are publicly available from the SEANOE (SEA scieNtific Open data Edition) publisher at https://doi.org/10.17882/49388" target="_blank">https://doi.org/10.17882/49388 and https://doi.org/10.17882/47142" target="_blank">https://doi.org/10.17882/47142 for vertical profiles and products within the first optical depth, respectively.

  15. Enhancement of Radiation Effects by Ursolic Acid in BGC-823 Human Adenocarcinoma Gastric Cancer Cell Line.

    Directory of Open Access Journals (Sweden)

    Yang Yang

    Full Text Available Recent research has suggested that certain plant-derived polyphenols, i.e., ursolic acid (UA, which are reported to have antitumor activities, might be used to sensitize tumor cells to radiation therapy by inhibiting pathways leading to radiation therapy resistance. This experiment was designed to investigate the effects and possible mechanism of radiosensitization by UA in BGC-823 cell line from human adenocarcinoma gastric cancer in vitro. UA caused cytotoxicity in a dose-dependent manner, and we used a sub-cytotoxicity concentration of UA to test radioenhancement efficacy with UA in gastric cancer. Radiosensitivity was determined by clonogenic survival assay. Surviving fraction of the combined group with irradiation and sub-cytotoxicity UA significantly decreased compared with the irradiation group. The improved radiosensitization efficacy was associated with enhanced G2/M arrest, increased reactive oxygen species (ROS, down-regulated Ki-67 level and improved apoptosis. In conclusion, as UA demonstrated potent antiproliferation effect and synergistic effect, it could be used as a potential drug sensitizer for the application of radiotherapy.

  16. Plankton Assemblage Estimated with BGC-Argo Floats in the Southern Ocean: Implications for Seasonal Successions and Particle Export

    Science.gov (United States)

    Rembauville, Mathieu; Briggs, Nathan; Ardyna, Mathieu; Uitz, Julia; Catala, Philippe; Penkerc'h, Cristophe; Poteau, Antoine; Claustre, Hervé; Blain, Stéphane

    2017-10-01

    The Southern Ocean (SO) hosts plankton communities that impact the biogeochemical cycles of the global ocean. However, weather conditions in the SO restrict mainly in situ observations of plankton communities to spring and summer, preventing the description of biological successions at an annual scale. Here, we use shipboard observations collected in the Indian sector of the SO to develop a multivariate relationship between physical and bio-optical data, and, the composition and carbon content of the plankton community. Then we apply this multivariate relationship to five biogeochemical Argo (BGC-Argo) floats deployed within the same bio-geographical zone as the ship-board observations to describe spatial and seasonal changes in plankton assemblage. The floats reveal a high contribution of bacteria below the mixed layer, an overall low abundance of picoplankton and a seasonal succession from nano- to microplankton during the spring bloom. Both naturally iron-fertilized waters downstream of the Crozet and Kerguelen Plateaus show elevated phytoplankton biomass in spring and summer but they differ by a nano- or microplankton dominance at Crozet and Kerguelen, respectively. The estimated plankton group successions appear consistent with independent estimations of particle diameter based on the optical signals. Furthermore, the comparison of the plankton community composition in the surface layer with the presence of large mesopelagic particles diagnosed by spikes of optical signals provides insight into the nature and temporal changes of ecological vectors that drive particle export. This study emphasizes the power of BGC-Argo floats for investigating important biogeochemical processes at high temporal and spatial resolution.

  17. Simulation of Forest Carbon Fluxes Using Model Incorporation and Data Assimilation

    OpenAIRE

    Min Yan; Xin Tian; Zengyuan Li; Erxue Chen; Xufeng Wang; Zongtao Han; Hong Sun

    2016-01-01

    This study improved simulation of forest carbon fluxes in the Changbai Mountains with a process-based model (Biome-BGC) using incorporation and data assimilation. Firstly, the original remote sensing-based MODIS MOD_17 GPP (MOD_17) model was optimized using refined input data and biome-specific parameters. The key ecophysiological parameters of the Biome-BGC model were determined through the Extended Fourier Amplitude Sensitivity Test (EFAST) sensitivity analysis. Then the optimized MOD_17 mo...

  18. Simulation of Forest Carbon Fluxes Using Model Incorporation and Data Assimilation

    Directory of Open Access Journals (Sweden)

    Min Yan

    2016-07-01

    Full Text Available This study improved simulation of forest carbon fluxes in the Changbai Mountains with a process-based model (Biome-BGC using incorporation and data assimilation. Firstly, the original remote sensing-based MODIS MOD_17 GPP (MOD_17 model was optimized using refined input data and biome-specific parameters. The key ecophysiological parameters of the Biome-BGC model were determined through the Extended Fourier Amplitude Sensitivity Test (EFAST sensitivity analysis. Then the optimized MOD_17 model was used to calibrate the Biome-BGC model by adjusting the sensitive ecophysiological parameters. Once the best match was found for the 10 selected forest plots for the 8-day GPP estimates from the optimized MOD_17 and from the Biome-BGC, the values of sensitive ecophysiological parameters were determined. The calibrated Biome-BGC model agreed better with the eddy covariance (EC measurements (R2 = 0.87, RMSE = 1.583 gC·m−2·d−1 than the original model did (R2 = 0.72, RMSE = 2.419 gC·m−2·d−1. To provide a best estimate of the true state of the model, the Ensemble Kalman Filter (EnKF was used to assimilate five years (of eight-day periods between 2003 and 2007 of Global LAnd Surface Satellite (GLASS LAI products into the calibrated Biome-BGC model. The results indicated that LAI simulated through the assimilated Biome-BGC agreed well with GLASS LAI. GPP performances obtained from the assimilated Biome-BGC were further improved and verified by EC measurements at the Changbai Mountains forest flux site (R2 = 0.92, RMSE = 1.261 gC·m−2·d−1.

  19. Anti-Proliferation and Anti-Invasion Effects of Diosgenin on Gastric Cancer BGC-823 Cells with HIF-1α shRNAs

    Directory of Open Access Journals (Sweden)

    Yuan-Neng Chou

    2012-05-01

    Full Text Available Drug resistance is a major factor for the limited efficacy of chemotherapy in gastric cancer treatment. Hypoxia-inducible factor-1α (HIF-1α, a central transcriptional factor in hypoxia, is suggested to participate in the resistance. Here, we identified a hypoxia-mimic (cobalt chloride sensitive gastric cell line BGC-823 to explore whether diosgenin, an aglycone of steroidal saponins, can inhibit cancer cell invasion and survival of solid tumor in a hypoxic mimic microenvironment. We have shown that diosgenin is a potent candidate for decreasing the ability of invasion and survival in cobalt chloride treated BGC-823 cells. In addition, when combined with HIF-1α specific short hairpin RNA (shRNA, diosgenin can inhibit BGC-823 cells more effectively. The anti-invasion role of diosgenin may be related to E-cadherin, integrinα5 and integrinβ6. These results suggest that diosgenin may be a useful compound in controlling gastric cancer cells in hypoxia condition, especially when combined with down-regulated HIF-1α.

  20. BOREAS RSS-08 BIOME-BGC SSA Simulations of Annual Water and Carbon Fluxes

    Data.gov (United States)

    National Aeronautics and Space Administration — Derived maps of landcover type and crown and stem biomass as model inputs to determine annual evapotranspiration, gross primary production, autotrophic respiration...

  1. Hydrography and biogeochemistry dedicated to the Mediterranean BGC-Argo network during a cruise with RV Tethys 2 in May 2015

    Directory of Open Access Journals (Sweden)

    V. Taillandier

    2018-03-01

    Full Text Available We report on data from an oceanographic cruise, covering western, central and eastern parts of the Mediterranean Sea, on the French research vessel Tethys 2 in May 2015. This cruise was fully dedicated to the maintenance and the metrological verification of a biogeochemical observing system based on a fleet of BGC-Argo floats. During the cruise, a comprehensive data set of parameters sensed by the autonomous network was collected. The measurements include ocean currents, seawater salinity and temperature, and concentrations of inorganic nutrients, dissolved oxygen and chlorophyll pigments. The analytical protocols and data processing methods are detailed, together with a first assessment of the calibration state for all the sensors deployed during the cruise. Data collected at stations are available at https://doi.org/10.17882/51678 and data collected along the ship track are available at https://doi.org/10.17882/51691.

  2. Hydrography and biogeochemistry dedicated to the Mediterranean BGC-Argo network during a cruise with RV Tethys 2 in May 2015

    Science.gov (United States)

    Taillandier, Vincent; Wagener, Thibaut; D'Ortenzio, Fabrizio; Mayot, Nicolas; Legoff, Hervé; Ras, Joséphine; Coppola, Laurent; Pasqueron de Fommervault, Orens; Schmechtig, Catherine; Diamond, Emilie; Bittig, Henry; Lefevre, Dominique; Leymarie, Edouard; Poteau, Antoine; Prieur, Louis

    2018-03-01

    We report on data from an oceanographic cruise, covering western, central and eastern parts of the Mediterranean Sea, on the French research vessel Tethys 2 in May 2015. This cruise was fully dedicated to the maintenance and the metrological verification of a biogeochemical observing system based on a fleet of BGC-Argo floats. During the cruise, a comprehensive data set of parameters sensed by the autonomous network was collected. The measurements include ocean currents, seawater salinity and temperature, and concentrations of inorganic nutrients, dissolved oxygen and chlorophyll pigments. The analytical protocols and data processing methods are detailed, together with a first assessment of the calibration state for all the sensors deployed during the cruise. Data collected at stations are available at https://doi.org/10.17882/51678" target="_blank">https://doi.org/10.17882/51678 and data collected along the ship track are available at https://doi.org/10.17882/51691" target="_blank">https://doi.org/10.17882/51691.

  3. Effect of orexin A on apoptosis in BGC-823 gastric cancer cells via OX1R through the AKT signaling pathway.

    Science.gov (United States)

    Wen, Jing; Zhao, Yuyan; Shen, Yang; Guo, Lei

    2015-05-01

    Orexins are a class of peptides involved in the regulation of food intake, energy homeostasis, the sleep‑wake cycle and gastrointestinal function. Recent studies have demonstrated that orexin A may influence apoptosis and proliferation in numerous types of cancer cells. However, the effect of orexin A on gastric cancer cells and its mechanisms of action remain elusive. In the present study, BGC‑823 gastric cancer cells were treated with orexin A (10‑10‑10‑6 M) in vitro and the expression levels of orexin receptor 1 (OX1R) protein in cells was then determined. The proliferation, viability and apoptosis of BGC‑823 cells were detected. In addition, BGC‑823 cells were treated with AKT inhibitor PF‑04691502 or OX1R‑specific antagonist SB334867 in combination with orexin A, in order to examine the activation of AKT and caspase‑3. The results showed that orexin A (10‑10‑10‑6 M) stimulated the OX1R protein expression in BGC‑823 cells, which improved the proliferation and viability of the cells as well as protected them from apoptosis. Phosphorylated AKT protein was significantly increased in BGC‑823 cells following treatment with orexin A. Moreover, 10‑8 M orexin A reduced the proapoptotic activity of caspase‑3 (by ≤30%). The OX1R antagonist SB334867 (10‑6 M) and AKT antagonist PF‑04691502 (10‑6 M), when used individually or in combination, abolished the effect of orexin A (10‑8 M) on BGC-823 cells. In conclusion, the results of the present study demonstrated that orexin A inhibited gastric cancer cell apoptosis via OX1R through the AKT signaling pathway.

  4. Modeling Gross Primary Production of Agro-Forestry Ecosystems by Assimilation of Satellite-Derived Information in a Process-Based Model

    Directory of Open Access Journals (Sweden)

    Guenther Seufert

    2009-02-01

    Full Text Available In this paper we present results obtained in the framework of a regional-scale analysis of the carbon budget of poplar plantations in Northern Italy. We explored the ability of the process-based model BIOME-BGC to estimate the gross primary production (GPP using an inverse modeling approach exploiting eddy covariance and satellite data. We firstly present a version of BIOME-BGC coupled with the radiative transfer models PROSPECT and SAILH (named PROSAILH-BGC with the aims of i improving the BIOME-BGC description of the radiative transfer regime within the canopy and ii allowing the assimilation of remotely-sensed vegetation index time series, such as MODIS NDVI, into the model. Secondly, we present a two-step model inversion for optimization of model parameters. In the first step, some key ecophysiological parameters were optimized against data collected by an eddy covariance flux tower. In the second step, important information about phenological dates and about standing biomass were optimized against MODIS NDVI. Results obtained showed that the PROSAILH-BGC allowed simulation of MODIS NDVI with good accuracy and that we described better the canopy radiation regime. The inverse modeling approach was demonstrated to be useful for the optimization of ecophysiological model parameters, phenological dates and parameters related to the standing biomass, allowing good accuracy of daily and annual GPP predictions. In summary, this study showed that assimilation of eddy covariance and remote sensing data in a process model may provide important information for modeling gross primary production at regional scale.

  5. Attribution of changes in global wetland methane emissions from pre-industrial to present using CLM4.5-BGC

    International Nuclear Information System (INIS)

    Paudel, Rajendra; Mahowald, Natalie M; Hess, Peter G M; Meng, Lei; Riley, William J

    2016-01-01

    An understanding of potential factors controlling methane emissions from natural wetlands is important to accurately project future atmospheric methane concentrations. Here, we examine the relative contributions of climatic and environmental factors, such as precipitation, temperature, atmospheric CO 2 concentration, nitrogen deposition, wetland inundation extent, and land-use and land-cover change, on changes in wetland methane emissions from preindustrial to present day (i.e., 1850–2005). We apply a mechanistic methane biogeochemical model integrated in the Community Land Model version 4.5 (CLM4.5), the land component of the Community Earth System Model. The methane model explicitly simulates methane production, oxidation, ebullition, transport through aerenchyma of plants, and aqueous and gaseous diffusion. We conduct a suite of model simulations from 1850 to 2005, with all changes in environmental factors included, and sensitivity studies isolating each factor. Globally, we estimate that preindustrial methane emissions were higher by 10% than present-day emissions from natural wetlands, with emissions changes from preindustrial to the present of +15%, −41%, and −11% for the high latitudes, temperate regions, and tropics, respectively. The most important change is due to the estimated change in wetland extent, due to the conversion of wetland areas to drylands by humans. This effect alone leads to higher preindustrial global methane fluxes by 33% relative to the present, with the largest change in temperate regions (+80%). These increases were partially offset by lower preindustrial emissions due to lower CO 2 levels (10%), shifts in precipitation (7%), lower nitrogen deposition (3%), and changes in land-use and land-cover (2%). Cooler temperatures in the preindustrial regions resulted in our simulations in an increase in global methane emissions of 6% relative to present day. Much of the sensitivity to these perturbations is mediated in the model by

  6. Assessing the ability of three land ecosystem models to simulate gross carbon uptake of forests from boreal to Mediterranean climate in Europe

    NARCIS (Netherlands)

    Jung, M.; Le Maire, Guerric; Zaehle, S.; Luyssaert, S.; Vetter, M.; Churkina, G.; Ciais, P.; Viovy, N.; Reichstein, M.

    2007-01-01

    Three terrestrial biosphere models (LPJ, Orchidee, Biome-BGC) were evaluated with respect to their ability to simulate large-scale climate related trends in gross primary production (GPP) across European forests. Simulated GPP and leaf area index (LAI) were compared with GPP estimates based on flux

  7. Theory, Modeling and Simulation: Research progress report 1994--1995

    Energy Technology Data Exchange (ETDEWEB)

    Garrett, B.C.; Dixon, D.A.; Dunning, T.H.

    1997-01-01

    The Pacific Northwest National Laboratory (PNNL) has established the Environmental Molecular Sciences Laboratory (EMSL). In April 1994, construction began on the new EMSL, a collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation (TM and S) program will play a critical role in understanding molecular processes important in restoring DOE`s research, development, and production sites, including understanding the migration and reactions of contaminants in soils and ground water, developing processes for isolation and processing of pollutants, developing improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TM and S program are fivefold: to apply available electronic structure and dynamics techniques to study fundamental molecular processes involved in the chemistry of natural and contaminated systems; to extend current electronic structure and dynamics techniques to treat molecular systems of future importance and to develop new techniques for addressing problems that are computationally intractable at present; to apply available molecular modeling techniques to simulate molecular processes occurring in the multi-species, multi-phase systems characteristic of natural and polluted environments; to extend current molecular modeling techniques to treat ever more complex molecular systems and to improve the reliability and accuracy of such simulations; and to develop technologies for advanced parallel architectural computer systems. Research highlights of 82 projects are given.

  8. Using a spatially-distributed hydrologic biogeochemistry model to study the spatial variation of carbon processes in a Critical Zone Observatory

    Science.gov (United States)

    Shi, Y.; Eissenstat, D. M.; Davis, K. J.; He, Y.

    2016-12-01

    Forest carbon processes are affected by, among other factors, soil moisture, soil temperature, soil nutrients and solar radiation. Most of the current biogeochemical models are 1-D and represent one point in space. Therefore, they cannot resolve the topographically driven hill-slope land surface heterogeneity or the spatial pattern of nutrient availability. A spatially distributed forest ecosystem model, Flux-PIHM-BGC, has been developed by coupling a 1-D mechanistic biogeochemical model Biome-BGC (BBGC) with a spatially distributed land surface hydrologic model, Flux-PIHM. Flux-PIHM is a coupled physically based model, which incorporates a land-surface scheme into the Penn State Integrated Hydrologic Model (PIHM). The land surface scheme is adapted from the Noah land surface model. Flux-PIHM is able to represent the link between groundwater and the surface energy balance, as well as the land surface heterogeneities caused by topography. In the coupled Flux-PIHM-BGC model, each Flux-PIHM model grid couples a 1-D BBGC model, while soil nitrogen is transported among model grids via subsurface water flow. In each grid, Flux-PIHM provides BBGC with soil moisture, soil temperature, and solar radiation information, while BBGC provides Flux-PIHM with leaf area index. The coupled Flux-PIHM-BGC model has been implemented at the Susquehanna/Shale Hills critical zone observatory (SSHCZO). Model results suggest that the vegetation and soil carbon distribution is primarily constrained by nitorgen availability (affected by nitorgen transport via topographically driven subsurface flow), and also constrained by solar radiation and root zone soil moisture. The predicted vegetation and soil carbon distribution generally agrees with the macro pattern observed within the watershed. The coupled ecosystem-hydrologic model provides an important tool to study the impact of topography on watershed carbon processes, as well as the impact of climate change on water resources.

  9. Theory, modeling and simulation: Annual report 1993

    International Nuclear Information System (INIS)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE's research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies

  10. Theory, modeling and simulation: Annual report 1993

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, T.H. Jr.; Garrett, B.C.

    1994-07-01

    Developing the knowledge base needed to address the environmental restoration issues of the US Department of Energy requires a fundamental understanding of molecules and their interactions in insolation and in liquids, on surfaces, and at interfaces. To meet these needs, the PNL has established the Environmental and Molecular Sciences Laboratory (EMSL) and will soon begin construction of a new, collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation program (TMS), which is one of seven research directorates in the EMSL, will play a critical role in understanding molecular processes important in restoring DOE`s research, development and production sites, including understanding the migration and reactions of contaminants in soils and groundwater, the development of separation process for isolation of pollutants, the development of improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TMS program are to apply available techniques to study fundamental molecular processes involved in natural and contaminated systems; to extend current techniques to treat molecular systems of future importance and to develop techniques for addressing problems that are computationally intractable at present; to apply molecular modeling techniques to simulate molecular processes occurring in the multispecies, multiphase systems characteristic of natural and polluted environments; and to extend current molecular modeling techniques to treat complex molecular systems and to improve the reliability and accuracy of such simulations. The program contains three research activities: Molecular Theory/Modeling, Solid State Theory, and Biomolecular Modeling/Simulation. Extended abstracts are presented for 89 studies.

  11. Using a spatially-distributed hydrologic biogeochemistry model with nitrogen transport to study the spatial variation of carbon stocks and fluxes in a Critical Zone Observatory

    Science.gov (United States)

    Shi, Y.; Eissenstat, D. M.; He, Y.; Davis, K. J.

    2017-12-01

    Most current biogeochemical models are 1-D and represent one point in space. Therefore, they cannot resolve topographically driven land surface heterogeneity (e.g., lateral water flow, soil moisture, soil temperature, solar radiation) or the spatial pattern of nutrient availability. A spatially distributed forest biogeochemical model with nitrogen transport, Flux-PIHM-BGC, has been developed by coupling a 1-D mechanistic biogeochemical model Biome-BGC (BBGC) with a spatially distributed land surface hydrologic model, Flux-PIHM, and adding an advection dominated nitrogen transport module. Flux-PIHM is a coupled physically based model, which incorporates a land-surface scheme into the Penn State Integrated Hydrologic Model (PIHM). The land surface scheme is adapted from the Noah land surface model, and is augmented by adding a topographic solar radiation module. Flux-PIHM is able to represent the link between groundwater and the surface energy balance, as well as land surface heterogeneities caused by topography. In the coupled Flux-PIHM-BGC model, each Flux-PIHM model grid couples a 1-D BBGC model, while nitrogen is transported among model grids via surface and subsurface water flow. In each grid, Flux-PIHM provides BBGC with soil moisture, soil temperature, and solar radiation, while BBGC provides Flux-PIHM with spatially-distributed leaf area index. The coupled Flux-PIHM-BGC model has been implemented at the Susquehanna/Shale Hills Critical Zone Observatory. The model-predicted aboveground vegetation carbon and soil carbon distributions generally agree with the macro patterns observed within the watershed. The importance of abiotic variables (including soil moisture, soil temperature, solar radiation, and soil mineral nitrogen) in predicting aboveground carbon distribution is calculated using a random forest. The result suggests that the spatial pattern of aboveground carbon is controlled by the distribution of soil mineral nitrogen. A Flux-PIHM-BGC simulation

  12. Comparing Productivity Simulated with Inventory Data Using Different Modelling Technologies

    Science.gov (United States)

    Klopf, M.; Pietsch, S. A.; Hasenauer, H.

    2009-04-01

    The Lime Stone National Park in Austria was established in 1997 to protect sensible lime stone soils from degradation due to heavy forest management. Since 1997 the management activities were successively reduced and standing volume and coarse woody debris (CWD) increased and degraded soils began to recover. One option to study the rehabilitation process towards natural virgin forest state is the use of modelling technology. In this study we will test two different modelling approaches for their applicability to Lime Stone National Park. We will compare standing tree volume simulated resulting from (i) the individual tree growth model MOSES, and (ii) the species and management sensitive adaptation of the biogeochemical-mechanistic model Biome-BGC. The results from the two models are compared with filed observations form repeated permanent forest inventory plots of the Lime Stone National Park in Austria. The simulated CWD predictions of the BGC-model were compared with dead wood measurements (standing and lying dead wood) recorded at the permanent inventory plots. The inventory was established between 1994 and 1996 and remeasured from 2004 to 2005. For this analysis 40 plots of this inventory were selected which comprise the required dead wood components and are dominated by a single tree species. First we used the distance dependant individual tree growth model MOSES to derive the standing timber and the amount of mortality per hectare. MOSES is initialized with the inventory data at plot establishment and each sampling plot is treated as forest stand. The Biome-BGC is a process based biogeochemical model with extensions for Austrian tree species, a self initialization and a forest management tool. The initialization for the actual simulations with the BGC model was done as follows: We first used spin up runs to derive a balanced forest vegetation, similar to an undisturbed forest. Next we considered the management history of the past centuries (heavy clear cuts

  13. Moderate forest disturbance as a stringent test for gap and big-leaf models

    Science.gov (United States)

    Bond-Lamberty, B.; Fisk, J. P.; Holm, J. A.; Bailey, V.; Bohrer, G.; Gough, C. M.

    2015-01-01

    Disturbance-induced tree mortality is a key factor regulating the carbon balance of a forest, but tree mortality and its subsequent effects are poorly represented processes in terrestrial ecosystem models. It is thus unclear whether models can robustly simulate moderate (non-catastrophic) disturbances, which tend to increase biological and structural complexity and are increasingly common in aging US forests. We tested whether three forest ecosystem models - Biome-BGC (BioGeochemical Cycles), a classic big-leaf model, and the ZELIG and ED (Ecosystem Demography) gap-oriented models - could reproduce the resilience to moderate disturbance observed in an experimentally manipulated forest (the Forest Accelerated Succession Experiment in northern Michigan, USA, in which 38% of canopy dominants were stem girdled and compared to control plots). Each model was parameterized, spun up, and disturbed following similar protocols and run for 5 years post-disturbance. The models replicated observed declines in aboveground biomass well. Biome-BGC captured the timing and rebound of observed leaf area index (LAI), while ZELIG and ED correctly estimated the magnitude of LAI decline. None of the models fully captured the observed post-disturbance C fluxes, in particular gross primary production or net primary production (NPP). Biome-BGC NPP was correctly resilient but for the wrong reasons, and could not match the absolute observational values. ZELIG and ED, in contrast, exhibited large, unobserved drops in NPP and net ecosystem production. The biological mechanisms proposed to explain the observed rapid resilience of the C cycle are typically not incorporated by these or other models. It is thus an open question whether most ecosystem models will simulate correctly the gradual and less extensive tree mortality characteristic of moderate disturbances.

  14. Biogeochemical protocols and diagnostics for the CMIP6 Ocean Model Intercomparison Project (OMIP

    Directory of Open Access Journals (Sweden)

    J. C. Orr

    2017-06-01

    Full Text Available The Ocean Model Intercomparison Project (OMIP focuses on the physics and biogeochemistry of the ocean component of Earth system models participating in the sixth phase of the Coupled Model Intercomparison Project (CMIP6. OMIP aims to provide standard protocols and diagnostics for ocean models, while offering a forum to promote their common assessment and improvement. It also offers to compare solutions of the same ocean models when forced with reanalysis data (OMIP simulations vs. when integrated within fully coupled Earth system models (CMIP6. Here we detail simulation protocols and diagnostics for OMIP's biogeochemical and inert chemical tracers. These passive-tracer simulations will be coupled to ocean circulation models, initialized with observational data or output from a model spin-up, and forced by repeating the 1948–2009 surface fluxes of heat, fresh water, and momentum. These so-called OMIP-BGC simulations include three inert chemical tracers (CFC-11, CFC-12, SF6 and biogeochemical tracers (e.g., dissolved inorganic carbon, carbon isotopes, alkalinity, nutrients, and oxygen. Modelers will use their preferred prognostic BGC model but should follow common guidelines for gas exchange and carbonate chemistry. Simulations include both natural and total carbon tracers. The required forced simulation (omip1 will be initialized with gridded observational climatologies. An optional forced simulation (omip1-spunup will be initialized instead with BGC fields from a long model spin-up, preferably for 2000 years or more, and forced by repeating the same 62-year meteorological forcing. That optional run will also include abiotic tracers of total dissolved inorganic carbon and radiocarbon, CTabio and 14CTabio, to assess deep-ocean ventilation and distinguish the role of physics vs. biology. These simulations will be forced by observed atmospheric histories of the three inert gases and CO2 as well as carbon isotope ratios of CO2. OMIP-BGC simulation

  15. Biogeochemical protocols and diagnostics for the CMIP6 Ocean Model Intercomparison Project (OMIP)

    Science.gov (United States)

    Orr, James C.; Najjar, Raymond G.; Aumont, Olivier; Bopp, Laurent; Bullister, John L.; Danabasoglu, Gokhan; Doney, Scott C.; Dunne, John P.; Dutay, Jean-Claude; Graven, Heather; Griffies, Stephen M.; John, Jasmin G.; Joos, Fortunat; Levin, Ingeborg; Lindsay, Keith; Matear, Richard J.; McKinley, Galen A.; Mouchet, Anne; Oschlies, Andreas; Romanou, Anastasia; Schlitzer, Reiner; Tagliabue, Alessandro; Tanhua, Toste; Yool, Andrew

    2017-06-01

    The Ocean Model Intercomparison Project (OMIP) focuses on the physics and biogeochemistry of the ocean component of Earth system models participating in the sixth phase of the Coupled Model Intercomparison Project (CMIP6). OMIP aims to provide standard protocols and diagnostics for ocean models, while offering a forum to promote their common assessment and improvement. It also offers to compare solutions of the same ocean models when forced with reanalysis data (OMIP simulations) vs. when integrated within fully coupled Earth system models (CMIP6). Here we detail simulation protocols and diagnostics for OMIP's biogeochemical and inert chemical tracers. These passive-tracer simulations will be coupled to ocean circulation models, initialized with observational data or output from a model spin-up, and forced by repeating the 1948-2009 surface fluxes of heat, fresh water, and momentum. These so-called OMIP-BGC simulations include three inert chemical tracers (CFC-11, CFC-12, SF6) and biogeochemical tracers (e.g., dissolved inorganic carbon, carbon isotopes, alkalinity, nutrients, and oxygen). Modelers will use their preferred prognostic BGC model but should follow common guidelines for gas exchange and carbonate chemistry. Simulations include both natural and total carbon tracers. The required forced simulation (omip1) will be initialized with gridded observational climatologies. An optional forced simulation (omip1-spunup) will be initialized instead with BGC fields from a long model spin-up, preferably for 2000 years or more, and forced by repeating the same 62-year meteorological forcing. That optional run will also include abiotic tracers of total dissolved inorganic carbon and radiocarbon, CTabio and 14CTabio, to assess deep-ocean ventilation and distinguish the role of physics vs. biology. These simulations will be forced by observed atmospheric histories of the three inert gases and CO2 as well as carbon isotope ratios of CO2. OMIP-BGC simulation protocols are

  16. Comparison of phenology models for predicting the onset of growing season over the Northern Hemisphere.

    Directory of Open Access Journals (Sweden)

    Yang Fu

    Full Text Available Vegetation phenology models are important for examining the impact of climate change on the length of the growing season and carbon cycles in terrestrial ecosystems. However, large uncertainties in present phenology models make accurate assessment of the beginning of the growing season (BGS a challenge. In this study, based on the satellite-based phenology product (i.e. the V005 MODIS Land Cover Dynamics (MCD12Q2 product, we calibrated four phenology models, compared their relative strength to predict vegetation phenology; and assessed the spatial pattern and interannual variability of BGS in the Northern Hemisphere. The results indicated that parameter calibration significantly influences the models' accuracy. All models showed good performance in cool regions but poor performance in warm regions. On average, they explained about 67% (the Growing Degree Day model, 79% (the Biome-BGC phenology model, 73% (the Number of Growing Days model and 68% (the Number of Chilling Days-Growing Degree Day model of the BGS variations over the Northern Hemisphere. There were substantial differences in BGS simulations among the four phenology models. Overall, the Biome-BGC phenology model performed best in predicting the BGS, and showed low biases in most boreal and cool regions. Compared with the other three models, the two-phase phenology model (NCD-GDD showed the lowest correlation and largest biases with the MODIS phenology product, although it could catch the interannual variations well for some vegetation types. Our study highlights the need for further improvements by integrating the effects of water availability, especially for plants growing in low latitudes, and the physiological adaptation of plants into phenology models.

  17. Comparison of phenology models for predicting the onset of growing season over the Northern Hemisphere.

    Science.gov (United States)

    Fu, Yang; Zhang, Haicheng; Dong, Wenjie; Yuan, Wenping

    2014-01-01

    Vegetation phenology models are important for examining the impact of climate change on the length of the growing season and carbon cycles in terrestrial ecosystems. However, large uncertainties in present phenology models make accurate assessment of the beginning of the growing season (BGS) a challenge. In this study, based on the satellite-based phenology product (i.e. the V005 MODIS Land Cover Dynamics (MCD12Q2) product), we calibrated four phenology models, compared their relative strength to predict vegetation phenology; and assessed the spatial pattern and interannual variability of BGS in the Northern Hemisphere. The results indicated that parameter calibration significantly influences the models' accuracy. All models showed good performance in cool regions but poor performance in warm regions. On average, they explained about 67% (the Growing Degree Day model), 79% (the Biome-BGC phenology model), 73% (the Number of Growing Days model) and 68% (the Number of Chilling Days-Growing Degree Day model) of the BGS variations over the Northern Hemisphere. There were substantial differences in BGS simulations among the four phenology models. Overall, the Biome-BGC phenology model performed best in predicting the BGS, and showed low biases in most boreal and cool regions. Compared with the other three models, the two-phase phenology model (NCD-GDD) showed the lowest correlation and largest biases with the MODIS phenology product, although it could catch the interannual variations well for some vegetation types. Our study highlights the need for further improvements by integrating the effects of water availability, especially for plants growing in low latitudes, and the physiological adaptation of plants into phenology models.

  18. Use of remote-sensing reflectance to constrain a data assimilating marine biogeochemical model of the Great Barrier Reef

    Science.gov (United States)

    Jones, Emlyn M.; Baird, Mark E.; Mongin, Mathieu; Parslow, John; Skerratt, Jenny; Lovell, Jenny; Margvelashvili, Nugzar; Matear, Richard J.; Wild-Allen, Karen; Robson, Barbara; Rizwi, Farhan; Oke, Peter; King, Edward; Schroeder, Thomas; Steven, Andy; Taylor, John

    2016-12-01

    Skillful marine biogeochemical (BGC) models are required to understand a range of coastal and global phenomena such as changes in nitrogen and carbon cycles. The refinement of BGC models through the assimilation of variables calculated from observed in-water inherent optical properties (IOPs), such as phytoplankton absorption, is problematic. Empirically derived relationships between IOPs and variables such as chlorophyll-a concentration (Chl a), total suspended solids (TSS) and coloured dissolved organic matter (CDOM) have been shown to have errors that can exceed 100 % of the observed quantity. These errors are greatest in shallow coastal regions, such as the Great Barrier Reef (GBR), due to the additional signal from bottom reflectance. Rather than assimilate quantities calculated using IOP algorithms, this study demonstrates the advantages of assimilating quantities calculated directly from the less error-prone satellite remote-sensing reflectance (RSR). To assimilate the observed RSR, we use an in-water optical model to produce an equivalent simulated RSR and calculate the mismatch between the observed and simulated quantities to constrain the BGC model with a deterministic ensemble Kalman filter (DEnKF). The traditional assumption that simulated surface Chl a is equivalent to the remotely sensed OC3M estimate of Chl a resulted in a forecast error of approximately 75 %. We show this error can be halved by instead using simulated RSR to constrain the model via the assimilation system. When the analysis and forecast fields from the RSR-based assimilation system are compared with the non-assimilating model, a comparison against independent in situ observations of Chl a, TSS and dissolved inorganic nutrients (NO3, NH4 and DIP) showed that errors are reduced by up to 90 %. In all cases, the assimilation system improves the simulation compared to the non-assimilating model. Our approach allows for the incorporation of vast quantities of remote-sensing observations

  19. Calibration of two complex ecosystem models with different likelihood functions

    Science.gov (United States)

    Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán

    2014-05-01

    The biosphere is a sensitive carbon reservoir. Terrestrial ecosystems were approximately carbon neutral during the past centuries, but they became net carbon sinks due to climate change induced environmental change and associated CO2 fertilization effect of the atmosphere. Model studies and measurements indicate that the biospheric carbon sink can saturate in the future due to ongoing climate change which can act as a positive feedback. Robustness of carbon cycle models is a key issue when trying to choose the appropriate model for decision support. The input parameters of the process-based models are decisive regarding the model output. At the same time there are several input parameters for which accurate values are hard to obtain directly from experiments or no local measurements are available. Due to the uncertainty associated with the unknown model parameters significant bias can be experienced if the model is used to simulate the carbon and nitrogen cycle components of different ecosystems. In order to improve model performance the unknown model parameters has to be estimated. We developed a multi-objective, two-step calibration method based on Bayesian approach in order to estimate the unknown parameters of PaSim and Biome-BGC models. Biome-BGC and PaSim are a widely used biogeochemical models that simulate the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems (in this research the developed version of Biome-BGC is used which is referred as BBGC MuSo). Both models were calibrated regardless the simulated processes and type of model parameters. The calibration procedure is based on the comparison of measured data with simulated results via calculating a likelihood function (degree of goodness-of-fit between simulated and measured data). In our research different likelihood function formulations were used in order to examine the effect of the different model

  20. Integrating Remote Sensing, Field Observations, and Models to Understand Disturbance and Climate Effects on the Carbon Balance of the West Coast U.S.

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, Warren [USDA Forest Service

    2014-07-03

    As an element of NACP research, the proposed investigation is a two pronged approach that derives and evaluates a regional carbon (C) budget for Oregon, Washington, and California. Objectives are (1) Use multiple data sources, including AmeriFlux data, inventories, and multispectral remote sensing data to investigate trends in carbon storage and exchanges of CO2 and water with variation in climate and disturbance history; (2) Develop and apply regional modeling that relies on these multiple data sources to reduce uncertainty in spatial estimates of carbon storage and NEP, and relative contributions of terrestrial ecosystems and anthropogenic emissions to atmospheric CO2 in the region; (3) Model terrestrial carbon processes across the region, using the Biome-BGC terrestrial ecosystem model, and an atmospheric inverse modeling approach to estimate variation in rate and timing of terrestrial uptake and feedbacks to the atmosphere in response to climate and disturbance.

  1. Integrating Remote Sensing, Field Observations, and Models to Understand Disturbance and Climate Effects on the Carbon Balance of the West Coast U.S., Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Beverly E. Law

    2011-10-05

    As an element of NACP research, the proposed investigation is a two pronged approach that derives and evaluates a regional carbon (C) budget for Oregon, Washington, and California. Objectives are (1) Use multiple data sources, including AmeriFlux data, inventories, and multispectral remote sensing data to investigate trends in carbon storage and exchanges of CO2 and water with variation in climate and disturbance history; (2) Develop and apply regional modeling that relies on these multiple data sources to reduce uncertainty in spatial estimates of carbon storage and NEP, and relative contributions of terrestrial ecosystems and anthropogenic emissions to atmospheric CO2 in the region; (3) Model terrestrial carbon processes across the region, using the Biome-BGC terrestrial ecosystem model, and an atmospheric inverse modeling approach to estimate variation in rate and timing of terrestrial uptake and feedbacks to the atmosphere in response to climate and disturbance.

  2. Development of an advanced eco-hydrologic and biogeochemical coupling model aimed at clarifying the missing role of inland water in the global biogeochemical cycle

    Science.gov (United States)

    Nakayama, Tadanobu

    2017-04-01

    Recent research showed that inland water including rivers, lakes, and groundwater may play some role in carbon cycling, although its contribution has remained uncertain due to limited amount of reliable data available. In this study, the author developed an advanced model coupling eco-hydrology and biogeochemical cycle (National Integrated Catchment-based Eco-hydrology (NICE)-BGC). This new model incorporates complex coupling of hydrologic-carbon cycle in terrestrial-aquatic linkages and interplay between inorganic and organic carbon during the whole process of carbon cycling. The model could simulate both horizontal transports (export from land to inland water 2.01 ± 1.98 Pg C/yr and transported to ocean 1.13 ± 0.50 Pg C/yr) and vertical fluxes (degassing 0.79 ± 0.38 Pg C/yr, and sediment storage 0.20 ± 0.09 Pg C/yr) in major rivers in good agreement with previous researches, which was an improved estimate of carbon flux from previous studies. The model results also showed global net land flux simulated by NICE-BGC (-1.05 ± 0.62 Pg C/yr) decreased carbon sink a little in comparison with revised Lund-Potsdam-Jena Wetland Hydrology and Methane (-1.79 ± 0.64 Pg C/yr) and previous materials (-2.8 to -1.4 Pg C/yr). This is attributable to CO2 evasion and lateral carbon transport explicitly included in the model, and the result suggests that most previous researches have generally overestimated the accumulation of terrestrial carbon and underestimated the potential for lateral transport. The results further implied difference between inverse techniques and budget estimates suggested can be explained to some extent by a net source from inland water. NICE-BGC would play an important role in reevaluation of greenhouse gas budget of the biosphere, quantification of hot spots, and bridging the gap between top-down and bottom-up approaches to global carbon budget.

  3. Integrating Remote Sensing, Field Observations, and Models to Understand Disturbance and Climate Effects on the Carbon Balance of the West Coast U.S.

    Energy Technology Data Exchange (ETDEWEB)

    B.E. Law; D. Turner; M. Goeckede

    2010-06-01

    GOAL: To develop and apply an approach to quantify and understand the regional carbon balance of the west coast states for the North American Carbon Program. OBJECTIVE: As an element of NACP research, the proposed investigation is a two pronged approach that derives and evaluates a regional carbon (C) budget for Oregon, Washington, and California. Objectives are (1) Use multiple data sources, including AmeriFlux data, inventories, and multispectral remote sensing data to investigate trends in carbon storage and exchanges of CO2 and water with variation in climate and disturbance history; (2) Develop and apply regional modeling that relies on these multiple data sources to reduce uncertainty in spatial estimates of carbon storage and NEP, and relative contributions of terrestrial ecosystems and anthropogenic emissions to atmospheric CO2 in the region; (3) Model terrestrial carbon processes across the region, using the Biome-BGC terrestrial ecosystem model, and an atmospheric inverse modeling approach to estimate variation in rate and timing of terrestrial uptake and feedbacks to the atmosphere in response to climate and disturbance. APPROACH: In performing the regional analysis, the research plan for the bottom-up approach uses a nested hierarchy of observations that include AmeriFlux data (i.e., net ecosystem exchange (NEE) from eddy covariance and associated biometric data), intermediate intensity inventories from an extended plot array partially developed from the PI's previous research, Forest Service FIA and CVS inventory data, time since disturbance, disturbance type, and cover type from Landsat developed in this study, and productivity estimates from MODIS algorithms. The BIOME-BGC model is used to integrate information from these sources and quantify C balance across the region. The inverse modeling approach assimilates flux data from AmeriFlux sites, high precision CO2 concentration data from AmeriFlux towers and four new calibrated CO2 sites

  4. Carbon Impacts of Fire- and Bark Beetle-Caused Tree Mortality across the Western US using the Community Land Model (Invited)

    Science.gov (United States)

    Meddens, A. J.; Hicke, J. A.; Edburg, S. L.; Lawrence, D. M.

    2013-12-01

    Wildfires and bark beetle outbreaks cause major forest disturbances in the western US, affecting ecosystem productivity and thereby impacting forest carbon cycling and future climate. Despite the large spatial extent of tree mortality, quantifying carbon flux dynamics following fires and bark beetles over larger areas is challenging because of forest heterogeneity, varying disturbance severities, and field observation limitations. The objective of our study is to estimate these dynamics across the western US using the Community Land Model (version CLM4.5-BGC). CLM4.5-BGC is a land ecosystem model that mechanistically represents the exchanges of energy, water, carbon, and nitrogen with the atmosphere. The most recent iteration of the model has been expanded to include vertically resolved soil biogeochemistry and includes improved nitrogen cycle representations including nitrification and denitrification and biological fixation as well as improved canopy processes including photosynthesis. Prior to conducting simulations, we modified CLM4.5-BGC to include the effects of bark beetle-caused tree mortality on carbon and nitrogen stocks and fluxes. Once modified, we conducted paired simulations (with and without) fire- and bark beetle-caused tree mortality by using regional data sets of observed mortality as inputs. Bark beetle-caused tree mortality was prescribed from a data set derived from US Forest Service aerial surveys from 1997 to 2010. Annual tree mortality area was produced from observed tree mortality caused by bark beetles and was adjusted for underestimation. Fires were prescribed using the Monitoring Trends in Burn Severity (MTBS) database from 1984 to 2010. Annual tree mortality area was produced from forest cover maps and inclusion of moderate- and high-severity burned areas. Simulations show that maximum yearly reduction of net ecosystem productivity (NEP) caused by bark beetles is approximately 20 Tg C for the western US. Fires cause similar reductions

  5. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  6. Biogeochemical modelling vs. tree-ring data - comparison of forest ecosystem productivity estimates

    Science.gov (United States)

    Zorana Ostrogović Sever, Maša; Barcza, Zoltán; Hidy, Dóra; Paladinić, Elvis; Kern, Anikó; Marjanović, Hrvoje

    2017-04-01

    Forest ecosystems are sensitive to environmental changes as well as human-induce disturbances, therefore process-based models with integrated management modules represent valuable tool for estimating and forecasting forest ecosystem productivity under changing conditions. Biogeochemical model Biome-BGC simulates carbon, nitrogen and water fluxes, and it is widely used for different terrestrial ecosystems. It was modified and parameterised by many researchers in the past to meet the specific local conditions. In this research, we used recently published improved version of the model Biome-BGCMuSo (BBGCMuSo), with multilayer soil module and integrated management module. The aim of our research is to validate modelling results of forest ecosystem productivity (NPP) from BBGCMuSo model with observed productivity estimated from an extensive dataset of tree-rings. The research was conducted in two distinct forest complexes of managed Pedunculate oak in SE Europe (Croatia), namely Pokupsko basin and Spačva basin. First, we parameterized BBGCMuSo model at a local level using eddy-covariance (EC) data from Jastrebarsko EC site. Parameterized model was used for the assessment of productivity on a larger scale. Results of NPP assessment with BBGCMuSo are compared with NPP estimated from tree ring data taken from trees on over 100 plots in both forest complexes. Keywords: Biome-BGCMuSo, forest productivity, model parameterization, NPP, Pedunculate oak

  7. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  8. Investigating and Modeling Ecosystem Response to an Experimental and a Natural Ice Storm

    Science.gov (United States)

    Fakhraei, H.; Driscoll, C. T.; Rustad, L.; Campbell, J. L.; Groffman, P.; Fahey, T.; Likens, G.; Swaminathan, R.

    2017-12-01

    the biogeochemical model, PnET-BGC. The model was calibrated to the study watersheds using observations from the natural and experimental ice storms. Future projections for ice storm events were estimated from an advanced climate model and applied to the calibrated PnET-BGC model to simulate future impacts of ice storms on the northern hardwood forests.

  9. A Virtual Inertia Control Strategy for DC Microgrids Analogized with Virtual Synchronous Machines

    DEFF Research Database (Denmark)

    Wu, Wenhua; Chen, Yandong; Luo, An

    2017-01-01

    synchronous machine (VSM) is proposed to enhance the inertia of the DC-MG, and to restrain the dc bus voltage fluctuation. The small-signal model of the BGC system is established, and the small-signal transfer function between the dc bus voltage and the dc output current of the BGC is deduced. The dynamic...

  10. Understanding the Effect of Land Cover Classification on Model Estimates of Regional Carbon Cycling in the Boreal Forest Biome

    Science.gov (United States)

    Kimball, John; Kang, Sinkyu

    2003-01-01

    The original objectives of this proposed 3-year project were to: 1) quantify the respective contributions of land cover and disturbance (i.e., wild fire) to uncertainty associated with regional carbon source/sink estimates produced by a variety of boreal ecosystem models; 2) identify the model processes responsible for differences in simulated carbon source/sink patterns for the boreal forest; 3) validate model outputs using tower and field- based estimates of NEP and NPP; and 4) recommend/prioritize improvements to boreal ecosystem carbon models, which will better constrain regional source/sink estimates for atmospheric C02. These original objectives were subsequently distilled to fit within the constraints of a 1 -year study. This revised study involved a regional model intercomparison over the BOREAS study region involving Biome-BGC, and TEM (A.D. McGuire, UAF) ecosystem models. The major focus of these revised activities involved quantifying the sensitivity of regional model predictions associated with land cover classification uncertainties. We also evaluated the individual and combined effects of historical fire activity, historical atmospheric CO2 concentrations, and climate change on carbon and water flux simulations within the BOREAS study region.

  11. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    Czech Academy of Sciences Publication Activity Database

    Yu, X.; Lamačová, Anna; Duffy, Ch.; Krám, P.; Hruška, Jakub

    2016-01-01

    Roč. 90, part B (2016), s. 90-101 ISSN 0098-3004 R&D Projects: GA MŠk(CZ) LO1415 Institutional support: RVO:67179843 Keywords : Uncertainty * Evapotranspiration * Forest management * PIHM * Biome-BGC Subject RIV: DA - Hydrology ; Limnology OBOR OECD: Hydrology Impact factor: 2.533, year: 2016

  12. Modelling

    CERN Document Server

    Spädtke, P

    2013-01-01

    Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.

  13. A BiomeBGC-based Evaluation of Dryness Stress of Central European Forests

    OpenAIRE

    Buddenbaum, H.; Hientgen, J.; Dotzler, S.; Werner, W.; Hill, J.

    2015-01-01

    Dryness stress is expected to become a more common problem in central European forests due to the predicted regional climate change. Forest management has to adapt to climate change in time and think ahead several decades in decisions on which tree species to plant at which locations. The summer of 2003 was the most severe dryness event in recent time, but more periods like this are expected. Since forests on different sites react quite differently to drought conditions, we used the ...

  14. Literature-Derived Parameters for the BIOME-BGC Terrestrial Ecosystem Mode

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: Various aspects of primary production of a variety of plant species found in natural temperate biomes were compiled from literature and presented for use...

  15. Environmental and Molecular Science Laboratory Arrow

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-24

    Arrows is a software package that combines NWChem, SQL and NOSQL databases, email, and social networks (e.g. Twitter, Tumblr) that simplifies molecular and materials modeling and makes these modeling capabilities accessible to all scientists and engineers. EMSL Arrows is very simple to use. The user just emails chemical reactions to arrows@emsl.pnnl.gov and then an email is sent back with thermodynamic, reaction pathway (kinetic), spectroscopy, and other results. EMSL Arrows parses the email and then searches the database for the compounds in the reactions. If a compound isn't there, an NWChem calculation is setup and submitted to calculate it. Once the calculation is finished the results are entered into the database and then results are emailed back.

  16. Molecular Science Research Center, 1991 annual report

    Energy Technology Data Exchange (ETDEWEB)

    Knotek, M.L.

    1992-03-01

    During 1991, the Molecular Science Research Center (MSRC) experienced solid growth and accomplishment and the Environmental, and Molecular Sciences Laboratory (EMSL) construction project moved forward. We began with strong programs in chemical structure and dynamics and theory, modeling, and simulation, and both these programs continued to thrive. We also made significant advances in the development of programs in materials and interfaces and macromolecular structure and dynamics, largely as a result of the key staff recruited to lead these efforts. If there was one pervasive activity for the past year, however, it was to strengthen the role of the EMSL in the overall environmental restoration and waste management (ER/WM) mission at Hanford. These extended activities involved not only MSRC and EMSL staff but all PNL scientific and technical staff engaged in ER/WM programs.

  17. Modeling effects of hydrological changes on the carbon and nitrogen balance of oak in floodplains.

    Science.gov (United States)

    Pietsch, Stephan A; Hasenauer, Hubert; Kucera, Jiŕi; Cermák, Jan

    2003-08-01

    We extended the applicability of the ecosystem model BIOME-BGC to floodplain ecosystems to study effects of hydrological changes on Quercus robur L. stands. The extended model assesses floodplain peculiarities, i.e., seasonal flooding and water infiltration from the groundwater table. Our interest was the tradeoff between (a). maintaining regional applicability with respect to available model input information, (b). incorporating the necessary mechanistic detail and (c). keeping the computational effort at an acceptable level. An evaluation based on observed transpiration, timber volume, soil carbon and soil nitrogen content showed that the extended model produced unbiased results. We also investigated the impact of hydrological changes on our oak stands as a result of the completion of an artificial canal network in 1971, which has stopped regular springtime flooding. A comparison of the 11 years before versus the 11 years after 1971 demonstrated that the hydrological changes affected mainly the annual variation across years in leaf area index (LAI) and soil carbon and nitrogen sequestration, leading to stagnation of carbon and nitrogen stocks, but to an increase in the variance across years. However, carbon sequestration to timber was unaffected and exhibited no significant change in cross-year variation. Finally, we investigated how drawdown of the water table, a general problem in the region, affects modeled ecosystem behavior. We found a further amplification of cross-year LAI fluctuations, but the variance in soil carbon and nitrogen stocks decreased. Volume increment was unaffected, suggesting a stabilization of the ecosystem two decades after implementation of water management measures.

  18. Use of BIOME-BGG to simulate Mediterranean forest carbon stocks

    OpenAIRE

    Chirici G; Barbati A; Salvati R; Maselli F; Chiesi M

    2011-01-01

    BIOME-BGC is a bio-geochemical model capable of estimating the water, carbon and nitrogen fluxes and storages of terrestrial ecosystems. Previous research demonstrated that, after proper calibration of its ecophysiological parameters, the model can reproduce the main processes of Mediterranean forest types. The same investigations, however, indicated a model tendency to overestimate woody biomass accumulation. The current paper aims at modifying BIOME-BGC ecophysiological settings to improve ...

  19. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    Energy Technology Data Exchange (ETDEWEB)

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  20. Mapping and modeling the biogeochemical cycling of turf grasses in the United States.

    Science.gov (United States)

    Milesi, Cristina; Running, Steven W; Elvidge, Christopher D; Dietz, John B; Tuttle, Benjamin T; Nemani, Ramakrishna R

    2005-09-01

    Turf grasses are ubiquitous in the urban landscape of the United States and are often associated with various types of environmental impacts, especially on water resources, yet there have been limited efforts to quantify their total surface and ecosystem functioning, such as their total impact on the continental water budget and potential net ecosystem exchange (NEE). In this study, relating turf grass area to an estimate of fractional impervious surface area, it was calculated that potentially 163,800 km2 (+/- 35,850 km2) of land are cultivated with turf grasses in the continental United States, an area three times larger than that of any irrigated crop. Using the Biome-BGC ecosystem process model, the growth of warm-season and cool-season turf grasses was modeled at a number of sites across the 48 conterminous states under different management scenarios, simulating potential carbon and water fluxes as if the entire turf surface was to be managed like a well-maintained lawn. The results indicate that well-watered and fertilized turf grasses act as a carbon sink. The potential NEE that could derive from the total surface potentially under turf (up to 17 Tg C/yr with the simulated scenarios) would require up to 695 to 900 liters of water per person per day, depending on the modeled water irrigation practices, suggesting that outdoor water conservation practices such as xeriscaping and irrigation with recycled waste-water may need to be extended as many municipalities continue to face increasing pressures on freshwater.

  1. Assessing the ability of three land ecosystem models to simulate gross carbon uptake of forests from boreal to Mediterranean climate in Europe

    Directory of Open Access Journals (Sweden)

    M. Jung

    2007-08-01

    Full Text Available Three terrestrial biosphere models (LPJ, Orchidee, Biome-BGC were evaluated with respect to their ability to simulate large-scale climate related trends in gross primary production (GPP across European forests. Simulated GPP and leaf area index (LAI were compared with GPP estimates based on flux separated eddy covariance measurements of net ecosystem exchange and LAI measurements along a temperature gradient ranging from the boreal to the Mediterranean region. The three models capture qualitatively the pattern suggested by the site data: an increase in GPP from boreal to temperate and a subsequent decline from temperate to Mediterranean climates. The models consistently predict higher GPP for boreal and lower GPP for Mediterranean forests. Based on a decomposition of GPP into absorbed photosynthetic active radiation (APAR and radiation use efficiency (RUE, the overestimation of GPP for the boreal coniferous forests appears to be primarily related to too high simulated LAI - and thus light absorption (APAR – rather than too high radiation use efficiency. We cannot attribute the tendency of the models to underestimate GPP in the water limited region to model structural deficiencies with confidence. A likely dry bias of the input meteorological data in southern Europe may create this pattern.

    On average, the models compare similarly well to the site GPP data (RMSE of ~30% or 420 gC/m2/yr but differences are apparent for different ecosystem types. In terms of absolute values, we find the agreement between site based GPP estimates and simulations acceptable when we consider uncertainties about the accuracy in model drivers, a potential representation bias of the eddy covariance sites, and uncertainties related to the method of deriving GPP from eddy covariance measurements data. Continental to global data-model comparison studies should be fostered in the future since they are necessary to identify consistent model bias along environmental

  2. Monitoring and modeling human interactions with ecosystems

    Science.gov (United States)

    Milesi, Cristina

    With rapidly increasing consumption rates and global population, there is a growing interest in understanding how to balance human activities with the other components of the Earth system. Humans alter ecosystem functioning with land cover changes, greenhouse gas emissions and overexploitation of natural resources. On the other side, climate and its inherent interannual variability drive global Net Primary Productivity (NPP), the base of energy for all trophic levels, shaping humans' distribution on the land surface and their sensitivity to natural and accelerated patterns of variation in ecosystem processes. In this thesis, I analyzed anthropogenic influences on ecosystems and ecosystems impacts on humans through a multi-scale approach. Anthropogenic influences were analyzed with a special focus on urban ecosystems, the living environment of nearly half of the global population and almost 90% of the population in the industrialized countries. A poorly quantified aspect of urban ecosystems is the biogeochemistry of urban vegetation, intensively managed through fertilization and irrigation. In chapter 1, adapting the ecosystem model Biome-BGC, I simulated the growth of turf grasses across the United States, and estimated their potential impact on the continental water and carbon budget. Using a remote sensing-based approach, I also developed a methodology to estimate the impact of land cover changes due to urbanization on the regional photosynthetic capacity (chapter 2), finding that low-density urbanization can retain high levels of net primary productivity, although at the expense of inefficient sprawl. One of the feedbacks of urbanization is the urban heat island effect, which I analyzed in conjunction with a remote sensing based estimate of fractional impervious surface area, showing how this is related to increases in land surface temperatures, independently from geographic location and population density (chapter 3). Finally, in chapter 4, I described the

  3. Using LIDAR and Quickbird Data to Model Plant Production and Quantify Uncertainties Associated with Wetland Detection and Land Cover Generalizations

    Science.gov (United States)

    Cook, Bruce D.; Bolstad, Paul V.; Naesset, Erik; Anderson, Ryan S.; Garrigues, Sebastian; Morisette, Jeffrey T.; Nickeson, Jaime; Davis, Kenneth J.

    2009-01-01

    Spatiotemporal data from satellite remote sensing and surface meteorology networks have made it possible to continuously monitor global plant production, and to identify global trends associated with land cover/use and climate change. Gross primary production (GPP) and net primary production (NPP) are routinely derived from the MOderate Resolution Imaging Spectroradiometer (MODIS) onboard satellites Terra and Aqua, and estimates generally agree with independent measurements at validation sites across the globe. However, the accuracy of GPP and NPP estimates in some regions may be limited by the quality of model input variables and heterogeneity at fine spatial scales. We developed new methods for deriving model inputs (i.e., land cover, leaf area, and photosynthetically active radiation absorbed by plant canopies) from airborne laser altimetry (LiDAR) and Quickbird multispectral data at resolutions ranging from about 30 m to 1 km. In addition, LiDAR-derived biomass was used as a means for computing carbon-use efficiency. Spatial variables were used with temporal data from ground-based monitoring stations to compute a six-year GPP and NPP time series for a 3600 ha study site in the Great Lakes region of North America. Model results compared favorably with independent observations from a 400 m flux tower and a process-based ecosystem model (BIOME-BGC), but only after removing vapor pressure deficit as a constraint on photosynthesis from the MODIS global algorithm. Fine resolution inputs captured more of the spatial variability, but estimates were similar to coarse-resolution data when integrated across the entire vegetation structure, composition, and conversion efficiencies were similar to upland plant communities. Plant productivity estimates were noticeably improved using LiDAR-derived variables, while uncertainties associated with land cover generalizations and wetlands in this largely forested landscape were considered less important.

  4. A Model-based Approach to Scaling GPP and NPP in Support of MODIS Land Product Validation

    Science.gov (United States)

    Turner, D. P.; Cohen, W. B.; Gower, S. T.; Ritts, W. D.

    2003-12-01

    Global products from the Earth-orbiting MODIS sensor include land cover, leaf area index (LAI), FPAR, 8-day gross primary production (GPP), and annual net primary production (NPP) at the 1 km spatial resolution. The BigFoot Project was designed specifically to validate MODIS land products, and has initiated ground measurements at 9 sites representing a wide array of vegetation types. An ecosystem process model (Biome-BGC) is used to generate estimates of GPP and NPP for each 5 km x 5 km BigFoot site. Model inputs include land cover and LAI (from Landsat ETM+), daily meteorological data (from a centrally located eddy covariance flux tower), and soil characteristics. Model derived outputs are validated against field-measured NPP and flux tower-derived GPP. The resulting GPP and NPP estimates are then aggregated to the 1 km resolution for direct spatial comparison with corresponding MODIS products. At the high latitude sites (tundra and boreal forest), the MODIS GPP phenology closely tracks the BigFoot GPP, but there is a high bias in the MODIS GPP. In the temperate zone sites, problems with the timing and magnitude of the MODIS FPAR introduce differences in MODIS GPP compared to the validation data at some sites. However, the MODIS LAI/FPAR data are currently being reprocessed (=Collection 4) and new comparisons will be made for 2002. The BigFoot scaling approach permits precise overlap in spatial and temporal resolution between the MODIS products and BigFoot products, and thus permits the evaluation of specific components of the MODIS NPP algorithm. These components include meteorological inputs from the NASA Data Assimilation Office, LAI and FPAR from other MODIS algorithms, and biome-specific parameters for base respiration rate and light use efficiency.

  5. Transforming Ocean Observations of the Carbon Budget, Acidification, Hypoxia, Nutrients, and Biological Productivity: a Global Array of Biogeochemical Argo Floats

    Science.gov (United States)

    Talley, L. D.; Johnson, K. S.; Claustre, H.; Boss, E.; Emerson, S. R.; Westberry, T. K.; Sarmiento, J. L.; Mazloff, M. R.; Riser, S.; Russell, J. L.

    2017-12-01

    Our ability to detect changes in biogeochemical (BGC) processes in the ocean that may be driven by increasing atmospheric CO2, as well as by natural climate variability, is greatly hindered by undersampling in vast areas of the open ocean. Argo is a major international program that measures ocean heat content and salinity with about 4000 floats distributed throughout the ocean, profiling to 2000 m every 10 days. Extending this approach to a global BGC-Argo float array, using recent, proven sensor technology, and in close synergy with satellite systems, will drive a transformative shift in observing and predicting the effects of climate change on ocean metabolism, carbon uptake, acidification, deoxygenation, and living marine resource management. BGC-Argo will add sensors for pH, oxygen, nitrate, chlorophyll, suspended particles, and downwelling irradiance, with sufficient accuracy for climate studies. Observing System Simulation Experiments (OSSEs) using BGC models indicate that 1000 BGC floats would provide sufficient coverage, hence equipping 1/4 of the Argo array. BGC-Argo (http://biogeochemical-argo.org) will enhance current sustained observational programs such as Argo, GO-SHIP, and long-term ocean time series. BGC-Argo will benefit from deployments on GO-SHIP vessels, which provide sensor verification. Empirically derived algorithms that relate the observed BGC float parameters to the carbon system parameters will provide global information on seasonal ocean-atmosphere carbon exchange. BGC Argo measurements could be paired with other emerging technology, such as pCO2 measurements from ships of opportunity and wave gliders, to extend and validate exchange estimates. BGC-Argo prototype programs already show the potential of a global observing system that can measure seasonal to decadal variability. Various countries have developed regional BGC arrays: Southern Ocean (SOCCOM), North Atlantic Subpolar Gyre (remOcean), Mediterranean (NAOS), the Kuroshio (INBOX

  6. Synthesis of Remote Sensing and Field Observations to Model and Understand Disturbance and Climate Effects on the Carbon Balance of Oregon & Northern California

    Energy Technology Data Exchange (ETDEWEB)

    Beverly Law; David Turner; Warren Cohen; Mathias Goeckede

    2008-05-22

    The goal is to quantify and explain the carbon (C) budget for Oregon and N. California. The research compares "bottom -up" and "top-down" methods, and develops prototype analytical systems for regional analysis of the carbon balance that are potentially applicable to other continental regions, and that can be used to explore climate, disturbance and land-use effects on the carbon cycle. Objectives are: 1) Improve, test and apply a bottom up approach that synthesizes a spatially nested hierarchy of observations (multispectral remote sensing, inventories, flux and extensive sites), and the Biome-BGC model to quantify the C balance across the region; 2) Improve, test and apply a top down approach for regional and global C flux modeling that uses a model-data fusion scheme (MODIS products, AmeriFlux, atmospheric CO2 concentration network), and a boundary layer model to estimate net ecosystem production (NEP) across the region and partition it among GPP, R(a) and R(h). 3) Provide critical understanding of the controls on regional C balance (how NEP and carbon stocks are influenced by disturbance from fire and management, land use, and interannual climate variation). The key science questions are, "What are the magnitudes and distributions of C sources and sinks on seasonal to decadal time scales, and what processes are controlling their dynamics? What are regional spatial and temporal variations of C sources and sinks? What are the errors and uncertainties in the data products and results (i.e., in situ observations, remote sensing, models)?

  7. Development of a tropical ecological forecasting strategy for ENSO based on the ACME modeling framework

    Science.gov (United States)

    Hoffman, F. M.; Xu, M.; Collier, N.; Xu, C.; Christoffersen, B. O.; Luo, Y.; Ricciuto, D. M.; Levine, P. A.; Randerson, J. T.

    2016-12-01

    The El Niño Southern Oscillation (ENSO) is an irregular periodic climate fluctuation, occurring every eight to 12 years, that is driven by variations in sea surface temperatures (SSTs) over the tropical eastern Pacific Ocean and extending westward across the equatorial Pacific. El Niño, the warming phase of ENSO, has strong effects on the global carbon cycle. Strong drying conditions in the Asia-Pacific region and western South America during El Niño lead to reduced ecosystem productivity and increased mortality and fire risk. The intensity of the 2015-2016 ENSO event rivaled or exceeded that of the 1997-1998 event, which was the strongest well-observed El Niño on record. We performed a set of simulations using the U.S. Department of Energy's Accelerated Climate Modeling for Energy (ACMEv0.3) model, forced with prescribed sea surface temperatures, to study the responses and feedbacks of drought effects on terrestrial ecosystems induced by both of these events. The ACME model was configured to run with active atmosphere and land models alongside the "data" ocean and thermodynamic sea ice models. The Community Atmosphere Model used the Spectral Element dynamical core (CAM-SE) operating on the ne30 ( 1°) grid, and the ACME Land Model (ALM) was equivalent to the Community Land Model with prognostic biogeochemistry (CLM4.5-BGC). Using Optimal Interpolation SSTs (OISSTv2) and predicted SST anomalies from NCEP's Climate Forecast System (CFSv2) as forcing, we conducted a transient simulation from 1995 to 2020, following a spin up simulation, and analyzed the ENSO impacts on tropical terrestrial ecosystems for the 5-year periods centered on these two strong ENSO events. During the transient simulation, we saved the resulting atmospheric forcing, which included prognostic biosphere-atmosphere interactions, every three hours for use in future offline simulation for model development and testing. We will present simulation results, focusing on hydroclimatic anomalies as

  8. The effects of climate downscaling technique and observational data set on modeled ecological responses.

    Science.gov (United States)

    Pourmokhtarian, Afshin; Driscoll, Charles T; Campbell, John L; Hayhoe, Katharine; Stoner, Anne M K

    2016-07-01

    Assessments of future climate change impacts on ecosystems typically rely on multiple climate model projections, but often utilize only one downscaling approach trained on one set of observations. Here, we explore the extent to which modeled biogeochemical responses to changing climate are affected by the selection of the climate downscaling method and training observations used at the montane landscape of the Hubbard Brook Experimental Forest, New Hampshire, USA. We evaluated three downscaling methods: the delta method (or the change factor method), monthly quantile mapping (Bias Correction-Spatial Disaggregation, or BCSD), and daily quantile regression (Asynchronous Regional Regression Model, or ARRM). Additionally, we trained outputs from four atmosphere-ocean general circulation models (AOGCMs) (CCSM3, HadCM3, PCM, and GFDL-CM2.1) driven by higher (A1fi) and lower (B1) future emissions scenarios on two sets of observations (1/8º resolution grid vs. individual weather station) to generate the high-resolution climate input for the forest biogeochemical model PnET-BGC (eight ensembles of six runs).The choice of downscaling approach and spatial resolution of the observations used to train the downscaling model impacted modeled soil moisture and streamflow, which in turn affected forest growth, net N mineralization, net soil nitrification, and stream chemistry. All three downscaling methods were highly sensitive to the observations used, resulting in projections that were significantly different between station-based and grid-based observations. The choice of downscaling method also slightly affected the results, however not as much as the choice of observations. Using spatially smoothed gridded observations and/or methods that do not resolve sub-monthly shifts in the distribution of temperature and/or precipitation can produce biased results in model applications run at greater temporal and/or spatial resolutions. These results underscore the importance of

  9. Asia-MIP: Multi Model-data Synthesis of Terrestrial Carbon Cycles in Asia

    Science.gov (United States)

    Ichii, K.; Kondo, M.; Ito, A.; Kang, M.; Sasai, T.; SATO, H.; Ueyama, M.; Kobayashi, H.; Saigusa, N.; Kim, J.

    2013-12-01

    Asia, which is characterized by monsoon climate and intense human activities, is one of the prominent understudied regions in terms of terrestrial carbon budgets and mechanisms of carbon exchange. To better understand terrestrial carbon cycle in Asia, we initiated multi-model and data intercomparison project in Asia (Asia-MIP). We analyzed outputs from multiple approaches: satellite-based observations (AVHRR and MODIS) and related products, empirically upscaled estimations (Support Vector Regression) using eddy-covariance observation network in Asia (AsiaFlux, CarboEastAsia, FLUXNET), ~10 terrestrial biosphere models (e.g. BEAMS, Biome-BGC, LPJ, SEIB-DGVM, TRIFFID, VISIT models), and atmospheric inversion analysis (e.g. TransCom models). We focused on the two difference temporal coverage: long-term (30 years; 1982-2011) and decadal (10 years; 2001-2010; data intensive period) scales. The regions of covering Siberia, Far East Asia, East Asia, Southeast Asia and South Asia (60-80E, 10S-80N), was analyzed in this study for assessing the magnitudes, interannual variability, and key driving factors of carbon cycles. We will report the progress of synthesis effort to quantify terrestrial carbon budget in Asia. First, we analyzed the recent trends in Gross Primary Productivities (GPP) using satellite-based observation (AVHRR) and multiple terrestrial biosphere models. We found both model outputs and satellite-based observation consistently show an increasing trend in GPP in most of the regions in Asia. Mechanisms of the GPP increase were analyzed using models, and changes in temperature and precipitation play dominant roles in GPP increase in boreal and temperate regions, whereas changes in atmospheric CO2 and precipitation are important in tropical regions. However, their relative contributions were different. Second, in the decadal analysis (2001-2010), we found that the negative GPP and carbon uptake anomalies in 2003 summer in Far East Asia is one of the largest

  10. Using LiDAR and quickbird data to model plant production and quantify uncertainties associated with wetland detection and land cover generalizations

    Science.gov (United States)

    Cook, B.D.; Bolstad, P.V.; Naesset, E.; Anderson, R. Scott; Garrigues, S.; Morisette, J.T.; Nickeson, J.; Davis, K.J.

    2009-01-01

    Spatiotemporal data from satellite remote sensing and surface meteorology networks have made it possible to continuously monitor global plant production, and to identify global trends associated with land cover/use and climate change. Gross primary production (GPP) and net primary production (NPP) are routinely derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) onboard satellites Terra and Aqua, and estimates generally agree with independent measurements at validation sites across the globe. However, the accuracy of GPP and NPP estimates in some regions may be limited by the quality of model input variables and heterogeneity at fine spatial scales. We developed new methods for deriving model inputs (i.e., land cover, leaf area, and photosynthetically active radiation absorbed by plant canopies) from airborne laser altimetry (LiDAR) and Quickbird multispectral data at resolutions ranging from about 30??m to 1??km. In addition, LiDAR-derived biomass was used as a means for computing carbon-use efficiency. Spatial variables were used with temporal data from ground-based monitoring stations to compute a six-year GPP and NPP time series for a 3600??ha study site in the Great Lakes region of North America. Model results compared favorably with independent observations from a 400??m flux tower and a process-based ecosystem model (BIOME-BGC), but only after removing vapor pressure deficit as a constraint on photosynthesis from the MODIS global algorithm. Fine-resolution inputs captured more of the spatial variability, but estimates were similar to coarse-resolution data when integrated across the entire landscape. Failure to account for wetlands had little impact on landscape-scale estimates, because vegetation structure, composition, and conversion efficiencies were similar to upland plant communities. Plant productivity estimates were noticeably improved using LiDAR-derived variables, while uncertainties associated with land cover generalizations and

  11. A statistical light use efficiency model explains 85% variations in global GPP

    Science.gov (United States)

    Jiang, C.; Ryu, Y.

    2016-12-01

    Photosynthesis is a complicated process whose modeling requires different levels of assumptions, simplification, and parameterization. Among models, light use efficiency (LUE) model is highly compact but powerful in monitoring gross primary production (GPP) from satellite data. Most of LUE models adopt a multiplicative from of maximum LUE, absorbed photosynthetically active radiation (APAR), and temperature and water stress functions. However, maximum LUE is a fitting parameter with large spatial variations, but most studies only use several biome dependent constants. In addition, stress functions are empirical and arbitrary in literatures. Moreover, meteorological data used are usually coarse-resolution, e.g., 1°, which could cause large errors. Finally, sunlit and shade canopy have completely different light responses but little considered. Targeting these issues, we derived a new statistical LUE model from a process-based and satellite-driven model, the Breathing Earth System Simulator (BESS). We have already derived a set of global radiation (5-km resolution), carbon and water fluxes (1-km resolution) products from 2000 to 2015 from BESS. By exploring these datasets, we found strong correlation between APAR and GPP for sunlit (R2=0.84) and shade (R2=0.96) canopy, respectively. A simple model, only driven by sunlit and shade APAR, was thus built based on linear relationships. The slopes of the linear function act as effective LUE of global ecosystem, with values of 0.0232 and 0.0128 umol C/umol quanta for sunlit and shade canopy, respectively. When compared with MPI-BGC GPP products, a global proxy of FLUXNET data, BESS-LUE achieved an overall accuracy of R2 = 0.85, whereas original BESS was R2 = 0.83 and MODIS GPP product was R2 = 0.76. We investigated spatiotemporal variations of the effective LUE. Spatially, the ratio of sunlit to shade values ranged from 0.1 (wet tropic) to 4.5 (dry inland). By using maps of sunlit and shade effective LUE the accuracy of

  12. Assessing the impacts of climate change and nitrogen deposition on Norway spruce growth in Austria with BIOME-BGC

    Energy Technology Data Exchange (ETDEWEB)

    Eastaugh, Chris S.; Potzelsberger, Elisabeth; Hasenaueur, Hubert

    2011-03-15

    The purpose of this study is to determine if the climate change has had an apparent impact in Austrian forests. This research has been conducted on Norway spruce forests as this is the predominant species in Austria. Growth data between regions which have different temperature and precipitation trendsw was then compared, with results showing increased productivity in all regions thus implying that growth of the forest is driven by other factors than climate. This conclusion is consistent with previous studies supporting that forest growth is mainly driven by increasing nitrogen deposition.

  13. Generating daily weather data for ecosystem modelling in the Congo River Basin

    Science.gov (United States)

    Petritsch, Richard; Pietsch, Stephan A.

    2010-05-01

    based on the ratio of values on rainy days and days without rain, respectively. For assessing the impact of our correction, we simulated the ecosystem behaviour using the climate data from Lastourville, Moanda and Mouilla with the mechanistic ecosystem model Biome-BGC. Differences in terms of the carbon, nitrogen and water cycle were subsequently analysed and discussed.

  14. Simulation and sensitivity analysis of carbon storage and fluxes in the New Jersey Pinelands

    Science.gov (United States)

    Zewei Miao; Richard G. Lathrop; Ming Xu; Inga P. La Puma; Kenneth L. Clark; John Hom; Nicholas Skowronski; Steve. Van Tuyl

    2011-01-01

    A major challenge in modeling the carbon dynamics of vegetation communities is the proper parameterization and calibration of eco-physiological variables that are critical determinants of the ecosystem process-based model behavior. In this study, we improved and calibrated a biochemical process-based WxBGC model by using in situ AmeriFlux eddy covariance tower...

  15. Multiscale Computation. Needs and Opportunities for BER Science

    Energy Technology Data Exchange (ETDEWEB)

    Scheibe, Timothy D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, Jeremy C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-01

    The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSL decisions regarding future computational (hardware and software) architectures.

  16. Regional carbon cycle responses to 25 years of variation in climate and disturbance in the US Pacific Northwest

    Science.gov (United States)

    David P. Turner; William D. Ritts; Robert E. Kennedy; Andrew N. Gray; Zhiqiang Yang

    2016-01-01

    Variation in climate, disturbance regime, and forest management strongly influence terrestrial carbon sources and sinks. Spatially distributed, process-based, carbon cycle simulation models provide a means to integrate information on these various influences to estimate carbon pools and flux over large domains. Here we apply the Biome-BGC model over the four-state...

  17. Policy, Protectionism and the Competent Child.

    Science.gov (United States)

    Wyness, Michael G.

    1996-01-01

    Examines the way recent childhood policy initiatives in Britain have generated contradictory models of child competence and adults' role, and attempts to locate policy within broader understandings of social change. Draws from case material on two dominant child care issues: child sex abuse and school deviance. (BGC)

  18. Estimating climate change effects on net primary production of rangelands in the United States

    Science.gov (United States)

    Matthew C. Reeves; Adam L. Moreno; Karen E. Bagne; Steven W. Running

    2014-01-01

    The potential effects of climate change on net primary productivity (NPP) of U.S. rangelands were evaluated using estimated climate regimes from the A1B, A2 and B2 global change scenarios imposed on the biogeochemical cycling model, Biome-BGC from 2001 to 2100. Temperature, precipitation, vapor pressure deficit, day length, solar radiation, CO2 enrichment and nitrogen...

  19. Assessing the effect of climate change on carbon sequestration in a Mexican dry forest in the Yucatan Peninsula

    Science.gov (United States)

    Z. Dai; K.D. Johnson; R.A. Birdsey; J.L. Hernandez-Stefanoni; J.M. Dupuy

    2015-01-01

    Assessing the effect of climate change on carbon sequestration in tropical forest ecosystems is important to inform monitoring, reporting, and verification (MRV) for reducing deforestation and forest degradation (REDD), and to effectively assess forest management options under climate change. Two process-based models, Forest-DNDC and Biome-BGC, with different spatial...

  20. Cytotoxicity assessment of functionalized CdSe, CdTe and InP quantum dots in two human cancer cell models

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jing [Institute of Gerontology and Geriatrics & Beijing Key Lab of Aging and Geriatrics, Chinese PLA General Hospital, Beijing 100853 (China); Hu, Rui [School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798 (Singapore); Liu, Jianwei [Institute of Gerontology and Geriatrics & Beijing Key Lab of Aging and Geriatrics, Chinese PLA General Hospital, Beijing 100853 (China); Zhang, Butian; Wang, Yucheng [School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798 (Singapore); Liu, Xin [Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Law, Wing-Cheung [Department of Industrial and System Engineering, The Hang Kong Polytechnic University, Hung Hom (Hong Kong); Liu, Liwei [School of Science, Changchun University of Science and Technology, Changchun 130022 (China); Ye, Ling, E-mail: lye_301@163.com [Institute of Gerontology and Geriatrics & Beijing Key Lab of Aging and Geriatrics, Chinese PLA General Hospital, Beijing 100853 (China); Yong, Ken-Tye, E-mail: ktyong@ntu.edu.sg [School of Electrical and Electronic Engineering, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798 (Singapore)

    2015-12-01

    The toxicity of quantum dots (QDs) has been extensively studied over the past decade. Some common factors that originate the QD toxicity include releasing of heavy metal ions from degraded QDs and the generation of reactive oxygen species on the QD surface. In addition to these factors, we should also carefully examine other potential QD toxicity causes that will play crucial roles in impacting the overall biological system. In this contribution, we have performed cytotoxicity assessment of four types of QD formulations in two different human cancer cell models. The four types of QD formulations, namely, mercaptopropionic acid modified CdSe/CdS/ZnS QDs (CdSe-MPA), PEGylated phospholipid encapsulated CdSe/CdS/ZnS QDs (CdSe-Phos), PEGylated phospholipid encapsulated InP/ZnS QDs (InP-Phos) and Pluronic F127 encapsulated CdTe/ZnS QDs (CdTe-F127), are representatives for the commonly used QD formulations in biomedical applications. Both the core materials and the surface modifications have been taken into consideration as the key factors for the cytotoxicity assessment. Through side-by-side comparison and careful evaluations, we have found that the toxicity of QDs does not solely depend on a single factor in initiating the toxicity in biological system but rather it depends on a combination of elements from the particle formulations. More importantly, our toxicity assessment shows different cytotoxicity trend for all the prepared formulations tested on gastric adenocarcinoma (BGC-823) and neuroblastoma (SH-SY5Y) cell lines. We have further proposed that the cellular uptake of these nanocrystals plays an important role in determining the final faith of the toxicity impact of the formulation. The result here suggests that the toxicity of QDs is rather complex and it cannot be generalized under a few assumptions reported previously. We suggest that one have to evaluate the QD toxicity on a case to case basis and this indicates that standard procedures and comprehensive

  1. Cytotoxicity assessment of functionalized CdSe, CdTe and InP quantum dots in two human cancer cell models

    International Nuclear Information System (INIS)

    Liu, Jing; Hu, Rui; Liu, Jianwei; Zhang, Butian; Wang, Yucheng; Liu, Xin; Law, Wing-Cheung; Liu, Liwei; Ye, Ling; Yong, Ken-Tye

    2015-01-01

    The toxicity of quantum dots (QDs) has been extensively studied over the past decade. Some common factors that originate the QD toxicity include releasing of heavy metal ions from degraded QDs and the generation of reactive oxygen species on the QD surface. In addition to these factors, we should also carefully examine other potential QD toxicity causes that will play crucial roles in impacting the overall biological system. In this contribution, we have performed cytotoxicity assessment of four types of QD formulations in two different human cancer cell models. The four types of QD formulations, namely, mercaptopropionic acid modified CdSe/CdS/ZnS QDs (CdSe-MPA), PEGylated phospholipid encapsulated CdSe/CdS/ZnS QDs (CdSe-Phos), PEGylated phospholipid encapsulated InP/ZnS QDs (InP-Phos) and Pluronic F127 encapsulated CdTe/ZnS QDs (CdTe-F127), are representatives for the commonly used QD formulations in biomedical applications. Both the core materials and the surface modifications have been taken into consideration as the key factors for the cytotoxicity assessment. Through side-by-side comparison and careful evaluations, we have found that the toxicity of QDs does not solely depend on a single factor in initiating the toxicity in biological system but rather it depends on a combination of elements from the particle formulations. More importantly, our toxicity assessment shows different cytotoxicity trend for all the prepared formulations tested on gastric adenocarcinoma (BGC-823) and neuroblastoma (SH-SY5Y) cell lines. We have further proposed that the cellular uptake of these nanocrystals plays an important role in determining the final faith of the toxicity impact of the formulation. The result here suggests that the toxicity of QDs is rather complex and it cannot be generalized under a few assumptions reported previously. We suggest that one have to evaluate the QD toxicity on a case to case basis and this indicates that standard procedures and comprehensive

  2. Model(ing) Law

    DEFF Research Database (Denmark)

    Carlson, Kerstin

    The International Criminal Tribunal for the former Yugoslavia (ICTY) was the first and most celebrated of a wave of international criminal tribunals (ICTs) built in the 1990s designed to advance liberalism through international criminal law. Model(ing) Justice examines the case law of the ICTY...

  3. Models and role models

    NARCIS (Netherlands)

    ten Cate, J.M.

    2015-01-01

    Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of

  4. Terrestrial biogeochemical feedbacks in the climate system: from past to future

    Energy Technology Data Exchange (ETDEWEB)

    Arneth, A.; Harrison, S. P.; Zaehle, S.; Tsigaridis, K; Menon, S; Bartlein, P.J.; Feichter, J; Korhola, A; Kulmala, M; O' Donnell, D; Schurgers, G; Sorvari, S; Vesala, T

    2010-01-05

    The terrestrial biosphere plays a major role in the regulation of atmospheric composition, and hence climate, through multiple interlinked biogeochemical cycles (BGC). Ice-core and other palaeoenvironmental records show a fast response of vegetation cover and exchanges with the atmosphere to past climate change, although the phasing of these responses reflects spatial patterning and complex interactions between individual biospheric feedbacks. Modern observations show a similar responsiveness of terrestrial biogeochemical cycles to anthropogenically-forced climate changes and air pollution, with equally complex feedbacks. For future conditions, although carbon cycle-climate interactions have been a major focus, other BGC feedbacks could be as important in modulating climate changes. The additional radiative forcing from terrestrial BGC feedbacks other than those conventionally attributed to the carbon cycle is in the range of 0.6 to 1.6 Wm{sup -2}; all taken together we estimate a possible maximum of around 3 Wm{sup -2} towards the end of the 21st century. There are large uncertainties associated with these estimates but, given that the majority of BGC feedbacks result in a positive forcing because of the fundamental link between metabolic stimulation and increasing temperature, improved quantification of these feedbacks and their incorporation in earth system models is necessary in order to develop coherent plans to manage ecosystems for climate mitigation.

  5. Understanding the interaction between wild fire and vegetation distribution within the NCAR CESM framework

    Science.gov (United States)

    Seo, H.; Kim, Y.; Kim, H. J.

    2017-12-01

    Every year wild fire brings about 400Mha of land burned therefore 2Pg of carbon emissions from the surface occur. In this way fire not only affects the carbon circulation but also has an effect on the terrestrial ecosystems. This study aims to understand role of fire on the geographic vegetation distribution and the terrestrial carbon balances within the NCAR CESM framework, specifically with the CLM-BGC and CLM-BGC-DV. Global climate data from Climate Research Unit (CRU)-National Centers for Environmental Prediction (NCEP) data ranging from 1901 to 2010 are used to drive the land models. First, by comparing fire-on and fire-off simulations with the CLM-BGC-DV, the fire impacts in dynamic vegetation are quantified by the fractional land areas of the different plant functional types. In addition, we examine how changes in vegetation distribution affect the total sum of the burned areas and the carbon balances. This study would provide the limits of and suggestions for the fire and dynamic vegetation modules of the CLM-BGC. AcknowledgementsThis work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2015R1C1A2A01054800) and by the Korea Meteorological Administration R&D Program under Grant KMIPA 2015-6180. This work was also supported by the Yonsei University Future-leading Research Initiative of 2015(2016-22-0061).

  6. Cognitive modeling

    OpenAIRE

    Zandbelt, Bram

    2017-01-01

    Introductory presentation on cognitive modeling for the course ‘Cognitive control’ of the MSc program Cognitive Neuroscience at Radboud University. It addresses basic questions, such as 'What is a model?', 'Why use models?', and 'How to use models?'

  7. Modelling the models

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models.   Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...

  8. Modelling Practice

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...... years of experience is providing in directing the reader in their activities.Traps and pitfalls are discussed and strategies also given to improve model development towards “fit-for-purpose” models. The emphasis in this chapter is the adoption and exercise of a modelling methodology that has proven very...

  9. STRUCTURAL MODELLING

    Directory of Open Access Journals (Sweden)

    Tea Ya. Danelyan

    2014-01-01

    Full Text Available The article states the general principles of structural modeling in aspect of the theory of systems and gives the interrelation with other types of modeling to adjust them to the main directions of modeling. Mathematical methods of structural modeling, in particular method of expert evaluations are considered.

  10. (HEV) Model

    African Journals Online (AJOL)

    Moatez Billah HARIDA

    The use of the simulator “Hybrid Electrical Vehicle Model Balances Fidelity and. Speed (HEVMBFS)” and the global control strategy make it possible to achieve encouraging results. Key words: Series parallel hybrid vehicle - nonlinear model - linear model - Diesel engine - Engine modelling -. HEV simulator - Predictive ...

  11. Constitutive Models

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Piccolo, Chiara; Heitzig, Martina

    2011-01-01

    This chapter presents various types of constitutive models and their applications. There are 3 aspects dealt with in this chapter, namely: creation and solution of property models, the application of parameter estimation and finally application examples of constitutive models. A systematic...... procedure is introduced for the analysis and solution of property models. Models that capture and represent the temperature dependent behaviour of physical properties are introduced, as well as equation of state models (EOS) such as the SRK EOS. Modelling of liquid phase activity coefficients are also...

  12. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  13. Galactic models

    International Nuclear Information System (INIS)

    Buchler, J.R.; Gottesman, S.T.; Hunter, J.H. Jr.

    1990-01-01

    Various papers on galactic models are presented. Individual topics addressed include: observations relating to galactic mass distributions; the structure of the Galaxy; mass distribution in spiral galaxies; rotation curves of spiral galaxies in clusters; grand design, multiple arm, and flocculent spiral galaxies; observations of barred spirals; ringed galaxies; elliptical galaxies; the modal approach to models of galaxies; self-consistent models of spiral galaxies; dynamical models of spiral galaxies; N-body models. Also discussed are: two-component models of galaxies; simulations of cloudy, gaseous galactic disks; numerical experiments on the stability of hot stellar systems; instabilities of slowly rotating galaxies; spiral structure as a recurrent instability; model gas flows in selected barred spiral galaxies; bar shapes and orbital stochasticity; three-dimensional models; polar ring galaxies; dynamical models of polar rings

  14. Interface models

    DEFF Research Database (Denmark)

    Ravn, Anders P.; Staunstrup, Jørgen

    1994-01-01

    This paper proposes a model for specifying interfaces between concurrently executing modules of a computing system. The model does not prescribe a particular type of communication protocol and is aimed at describing interfaces between both software and hardware modules or a combination of the two....... The model describes both functional and timing properties of an interface...

  15. Hydrological models are mediating models

    Science.gov (United States)

    Babel, L. V.; Karssenberg, D.

    2013-08-01

    Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting

  16. ICRF modelling

    International Nuclear Information System (INIS)

    Phillips, C.K.

    1985-12-01

    This lecture provides a survey of the methods used to model fast magnetosonic wave coupling, propagation, and absorption in tokamaks. The validity and limitations of three distinct types of modelling codes, which will be contrasted, include discrete models which utilize ray tracing techniques, approximate continuous field models based on a parabolic approximation of the wave equation, and full field models derived using finite difference techniques. Inclusion of mode conversion effects in these models and modification of the minority distribution function will also be discussed. The lecture will conclude with a presentation of time-dependent global transport simulations of ICRF-heated tokamak discharges obtained in conjunction with the ICRF modelling codes. 52 refs., 15 figs

  17. Modelling in Business Model design

    NARCIS (Netherlands)

    Simonse, W.L.

    2013-01-01

    It appears that business model design might not always produce a design or model as the expected result. However when designers are involved, a visual model or artefact is produced. To assist strategic managers in thinking about how they can act, the designers challenge is to combine strategy and

  18. Ventilation Model

    International Nuclear Information System (INIS)

    Yang, H.

    1999-01-01

    The purpose of this analysis and model report (AMR) for the Ventilation Model is to analyze the effects of pre-closure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts and provide heat removal data to support EBS design. It will also provide input data (initial conditions, and time varying boundary conditions) for the EBS post-closure performance assessment and the EBS Water Distribution and Removal Process Model. The objective of the analysis is to develop, describe, and apply calculation methods and models that can be used to predict thermal conditions within emplacement drifts under forced ventilation during the pre-closure period. The scope of this analysis includes: (1) Provide a general description of effects and heat transfer process of emplacement drift ventilation. (2) Develop a modeling approach to simulate the impacts of pre-closure ventilation on the thermal conditions in emplacement drifts. (3) Identify and document inputs to be used for modeling emplacement ventilation. (4) Perform calculations of temperatures and heat removal in the emplacement drift. (5) Address general considerations of the effect of water/moisture removal by ventilation on the repository thermal conditions. The numerical modeling in this document will be limited to heat-only modeling and calculations. Only a preliminary assessment of the heat/moisture ventilation effects and modeling method will be performed in this revision. Modeling of moisture effects on heat removal and emplacement drift temperature may be performed in the future

  19. Turbulence modelling

    International Nuclear Information System (INIS)

    Laurence, D.

    1997-01-01

    This paper is an introduction course in modelling turbulent thermohydraulics, aimed at computational fluid dynamics users. No specific knowledge other than the Navier Stokes equations is required beforehand. Chapter I (which those who are not beginners can skip) provides basic ideas on turbulence physics and is taken up in a textbook prepared by the teaching team of the ENPC (Benque, Viollet). Chapter II describes turbulent viscosity type modelling and the 2k-ε two equations model. It provides details of the channel flow case and the boundary conditions. Chapter III describes the 'standard' (R ij -ε) Reynolds tensions transport model and introduces more recent models called 'feasible'. A second paper deals with heat transfer and the effects of gravity, and returns to the Reynolds stress transport model. (author)

  20. Mathematical modelling

    CERN Document Server

    2016-01-01

    This book provides a thorough introduction to the challenge of applying mathematics in real-world scenarios. Modelling tasks rarely involve well-defined categories, and they often require multidisciplinary input from mathematics, physics, computer sciences, or engineering. In keeping with this spirit of modelling, the book includes a wealth of cross-references between the chapters and frequently points to the real-world context. The book combines classical approaches to modelling with novel areas such as soft computing methods, inverse problems, and model uncertainty. Attention is also paid to the interaction between models, data and the use of mathematical software. The reader will find a broad selection of theoretical tools for practicing industrial mathematics, including the analysis of continuum models, probabilistic and discrete phenomena, and asymptotic and sensitivity analysis.

  1. Modelling Overview

    DEFF Research Database (Denmark)

    Larsen, Lars Bjørn; Vesterager, Johan

    sharing many of the characteristics of a virtual enterprise. This extended enterprise will have the following characteristics: The extended enterprise is focused on satisfying the current customer requirement so that it has a limited life expectancy, but should be capable of being recreated to deal....... One or more units from beyond the network may complement the extended enterprise. The common reference model for this extended enterprise will utilise GERAM (Generalised Enterprise Reference Architecture and Methodology) to provide an architectural framework for the modelling carried out within......This report provides an overview of the existing models of global manufacturing, describes the required modelling views and associated methods and identifies tools, which can provide support for this modelling activity.The model adopted for global manufacturing is that of an extended enterprise...

  2. Mathematical modelling

    DEFF Research Database (Denmark)

    Blomhøj, Morten

    2004-01-01

    modelling, however, can be seen as a practice of teaching that place the relation between real life and mathematics into the centre of teaching and learning mathematics, and this is relevant at all levels. Modelling activities may motivate the learning process and help the learner to establish cognitive......Developing competences for setting up, analysing and criticising mathematical models are normally seen as relevant only from and above upper secondary level. The general belief among teachers is that modelling activities presuppose conceptual understanding of the mathematics involved. Mathematical...... roots for the construction of important mathematical concepts. In addition competences for setting up, analysing and criticising modelling processes and the possible use of models is a formative aim in this own right for mathematics teaching in general education. The paper presents a theoretical...

  3. Event Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2001-01-01

    The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics. We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...... be characterized by their occurrence times and the participating books and borrowers. When we characterize events as information objects we focus on concepts like information structures. When viewed as change agents events are phenomena that trigger change. For example, when borrow event occurs books are moved...

  4. Model : making

    OpenAIRE

    Bottle, Neil

    2013-01-01

    The Model : making exhibition was curated by Brian Kennedy in collaboration with Allies & Morrison in September 2013. For the London Design Festival, the Model : making exhibition looked at the increased use of new technologies by both craft-makers and architectural model makers. In both practices traditional ways of making by hand are increasingly being combined with the latest technologies of digital imaging, laser cutting, CNC machining and 3D printing. This exhibition focussed on ...

  5. Spherical models

    CERN Document Server

    Wenninger, Magnus J

    2012-01-01

    Well-illustrated, practical approach to creating star-faced spherical forms that can serve as basic structures for geodesic domes. Complete instructions for making models from circular bands of paper with just a ruler and compass. Discusses tessellation, or tiling, and how to make spherical models of the semiregular solids and concludes with a discussion of the relationship of polyhedra to geodesic domes and directions for building models of domes. "". . . very pleasant reading."" - Science. 1979 edition.

  6. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  7. Didactical modelling

    DEFF Research Database (Denmark)

    Højgaard, Tomas; Hansen, Rune

    The purpose of this paper is to introduce Didactical Modelling as a research methodology in mathematics education. We compare the methodology with other approaches and argue that Didactical Modelling has its own specificity. We discuss the methodological “why” and explain why we find it useful...... to construct this approach in mathematics education research....

  8. Virtual modeling

    NARCIS (Netherlands)

    Flores, J.; Kiss, S.; Cano, P.; Nijholt, Antinus; Zwiers, Jakob

    2003-01-01

    We concentrate our efforts on building virtual modelling environments where the content creator uses controls (widgets) as an interactive adjustment modality for the properties of the edited objects. Besides the advantage of being an on-line modelling approach (visualised just like any other on-line

  9. Animal models

    DEFF Research Database (Denmark)

    Gøtze, Jens Peter; Krentz, Andrew

    2014-01-01

    In this issue of Cardiovascular Endocrinology, we are proud to present a broad and dedicated spectrum of reviews on animal models in cardiovascular disease. The reviews cover most aspects of animal models in science from basic differences and similarities between small animals and the human...

  10. Education models

    NARCIS (Netherlands)

    Poortman, Sybilla; Sloep, Peter

    2006-01-01

    Educational models describes a case study on a complex learning object. Possibilities are investigated for using this learning object, which is based on a particular educational model, outside of its original context. Furthermore, this study provides advice that might lead to an increase in

  11. Modeling Sunspots

    Science.gov (United States)

    Oh, Phil Seok; Oh, Sung Jin

    2013-01-01

    Modeling in science has been studied by education researchers for decades and is now being applied broadly in school. It is among the scientific practices featured in the "Next Generation Science Standards" ("NGSS") (Achieve Inc. 2013). This article describes modeling activities in an extracurricular science club in a high…

  12. Battery Modeling

    NARCIS (Netherlands)

    Jongerden, M.R.; Haverkort, Boudewijn R.H.M.

    2008-01-01

    The use of mobile devices is often limited by the capacity of the employed batteries. The battery lifetime determines how long one can use a device. Battery modeling can help to predict, and possibly extend this lifetime. Many different battery models have been developed over the years. However,

  13. Nonlinear Interactions between Climate and Atmospheric Carbon Dioxide Drivers of Terrestrial and Marine Carbon Cycle Changes

    Science.gov (United States)

    Hoffman, F. M.; Randerson, J. T.; Moore, J. K.; Goulden, M.; Fu, W.; Koven, C.; Swann, A. L. S.; Mahowald, N. M.; Lindsay, K. T.; Munoz, E.

    2017-12-01

    Quantifying interactions between global biogeochemical cycles and the Earth system is important for predicting future atmospheric composition and informing energy policy. We applied a feedback analysis framework to three sets of Historical (1850-2005), Representative Concentration Pathway 8.5 (2006-2100), and its extension (2101-2300) simulations from the Community Earth System Model version 1.0 (CESM1(BGC)) to quantify drivers of terrestrial and ocean responses of carbon uptake. In the biogeochemically coupled simulation (BGC), the effects of CO2 fertilization and nitrogen deposition influenced marine and terrestrial carbon cycling. In the radiatively coupled simulation (RAD), the effects of rising temperature and circulation changes due to radiative forcing from CO2, other greenhouse gases, and aerosols were the sole drivers of carbon cycle changes. In the third, fully coupled simulation (FC), both the biogeochemical and radiative coupling effects acted simultaneously. We found that climate-carbon sensitivities derived from RAD simulations produced a net ocean carbon storage climate sensitivity that was weaker and a net land carbon storage climate sensitivity that was stronger than those diagnosed from the FC and BGC simulations. For the ocean, this nonlinearity was associated with warming-induced weakening of ocean circulation and mixing that limited exchange of dissolved inorganic carbon between surface and deeper water masses. For the land, this nonlinearity was associated with strong gains in gross primary production in the FC simulation, driven by enhancements in the hydrological cycle and increased nutrient availability. We developed and applied a nonlinearity metric to rank model responses and driver variables. The climate-carbon cycle feedback gain at 2300 was 42% higher when estimated from climate-carbon sensitivities derived from the difference between FC and BGC than when derived from RAD. We re-analyzed other CMIP5 model results to quantify the

  14. Prostaglandins and prostaglandin receptor antagonism in migraine

    DEFF Research Database (Denmark)

    Antonova, Maria

    2013-01-01

    Human models of headache may contribute to understanding of prostaglandins' role in migraine pathogenesis. The current thesis investigated the migraine triggering effect of prostaglandin E2 (PGE2) in migraine patients without aura, the efficacy of a novel EP4 receptor antagonist, BGC20....... The infusion of PGE2 caused the immediate migraine-like attacks and vasodilatation of the middle cerebral artery in migraine patients without aura. The highly specific and potent EP4 receptor antagonist, BGC20-1531, was not able to attenuate PGE2-induced headache and vasodilatation of both intra- and extra......-cerebral arteries. The intravenous infusion of PGF2α did not induce headache or statistically significant vasoconstriction of cerebral arteries in healthy volunteers. Novel data on PGE2-provoked immediate migraine-like attacks suggest that PGE2 may be one of the important final products in the pathogenesis...

  15. VENTILATION MODEL

    International Nuclear Information System (INIS)

    V. Chipman

    2002-01-01

    The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their postclosure analyses

  16. Modelling Constructs

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2009-01-01

    , these notations have been extended in order to increase expressiveness and to be more competitive. This resulted in an increasing number of notations and formalisms for modelling business processes and in an increase of the different modelling constructs provided by modelling notations, which makes it difficult......There are many different notations and formalisms for modelling business processes and workflows. These notations and formalisms have been introduced with different purposes and objectives. Later, influenced by other notations, comparisons with other tools, or by standardization efforts...... to compare modelling notations and to make transformations between them. One of the reasons is that, in each notation, the new concepts are introduced in a different way by extending the already existing constructs. In this chapter, we go the opposite direction: We show that it is possible to add most...

  17. OSPREY Model

    Energy Technology Data Exchange (ETDEWEB)

    Veronica J. Rutledge

    2013-01-01

    The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior and feedback loops. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes will provide substantial cost savings and many technical benefits. The specific fuel cycle separation process discussed in this report is the off-gas treatment system. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of off-gas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas, sorbent, and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data is obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. It also outputs temperature along the column length as a function of time and pressure drop along the column length. Experimental data and parameters were input into the adsorption model to develop models specific for krypton adsorption. The same can be done for iodine, xenon, and tritium. The model will be validated with experimental breakthrough curves. Customers will be given access to

  18. Molecular Science Computing: 2010 Greenbook

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Cowley, David E.; Dunning, Thom H.; Vorpagel, Erich R.

    2010-04-02

    This 2010 Greenbook outlines the science drivers for performing integrated computational environmental molecular research at EMSL and defines the next-generation HPC capabilities that must be developed at the MSC to address this critical research. The EMSL MSC Science Panel used EMSL’s vision and science focus and white papers from current and potential future EMSL scientific user communities to define the scientific direction and resulting HPC resource requirements presented in this 2010 Greenbook.

  19. Modeling Applications.

    Science.gov (United States)

    McMEEKIN, Thomas A; Ross, Thomas

    1996-12-01

    The concept of predictive microbiology has developed rapidly through the initial phases of experimental design and model development and the subsequent phase of model validation. A fully validated model represents a general rule which may be brought to bear on particular cases. For some microorganism/food combinations, sufficient confidence now exists to indicate substantial benefits to the food industry from use of predictive models. Several types of devices are available to monitor and record environmental conditions (particularly temperature). These "environmental histories" can be interpreted, using predictive models, in terms of microbial proliferation. The current challenge is to provide systems for the collection and interpretation of environmental information which combine ease of use, reliability, and security, providing the industrial user with the ability to make informed and precise decisions regarding the quality and safety of foods. Many specific applications for predictive modeling can be developed from a basis of understanding the inherent qualities of a fully validated model. These include increased precision and confidence in predictions based on accumulation of quantitative data, objective and rapid assessment of the effect of environmental conditions on microbial proliferation, and flexibility in monitoring the relative contribution of component parts of processing, distribution, and storage systems for assurance of shelf life and safety.

  20. A Model for Math Modeling

    Science.gov (United States)

    Lin, Tony; Erfan, Sasan

    2016-01-01

    Mathematical modeling is an open-ended research subject where no definite answers exist for any problem. Math modeling enables thinking outside the box to connect different fields of studies together including statistics, algebra, calculus, matrices, programming and scientific writing. As an integral part of society, it is the foundation for many…

  1. Modeling complexes of modeled proteins.

    Science.gov (United States)

    Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A

    2017-03-01

    Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. RNICE Model

    DEFF Research Database (Denmark)

    Pedersen, Mogens Jin; Stritch, Justin Michael

    2018-01-01

    contributes knowledge about a social phenomenon and advances knowledge in the public administration and management literatures. The RNICE model provides a vehicle for researchers who seek to evaluate or demonstrate the value of a replication study systematically. We illustrate the practical application...... research. Recently, scholars have issued calls for more replication, but academic reflections on when replication adds substantive value to public administration and management research are needed. This concise article presents a conceptual model, RNICE, for assessing when and how a replication study...... of the model using two previously published replication studies as examples....

  3. Correct Models

    OpenAIRE

    Blacher, René

    2010-01-01

    Ce rapport complete les deux rapports précédents et apporte une explication plus simple aux résultats précédents : à savoir la preuve que les suites obtenues sont aléatoires.; In previous reports, we have show how to transform a text $y_n$ in a random sequence by using functions of Fibonacci $T_q$. Now, in this report, we obtain a clearer result by proving that $T_q(y_n)$ has the IID model as correct model. But, it is necessary to define correctly a correct model. Then, we study also this pro...

  4. Paleoclimate Modeling

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Computer simulations of past climate. Variables provided as model output are described by parameter keyword. In some cases the parameter keywords are a subset of all...

  5. Anchor Modeling

    Science.gov (United States)

    Regardt, Olle; Rönnbäck, Lars; Bergholtz, Maria; Johannesson, Paul; Wohed, Petia

    Maintaining and evolving data warehouses is a complex, error prone, and time consuming activity. The main reason for this state of affairs is that the environment of a data warehouse is in constant change, while the warehouse itself needs to provide a stable and consistent interface to information spanning extended periods of time. In this paper, we propose a modeling technique for data warehousing, called anchor modeling, that offers non-destructive extensibility mechanisms, thereby enabling robust and flexible management of changes in source systems. A key benefit of anchor modeling is that changes in a data warehouse environment only require extensions, not modifications, to the data warehouse. This ensures that existing data warehouse applications will remain unaffected by the evolution of the data warehouse, i.e. existing views and functions will not have to be modified as a result of changes in the warehouse model.

  6. Linear Models

    CERN Document Server

    Searle, Shayle R

    2012-01-01

    This 1971 classic on linear models is once again available--as a Wiley Classics Library Edition. It features material that can be understood by any statistician who understands matrix algebra and basic statistical methods.

  7. Environmental Modeling

    Science.gov (United States)

    EPA's modeling community is working to gain insights into certain parts of a physical, biological, economic, or social system by conducting environmental assessments for Agency decision making to complex environmental issues.

  8. Quark models

    International Nuclear Information System (INIS)

    Rosner, J.L.

    1981-01-01

    This paper invites experimenters to consider the wide variety of tests suggested by the new aspects of quark models since the discovery of charm and beauty, and nonrelativistic models. Colors and flavours are counted and combined into hadrons. The current quark zoo is summarized. Models and theoretical background are studied under: qualitative QCD: strings and bags, potential models, relativistic effects, electromagnetic transitions, gluon emissions, and single quark transition descriptions. Hadrons containing quarks known before 1974 (i.e. that can be made of ''light'' quarks u, d, and s) are treated in Section III, while those containing charmed quarks and beauty (b) quarks are discussed in Section IV. Unfolding the properties of the sixth quark from information on its hadrons is seen as a future application of the methods used in this study

  9. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  10. Numerical models

    Digital Repository Service at National Institute of Oceanography (India)

    Unnikrishnan, A.S.; Manoj, N.T.

    developed most of the above models. This is a good approximation to simulate horizontal distribution of active and passive variables. The future challenge lies in developing capability to simulate the distribution in the vertical....

  11. Composite models

    International Nuclear Information System (INIS)

    Peccei, R.D.

    If quarks and leptons are composite, it should be possible eventually to calculate their mass spectrum and understand the reasons for the observed family replications, questions which lie beyond the standard model. Alas, all experimental evidence to date points towards quark and lepton elemenarity with the typical momentum scale Λsub(comp), beyond which effects of inner structure may be seen, probably being greater than ITeV. One supersymmetric preon model explained provides a new dynamical alternative for obtaining light fermions which is that these states are quasi Goldstone fermions. This, and similar models are discussed. Although quasi Goldstone fermions provide an answer to the 0sup(th)-order question of composite models the questions of how masses and families are generated remain unanswered. (U.K.)

  12. Ventilation models

    Science.gov (United States)

    Skaaret, Eimund

    Calculation procedures, used in the design of ventilating systems, which are especially suited for displacement ventilation in addition to linking it to mixing ventilation, are addressed. The two zone flow model is considered and the steady state and transient solutions are addressed. Different methods of supplying air are discussed, and different types of air flow are considered: piston flow, plane flow and radial flow. An evaluation model for ventilation systems is presented.

  13. Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi

    2016-01-01

    effects, unicausal reduction, and case specificity. Based on the developments in set theoretical thinking in social sciences and employing methods like Qualitative Comparative Analysis (QCA), Necessary Condition Analysis (NCA), and set visualization techniques, in this position paper, we propose...... and demonstrate a new approach to maturity models in the domain of Information Systems. This position paper describes the set-theoretical approach to maturity models, presents current results and outlines future research work....

  14. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  15. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  16. Mechanistic models

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, S.B.

    1990-09-01

    Several models and theories are reviewed that incorporate the idea of radiation-induced lesions (repairable and/or irreparable) that can be related to molecular lesions in the DNA molecule. Usually the DNA double-strand or chromatin break is suggested as the critical lesion. In the models, the shoulder on the low-LET survival curve is hypothesized as being due to one (or more) of the following three mechanisms: (1) ``interaction`` of lesions produced by statistically independent particle tracks; (2) nonlinear (i.e., linear-quadratic) increase in the yield of initial lesions, and (3) saturation of repair processes at high dose. Comparisons are made between the various approaches. Several significant advances in model development are discussed; in particular, a description of the matrix formulation of the Markov versions of the RMR and LPL models is given. The more advanced theories have incorporated statistical fluctuations in various aspects of the energy-loss and lesion-formation process. An important direction is the inclusion of physical and chemical processes into the formulations by incorporating relevant track structure theory (Monte Carlo track simulations) and chemical reactions of radiation-induced radicals. At the biological end, identification of repair genes and how they operate as well as a better understanding of how DNA misjoinings lead to lethal chromosome aberrations are needed for appropriate inclusion into the theories. More effort is necessary to model the complex end point of radiation-induced carcinogenesis.

  17. Mechanistic models

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, S.B.

    1990-09-01

    Several models and theories are reviewed that incorporate the idea of radiation-induced lesions (repairable and/or irreparable) that can be related to molecular lesions in the DNA molecule. Usually the DNA double-strand or chromatin break is suggested as the critical lesion. In the models, the shoulder on the low-LET survival curve is hypothesized as being due to one (or more) of the following three mechanisms: (1) interaction'' of lesions produced by statistically independent particle tracks; (2) nonlinear (i.e., linear-quadratic) increase in the yield of initial lesions, and (3) saturation of repair processes at high dose. Comparisons are made between the various approaches. Several significant advances in model development are discussed; in particular, a description of the matrix formulation of the Markov versions of the RMR and LPL models is given. The more advanced theories have incorporated statistical fluctuations in various aspects of the energy-loss and lesion-formation process. An important direction is the inclusion of physical and chemical processes into the formulations by incorporating relevant track structure theory (Monte Carlo track simulations) and chemical reactions of radiation-induced radicals. At the biological end, identification of repair genes and how they operate as well as a better understanding of how DNA misjoinings lead to lethal chromosome aberrations are needed for appropriate inclusion into the theories. More effort is necessary to model the complex end point of radiation-induced carcinogenesis.

  18. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Directory of Open Access Journals (Sweden)

    R. Raj

    2018-01-01

    Full Text Available Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT, ratio of fine root carbon to leaf carbon (FRC : LC, ratio of carbon to nitrogen in leaf (C : Nleaf, canopy water interception coefficient (Wint, fraction of leaf nitrogen in RuBisCO (FLNR, and effective soil rooting depth (SD characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash–Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  19. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Science.gov (United States)

    Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred

    2018-01-01

    Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  20. Reflectance Modeling

    Science.gov (United States)

    Smith, J. A.; Cooper, K.; Randolph, M.

    1984-01-01

    A classical description of the one dimensional radiative transfer treatment of vegetation canopies was completed and the results were tested against measured prairie (blue grama) and agricultural canopies (soybean). Phase functions are calculated in terms of directly measurable biophysical characteristics of the canopy medium. While the phase functions tend to exhibit backscattering anisotropy, their exact behavior is somewhat more complex and wavelength dependent. A Monte Carlo model was developed that treats soil surfaces with large periodic variations in three dimensions. A photon-ray tracing technology is used. Currently, the rough soil surface is described by analytic functions and appropriate geometric calculations performed. A bidirectional reflectance distribution function is calculated and, hence, available for other atmospheric or canopy reflectance models as a lower boundary condition. This technique is used together with an adding model to calculate several cases where Lambertian leaves possessing anisotropic leaf angle distributions yield non-Lambertian reflectance; similar behavior is exhibited for simulated soil surfaces.

  1. Mathematical modeling

    CERN Document Server

    Eck, Christof; Knabner, Peter

    2017-01-01

    Mathematical models are the decisive tool to explain and predict phenomena in the natural and engineering sciences. With this book readers will learn to derive mathematical models which help to understand real world phenomena. At the same time a wealth of important examples for the abstract concepts treated in the curriculum of mathematics degrees are given. An essential feature of this book is that mathematical structures are used as an ordering principle and not the fields of application. Methods from linear algebra, analysis and the theory of ordinary and partial differential equations are thoroughly introduced and applied in the modeling process. Examples of applications in the fields electrical networks, chemical reaction dynamics, population dynamics, fluid dynamics, elasticity theory and crystal growth are treated comprehensively.

  2. Modelling language

    CERN Document Server

    Cardey, Sylviane

    2013-01-01

    In response to the need for reliable results from natural language processing, this book presents an original way of decomposing a language(s) in a microscopic manner by means of intra/inter‑language norms and divergences, going progressively from languages as systems to the linguistic, mathematical and computational models, which being based on a constructive approach are inherently traceable. Languages are described with their elements aggregating or repelling each other to form viable interrelated micro‑systems. The abstract model, which contrary to the current state of the art works in int

  3. Molecular modeling

    Directory of Open Access Journals (Sweden)

    Aarti Sharma

    2009-01-01

    Full Text Available The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and the exponential growth of the knowledge of protein structures have made it possible for organic compounds to be tailored to decrease the harmful side effects and increase the potency. This article provides a detailed description of the techniques employed in molecular modeling. Molecular modeling is a rapidly developing discipline, and has been supported by the dramatic improvements in computer hardware and software in recent years.

  4. Supernova models

    International Nuclear Information System (INIS)

    Woosley, S.E.; Weaver, T.A.

    1980-01-01

    Recent progress in understanding the observed properties of Type I supernovae as a consequence of the thermonuclear detonation of white dwarf stars and the ensuing decay of the 56 Ni produced therein is reviewed. Within the context of this model for Type I explosions and the 1978 model for Type II explosions, the expected nucleosynthesis and gamma-line spectra from both kinds of supernovae are presented. Finally, a qualitatively new approach to the problem of massive star death and Type II supernovae based upon a combination of rotation and thermonuclear burning is discussed

  5. Cadastral Modeling

    DEFF Research Database (Denmark)

    Stubkjær, Erik

    2005-01-01

    to the modeling of an industrial sector, as it aims at rendering the basic concepts that relate to the domain of real estate and the pertinent human activities. The palpable objects are pieces of land and buildings, documents, data stores and archives, as well as persons in their diverse roles as owners, holders...... to land. The paper advances the position that cadastral modeling has to include not only the physical objects, agents, and information sets of the domain, but also the objectives or requirements of cadastral systems....

  6. Efficacy of Balloon-Guiding Catheter for Mechanical Thrombectomy in Patients with Anterior Circulation Ischemic Stroke.

    Science.gov (United States)

    Oh, Jae-Sang; Yoon, Seok-Mann; Shim, Jai-Joon; Doh, Jae-Won; Bae, Hack-Gun; Lee, Kyeong-Seok

    2017-03-01

    To evaluate the efficacy of balloon guiding catheter (BGC) during thrombectomy in anterior circulation ischemic stroke. Sixty-two patients with acute anterior circulation ischemic stroke were treated with thrombectomy using a Solitaire stent from 2011 to 2016. Patients were divided into the BGC group (n=24, 39%) and the non-BGC group (n=38, 61%). The number of retrievals, procedure time, thrombolysis in cerebral infarction (TICI) grade, presence of distal emboli, and clinical outcomes at 3 months were evaluated. Successful recanalization was more frequent in BGC than in non-BGC (83% vs. 66%, p =0.13). Distal emboli occurred less in BGC than in non-BGC (23.1% vs. 57.1%, p =0.02). Good clinical outcome was more frequent in BGC than in non-BGC (50% vs. 16%, p =0.03). The multivariate analysis showed that use of BGC was the only independent predictor of good clinical outcome (odds ratio, 5.19: 95% confidence interval, 1.07-25.11). More patients in BGC were successfully recanalized in internal carotid artery (ICA) occlusion with small retrieval numbers (<3) than those in non-BGC (70% vs. 24%, p =0.005). In successfully recanalized ICA occlusion, distal emboli did not occur in BGC, whereas nine patients had distal emboli in non-BGC (0% vs. 75%, p =0.001) and good clinical outcome was superior in BGC than in non-BGC (55.6% vs. 8.3%, p =0.01). A BGC significantly reduces the number of retrievals and the occurrence of distal emboli, thereby resulting in better clinical outcomes in patients with anterior circulation ischemic stroke, particularly with ICA occlusion.

  7. Institutional Plan FY 1999-2003

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, P.J.

    1999-02-08

    Computational science is becoming an increasingly important component of Pacific Northwest's support to DOE's major missions. The advanced parallel computing systems in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), coupled with new modeling and simulation software, data management tools, and user interfaces, are providing solutions to previously intractable problems. Research focuses on developing software and other tools to address computational challenges in molecular science, environmental management, global climate change, advanced materials and manufacturing processes, molecular biology, and information management. The Graphics and Visualization Laboratory is part of EMSL'S Molecular Science Computing Facility (MSCF). The MSCF contains a 512-processor IBM RISC System/6000 scalable power parallel computer system that provides the advanced computing capability needed to address ''Grand Challenge'' environmental research problems. The MSCF provides an integrated computing environment with links to facilities in the DOE complex, universities, and industry. The image inserts are graphical representations of simulations performed with software developed at the Laboratory.

  8. (SSE) model

    African Journals Online (AJOL)

    Simple analytic polynomials have been proposed for estimating solar radiation in the traditional Northern, Central and Southern regions of Malawi. There is a strong agreement between the polynomials and the SSE model with R2 values of 0.988, 0.989 and 0.989 and root mean square errors of 0.061, 0.057 and 0.062 ...

  9. Lens Model

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory...

  10. Markov model

    Indian Academy of Sciences (India)

    pattern of the watershed LULC, leading to an accretive linear growth of agricultural and settlement areas. The annual rate of ... thereby advocates for better agricultural practices with additional energy subsidy to arrest further forest loss and LULC ...... automaton model and GIS: Long-term urban growth pre- diction for San ...

  11. Cheating models

    DEFF Research Database (Denmark)

    Arnoldi, Jakob

    The article discusses the use of algorithmic models for so-called High Frequency Trading (HFT) in finance. HFT is controversial yet widespread in modern financial markets. It is a form of automated trading technology which critics among other things claim can lead to market manipulation. Drawing ...

  12. Entrepreneurship Models.

    Science.gov (United States)

    Finger Lakes Regional Education Center for Economic Development, Mount Morris, NY.

    This guide describes seven model programs that were developed by the Finger Lakes Regional Center for Economic Development (New York) to meet the training needs of female and minority entrepreneurs to help their businesses survive and grow and to assist disabled and dislocated workers and youth in beginning small businesses. The first three models…

  13. The Model

    DEFF Research Database (Denmark)

    About the reconstruction of Palle Nielsen's (f. 1942) work The Model from 1968: a gigantic playground for children in the museum, where they can freely romp about, climb in ropes, crawl on wooden structures, work with tools, jump in foam rubber, paint with finger paints and dress up in costumes....

  14. Model Checking

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Successful modeling?

    Science.gov (United States)

    Lomnitz, Cinna

    Tichelaar and Ruff [1989] propose to “estimate model variance in complicated geophysical problems,” including the determination of focal depth in earthquakes, by means of unconventional statistical methods such as bootstrapping. They are successful insofar as they are able to duplicate the results from more conventional procedures.

  16. Molecular Modeling

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular Docking. Rama Rao Nadendla. General Article Volume 9 Issue 5 May 2004 pp 51-60. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Turbulence Model

    DEFF Research Database (Denmark)

    Nielsen, Mogens Peter; Shui, Wan; Johansson, Jens

    2011-01-01

    term with stresses depending linearly on the strain rates. This term takes into account the transfer of linear momentum from one part of the fluid to another. Besides there is another term, which takes into account the transfer of angular momentum. Thus the model implies a new definition of turbulence...

  18. Model-reduced inverse modeling

    NARCIS (Netherlands)

    Vermeulen, P.T.M.

    2006-01-01

    Although faster computers have been developed in recent years, they tend to be used to solve even more detailed problems. In many cases this will yield enormous models that can not be solved within acceptable time constraints. Therefore, there is a need for alternative methods that simulate such

  19. Building Models and Building Modelling

    DEFF Research Database (Denmark)

    Jørgensen, Kaj; Skauge, Jørn

    2008-01-01

    I rapportens indledende kapitel beskrives de primære begreber vedrørende bygningsmodeller og nogle fundamentale forhold vedrørende computerbaseret modulering bliver opstillet. Desuden bliver forskellen mellem tegneprogrammer og bygnings­model­lerings­programmer beskrevet. Vigtige aspekter om comp...

  20. Molecular Modelling

    Directory of Open Access Journals (Sweden)

    Aarti Sharma

    2009-12-01

    Full Text Available

    The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important
    tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and
    the exponential growth of the knowledge of protein structures have made it possible for organic compounds to tailored to
    decrease harmful side effects and increase the potency. This article provides a detailed description of the techniques
    employed in molecular modeling. Molecular modelling is a rapidly developing discipline, and has been supported from
    the dramatic improvements in computer hardware and software in recent years.

  1. Acyclic models

    CERN Document Server

    Barr, Michael

    2002-01-01

    Acyclic models is a method heavily used to analyze and compare various homology and cohomology theories appearing in topology and algebra. This book is the first attempt to put together in a concise form this important technique and to include all the necessary background. It presents a brief introduction to category theory and homological algebra. The author then gives the background of the theory of differential modules and chain complexes over an abelian category to state the main acyclic models theorem, generalizing and systemizing the earlier material. This is then applied to various cohomology theories in algebra and topology. The volume could be used as a text for a course that combines homological algebra and algebraic topology. Required background includes a standard course in abstract algebra and some knowledge of topology. The volume contains many exercises. It is also suitable as a reference work for researchers.

  2. RNICE Model

    DEFF Research Database (Denmark)

    Pedersen, Mogens Jin; Stritch, Justin Michael

    2018-01-01

    Replication studies relate to the scientific principle of replicability and serve the significant purpose of providing supporting (or contradicting) evidence regarding the existence of a phenomenon. However, replication has never been an integral part of public administration and management...... research. Recently, scholars have issued calls for more replication, but academic reflections on when replication adds substantive value to public administration and management research are needed. This concise article presents a conceptual model, RNICE, for assessing when and how a replication study...... contributes knowledge about a social phenomenon and advances knowledge in the public administration and management literatures. The RNICE model provides a vehicle for researchers who seek to evaluate or demonstrate the value of a replication study systematically. We illustrate the practical application...

  3. Persistent Modelling

    DEFF Research Database (Denmark)

    2012-01-01

    on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered......The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary...

  4. Persistent Modelling

    DEFF Research Database (Denmark)

    on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered......The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary...

  5. Modeling Minds

    DEFF Research Database (Denmark)

    Michael, John

    others' minds. Then (2), in order to bring to light some possible justifications, as well as hazards and criticisms of the methodology of looking time tests, I will take a closer look at the concept of folk psychology and will focus on the idea that folk psychology involves using oneself as a model...... of other people in order to predict and understand their behavior. Finally (3), I will discuss the historical location and significance of the emergence of looking time tests...

  6. Hydroballistics Modeling

    Science.gov (United States)

    1975-01-01

    detailed rendered visible in his photographs by streams of photographs of spheres entering the water small bubbles from electrolysis . So far as is...of the cavity is opaque or, brined wihile the sphere wats still in the oil. At if translucent, the contrast between thle jet and about the time the...and brass, for example) should be so model velocity scale according to Equation 1.18, selected that electrolysis is not a problem. the addition of

  7. Biomimetic modelling.

    OpenAIRE

    Vincent, Julian F V

    2003-01-01

    Biomimetics is seen as a path from biology to engineering. The only path from engineering to biology in current use is the application of engineering concepts and models to biological systems. However, there is another pathway: the verification of biological mechanisms by manufacture, leading to an iterative process between biology and engineering in which the new understanding that the engineering implementation of a biological system can bring is fed back into biology, allowing a more compl...

  8. Modelling Behaviour

    DEFF Research Database (Denmark)

    This book reflects and expands on the current trend in the building industry to understand, simulate and ultimately design buildings by taking into consideration the interlinked elements and forces that act on them. This approach overcomes the traditional, exclusive focus on building tasks, while....... The chapter authors were invited speakers at the 5th Symposium "Modelling Behaviour", which took place at the CITA in Copenhagen in September 2015....

  9. Combustor Modelling

    Science.gov (United States)

    1980-02-01

    a teuto 014aceo 0-oiuato 4 ajj 210- I 14 *Experiments l~~lamCID - l2 C15 model+ Aida ditane &Gray medium K .2 a Experiments hont target n-IO a0 deawa...possibilita di valutazione dello scambio termico in focolai di caldaie per ricaldamento"I Atti E Rassegna Tecnica Societa ingegneri e arc~hitetti in Torino

  10. Persistent Modelling

    DEFF Research Database (Denmark)

    practice: the duration of active influence that representation can hold in relation to the represented; the means, methods and media through which representations are constructed and used; and what it is that is being represented. Featuring contributions from some of the world’s most advanced thinkers....... It also provides critical insight into the use of contemporary modelling tools and methods, together with an examination of the implications their use has within the territories of architectural design, realisation and experience....

  11. Ozone modeling

    International Nuclear Information System (INIS)

    McIllvaine, C.M.

    1994-01-01

    Exhaust gases from power plants that burn fossil fuels contain concentrations of sulfur dioxide (SO 2 ), nitric oxide (NO), particulate matter, hydrocarbon compounds and trace metals. Estimated emissions from the operation of a hypothetical 500 MW coal-fired power plant are given. Ozone is considered a secondary pollutant, since it is not emitted directly into the atmosphere but is formed from other air pollutants, specifically, nitrogen oxides (NO), and non-methane organic compounds (NMOQ) in the presence of sunlight. (NMOC are sometimes referred to as hydrocarbons, HC, or volatile organic compounds, VOC, and they may or may not include methane). Additionally, ozone formation Alternative is a function of the ratio of NMOC concentrations to NO x concentrations. A typical ozone isopleth is shown, generated with the Empirical Kinetic Modeling Approach (EKMA) option of the Environmental Protection Agency's (EPA) Ozone Isopleth Plotting Mechanism (OZIPM-4) model. Ozone isopleth diagrams, originally generated with smog chamber data, are more commonly generated with photochemical reaction mechanisms and tested against smog chamber data. The shape of the isopleth curves is a function of the region (i.e. background conditions) where ozone concentrations are simulated. The location of an ozone concentration on the isopleth diagram is defined by the ratio of NMOC and NO x coordinates of the point, known as the NMOC/NO x ratio. Results obtained by the described model are presented

  12. Modeling biomembranes.

    Energy Technology Data Exchange (ETDEWEB)

    Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas

    2005-11-01

    Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.

  13. Object Modeling and Building Information Modeling

    OpenAIRE

    Auråen, Hege; Gjemdal, Hanne

    2016-01-01

    The main part of this thesis is an online course (Small Private Online Course) entitled "Introduction to Object Modeling and Building Information Modeling". This supplementary report clarifies the choices made in the process of developing the course. The course examines the basic concepts of object modeling, modeling techniques and a modeling language ​​(UML). Further, building information modeling (BIM) is presented as a modeling process, and the object modeling concepts in the BIM softw...

  14. DTN Modeling in OPNET Modeler

    Directory of Open Access Journals (Sweden)

    PAPAJ Jan

    2014-05-01

    Full Text Available Traditional wireless networks use the concept of the point-to-point forwarding inherited from reliable wired networks which seems to be not ideal for wireless environment. New emerging applications and networks operate mostly disconnected. So-called Delay-Tolerant networks (DTNs are receiving increasing attentions from both academia and industry. DTNs introduced a store-carry-and-forward concept solving the problem of intermittent connectivity. Behavior of such networks is verified by real models, computer simulation or combination of the both approaches. Computer simulation has become the primary and cost effective tool for evaluating the performance of the DTNs. OPNET modeler is our target simulation tool and we wanted to spread OPNET’s simulation opportunity towards DTN. We implemented bundle protocol to OPNET modeler allowing simulate cases based on bundle concept as epidemic forwarding which relies on flooding the network with messages and the forwarding algorithm based on the history of past encounters (PRoPHET. The implementation details will be provided in article.

  15. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  16. Model Checking Algorithms for Markov Reward Models

    NARCIS (Netherlands)

    Cloth, Lucia; Cloth, L.

    2006-01-01

    Model checking Markov reward models unites two different approaches of model-based system validation. On the one hand, Markov reward models have a long tradition in model-based performance and dependability evaluation. On the other hand, a formal method like model checking allows for the precise

  17. Modelling Defiguration

    DEFF Research Database (Denmark)

    Bork Petersen, Franziska

    2013-01-01

    focus centres on how the catwalk scenography evokes a ‘defiguration’ of the walking models and to what effect. Vibskov’s mobile catwalk draws attention to the walk, which is a key element of models’ performance but which usually functions in fashion shows merely to present clothes in the most...... catwalks. Vibskov’s catwalk induces what the dance scholar Gabriele Brandstetter has labelled a ‘defigurative choregoraphy’: a straying from definitions, which exist in ballet as in other movement-based genres, of how a figure should move and appear (1998). The catwalk scenography in this instance...

  18. Students' Models of Curve Fitting: A Models and Modeling Perspective

    Science.gov (United States)

    Gupta, Shweta

    2010-01-01

    The Models and Modeling Perspectives (MMP) has evolved out of research that began 26 years ago. MMP researchers use Model Eliciting Activities (MEAs) to elicit students' mental models. In this study MMP was used as the conceptual framework to investigate the nature of students' models of curve fitting in a problem-solving environment consisting of…

  19. ALEPH model

    CERN Multimedia

    1989-01-01

    A wooden model of the ALEPH experiment and its cavern. ALEPH was one of 4 experiments at CERN's 27km Large Electron Positron collider (LEP) that ran from 1989 to 2000. During 11 years of research, LEP's experiments provided a detailed study of the electroweak interaction. Measurements performed at LEP also proved that there are three – and only three – generations of particles of matter. LEP was closed down on 2 November 2000 to make way for the construction of the Large Hadron Collider in the same tunnel. The cavern and detector are in separate locations - the cavern is stored at CERN and the detector is temporarily on display in Glasgow physics department. Both are available for loan.

  20. Vabakonna kristallkuul on rääkinud: tulevikugrupi õpetussõnad tulevikuga toimetulekuks

    Index Scriptorium Estoniae

    2014-01-01

    Mittetulundusühingute ja sihtasutuste liidu (EMSL) ning Praxise korraldatud vabakonna tulevikutrendide inspiratsiooniseminaril võeti kokku vabaühenduste tulevikugrupi projekt. EMSL-i ja Praxise nõuanded ühendustele, kes soovivad tulevikutrendid oma organisatsiooni jaoks tööle panna

  1. Status Report on the Development of Research Campaigns

    Energy Technology Data Exchange (ETDEWEB)

    Baer, Donald R.; Baker, Scott E.; Washton, Nancy M.; Linggi, Bryan E.

    2013-06-30

    Research campaigns were conceived as a means to focus EMSL research on specific scientific questions. Campaign will help fulfill the Environmental Molecular Sciences Laboratory (EMSL) strategic vision to develop and integrate, for use by the scientific community, world leading capabilities that transform understanding in the environmental molecular sciences and accelerate discoveries relevant to the Department of Energy’s (DOE’s) missions. Campaigns are multi-institutional multi-disciplinary projects with scope beyond those of normal EMSL user projects. The goal of research campaigns is to have EMSL scientists and users team on the projects in the effort to accelerate progress and increase impact in specific scientific areas by focusing user research, EMSL resources, and expertise in those areas. This report will give a history and update on the progress of those campaigns.

  2. The IMACLIM model; Le modele IMACLIM

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This document provides annexes to the IMACLIM model which propose an actualized description of IMACLIM, model allowing the design of an evaluation tool of the greenhouse gases reduction policies. The model is described in a version coupled with the POLES, technical and economical model of the energy industry. Notations, equations, sources, processing and specifications are proposed and detailed. (A.L.B.)

  3. Building Mental Models by Dissecting Physical Models

    Science.gov (United States)

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  4. Atmospheric Models/Global Atmospheric Modeling

    Science.gov (United States)

    1998-09-30

    Atmospheric Models /Global Atmospheric Modeling Timothy F. Hogan Naval Research Laboratory Monterey, CA 93943-5502 phone: (831) 656-4705 fax: (831...to 00-00-1998 4. TITLE AND SUBTITLE Atmospheric Models /Global Atmospheric Modeling 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...initialization of increments, improved cloud prediction, and improved surface fluxes) have been transition to 6.4 (Global Atmospheric Models , PE 0603207N, X-0513

  5. Models in architectural design

    OpenAIRE

    Pauwels, Pieter

    2017-01-01

    Whereas architects and construction specialists used to rely mainly on sketches and physical models as representations of their own cognitive design models, they rely now more and more on computer models. Parametric models, generative models, as-built models, building information models (BIM), and so forth, they are used daily by any practitioner in architectural design and construction. Although processes of abstraction and the actual architectural model-based reasoning itself of course rema...

  6. Rotating universe models

    International Nuclear Information System (INIS)

    Tozini, A.V.

    1984-01-01

    A review is made of some properties of the rotating Universe models. Godel's model is identified as a generalized filted model. Some properties of new solutions of the Einstein's equations, which are rotating non-stationary Universe models, are presented and analyzed. These models have the Godel's model as a particular case. Non-stationary cosmological models are found which are a generalization of the Godel's metrics in an analogous way in which Friedmann is to the Einstein's model. (L.C.) [pt

  7. Wake Expansion Models

    DEFF Research Database (Denmark)

    Branlard, Emmanuel Simon Pierre

    2017-01-01

    Different models of wake expansion are presented in this chapter: the 1D momentum theory model, the cylinder analog model and Theodorsen’s model. Far wake models such as the ones from Frandsen or Rathmann or only briefly mentioned. The different models are compared to each other. Results from...

  8. Model Manipulation for End-User Modelers

    DEFF Research Database (Denmark)

    Acretoaie, Vlad

    End-user modelers are domain experts who create and use models as part of their work. They are typically not Software Engineers, and have little or no programming and meta-modeling experience. However, using model manipulation languages developed in the context of Model-Driven Engineering often...... of these proposals. To achieve its first goal, the thesis presents the findings of a Systematic Mapping Study showing that human factors topics are scarcely and relatively poorly addressed in model transformation research. Motivated by these findings, the thesis explores the requirements of end-user modelers......, and transformations using their modeling notation and editor of choice. The VM* languages are implemented via a single execution engine, the VM* Runtime, built on top of the Henshin graph-based transformation engine. This approach combines the benefits of flexibility, maturity, and formality. To simplify model editor...

  9. Model-to-model interface for multiscale materials modeling

    Energy Technology Data Exchange (ETDEWEB)

    Antonelli, Perry Edward [Iowa State Univ., Ames, IA (United States)

    2017-12-17

    A low-level model-to-model interface is presented that will enable independent models to be linked into an integrated system of models. The interface is based on a standard set of functions that contain appropriate export and import schemas that enable models to be linked with no changes to the models themselves. These ideas are presented in the context of a specific multiscale material problem that couples atomistic-based molecular dynamics calculations to continuum calculations of fluid ow. These simulations will be used to examine the influence of interactions of the fluid with an adjacent solid on the fluid ow. The interface will also be examined by adding it to an already existing modeling code, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and comparing it with our own molecular dynamics code.

  10. Concept Modeling vs. Data modeling in Practice

    DEFF Research Database (Denmark)

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2015-01-01

    This chapter shows the usefulness of terminological concept modeling as a first step in data modeling. First, we introduce terminological concept modeling with terminological ontologies, i.e. concept systems enriched with characteristics modeled as feature specifications. This enables a formal...... account of the inheritance of characteristics and allows us to introduce a number of principles and constraints which render concept modeling more coherent than earlier approaches. Second, we explain how terminological ontologies can be used as the basis for developing conceptual and logical data models...

  11. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  12. Business Model Innovation

    OpenAIRE

    Dodgson, Mark; Gann, David; Phillips, Nelson; Massa, Lorenzo; Tucci, Christopher

    2014-01-01

    The chapter offers a broad review of the literature at the nexus between Business Models and innovation studies, and examines the notion of Business Model Innovation in three different situations: Business Model Design in newly formed organizations, Business Model Reconfiguration in incumbent firms, and Business Model Innovation in the broad context of sustainability. Tools and perspectives to make sense of Business Models and support managers and entrepreneurs in dealing with Business Model ...

  13. Air Quality Dispersion Modeling - Alternative Models

    Science.gov (United States)

    Models, not listed in Appendix W, that can be used in regulatory applications with case-by-case justification to the Reviewing Authority as noted in Section 3.2, Use of Alternative Models, in Appendix W.

  14. Wake modelling combining mesoscale and microscale models

    DEFF Research Database (Denmark)

    Badger, Jake; Volker, Patrick; Prospathospoulos, J.

    2013-01-01

    parameterizations are demonstrated in theWeather Research and Forecasting mesoscale model (WRF) in an idealized atmospheric flow. The model framework is the Horns Rev I wind farm experiencing an 7.97 m/s wind from 269.4o. Three of the four parameterizations use thrust output from the CRESflow-NS microscale model......In this paper the basis for introducing thrust information from microscale wake models into mesocale model wake parameterizations will be described. A classification system for the different types of mesoscale wake parameterizations is suggested and outlined. Four different mesoscale wake....... The characteristics of the mesoscale wake that developed from the four parameterizations are examined. In addition the mesoscale model wakes are compared to measurement data from Horns Rev I. Overall it is seen as an advantage to incorporate microscale model data in mesocale model wake parameterizations....

  15. A Model of Trusted Measurement Model

    OpenAIRE

    Ma Zhili; Wang Zhihao; Dai Liang; Zhu Xiaoqin

    2017-01-01

    A model of Trusted Measurement supporting behavior measurement based on trusted connection architecture (TCA) with three entities and three levels is proposed, and a frame to illustrate the model is given. The model synthesizes three trusted measurement dimensions including trusted identity, trusted status and trusted behavior, satisfies the essential requirements of trusted measurement, and unified the TCA with three entities and three levels.

  16. Molecular Models: Construction of Models with Magnets

    Directory of Open Access Journals (Sweden)

    Kalinovčić P.

    2015-07-01

    Full Text Available Molecular models are indispensable tools in teaching chemistry. Beside their high price, commercially available models are generally too small for classroom demonstration. This paper suggests how to make space-filling (callote models from Styrofoam with magnetic balls as connectors and disc magnets for showing molecular polarity

  17. Dextran hydrogels incorporated with bioactive glass-ceramic: Nanocomposite scaffolds for bone tissue engineering.

    Science.gov (United States)

    Nikpour, Parisa; Salimi-Kenari, Hamed; Fahimipour, Farahnaz; Rabiee, Sayed Mahmood; Imani, Mohammad; Dashtimoghadam, Erfan; Tayebi, Lobat

    2018-06-15

    A series of nanocomposite scaffolds comprised of dextran (Dex) and sol-gel derived bioactive glass ceramic nanoparticles (nBGC: 0-16 (wt%)) were fabricated as bioactive scaffolds for bone tissue engineering. Scanning electron microscopy showed Dex/nBGC scaffolds were consisting of a porous 3D microstructure with an average pore size of 240 μm. Energy-dispersive x-ray spectroscopy illustrated nBGC nanoparticles were homogenously distributed within the Dex matrix at low nBGC content (2 wt%), while agglomeration was observed at higher nBGC contents. It was found that the osmotic pressure and nBGC agglomeration at higher nBGC contents leads to increased water uptake, then reduction of the compressive modulus. Bioactivity of Dex/nBGC scaffolds was validated through apatite formation after submersion in the simulated body fluid. Dex/nBGC composite scaffolds were found to show improved human osteoblasts (HOBs) proliferation and alkaline phosphatase (ALP) activity with increasing nBGC content up to 16 (wt%) over two weeks. Owing to favorable physicochemical and bioactivity properties, the Dex/nBGC composite hydrogels can be offered as promising bioactive scaffolds for bone tissue engineering applications. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Forest M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bochev, Pavel B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cameron-Smith, Philip J.. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Easter, Richard C [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Elliott, Scott M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ghan, Steven J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Xiaohong [Univ. of Wyoming, Laramie, WY (United States); Lowrie, Robert B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ma, Po-lun [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sacks, William J. [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Shrivastava, Manish [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Singh, Balwinder [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tautges, Timothy J. [Argonne National Lab. (ANL), Argonne, IL (United States); Taylor, Mark A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Vertenstein, Mariana [National Center for Atmospheric Research (NCAR), Boulder, CO (United States); Worley, Patrick H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-15

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  19. Target Scattering Metrics: Model-Model and Model Data comparisons

    Science.gov (United States)

    2017-12-13

    be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for...stainless steel replica of artillery shell Table 7. Targets used in the TIER simulations for the metrics study. C. Four Potential Metrics: Four...Four metrics were investigated. The metric, based on 2D cross-correlation, is typically used in classification algorithms. Model-model comparisons

  20. Modelling binary data

    CERN Document Server

    Collett, David

    2002-01-01

    INTRODUCTION Some Examples The Scope of this Book Use of Statistical Software STATISTICAL INFERENCE FOR BINARY DATA The Binomial Distribution Inference about the Success Probability Comparison of Two Proportions Comparison of Two or More Proportions MODELS FOR BINARY AND BINOMIAL DATA Statistical Modelling Linear Models Methods of Estimation Fitting Linear Models to Binomial Data Models for Binomial Response Data The Linear Logistic Model Fitting the Linear Logistic Model to Binomial Data Goodness of Fit of a Linear Logistic Model Comparing Linear Logistic Models Linear Trend in Proportions Comparing Stimulus-Response Relationships Non-Convergence and Overfitting Some other Goodness of Fit Statistics Strategy for Model Selection Predicting a Binary Response Probability BIOASSAY AND SOME OTHER APPLICATIONS The Tolerance Distribution Estimating an Effective Dose Relative Potency Natural Response Non-Linear Logistic Regression Models Applications of the Complementary Log-Log Model MODEL CHECKING Definition of Re...

  1. Modelling of Hydraulic Robot

    DEFF Research Database (Denmark)

    Madsen, Henrik; Zhou, Jianjun; Hansen, Lars Henrik

    1997-01-01

    This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application...

  2. Automated data model evaluation

    International Nuclear Information System (INIS)

    Kazi, Zoltan; Kazi, Ljubica; Radulovic, Biljana

    2012-01-01

    Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation

  3. Elastic Appearance Models

    DEFF Research Database (Denmark)

    Hansen, Mads Fogtmann; Fagertun, Jens; Larsen, Rasmus

    2011-01-01

    This paper presents a fusion of the active appearance model (AAM) and the Riemannian elasticity framework which yields a non-linear shape model and a linear texture model – the active elastic appearance model (EAM). The non-linear elasticity shape model is more flexible than the usual linear subs...

  4. Forest-fire models

    Science.gov (United States)

    Haiganoush Preisler; Alan Ager

    2013-01-01

    For applied mathematicians forest fire models refer mainly to a non-linear dynamic system often used to simulate spread of fire. For forest managers forest fire models may pertain to any of the three phases of fire management: prefire planning (fire risk models), fire suppression (fire behavior models), and postfire evaluation (fire effects and economic models). In...

  5. "Bohr's Atomic Model."

    Science.gov (United States)

    Willden, Jeff

    2001-01-01

    "Bohr's Atomic Model" is a small interactive multimedia program that introduces the viewer to a simplified model of the atom. This interactive simulation lets students build an atom using an atomic construction set. The underlying design methodology for "Bohr's Atomic Model" is model-centered instruction, which means the central model of the…

  6. From Numeric Models to Granular System Modeling

    Directory of Open Access Journals (Sweden)

    Witold Pedrycz

    2015-03-01

    To make this study self-contained, we briefly recall the key concepts of granular computing and demonstrate how this conceptual framework and its algorithmic fundamentals give rise to granular models. We discuss several representative formal setups used in describing and processing information granules including fuzzy sets, rough sets, and interval calculus. Key architectures of models dwell upon relationships among information granules. We demonstrate how information granularity and its optimization can be regarded as an important design asset to be exploited in system modeling and giving rise to granular models. With this regard, an important category of rule-based models along with their granular enrichments is studied in detail.

  7. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    Clayton, R.

    2000-01-01

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  8. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  9. Mathematical Modeling Using MATLAB

    National Research Council Canada - National Science Library

    Phillips, Donovan

    1998-01-01

    .... Mathematical Modeling Using MA MATLAB acts as a companion resource to A First Course in Mathematical Modeling with the goal of guiding the reader to a fuller understanding of the modeling process...

  10. Energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-01-01

    Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...

  11. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...

  12. N-Gram models

    NARCIS (Netherlands)

    Hiemstra, Djoerd; Liu, Ling; Tamer Özsu, M.

    2017-01-01

    In language modeling, n-gram models are probabilistic models of text that use some limited amount of history, or word dependencies, where n refers to the number of words that participate in the dependence relation.

  13. Business Model Canvas

    OpenAIRE

    Souza, D', Austin

    2013-01-01

    Presentatie gegeven op 13 mei 2013 op de bijeenkomst "Business Model Canvas Challenge Assen". Het Business Model Canvas is ontworpen door Alex Osterwalder. Het model werkt zeer overzichtelijk en bestaat uit negen bouwstenen.

  14. Wildfire Risk Main Model

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...

  15. Lapse Rate Modeling

    DEFF Research Database (Denmark)

    De Giovanni, Domenico

    prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...

  16. Lapse rate modeling

    DEFF Research Database (Denmark)

    De Giovanni, Domenico

    2010-01-01

    prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...

  17. Quintessence Model Building

    OpenAIRE

    Brax, P.; Martin, J.; Riazuelo, A.

    2001-01-01

    A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric origin. A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric origin. A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric o...

  18. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  19. Overuse Injury Assessment Model

    National Research Council Canada - National Science Library

    Stuhmiller, James H; Shen, Weixin; Sih, Bryant

    2005-01-01

    .... Previously, we developed a preliminary model that predicted the stress fracture rate and used biomechanical modeling, nonlinear optimization for muscle force, and bone structural analysis to estimate...

  20. Multilevel modeling using R

    CERN Document Server

    Finch, W Holmes; Kelley, Ken

    2014-01-01

    A powerful tool for analyzing nested designs in a variety of fields, multilevel/hierarchical modeling allows researchers to account for data collected at multiple levels. Multilevel Modeling Using R provides you with a helpful guide to conducting multilevel data modeling using the R software environment.After reviewing standard linear models, the authors present the basics of multilevel models and explain how to fit these models using R. They then show how to employ multilevel modeling with longitudinal data and demonstrate the valuable graphical options in R. The book also describes models fo

  1. Cosmological models without singularities

    International Nuclear Information System (INIS)

    Petry, W.

    1981-01-01

    A previously studied theory of gravitation in flat space-time is applied to homogeneous and isotropic cosmological models. There exist two different classes of models without singularities: (i) ever-expanding models, (ii) oscillating models. The first class contains models with hot big bang. For these models there exist at the beginning of the universe-in contrast to Einstein's theory-very high but finite densities of matter and radiation with a big bang of very short duration. After short time these models pass into the homogeneous and isotropic models of Einstein's theory with spatial curvature equal to zero and cosmological constant ALPHA >= O. (author)

  2. Validated dynamic flow model

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2011-01-01

    model structure suggested by University of Lund the WP4 leader. This particular model structure has the advantages that it fits better into the control design frame work used by WP3-4 compared to the model structures previously developed in WP2. The different model structures are first summarised....... Then issues dealing with optimal experimental design is considered. Finally the parameters are estimated in the chosen static and dynamic models and a validation is performed. Two of the static models, one of them the additive model, explains the data well. In case of dynamic models the suggested additive...

  3. Environmental Modeling Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...

  4. TRACKING CLIMATE MODELS

    Data.gov (United States)

    National Aeronautics and Space Administration — CLAIRE MONTELEONI*, GAVIN SCHMIDT, AND SHAILESH SAROHA* Climate models are complex mathematical models designed by meteorologists, geophysicists, and climate...

  5. Regularized Structural Equation Modeling

    Science.gov (United States)

    Jacobucci, Ross; Grimm, Kevin J.; McArdle, John J.

    2016-01-01

    A new method is proposed that extends the use of regularization in both lasso and ridge regression to structural equation models. The method is termed regularized structural equation modeling (RegSEM). RegSEM penalizes specific parameters in structural equation models, with the goal of creating easier to understand and simpler models. Although regularization has gained wide adoption in regression, very little has transferred to models with latent variables. By adding penalties to specific parameters in a structural equation model, researchers have a high level of flexibility in reducing model complexity, overcoming poor fitting models, and the creation of models that are more likely to generalize to new samples. The proposed method was evaluated through a simulation study, two illustrative examples involving a measurement model, and one empirical example involving the structural part of the model to demonstrate RegSEM’s utility. PMID:27398019

  6. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  7. Better models are more effectively connected models

    Science.gov (United States)

    Nunes, João Pedro; Bielders, Charles; Darboux, Frederic; Fiener, Peter; Finger, David; Turnbull-Lloyd, Laura; Wainwright, John

    2016-04-01

    The concept of hydrologic and geomorphologic connectivity describes the processes and pathways which link sources (e.g. rainfall, snow and ice melt, springs, eroded areas and barren lands) to accumulation areas (e.g. foot slopes, streams, aquifers, reservoirs), and the spatial variations thereof. There are many examples of hydrological and sediment connectivity on a watershed scale; in consequence, a process-based understanding of connectivity is crucial to help managers understand their systems and adopt adequate measures for flood prevention, pollution mitigation and soil protection, among others. Modelling is often used as a tool to understand and predict fluxes within a catchment by complementing observations with model results. Catchment models should therefore be able to reproduce the linkages, and thus the connectivity of water and sediment fluxes within the systems under simulation. In modelling, a high level of spatial and temporal detail is desirable to ensure taking into account a maximum number of components, which then enables connectivity to emerge from the simulated structures and functions. However, computational constraints and, in many cases, lack of data prevent the representation of all relevant processes and spatial/temporal variability in most models. In most cases, therefore, the level of detail selected for modelling is too coarse to represent the system in a way in which connectivity can emerge; a problem which can be circumvented by representing fine-scale structures and processes within coarser scale models using a variety of approaches. This poster focuses on the results of ongoing discussions on modelling connectivity held during several workshops within COST Action Connecteur. It assesses the current state of the art of incorporating the concept of connectivity in hydrological and sediment models, as well as the attitudes of modellers towards this issue. The discussion will focus on the different approaches through which connectivity

  8. Modelling bankruptcy prediction models in Slovak companies

    Directory of Open Access Journals (Sweden)

    Kovacova Maria

    2017-01-01

    Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.

  9. Generalized latent variable modeling multilevel, longitudinal, and structural equation models

    CERN Document Server

    Skrondal, Anders; Rabe-Hesketh, Sophia

    2004-01-01

    This book unifies and extends latent variable models, including multilevel or generalized linear mixed models, longitudinal or panel models, item response or factor models, latent class or finite mixture models, and structural equation models.

  10. Biosphere Model Report

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)

  11. Biosphere Model Report

    Energy Technology Data Exchange (ETDEWEB)

    D. W. Wu

    2003-07-16

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).

  12. Biosphere Model Report

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-10-27

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).

  13. Lumped Thermal Household Model

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob

    2013-01-01

    a lumped model approach as an alternative to the individual models. In the lumped model, the portfolio is seen as baseline consumption superimposed with an ideal storage of limited power and energy capacity. The benefit of such a lumped model is that the computational effort of flexibility optimization...

  14. The Moody Mask Model

    DEFF Research Database (Denmark)

    Larsen, Bjarke Alexander; Andkjær, Kasper Ingdahl; Schoenau-Fog, Henrik

    2015-01-01

    This paper proposes a new relation model, called "The Moody Mask model", for Interactive Digital Storytelling (IDS), based on Franceso Osborne's "Mask Model" from 2011. This, mixed with some elements from Chris Crawford's Personality Models, is a system designed for dynamic interaction between ch...

  15. The Model Confidence Set

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.

    The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The MCS...

  16. AIDS Epidemiological models

    Science.gov (United States)

    Rahmani, Fouad Lazhar

    2010-11-01

    The aim of this paper is to present mathematical modelling of the spread of infection in the context of the transmission of the human immunodeficiency virus (HIV) and the acquired immune deficiency syndrome (AIDS). These models are based in part on the models suggested in the field of th AIDS mathematical modelling as reported by ISHAM [6].

  17. Numerical Modelling of Streams

    DEFF Research Database (Denmark)

    Vestergaard, Kristian

    In recent years there has been a sharp increase in the use of numerical water quality models. Numeric water quality modeling can be divided into three steps: Hydrodynamic modeling for the determination of stream flow and water levels. Modelling of transport and dispersion of a conservative...

  18. A Model for Conversation

    DEFF Research Database (Denmark)

    Ayres, Phil

    2012-01-01

    This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...

  19. Generic Market Models

    NARCIS (Netherlands)

    R. Pietersz (Raoul); M. van Regenmortel

    2005-01-01

    textabstractCurrently, there are two market models for valuation and risk management of interest rate derivatives, the LIBOR and swap market models. In this paper, we introduce arbitrage-free constant maturity swap (CMS) market models and generic market models featuring forward rates that span

  20. Modeling the Accidental Deaths

    Directory of Open Access Journals (Sweden)

    Mariyam Hafeez

    2008-01-01

    Full Text Available The model for accidental deaths in the city of Lahore has been developed by using a class of Generalized Linear Models. Various link functions have been used in developing the model. The diagnostic checks have been carried out to see the validity of the fitted model.

  1. Cultural Resource Predictive Modeling

    Science.gov (United States)

    2017-10-01

    refining formal, inductive predictive models is the quality of the archaeological and environmental data. To build models efficiently, relevant...geomorphology, and historic information . Lessons Learned: The original model was focused on the identification of prehistoric resources. This...system but uses predictive modeling informally . For example, there is no probability for buried archaeological deposits on the Burton Mesa, but there is

  2. Modelling Railway Interlocking Systems

    DEFF Research Database (Denmark)

    Lindegaard, Morten Peter; Viuf, P.; Haxthausen, Anne Elisabeth

    2000-01-01

    In this report we present a model of interlocking systems, and describe how the model may be validated by simulation. Station topologies are modelled by graphs in which the nodes denote track segments, and the edges denote connectivity for train traÆc. Points and signals are modelled by annotatio...

  3. Lumped-parameter models

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)

  4. Comparing Active Vision Models

    NARCIS (Netherlands)

    Croon, G.C.H.E. de; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.

    2009-01-01

    Active vision models can simplify visual tasks, provided that they can select sensible actions given incoming sensory inputs. Many active vision models have been proposed, but a comparative evaluation of these models is lacking. We present a comparison of active vision models from two different

  5. Comparing active vision models

    NARCIS (Netherlands)

    Croon, G.C.H.E. de; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.

    2009-01-01

    Active vision models can simplify visual tasks, provided that they can select sensible actions given incoming sensory inputs. Many active vision models have been proposed, but a comparative evaluation of these models is lacking. We present a comparison of active vision models from two different

  6. White Paper on Modelling

    NARCIS (Netherlands)

    Van Bloemendaal, Karen; Dijkema, Gerard P.J.; Woerdman, Edwin; Jong, Mattheus

    2015-01-01

    This White Paper provides an overview of the modelling approaches adopted by the project partners in the EDGaR project 'Understanding Gas Sector Intra- and Inter- Market interactions' (UGSIIMI). The paper addresses three types of models: complementarity modelling, agent-based modelling and property

  7. Environmental assessment for the resiting, construction, and operation of the Environmental and Molecular Sciences Laboratory at the Hanford Site, Richland, Washington

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This environmental assessment (EA) presents estimated environmental impacts from the resiting, construction, and operation of the US Department of Energy`s (DOE`s) Environmental and Molecular Sciences Laboratory (EMSL), which is proposed to be constructed and operated on land near the south boundary of the Hanford Site near Richland, Washington. The EMSL, if constructed, would be a modern research facility in which experimental, theoretical, and computational techniques can be focused on environmental restoration problems, such as the chemical and transport behavior of complex mixtures of contaminants in the environment. The EMSL design includes approximately 18,500 square meters (200,000 square feet) of floor space on a 12-hectare (30-acre) site. The proposed new site is located within the city limits of Richland in north Richland, at the south end of DOE`s 300 Area, on land to be deeded to the US by the Battelle Memorial Institute. Approximately 200 persons are expected to be employed in the EMSL and approximately 60 visiting scientists may be working in the EMSL at any given time. State-of-the-art equipment is expected to be installed and used in the EMSL. Small amounts of hazardous substances (chemicals and radionuclides) are expected to be used in experimental work in the EMSL.

  8. A Model for Conversation

    DEFF Research Database (Denmark)

    Ayres, Phil

    2012-01-01

    This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...... of design. Three distinctions are drawn through which to develop this discussion of models in an architectural context. An examination of these distinctions serves to nuance particular characteristics and roles of models, the modelling activity itself and those engaged in it....

  9. Wastewater treatment models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2011-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...

  10. Wastewater Treatment Models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2008-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... the practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...

  11. Applied stochastic modelling

    CERN Document Server

    Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P

    2008-01-01

    Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...

  12. The Hospitable Meal Model

    DEFF Research Database (Denmark)

    Justesen, Lise; Overgaard, Svend Skafte

    2017-01-01

    -ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored......This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open...

  13. Calibrated Properties Model

    Energy Technology Data Exchange (ETDEWEB)

    C. Ahlers; H. Liu

    2000-03-12

    The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the ''AMR Development Plan for U0035 Calibrated Properties Model REV00. These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions.

  14. Calibrated Properties Model

    Energy Technology Data Exchange (ETDEWEB)

    C.F. Ahlers, H.H. Liu

    2001-12-18

    The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the AMR Development Plan for U0035 Calibrated Properties Model REV00 (CRWMS M&O 1999c). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions.

  15. The Hospitable Meal Model

    DEFF Research Database (Denmark)

    Justesen, Lise; Overgaard, Svend Skafte

    2017-01-01

    This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open......-ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored...

  16. MulensModel: Microlensing light curves modeling

    Science.gov (United States)

    Poleski, Radoslaw; Yee, Jennifer

    2018-03-01

    MulensModel calculates light curves of microlensing events. Both single and binary lens events are modeled and various higher-order effects can be included: extended source (with limb-darkening), annual microlensing parallax, and satellite microlensing parallax. The code is object-oriented and written in Python3, and requires AstroPy (ascl:1304.002).

  17. Business Models and Business Model Innovation

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Saebi, Tina

    2018-01-01

    While research on business models and business model innovation continue to exhibit growth, the field is still, even after more than two decades of research, characterized by a striking lack of cumulative theorizing and an opportunistic borrowing of more or less related ideas from neighbouring...

  18. EMC Simulation and Modeling

    Science.gov (United States)

    Takahashi, Takehiro; Schibuya, Noboru

    The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.

  19. Phenomenology of inflationary models

    Science.gov (United States)

    Olyaei, Abbas

    2018-01-01

    There are many inflationary models compatible with observational data. One can investigate inflationary models by looking at their general features, which are common in most of the models. Here we have investigated some of the single-field models without considering their origin in order to find the phenomenology of them. We have shown how to adjust the simple harmonic oscillator model in order to be in good agreement with observational data.

  20. Multilevel statistical models

    CERN Document Server

    Goldstein, Harvey

    2011-01-01

    This book provides a clear introduction to this important area of statistics. The author provides a wide of coverage of different kinds of multilevel models, and how to interpret different statistical methodologies and algorithms applied to such models. This 4th edition reflects the growth and interest in this area and is updated to include new chapters on multilevel models with mixed response types, smoothing and multilevel data, models with correlated random effects and modeling with variance.

  1. Latent classification models

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2005-01-01

    parametric family ofdistributions.  In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....

  2. Geochemistry Model Validation Report: External Accumulation Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Zarrabi

    2001-09-27

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  3. Geochemistry Model Validation Report: External Accumulation Model

    International Nuclear Information System (INIS)

    Zarrabi, K.

    2001-01-01

    The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation

  4. Crop rotation modelling - A European model intercomparison

    DEFF Research Database (Denmark)

    Kollas, Chris; Kersebaum, Kurt C; Nendel, Claas

    2015-01-01

    crop growth simulation models to predict yields in crop rotations at five sites across Europe under minimal calibration. Crop rotations encompassed 301 seasons of ten crop types common to European agriculture and a diverse set of treatments (irrigation, fertilisation, CO2 concentration, soil types...... accurately than main crops (cereals). The majority of models performed better for the treatments of increased CO2 and nitrogen fertilisation than for irrigation and soil-related treatments. The yield simulation of the multi-model ensemble reduced the error compared to single-model simulations. The low degree...... representation of crop rotations, further research is required to synthesise existing knowledge of the physiology of intermediate crops and of carry-over effects from the preceding to the following crop, and to implement/improve the modelling of processes that condition these effects....

  5. Modelling of an homogeneous equilibrium mixture model

    International Nuclear Information System (INIS)

    Bernard-Champmartin, A.; Poujade, O.; Mathiaud, J.; Mathiaud, J.; Ghidaglia, J.M.

    2014-01-01

    We present here a model for two phase flows which is simpler than the 6-equations models (with two densities, two velocities, two temperatures) but more accurate than the standard mixture models with 4 equations (with two densities, one velocity and one temperature). We are interested in the case when the two-phases have been interacting long enough for the drag force to be small but still not negligible. The so-called Homogeneous Equilibrium Mixture Model (HEM) that we present is dealing with both mixture and relative quantities, allowing in particular to follow both a mixture velocity and a relative velocity. This relative velocity is not tracked by a conservation law but by a closure law (drift relation), whose expression is related to the drag force terms of the two-phase flow. After the derivation of the model, a stability analysis and numerical experiments are presented. (authors)

  6. Model Reduction in Groundwater Modeling and Management

    Science.gov (United States)

    Siade, A. J.; Kendall, D. R.; Putti, M.; Yeh, W. W.

    2008-12-01

    Groundwater management requires the development and implementation of mathematical models that, through simulation, evaluate the effects of anthropogenic impacts on an aquifer system. To obtain high levels of accuracy, one must incorporate high levels of complexity, resulting in computationally demanding models. This study provides a methodology for solving groundwater management problems with reduced computational effort by replacing the large, complex numerical model with a significantly smaller, simpler approximation. This is achieved via Proper Orthogonal Decomposition (POD), where the goal is to project the larger model solution space onto a smaller or reduced subspace in which the management problem will be solved, achieving reductions in computation time of up to three orders of magnitude. Once the solution is obtained in the reduced space with acceptable accuracy, it is then projected back to the full model space. A major challenge when using this method is the definition of the reduced solution subspace. In POD, this subspace is defined based on samples or snapshots taken at specific times from the solution of the full model. In this work we determine when snapshots should be taken on the basis of the exponential behavior of the governing partial differential equation. This selection strategy is then generalized for any groundwater model by obtaining and using the optimal snapshot selection for a simplified, dimensionless model. Protocols are developed to allow the snapshot selection results of the simplified, dimensionless model to be transferred to that of a complex, heterogeneous model with any geometry. The proposed methodology is finally applied to a basin in the Oristano Plain located in the Sardinia Island, Italy.

  7. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  8. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  9. Modeling for Battery Prognostics

    Science.gov (United States)

    Kulkarni, Chetan S.; Goebel, Kai; Khasin, Michael; Hogge, Edward; Quach, Patrick

    2017-01-01

    For any battery-powered vehicles (be it unmanned aerial vehicles, small passenger aircraft, or assets in exoplanetary operations) to operate at maximum efficiency and reliability, it is critical to monitor battery health as well performance and to predict end of discharge (EOD) and end of useful life (EOL). To fulfil these needs, it is important to capture the battery's inherent characteristics as well as operational knowledge in the form of models that can be used by monitoring, diagnostic, and prognostic algorithms. Several battery modeling methodologies have been developed in last few years as the understanding of underlying electrochemical mechanics has been advancing. The models can generally be classified as empirical models, electrochemical engineering models, multi-physics models, and molecular/atomist. Empirical models are based on fitting certain functions to past experimental data, without making use of any physicochemical principles. Electrical circuit equivalent models are an example of such empirical models. Electrochemical engineering models are typically continuum models that include electrochemical kinetics and transport phenomena. Each model has its advantages and disadvantages. The former type of model has the advantage of being computationally efficient, but has limited accuracy and robustness, due to the approximations used in developed model, and as a result of such approximations, cannot represent aging well. The latter type of model has the advantage of being very accurate, but is often computationally inefficient, having to solve complex sets of partial differential equations, and thus not suited well for online prognostic applications. In addition both multi-physics and atomist models are computationally expensive hence are even less suited to online application An electrochemistry-based model of Li-ion batteries has been developed, that captures crucial electrochemical processes, captures effects of aging, is computationally efficient

  10. Dimension of linear models

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar

    1996-01-01

    Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four of these cri......Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....

  11. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  12. Modeling volatility using state space models.

    Science.gov (United States)

    Timmer, J; Weigend, A S

    1997-08-01

    In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).

  13. Empirical Model Building Data, Models, and Reality

    CERN Document Server

    Thompson, James R

    2011-01-01

    Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m

  14. Zephyr - the prediction models

    DEFF Research Database (Denmark)

    Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg

    2001-01-01

    utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obatined by state-of-the-art parametric models.......This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Danish...

  15. Models of light nuclei

    International Nuclear Information System (INIS)

    Harvey, M.; Khanna, F.C.

    1975-01-01

    The general problem of what constitutes a physical model and what is known about the free nucleon-nucleon interaction are considered. A time independent formulation of the basic equations is chosen. Construction of the average field in which particles move in a general independent particle model is developed, concentrating on problems of defining the average spherical single particle field for any given nucleus, and methods for construction of effective residual interactions and other physical operators. Deformed shell models and both spherical and deformed harmonic oscillator models are discussed in detail, and connections between spherical and deformed shell models are analyzed. A section on cluster models is included. 11 tables, 21 figures

  16. Building Thermal Models

    Science.gov (United States)

    Peabody, Hume L.

    2017-01-01

    This presentation is meant to be an overview of the model building process It is based on typical techniques (Monte Carlo Ray Tracing for radiation exchange, Lumped Parameter, Finite Difference for thermal solution) used by the aerospace industry This is not intended to be a "How to Use ThermalDesktop" course. It is intended to be a "How to Build Thermal Models" course and the techniques will be demonstrated using the capabilities of ThermalDesktop (TD). Other codes may or may not have similar capabilities. The General Model Building Process can be broken into four top level steps: 1. Build Model; 2. Check Model; 3. Execute Model; 4. Verify Results.

  17. Microsoft tabular modeling cookbook

    CERN Document Server

    Braak, Paul te

    2013-01-01

    This book follows a cookbook style with recipes explaining the steps for developing analytic data using Business Intelligence Semantic Models.This book is designed for developers who wish to develop powerful and dynamic models for users as well as those who are responsible for the administration of models in corporate environments. It is also targeted at analysts and users of Excel who wish to advance their knowledge of Excel through the development of tabular models or who wish to analyze data through tabular modeling techniques. We assume no prior knowledge of tabular modeling

  18. Five models of capitalism

    Directory of Open Access Journals (Sweden)

    Luiz Carlos Bresser-Pereira

    2012-03-01

    Full Text Available Besides analyzing capitalist societies historically and thinking of them in terms of phases or stages, we may compare different models or varieties of capitalism. In this paper I survey the literature on this subject, and distinguish the classification that has a production or business approach from those that use a mainly political criterion. I identify five forms of capitalism: among the rich countries, the liberal democratic or Anglo-Saxon model, the social or European model, and the endogenous social integration or Japanese model; among developing countries, I distinguish the Asian developmental model from the liberal-dependent model that characterizes most other developing countries, including Brazil.

  19. Holographic twin Higgs model.

    Science.gov (United States)

    Geller, Michael; Telem, Ofri

    2015-05-15

    We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at m_{KK}, naturally allowing for m_{KK} beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider.

  20. Biosphere Model Report

    Energy Technology Data Exchange (ETDEWEB)

    D.W. Wu; A.J. Smith

    2004-11-08

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).

  1. Biosphere Model Report

    International Nuclear Information System (INIS)

    D.W. Wu; A.J. Smith

    2004-01-01

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)

  2. Modelling of Innovation Diffusion

    Directory of Open Access Journals (Sweden)

    Arkadiusz Kijek

    2010-01-01

    Full Text Available Since the publication of the Bass model in 1969, research on the modelling of the diffusion of innovation resulted in a vast body of scientific literature consisting of articles, books, and studies of real-world applications of this model. The main objective of the diffusion model is to describe a pattern of spread of innovation among potential adopters in terms of a mathematical function of time. This paper assesses the state-of-the-art in mathematical models of innovation diffusion and procedures for estimating their parameters. Moreover, theoretical issues related to the models presented are supplemented with empirical research. The purpose of the research is to explore the extent to which the diffusion of broadband Internet users in 29 OECD countries can be adequately described by three diffusion models, i.e. the Bass model, logistic model and dynamic model. The results of this research are ambiguous and do not indicate which model best describes the diffusion pattern of broadband Internet users but in terms of the results presented, in most cases the dynamic model is inappropriate for describing the diffusion pattern. Issues related to the further development of innovation diffusion models are discussed and some recommendations are given. (original abstract

  3. Lie Markov models.

    Science.gov (United States)

    Sumner, J G; Fernández-Sánchez, J; Jarvis, P D

    2012-04-07

    Recent work has discussed the importance of multiplicative closure for the Markov models used in phylogenetics. For continuous-time Markov chains, a sufficient condition for multiplicative closure of a model class is ensured by demanding that the set of rate-matrices belonging to the model class form a Lie algebra. It is the case that some well-known Markov models do form Lie algebras and we refer to such models as "Lie Markov models". However it is also the case that some other well-known Markov models unequivocally do not form Lie algebras (GTR being the most conspicuous example). In this paper, we will discuss how to generate Lie Markov models by demanding that the models have certain symmetries under nucleotide permutations. We show that the Lie Markov models include, and hence provide a unifying concept for, "group-based" and "equivariant" models. For each of two and four character states, the full list of Lie Markov models with maximal symmetry is presented and shown to include interesting examples that are neither group-based nor equivariant. We also argue that our scheme is pleasing in the context of applied phylogenetics, as, for a given symmetry of nucleotide substitution, it provides a natural hierarchy of models with increasing number of parameters. We also note that our methods are applicable to any application of continuous-time Markov chains beyond the initial motivations we take from phylogenetics. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  4. Integrated Medical Model – Chest Injury Model

    Data.gov (United States)

    National Aeronautics and Space Administration — The Exploration Medical Capability (ExMC) Element of NASA's Human Research Program (HRP) developed the Integrated Medical Model (IMM) to forecast the resources...

  5. Traffic & safety statewide model and GIS modeling.

    Science.gov (United States)

    2012-07-01

    Several steps have been taken over the past two years to advance the Utah Department of Transportation (UDOT) safety initiative. Previous research projects began the development of a hierarchical Bayesian model to analyze crashes on Utah roadways. De...

  6. Nonlinear Modeling by Assembling Piecewise Linear Models

    Science.gov (United States)

    Yao, Weigang; Liou, Meng-Sing

    2013-01-01

    To preserve nonlinearity of a full order system over a parameters range of interest, we propose a simple modeling approach by assembling a set of piecewise local solutions, including the first-order Taylor series terms expanded about some sampling states. The work by Rewienski and White inspired our use of piecewise linear local solutions. The assembly of these local approximations is accomplished by assigning nonlinear weights, through radial basis functions in this study. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving at different Mach numbers and pitching motions, under which the flow exhibits prominent nonlinear behaviors. All results confirm that our nonlinear model is accurate and stable for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robustness-accurate for inputs considerably different from the base trajectory in form and magnitude. This modeling preserves nonlinearity of the problems considered in a rather simple and accurate manner.

  7. Solid Waste Projection Model: Model user's guide

    International Nuclear Information System (INIS)

    Stiles, D.L.; Crow, V.L.

    1990-08-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford company (WHC) specifically to address solid waste management issues at the Hanford Central Waste Complex (HCWC). This document, one of six documents supporting the SWPM system, contains a description of the system and instructions for preparing to use SWPM and operating Version 1 of the model. 4 figs., 1 tab

  8. Modeling of System Families

    National Research Council Canada - National Science Library

    Feiler, Peter

    2007-01-01

    .... The Society of Automotive Engineers (SAE) Architecture Analysis & Design Language (AADL) is an industry-standard, architecture-modeling notation specifically designed to support a component- based approach to modeling embedded systems...

  9. Models in Action

    DEFF Research Database (Denmark)

    Juhl, Joakim

    This thesis is about mathematical modelling and technology development. While mathematical modelling has become widely deployed within a broad range of scientific practices, it has also gained a central position within technology development. The intersection of mathematical modelling......-efficiency project, this thesis presents an analysis of the central practices that materialised representative physical modelling and implemented operational regulation models. In order to show how the project’s representative modelling and technology development connected physical theory with concrete problems...... theoretical outset, the existing literature on simulation models, and the study’s methodological and empirical approach. The purpose of this thesis is to describe the central practices that developed regulation technology for industrial production processes and to analyse how mathematical modelling...

  10. CCF model comparison

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    2004-04-01

    The report describes a simple comparison of two CCF-models, the ECLM, and the Beta-model. The objective of the comparison is to identify differences in the results of the models by applying the models in some simple test data cases. The comparison focuses mainly on theoretical aspects of the above mentioned CCF-models. The properties of the model parameter estimates in the data cases is also discussed. The practical aspects in using and estimating CCFmodels in real PSA context (e.g. the data interpretation, properties of computer tools, the model documentation) are not discussed in the report. Similarly, the qualitative CCF-analyses needed in using the models are not discussed in the report. (au)

  11. Modeling EERE deployment programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2007-11-01

    The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.

  12. Controlling Modelling Artifacts

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    the possible configurations of the system (for example, by counting the number of components in a certain state). We motivate our methodology with a case study of the LMAC protocol for wireless sensor networks. In particular, we investigate the accuracy of a recently proposed high-level model of LMAC......When analysing the performance of a complex system, we typically build abstract models that are small enough to analyse, but still capture the relevant details of the system. But it is difficult to know whether the model accurately describes the real system, or if its behaviour is due to modelling...... artifacts that were inadvertently introduced. In this paper, we propose a novel methodology to reason about modelling artifacts, given a detailed model and a highlevel (more abstract) model of the same system. By a series of automated abstraction steps, we lift the detailed model to the same state space...

  13. Modeling Fluid Structure Interaction

    National Research Council Canada - National Science Library

    Benaroya, Haym

    2000-01-01

    The principal goal of this program is on integrating experiments with analytical modeling to develop physics-based reduced-order analytical models of nonlinear fluid-structure interactions in articulated naval platforms...

  14. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  15. Protein solubility modeling

    Science.gov (United States)

    Agena, S. M.; Pusey, M. L.; Bogle, I. D.

    1999-01-01

    A thermodynamic framework (UNIQUAC model with temperature dependent parameters) is applied to model the salt-induced protein crystallization equilibrium, i.e., protein solubility. The framework introduces a term for the solubility product describing protein transfer between the liquid and solid phase and a term for the solution behavior describing deviation from ideal solution. Protein solubility is modeled as a function of salt concentration and temperature for a four-component system consisting of a protein, pseudo solvent (water and buffer), cation, and anion (salt). Two different systems, lysozyme with sodium chloride and concanavalin A with ammonium sulfate, are investigated. Comparison of the modeled and experimental protein solubility data results in an average root mean square deviation of 5.8%, demonstrating that the model closely follows the experimental behavior. Model calculations and model parameters are reviewed to examine the model and protein crystallization process. Copyright 1999 John Wiley & Sons, Inc.

  16. Model and code development

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed

  17. LAT Background Models

    Data.gov (United States)

    National Aeronautics and Space Administration — The Galactic model is a spatial and spectral template. The model for the Galactic diffuse emission was developed using spectral line surveys of HI and CO (as a...

  18. World Magnetic Model 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...

  19. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  20. World Magnetic Model 2010

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...

  1. ASC Champ Orbit Model

    DEFF Research Database (Denmark)

    Riis, Troels; Jørgensen, John Leif

    1999-01-01

    This documents describes a test of the implementation of the ASC orbit model for the Champ satellite.......This documents describes a test of the implementation of the ASC orbit model for the Champ satellite....

  2. Laboratory of Biological Modeling

    Data.gov (United States)

    Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to a...

  3. Bounding species distribution models

    Directory of Open Access Journals (Sweden)

    Thomas J. STOHLGREN, Catherine S. JARNEVICH, Wayne E. ESAIAS,Jeffrey T. MORISETTE

    2011-10-01

    Full Text Available Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for “clamping” model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART and maximum entropy (Maxent models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5: 642–647, 2011].

  4. Bounding Species Distribution Models

    Science.gov (United States)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  5. Modeling Complex Time Limits

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2013-01-01

    Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.

  6. Emissions Modeling Clearinghouse

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Emissions Modeling Clearinghouse (EMCH) supports and promotes emissions modeling activities both internal and external to the EPA. Through this site, the EPA...

  7. Differential models in ecology

    International Nuclear Information System (INIS)

    Barco Gomez, Carlos; Barco Gomez, German

    2002-01-01

    The models mathematical writings with differential equations are used to describe the populational behavior through the time of the animal species. These models can be lineal or no lineal. The differential models for unique specie include the exponential pattern of Malthus and the logistical pattern of Verlhust. The lineal differential models to describe the interaction between two species include the competition relationships, predation and symbiosis

  8. Hierarchical Bass model

    Science.gov (United States)

    Tashiro, Tohru

    2014-03-01

    We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.

  9. GARCH Modelling of Cryptocurrencies

    Directory of Open Access Journals (Sweden)

    Jeffrey Chu

    2017-10-01

    Full Text Available With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.

  10. Cloud Model Bat Algorithm

    OpenAIRE

    Yongquan Zhou; Jian Xie; Liangliang Li; Mingzhi Ma

    2014-01-01

    Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformati...

  11. Optimization modeling with spreadsheets

    CERN Document Server

    Baker, Kenneth R

    2015-01-01

    An accessible introduction to optimization analysis using spreadsheets Updated and revised, Optimization Modeling with Spreadsheets, Third Edition emphasizes model building skills in optimization analysis. By emphasizing both spreadsheet modeling and optimization tools in the freely available Microsoft® Office Excel® Solver, the book illustrates how to find solutions to real-world optimization problems without needing additional specialized software. The Third Edition includes many practical applications of optimization models as well as a systematic framework that il

  12. Artificial neural network modelling

    CERN Document Server

    Samarasinghe, Sandhya

    2016-01-01

    This book covers theoretical aspects as well as recent innovative applications of Artificial Neural networks (ANNs) in natural, environmental, biological, social, industrial and automated systems. It presents recent results of ANNs in modelling small, large and complex systems under three categories, namely, 1) Networks, Structure Optimisation, Robustness and Stochasticity 2) Advances in Modelling Biological and Environmental Systems and 3) Advances in Modelling Social and Economic Systems. The book aims at serving undergraduates, postgraduates and researchers in ANN computational modelling. .

  13. TENCompetence Domain Model

    NARCIS (Netherlands)

    2006-01-01

    This is the version 1.1 of the TENCompetence Domain Model (version 1.0 released at 19-6-2006; version 1.1 at 9-11-2008). It contains several files: a) a pdf with the model description, b) three jpg files with class models (also in the pdf), c) a MagicDraw zip file with the model itself, d) a release

  14. Photovoltaic sources modeling

    CERN Document Server

    Petrone, Giovanni; Spagnuolo, Giovanni

    2016-01-01

    This comprehensive guide surveys all available models for simulating a photovoltaic (PV) generator at different levels of granularity, from cell to system level, in uniform as well as in mismatched conditions. Providing a thorough comparison among the models, engineers have all the elements needed to choose the right PV array model for specific applications or environmental conditions matched with the model of the electronic circuit used to maximize the PV power production.

  15. Hierarchical Bass model

    International Nuclear Information System (INIS)

    Tashiro, Tohru

    2014-01-01

    We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model

  16. Calibrated Properties Model

    Energy Technology Data Exchange (ETDEWEB)

    J. Wang

    2003-06-24

    The purpose of this Model Report is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Office of Repository Development (ORD). The UZ contains the unsaturated rock layers overlying the repository and host unit, which constitute a natural barrier to flow, and the unsaturated rock layers below the repository which constitute a natural barrier to flow and transport. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Performance Assessment Unsaturated Zone'' (BSC 2002 [160819], Section 1.10.8 [under Work Package (WP) AUZM06, Climate Infiltration and Flow], and Section I-1-1 [in Attachment I, Model Validation Plans]). In Section 4.2, four acceptance criteria (ACs) are identified for acceptance of this Model Report; only one of these (Section 4.2.1.3.6.3, AC 3) was identified in the TWP (BSC 2002 [160819], Table 3-1). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, and drift-scale and mountain-scale coupled-process models from the UZ Flow, Transport and Coupled Processes Department in the Natural Systems Subproject of the Performance Assessment (PA) Project. The Calibrated Properties Model output will also be used by the Engineered Barrier System Department in the Engineering Systems Subproject. The Calibrated Properties Model provides input through the UZ Model and other process models of natural and engineered systems to the Total System Performance Assessment (TSPA) models, in accord with the PA Strategy and Scope in the PA Project of the Bechtel SAIC Company, LLC (BSC). The UZ process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions. UZ flow is a TSPA model component.

  17. Making business models

    DEFF Research Database (Denmark)

    Gudiksen, Sune Klok; Poulsen, Søren Bolvig; Buur, Jacob

    2014-01-01

    Well-established companies are currently struggling to secure profits due to the pressure from new players' business models as they take advantage of communication technology and new business-model configurations. Because of this, the business model research field flourishes currently; however, t...... illustrates how the application of participatory business model design toolsets can open up discussions on alternative scenarios through improvisation, mock-up making and design game playing, before qualitative judgment on the most promising scenario is carried out....

  18. Model Checking Feature Interactions

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Pedersen, Thomas

    2015-01-01

    This paper presents an offline approach to analyzing feature interactions in embedded systems. The approach consists of a systematic process to gather the necessary information about system components and their models. The model is first specified in terms of predicates, before being refined to t...... to timed automata. The consistency of the model is verified at different development stages, and the correct linkage between the predicates and their semantic model is checked. The approach is illustrated on a use case from home automation....

  19. Modelling of Corrosion Cracks

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed.......Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed....

  20. Brucellosis, genital campylobacteriosis and other factors affecting calving rate of cattle in three states of Northern Nigeria.

    Science.gov (United States)

    Mai, Hassan M; Irons, Peter C; Thompson, Peter N

    2015-01-20

    Reproductive diseases limit the productivity of cattle worldwide and represent an important obstacle to profitable cattle enterprise. In this study, herd brucellosis and bovine genital campylobacteriosis (BGC) status, and demographic and management variables were determined and related to predicted calving rate (PrCR) of cattle herds in Adamawa, Kaduna and Kano states, Nigeria. Serum samples, preputial scrapings, questionnaire data, trans-rectal palpation and farm records were used from 271 herds. The Rose-Bengal plate test and competitive enzyme-linked immunosorbent assay were used for Brucella serology and culture and identification from preputial samples for BGC. A herd was classified as positive if one or more animals tested positive. The PrCR was determined as the number of calvings expected during the previous 6 and next 6 months as a percentage of the number of postpubertal heifers and cows in the herd. A multilevel linear regression model was used to estimate the herd-level effect of Brucella abortus seropositivity, Campylobacter fetus infection and other factors on calculated PrCR. The reproductive performance of the cattle herds was generally poor: Only 6.5% of the nursing cows were pregnant and 51.1% were non-pregnant and acyclic; the mean annual PrCR was 51.4%. Brucella abortus and C. fetus infection of herds were independently associated with absolute reduction in PrCR of 14.9% and 8.4%, respectively. There was also a strong negative association between within-herd Brucella seroprevalence and PrCR. Presence of small ruminants, animal introduction without quarantine and the presence of handling facilities were associated with lower PrCR, whereas larger herd size, supplementary feeding, routine mineral supplementation and care during parturition were associated with higher PrCR. Brucellosis and BGC may be largely responsible for the poor reproductive performance of indigenous Nigerian cattle. Farmer education and measures to improve the fertility of

  1. Model description and evaluation of model performance: DOSDIM model

    International Nuclear Information System (INIS)

    Lewyckyj, N.; Zeevaert, T.

    1996-01-01

    DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs

  2. Urban tree growth modeling

    Science.gov (United States)

    E. Gregory McPherson; Paula J. Peper

    2012-01-01

    This paper describes three long-term tree growth studies conducted to evaluate tree performance because repeated measurements of the same trees produce critical data for growth model calibration and validation. Several empirical and process-based approaches to modeling tree growth are reviewed. Modeling is more advanced in the fields of forestry and...

  3. Classifying variability modeling techniques

    NARCIS (Netherlands)

    Sinnema, Marco; Deelstra, Sybren

    Variability modeling is important for managing variability in software product families, especially during product derivation. In the past few years, several variability modeling techniques have been developed, each using its own concepts to model the variability provided by a product family. The

  4. Climate models and scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Fortelius, C.; Holopainen, E.; Kaurola, J.; Ruosteenoja, K.; Raeisaenen, J. [Helsinki Univ. (Finland). Dept. of Meteorology

    1996-12-31

    In recent years the modelling of interannual climate variability has been studied, the atmospheric energy and water cycles, and climate simulations with the ECHAM3 model. In addition, the climate simulations of several models have been compared with special emphasis in the area of northern Europe

  5. Rock Properties Model

    International Nuclear Information System (INIS)

    Lum, C.

    2004-01-01

    The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process

  6. THE SLX MODEL

    NARCIS (Netherlands)

    Vega, Solmaria Halleck; Elhorst, J. Paul

    We provide a comprehensive overview of the strengths and weaknesses of different spatial econometric model specifications in terms of spillover effects. Based on this overview, we advocate taking the SLX model as point of departure in case a well-founded theory indicating which model is most

  7. Understandings of 'Modelling'

    DEFF Research Database (Denmark)

    Andresen, Mette

    2007-01-01

    -authentic modelling is also linked with the potentials of exploration of ready-made models as a forerunner for more authentic modelling processes. The discussion includes analysis of an episode of students? work in the classroom, which serves to illustrate how concept formation may be linked to explorations of a non...

  8. Modeling EERE Deployment Programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2007-11-01

    This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.

  9. Models of Business Internationalisation

    Directory of Open Access Journals (Sweden)

    Jurgita Vabinskaitė

    2011-04-01

    Full Text Available The study deals with the theoretical models of business internationalisation: the “Uppsala” Internationalisation Model, modified “Uppsala” model, the Eclectic Paradigm and analysis of transactional costs, Industrial Network approach, the Advantage Package and the Advantage Cycle.Article in Lithuanian

  10. Modelling: Nature and Use

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Engineering of products and processes is increasingly “model-centric”. Models in their multitudinous forms are ubiquitous, being heavily used for a range of decision making activities across all life cycle phases. This chapter gives an overview of what is a model, the principal activities in the ...

  11. Ecological modeling for all

    OpenAIRE

    Christensen, V.; Pauly, D.

    1996-01-01

    A brief review of the status of the ECOPATH modeling approach and software is presented, with emphasis on the recent release of a Windows version (ECOPATH 3.0), which enables consideration of uncertainties, and sets the stage for simulation modeling using ECOSIM. Modeling of coral reefs is emphasized.

  12. Models selection and fitting

    International Nuclear Information System (INIS)

    Martin Llorente, F.

    1990-01-01

    The models of atmospheric pollutants dispersion are based in mathematic algorithms that describe the transport, diffusion, elimination and chemical reactions of atmospheric contaminants. These models operate with data of contaminants emission and make an estimation of quality air in the area. This model can be applied to several aspects of atmospheric contamination

  13. Loglinear Rasch model tests

    NARCIS (Netherlands)

    Kelderman, Hendrikus

    1984-01-01

    Existing statistical tests for the fit of the Rasch model have been criticized, because they are only sensitive to specific violations of its assumptions. Contingency table methods using loglinear models have been used to test various psychometric models. In this paper, the assumptions of the Rasch

  14. Modeling Epidemic Network Failures

    DEFF Research Database (Denmark)

    Ruepp, Sarah Renée; Fagertun, Anna Manolova

    2013-01-01

    This paper presents the implementation of a failure propagation model for transport networks when multiple failures occur resulting in an epidemic. We model the Susceptible Infected Disabled (SID) epidemic model and validate it by comparing it to analytical solutions. Furthermore, we evaluate...... to evaluate multiple epidemic scenarios in various network types....

  15. The 5C Model

    DEFF Research Database (Denmark)

    Friis, Silje Alberthe Kamille; Gelting, Anne Katrine Gøtzsche

    2014-01-01

    the approaches and reach a new level of conscious action when designing? Informed by theories of design thinking, knowledge production, and learning, we have developed a model, the 5C model, accompanied by 62 method cards. Examples of how the model has been applied in an educational setting are provided...

  16. The nontopological soliton model

    International Nuclear Information System (INIS)

    Wilets, L.

    1988-01-01

    The nontopological soliton model introduced by Friedberg and Lee, and variations of it, provide a method for modeling QCD which can effectively include the dynamics of hadronic collisions as well as spectra. Absolute color confinement is effected by the assumed dielectric properties of the medium. A recently proposed version of the model is chirally invariant. 32 refs., 5 figs., 1 tab

  17. The cloudy bag model

    International Nuclear Information System (INIS)

    Thomas, A.W.

    1981-01-01

    Recent developments in the bag model, in which the constraints of chiral symmetry are explicitly included are reviewed. The model leads to a new understanding of the Δ-resonance. The connection of the theory with current algebra is clarified and implications of the model for the structure of the nucleon are discussed

  18. Flexible survival regression modelling

    DEFF Research Database (Denmark)

    Cortese, Giuliana; Scheike, Thomas H; Martinussen, Torben

    2009-01-01

    Regression analysis of survival data, and more generally event history data, is typically based on Cox's regression model. We here review some recent methodology, focusing on the limitations of Cox's regression model. The key limitation is that the model is not well suited to represent time-varyi...

  19. Automated Simulation Model Generation

    NARCIS (Netherlands)

    Huang, Y.

    2013-01-01

    One of today's challenges in the field of modeling and simulation is to model increasingly larger and more complex systems. Complex models take long to develop and incur high costs. With the advances in data collection technologies and more popular use of computer-aided systems, more data has become

  20. Genome-Scale Models

    DEFF Research Database (Denmark)

    Bergdahl, Basti; Sonnenschein, Nikolaus; Machado, Daniel

    2016-01-01

    An introduction to genome-scale models, how to build and use them, will be given in this chapter. Genome-scale models have become an important part of systems biology and metabolic engineering, and are increasingly used in research, both in academica and in industry, both for modeling chemical pr...

  1. Models and Indicators.

    Science.gov (United States)

    Land, Kenneth C.

    2001-01-01

    Examines the definition, construction, and interpretation of social indicators. Shows how standard classes of formalisms used to construct models in contemporary sociology are derived from the general theory of models. Reviews recent model building and evaluation related to active life expectancy among the elderly, fertility rates, and indicators…

  2. Intersection of Feature Models

    NARCIS (Netherlands)

    van den Broek, P.M.

    In this paper, we present an algorithm for the construction of the intersection of two feature models. The feature models are allowed to have "requires" and "excludes" constraints, and should be parent-compatible. The algorithm is applied to the problem of combining feature models from stakeholders

  3. Merging Feature Models

    NARCIS (Netherlands)

    van den Broek, P.M.; Galvao, I.; Noppen, J.A.R.

    2010-01-01

    In this paper, we consider the problem of merging feature models which consist of trees with "requires" and "excludes" constraints. For any two such feature models which are parent-compatible, their merge is defined to be the smallest parent-compatible feature model which has all products of the

  4. Modeling Natural Selection

    Science.gov (United States)

    Bogiages, Christopher A.; Lotter, Christine

    2011-01-01

    In their research, scientists generate, test, and modify scientific models. These models can be shared with others and demonstrate a scientist's understanding of how the natural world works. Similarly, students can generate and modify models to gain a better understanding of the content, process, and nature of science (Kenyon, Schwarz, and Hug…

  5. Model Breaking Points Conceptualized

    Science.gov (United States)

    Vig, Rozy; Murray, Eileen; Star, Jon R.

    2014-01-01

    Current curriculum initiatives (e.g., National Governors Association Center for Best Practices and Council of Chief State School Officers 2010) advocate that models be used in the mathematics classroom. However, despite their apparent promise, there comes a point when models break, a point in the mathematical problem space where the model cannot,…

  6. Scalability of human models

    NARCIS (Netherlands)

    Rodarius, C.; Rooij, L. van; Lange, R. de

    2007-01-01

    The objective of this work was to create a scalable human occupant model that allows adaptation of human models with respect to size, weight and several mechanical parameters. Therefore, for the first time two scalable facet human models were developed in MADYMO. First, a scalable human male was

  7. Dynamic term structure models

    DEFF Research Database (Denmark)

    Andreasen, Martin Møller; Meldrum, Andrew

    pricing factors using the sequential regression approach. Our findings suggest that the two models largely provide the same in-sample fit, but loadings from ordinary and risk-adjusted Campbell-Shiller regressions are generally best matched by the shadow rate models. We also find that the shadow rate...... models perform better than the QTSMs when forecasting bond yields out of sample....

  8. Modeling agriculture in the Community Land Model

    Directory of Open Access Journals (Sweden)

    B. Drewniak

    2013-04-01

    Full Text Available The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types – maize, soybean, and spring wheat – into the coupled carbon–nitrogen version of the Community Land Model (CLM, to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements for soybean, but not as well for maize. CLM-Crop yields were comparable with observations in countries such as the United States, Argentina, and China, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation, in agreement with other modeling studies. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model – simulating agriculture in a realistic way, complete with

  9. Modeling agriculture in the Community Land Model

    Science.gov (United States)

    Drewniak, B.; Song, J.; Prell, J.; Kotamarthi, V. R.; Jacob, R.

    2013-04-01

    The potential impact of climate change on agriculture is uncertain. In addition, agriculture could influence above- and below-ground carbon storage. Development of models that represent agriculture is necessary to address these impacts. We have developed an approach to integrate agriculture representations for three crop types - maize, soybean, and spring wheat - into the coupled carbon-nitrogen version of the Community Land Model (CLM), to help address these questions. Here we present the new model, CLM-Crop, validated against observations from two AmeriFlux sites in the United States, planted with maize and soybean. Seasonal carbon fluxes compared well with field measurements for soybean, but not as well for maize. CLM-Crop yields were comparable with observations in countries such as the United States, Argentina, and China, although the generality of the crop model and its lack of technology and irrigation made direct comparison difficult. CLM-Crop was compared against the standard CLM3.5, which simulates crops as grass. The comparison showed improvement in gross primary productivity in regions where crops are the dominant vegetation cover. Crop yields and productivity were negatively correlated with temperature and positively correlated with precipitation, in agreement with other modeling studies. In case studies with the new crop model looking at impacts of residue management and planting date on crop yield, we found that increased residue returned to the litter pool increased crop yield, while reduced residue returns resulted in yield decreases. Using climate controls to signal planting date caused different responses in different crops. Maize and soybean had opposite reactions: when low temperature threshold resulted in early planting, maize responded with a loss of yield, but soybean yields increased. Our improvements in CLM demonstrate a new capability in the model - simulating agriculture in a realistic way, complete with fertilizer and residue management

  10. Transgenesis for pig models.

    Science.gov (United States)

    Yum, Soo-Young; Yoon, Ki-Young; Lee, Choong-Il; Lee, Byeong-Chun; Jang, Goo

    2016-09-30

    Animal models, particularly pigs, have come to play an important role in translational biomedical research. There have been many pig models with genetically modifications via somatic cell nuclear transfer (SCNT). However, because most transgenic pigs have been produced by random integration to date, the necessity for more exact gene-mutated models using recombinase based conditional gene expression like mice has been raised. Currently, advanced genome-editing technologies enable us to generate specific gene-deleted and -inserted pig models. In the future, the development of pig models with gene editing technologies could be a valuable resource for biomedical research.

  11. Mathematical modelling techniques

    CERN Document Server

    Aris, Rutherford

    1995-01-01

    ""Engaging, elegantly written."" - Applied Mathematical ModellingMathematical modelling is a highly useful methodology designed to enable mathematicians, physicists and other scientists to formulate equations from a given nonmathematical situation. In this elegantly written volume, a distinguished theoretical chemist and engineer sets down helpful rules not only for setting up models but also for solving the mathematical problems they pose and for evaluating models.The author begins with a discussion of the term ""model,"" followed by clearly presented examples of the different types of mode

  12. SAS "New" Business Model

    OpenAIRE

    Sörhammar, David; Bengtson, Anna

    2006-01-01

    In September 2005 SAS introduced a new business model. Where did the model come from and what influenced it? This paper’s focus is on the making of the model where we study the making of a business model as a dynamic process through time. In concrete terms, traces of today’s model can be found and examined from the SAS group’s embryonic attempts starting in 1946, through the financially good years during the 1980s, to the market re-regulation in contemporary time. During these years several c...

  13. The interacting boson model

    International Nuclear Information System (INIS)

    Iachello, F.; Arima, A.

    1987-01-01

    The book gives an account of some of the properties of the interacting boson model. The model was introduced in 1974 to describe in a unified way the collective properties of nuclei. The book presents the mathematical techniques used to analyse the structure of the model. The mathematical framework of the model is discussed in detail. The book also contains all the formulae that have been developed throughout the years to account for collective properties of nuclei. These formulae can be used by experimentalists to compare their data with the predictions of the model. (U.K.)

  14. CRAC2 model description

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Alpert, D.J.; Burke, R.P.; Johnson, J.D.; Ostmeyer, R.M.; Aldrich, D.C.; Blond, R.M.

    1984-03-01

    The CRAC2 computer code is a revised version of CRAC (Calculation of Reactor Accident Consequences) which was developed for the Reactor Safety Study. This document provides an overview of the CRAC2 code and a description of each of the models used. Significant improvements incorporated into CRAC2 include an improved weather sequence sampling technique, a new evacuation model, and new output capabilities. In addition, refinements have been made to the atmospheric transport and deposition model. Details of the modeling differences between CRAC2 and CRAC are emphasized in the model descriptions

  15. Model-independent differences

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    Computing differences (diffs) and merging different versions is well-known for text files, but for models it is a very young field - especially patches for models are still matter of research. Text-based and model-based diffs have different starting points because the semantics of their structure...... is fundamentally different. This paper reports on our ongoing work on model-independent diffs, i.e. a diff that does not directly refer to the models it was created from. Based on that, we present an idea of how the diff could be generalized, e.g. many atomic diffs are merged to a new, generalized diff. One use...

  16. Models of human operators

    International Nuclear Information System (INIS)

    Knee, H.E.; Schryver, J.C.

    1991-01-01

    Models of human behavior and cognition (HB and C) are necessary for understanding the total response of complex systems. Many such models have come available over the past thirty years for various applications. Unfortunately, many potential model users remain skeptical about their practicality, acceptability, and usefulness. Such hesitancy stems in part to disbelief in the ability to model complex cognitive processes, and a belief that relevant human behavior can be adequately accounted for through the use of commonsense heuristics. This paper will highlight several models of HB and C and identify existing and potential applications in attempt to dispel such notions. (author)

  17. Patterns of data modeling

    CERN Document Server

    Blaha, Michael

    2010-01-01

    Best-selling author and database expert with more than 25 years of experience modeling application and enterprise data, Dr. Michael Blaha provides tried and tested data model patterns, to help readers avoid common modeling mistakes and unnecessary frustration on their way to building effective data models. Unlike the typical methodology book, "Patterns of Data Modeling" provides advanced techniques for those who have mastered the basics. Recognizing that database representation sets the path for software, determines its flexibility, affects its quality, and influences whether it succ

  18. UZ Colloid Transport Model

    International Nuclear Information System (INIS)

    McGraw, M.

    2000-01-01

    The UZ Colloid Transport model development plan states that the objective of this Analysis/Model Report (AMR) is to document the development of a model for simulating unsaturated colloid transport. This objective includes the following: (1) use of a process level model to evaluate the potential mechanisms for colloid transport at Yucca Mountain; (2) Provide ranges of parameters for significant colloid transport processes to Performance Assessment (PA) for the unsaturated zone (UZ); (3) Provide a basis for development of an abstracted model for use in PA calculations

  19. A Model for Information

    Directory of Open Access Journals (Sweden)

    Paul Walton

    2014-09-01

    Full Text Available This paper uses an approach drawn from the ideas of computer systems modelling to produce a model for information itself. The model integrates evolutionary, static and dynamic views of information and highlights the relationship between symbolic content and the physical world. The model includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction. The concepts developed in the model enable a richer understanding of Floridi’s questions “what is information?” and “the informational circle: how can information be assessed?” (which he numbers P1 and P12.

  20. Complex matrix model duality

    Energy Technology Data Exchange (ETDEWEB)

    Brown, T.W.

    2010-11-15

    The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)

  1. Graphical Models with R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Edwards, David; Lauritzen, Steffen L.

    of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...

  2. Mechanics of materials model

    Science.gov (United States)

    Meister, Jeffrey P.

    1987-01-01

    The Mechanics of Materials Model (MOMM) is a three-dimensional inelastic structural analysis code for use as an early design stage tool for hot section components. MOMM is a stiffness method finite element code that uses a network of beams to characterize component behavior. The MOMM contains three material models to account for inelastic material behavior. These include the simplified material model, which assumes a bilinear stress-strain response; the state-of-the-art model, which utilizes the classical elastic-plastic-creep strain decomposition; and Walker's viscoplastic model, which accounts for the interaction between creep and plasticity that occurs under cyclic loading conditions.

  3. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  4. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    leaving students. It is a probabilistic model. In the next part of this article, two more models - 'input/output model' used for production systems or economic studies and a. 'discrete event simulation model' are introduced. Aircraft Performance Model.

  5. Making ecological models adequate

    Science.gov (United States)

    Getz, Wayne M.; Marshall, Charles R.; Carlson, Colin J.; Giuggioli, Luca; Ryan, Sadie J.; Romañach, Stephanie; Boettiger, Carl; Chamberlain, Samuel D.; Larsen, Laurel; D'Odorico, Paolo; O'Sullivan, David

    2018-01-01

    Critical evaluation of the adequacy of ecological models is urgently needed to enhance their utility in developing theory and enabling environmental managers and policymakers to make informed decisions. Poorly supported management can have detrimental, costly or irreversible impacts on the environment and society. Here, we examine common issues in ecological modelling and suggest criteria for improving modelling frameworks. An appropriate level of process description is crucial to constructing the best possible model, given the available data and understanding of ecological structures. Model details unsupported by data typically lead to over parameterisation and poor model performance. Conversely, a lack of mechanistic details may limit a model's ability to predict ecological systems’ responses to management. Ecological studies that employ models should follow a set of model adequacy assessment protocols that include: asking a series of critical questions regarding state and control variable selection, the determinacy of data, and the sensitivity and validity of analyses. We also need to improve model elaboration, refinement and coarse graining procedures to better understand the relevancy and adequacy of our models and the role they play in advancing theory, improving hind and forecasting, and enabling problem solving and management.

  6. Modelling Farm Animal Welfare

    Science.gov (United States)

    Collins, Lisa M.; Part, Chérie E.

    2013-01-01

    Simple Summary In this review paper we discuss the different modeling techniques that have been used in animal welfare research to date. We look at what questions they have been used to answer, the advantages and pitfalls of the methods, and how future research can best use these approaches to answer some of the most important upcoming questions in farm animal welfare. Abstract The use of models in the life sciences has greatly expanded in scope and advanced in technique in recent decades. However, the range, type and complexity of models used in farm animal welfare is comparatively poor, despite the great scope for use of modeling in this field of research. In this paper, we review the different modeling approaches used in farm animal welfare science to date, discussing the types of questions they have been used to answer, the merits and problems associated with the method, and possible future applications of each technique. We find that the most frequently published types of model used in farm animal welfare are conceptual and assessment models; two types of model that are frequently (though not exclusively) based on expert opinion. Simulation, optimization, scenario, and systems modeling approaches are rarer in animal welfare, despite being commonly used in other related fields. Finally, common issues such as a lack of quantitative data to parameterize models, and model selection and validation are discussed throughout the review, with possible solutions and alternative approaches suggested. PMID:26487411

  7. Calibrated Properties Model

    Energy Technology Data Exchange (ETDEWEB)

    T. Ghezzehej

    2004-10-04

    The purpose of this model report is to document the calibrated properties model that provides calibrated property sets for unsaturated zone (UZ) flow and transport process models (UZ models). The calibration of the property sets is performed through inverse modeling. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Unsaturated Zone Flow Analysis and Model Report Integration'' (BSC 2004 [DIRS 169654], Sections 1.2.6 and 2.1.1.6). Direct inputs to this model report were derived from the following upstream analysis and model reports: ''Analysis of Hydrologic Properties Data'' (BSC 2004 [DIRS 170038]); ''Development of Numerical Grids for UZ Flow and Transport Modeling'' (BSC 2004 [DIRS 169855]); ''Simulation of Net Infiltration for Present-Day and Potential Future Climates'' (BSC 2004 [DIRS 170007]); ''Geologic Framework Model'' (GFM2000) (BSC 2004 [DIRS 170029]). Additionally, this model report incorporates errata of the previous version and closure of the Key Technical Issue agreement TSPAI 3.26 (Section 6.2.2 and Appendix B), and it is revised for improved transparency.

  8. Programming Models in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-13

    These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.

  9. CREDIT RISK. DETERMINATION MODELS

    Directory of Open Access Journals (Sweden)

    MIHAELA GRUIESCU

    2012-01-01

    Full Text Available The internationalization of financial flows and banking and the rapid development of markets have changed the financial sector, causing him to respond with force and imagination. Under these conditions, the concerns of financial and banking institutions, rating institutions are increasingly turning to find the best solutions to hedge risks and maximize profits. This paper aims to present a number of advantages, but also limits the Merton model, the first structural model for modeling credit risk. Also, some are extensions of the model, some empirical research and performance known, others such as state-dependent models (SDM, which together with the liquidation process models (LPM, are two recent efforts in the structural models, show different phenomena in real life.

  10. Modelling of wastewater systems

    DEFF Research Database (Denmark)

    Bechmann, Henrik

    Oxygen Demand) flux and SS flux in the inlet to the WWTP. COD is measured by means of a UV absorption sensor while SS is measured by a turbidity sensor. These models include a description of the deposit of COD and SS amounts, respectively, in the sewer system, and the models can thus be used to quantify......In this thesis, models of pollution fluxes in the inlet to 2 Danish wastewater treatment plants (WWTPs) as well as of suspended solids (SS) concentrations in the aeration tanks of an alternating WWTP and in the effluent from the aeration tanks are developed. The latter model is furthermore used...... to analyze and quantify the effect of the Aeration Tank Settling (ATS) operating mode, which is used during rain events. Furthermore, the model is used to propose a control algorithm for the phase lengths during ATS operation. The models are mainly formulated as state space model in continuous time...

  11. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  12. Hydrological land surface modelling

    DEFF Research Database (Denmark)

    Ridler, Marc-Etienne Francois

    and disaster management. The objective of this study is to develop and investigate methods to reduce hydrological model uncertainty by using supplementary data sources. The data is used either for model calibration or for model updating using data assimilation. Satellite estimates of soil moisture and surface......Recent advances in integrated hydrological and soil-vegetation-atmosphere transfer (SVAT) modelling have led to improved water resource management practices, greater crop production, and better flood forecasting systems. However, uncertainty is inherent in all numerical models ultimately leading...... hydrological and tested by assimilating synthetic hydraulic head observations in a catchment in Denmark. Assimilation led to a substantial reduction of model prediction error, and better model forecasts. Also, a new assimilation scheme is developed to downscale and bias-correct coarse satellite derived soil...

  13. Hydrological land surface modelling

    DEFF Research Database (Denmark)

    Ridler, Marc-Etienne Francois

    Recent advances in integrated hydrological and soil-vegetation-atmosphere transfer (SVAT) modelling have led to improved water resource management practices, greater crop production, and better flood forecasting systems. However, uncertainty is inherent in all numerical models ultimately leading...... and disaster management. The objective of this study is to develop and investigate methods to reduce hydrological model uncertainty by using supplementary data sources. The data is used either for model calibration or for model updating using data assimilation. Satellite estimates of soil moisture and surface...... hydrological and tested by assimilating synthetic hydraulic head observations in a catchment in Denmark. Assimilation led to a substantial reduction of model prediction error, and better model forecasts. Also, a new assimilation scheme is developed to downscale and bias-correct coarse satellite derived soil...

  14. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially......-process models, the last part of the thesis, where the integrated process tank model is tested on three examples of activated sludge systems, is initiated. The three case studies are introduced with an increasing degree of model complexity. All three cases are take basis in Danish municipal wastewater treatment...... plants. The first case study involves the modeling of an activated sludge tank undergoing a special controlling strategy with the intention minimizing the sludge loading on the subsequent secondary settlers during storm events. The applied model is a two-phase model, where the sedimentation of sludge...

  15. Aware design models

    DEFF Research Database (Denmark)

    Tamke, Martin

    2015-01-01

    and informed feedback. Introducing the term "Aware models", the paper investigates how computational models become an enabler for a better informed architectural design practice, through the embedding of knowledge about constraints, behaviour and processes of formation and making into generative design models......Appearing almost alive, a novel set of computational design models can become an active counterpart for architects in the design process. The ability to loop, sense and query and the integration of near real-time simulation provide these models with a depth and agility that allows for instant....... The inspection of several computational design projects in architectural research highlights three different types of awareness a model can possess and devises strategies to establish and finally design with aware models. This design practice is collaborative in nature and characterized by a bidirectional flow...

  16. MODERN MEDIA EDUCATION MODELS

    Directory of Open Access Journals (Sweden)

    Alexander Fedorov

    2011-03-01

    Full Text Available The author supposed that media education models can be divided into the following groups:- educational-information models (the study of the theory, history, language of media culture, etc., based on the cultural, aesthetic, semiotic, socio-cultural theories of media education;- educational-ethical models (the study of moral, religions, philosophical problems relying on the ethic, religious, ideological, ecological, protectionist theories of media education;- pragmatic models (practical media technology training, based on the uses and gratifications and ‘practical’ theories of media education;- aesthetical models (aimed above all at the development of the artistic taste and enriching the skills of analysis of the best media culture examples. Relies on the aesthetical (art and cultural studies theory; - socio-cultural models (socio-cultural development of a creative personality as to the perception, imagination, visual memory, interpretation analysis, autonomic critical thinking, relying on the cultural studies, semiotic, ethic models of media education.

  17. Untangling RFID Privacy Models

    Directory of Open Access Journals (Sweden)

    Iwen Coisel

    2013-01-01

    Full Text Available The rise of wireless applications based on RFID has brought up major concerns on privacy. Indeed nowadays, when such an application is deployed, informed customers yearn for guarantees that their privacy will not be threatened. One formal way to perform this task is to assess the privacy level of the RFID application with a model. However, if the chosen model does not reflect the assumptions and requirements of the analyzed application, it may misevaluate its privacy level. Therefore, selecting the most appropriate model among all the existing ones is not an easy task. This paper investigates the eight most well-known RFID privacy models and thoroughly examines their advantages and drawbacks in three steps. Firstly, five RFID authentication protocols are analyzed with these models. This discloses a main worry: although these protocols intuitively ensure different privacy levels, no model is able to accurately distinguish them. Secondly, these models are grouped according to their features (e.g., tag corruption ability. This classification reveals the most appropriate candidate model(s to be used for a privacy analysis when one of these features is especially required. Furthermore, it points out that none of the models are comprehensive. Hence, some combinations of features may not match any model. Finally, the privacy properties of the eight models are compared in order to provide an overall view of their relations. This part highlights that no model globally outclasses the other ones. Considering the required properties of an application, the thorough study provided in this paper aims to assist system designers to choose the best suited model.

  18. Evaluating the Terrestrial Biogeochemical Responses and Feedbacks of Stratospheric Geoengineering Strategies

    Science.gov (United States)

    Yang, C. E.; Hoffman, F. M.; Fu, J. S.

    2017-12-01

    Stratospheric aerosol geoengineering options, involving injection of sulfur dioxide (SO2) aerosols into the stratosphere, are being proposed to reduce the heating effects of increasing anthropogenic atmospheric carbon dioxide (CO2). While the impacts of stratospheric aerosol geoengineering on climate changes, such as stratospheric ozone depletion and weakened monsoons, have been extensively investigated in the past few decades, few studies have considered the biogeochemical (BGC) responses and feedbacks on land. Previous Earth system model (ESM) simulations incorporating stratospheric aerosol geoengineering scenarios primarily focused on the atmospheric radiative forcing and temperature response in the absence of ocean and land responses. The land model setup in these simulations did not incorporate the carbon-nitrogen cycles and effects on the hydrological cycle considering vegetation responses. Since ESMs simulated very different aerosol distributions for the G3 and G4 scenarios in the Geoengineering Model Intercomparison Project (GeoMIP), we instead adopted the G4SSA scenario to simulate the BGC responses and feedbacks on land due to stratospheric aerosol geoengineering using the Community Earth System Model with active biogeochemical dynamic variations enabled. Implications for the terrestrial carbon cycle and hydrological responses will be presented.

  19. Constitutive models in LAME.

    Energy Technology Data Exchange (ETDEWEB)

    Hammerand, Daniel Carl; Scherzinger, William Mark

    2007-09-01

    The Library of Advanced Materials for Engineering (LAME) provides a common repository for constitutive models that can be used in computational solid mechanics codes. A number of models including both hypoelastic (rate) and hyperelastic (total strain) constitutive forms have been implemented in LAME. The structure and testing of LAME is described in Scherzinger and Hammerand ([3] and [4]). The purpose of the present report is to describe the material models which have already been implemented into LAME. The descriptions are designed to give useful information to both analysts and code developers. Thus far, 33 non-ITAR/non-CRADA protected material models have been incorporated. These include everything from the simple isotropic linear elastic models to a number of elastic-plastic models for metals to models for honeycomb, foams, potting epoxies and rubber. A complete description of each model is outside the scope of the current report. Rather, the aim here is to delineate the properties, state variables, functions, and methods for each model. However, a brief description of some of the constitutive details is provided for a number of the material models. Where appropriate, the SAND reports available for each model have been cited. Many models have state variable aliases for some or all of their state variables. These alias names can be used for outputting desired quantities. The state variable aliases available for results output have been listed in this report. However, not all models use these aliases. For those models, no state variable names are listed. Nevertheless, the number of state variables employed by each model is always given. Currently, there are four possible functions for a material model. This report lists which of these four methods are employed in each material model. As far as analysts are concerned, this information is included only for the awareness purposes. The analyst can take confidence in the fact that model has been properly implemented

  20. Systemic resilience model

    International Nuclear Information System (INIS)

    Lundberg, Jonas; Johansson, Björn JE

    2015-01-01

    It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies

  1. Modeling Quantum Well Lasers

    Directory of Open Access Journals (Sweden)

    Dan Alexandru Anghel

    2012-01-01

    Full Text Available In semiconductor laser modeling, a good mathematical model gives near-reality results. Three methods of modeling solutions from the rate equations are presented and analyzed. A method based on the rate equations modeled in Simulink to describe quantum well lasers was presented. For different signal types like step function, saw tooth and sinus used as input, a good response of the used equations is obtained. Circuit model resulting from one of the rate equations models is presented and simulated in SPICE. Results show a good modeling behavior. Numerical simulation in MathCad gives satisfactory results for the study of the transitory and dynamic operation at small level of the injection current. The obtained numerical results show the specific limits of each model, according to theoretical analysis. Based on these results, software can be built that integrates circuit simulation and other modeling methods for quantum well lasers to have a tool that model and analysis these devices from all points of view.

  2. Geochemical modeling: a review

    International Nuclear Information System (INIS)

    Jenne, E.A.

    1981-06-01

    Two general families of geochemical models presently exist. The ion speciation-solubility group of geochemical models contain submodels to first calculate a distribution of aqueous species and to secondly test the hypothesis that the water is near equilibrium with particular solid phases. These models may or may not calculate the adsorption of dissolved constituents and simulate the dissolution and precipitation (mass transfer) of solid phases. Another family of geochemical models, the reaction path models, simulates the stepwise precipitation of solid phases as a result of reacting specified amounts of water and rock. Reaction path models first perform an aqueous speciation of the dissolved constituents of the water, test solubility hypotheses, then perform the reaction path modeling. Certain improvements in the present versions of these models would enhance their value and usefulness to applications in nuclear-waste isolation, etc. Mass-transfer calculations of limited extent are certainly within the capabilities of state-of-the-art models. However, the reaction path models require an expansion of their thermodynamic data bases and systematic validation before they are generally accepted

  3. The Tanaka model

    International Nuclear Information System (INIS)

    Tanaka, G.

    1998-01-01

    The data presented here includes male and female models for Asian populations in the age groups: Newborn, 1 year, 5 years, 10 years, 15 years and adult. The model for adult male was presented at the 3rd Research Coordination Meeting held in Tianjin, October 1993. At that time, the CRP participants requested Dr. Tanaka to continue development of a female model. The adult female model was developed together with models for five younger age groups. It is intended to provide useful data for radiation protection, and has been submitted to ICRP for use in developing revised models for internal dosimetry. The model is based on normal organ masses as well as physical measurements obtained primarily from Chinese, Indian and Japanese populations. These are believed to be the most extensive data sets available. The data presented here also takes into account the variations found in the data reported by other CRP participants. It should be stressed that the model is, at the same time, based on the approach used by the ICRP Reference Man Task Group in development of their Reference Man. As noted above, the adult male model was presented at the RCM Meeting in Tianjin and approved by the participants as ''Tanaka Model'' that would be convenient for use in internal dosimetry studies for subjects from Asian populations. It is also the essential part of a publication which is a revised edition of the previous work

  4. Modeling environmental policy

    International Nuclear Information System (INIS)

    Martin, W.E.; McDonald, L.A.

    1997-01-01

    The eight book chapters demonstrate the link between the physical models of the environment and the policy analysis in support of policy making. Each chapter addresses an environmental policy issue using a quantitative modeling approach. The volume addresses three general areas of environmental policy - non-point source pollution in the agricultural sector, pollution generated in the extractive industries, and transboundary pollutants from burning fossil fuels. The book concludes by discussing the modeling efforts and the use of mathematical models in general. Chapters are entitled: modeling environmental policy: an introduction; modeling nonpoint source pollution in an integrated system (agri-ecological); modeling environmental and trade policy linkages: the case of EU and US agriculture; modeling ecosystem constraints in the Clean Water Act: a case study in Clearwater National Forest (subject to discharge from metal mining waste); costs and benefits of coke oven emission controls; modeling equilibria and risk under global environmental constraints (discussing energy and environmental interrelations); relative contribution of the enhanced greenhouse effect on the coastal changes in Louisiana; and the use of mathematical models in policy evaluations: comments. The paper on coke area emission controls has been abstracted separately for the IEA Coal Research CD-ROM

  5. Drought modeling - A review

    Science.gov (United States)

    Mishra, Ashok K.; Singh, Vijay P.

    2011-06-01

    SummaryIn recent years droughts have been occurring frequently, and their impacts are being aggravated by the rise in water demand and the variability in hydro-meteorological variables due to climate change. As a result, drought hydrology has been receiving much attention. A variety of concepts have been applied to modeling droughts, ranging from simplistic approaches to more complex models. It is important to understand different modeling approaches as well as their advantages and limitations. This paper, supplementing the previous paper ( Mishra and Singh, 2010) where different concepts of droughts were highlighted, reviews different methodologies used for drought modeling, which include drought forecasting, probability based modeling, spatio-temporal analysis, use of Global Climate Models (GCMs) for drought scenarios, land data assimilation systems for drought modeling, and drought planning. It is found that there have been significant improvements in modeling droughts over the past three decades. Hybrid models, incorporating large scale climate indices, seem to be promising for long lead-time drought forecasting. Further research is needed to understand the spatio-temporal complexity of droughts under climate change due to changes in spatio-temporal variability of precipitation. Applications of copula based models for multivariate drought characterization seem to be promising for better drought characterization. Research on decision support systems should be advanced for issuing warnings, assessing risk, and taking precautionary measures, and the effective ways for the flow of information from decision makers to users need to be developed. Finally, some remarks are made regarding the future outlook for drought research.

  6. Models as Relational Categories

    Science.gov (United States)

    Kokkonen, Tommi

    2017-11-01

    Model-based learning (MBL) has an established position within science education. It has been found to enhance conceptual understanding and provide a way for engaging students in authentic scientific activity. Despite ample research, few studies have examined the cognitive processes regarding learning scientific concepts within MBL. On the other hand, recent research within cognitive science has examined the learning of so-called relational categories. Relational categories are categories whose membership is determined on the basis of the common relational structure. In this theoretical paper, I argue that viewing models as relational categories provides a well-motivated cognitive basis for MBL. I discuss the different roles of models and modeling within MBL (using ready-made models, constructive modeling, and generative modeling) and discern the related cognitive aspects brought forward by the reinterpretation of models as relational categories. I will argue that relational knowledge is vital in learning novel models and in the transfer of learning. Moreover, relational knowledge underlies the coherent, hierarchical knowledge of experts. Lastly, I will examine how the format of external representations may affect the learning of models and the relevant relations. The nature of the learning mechanisms underlying students' mental representations of models is an interesting open question to be examined. Furthermore, the ways in which the expert-like knowledge develops and how to best support it is in need of more research. The discussion and conceptualization of models as relational categories allows discerning students' mental representations of models in terms of evolving relational structures in greater detail than previously done.

  7. The Protein Model Portal.

    Science.gov (United States)

    Arnold, Konstantin; Kiefer, Florian; Kopp, Jürgen; Battey, James N D; Podvinec, Michael; Westbrook, John D; Berman, Helen M; Bordoli, Lorenza; Schwede, Torsten

    2009-03-01

    Structural Genomics has been successful in determining the structures of many unique proteins in a high throughput manner. Still, the number of known protein sequences is much larger than the number of experimentally solved protein structures. Homology (or comparative) modeling methods make use of experimental protein structures to build models for evolutionary related proteins. Thereby, experimental structure determination efforts and homology modeling complement each other in the exploration of the protein structure space. One of the challenges in using model information effectively has been to access all models available for a specific protein in heterogeneous formats at different sites using various incompatible accession code systems. Often, structure models for hundreds of proteins can be derived from a given experimentally determined structure, using a variety of established methods. This has been done by all of the PSI centers, and by various independent modeling groups. The goal of the Protein Model Portal (PMP) is to provide a single portal which gives access to the various models that can be leveraged from PSI targets and other experimental protein structures. A single interface allows all existing pre-computed models across these various sites to be queried simultaneously, and provides links to interactive services for template selection, target-template alignment, model building, and quality assessment. The current release of the portal consists of 7.6 million model structures provided by different partner resources (CSMP, JCSG, MCSG, NESG, NYSGXRC, JCMM, ModBase, SWISS-MODEL Repository). The PMP is available at http://www.proteinmodelportal.org and from the PSI Structural Genomics Knowledgebase.

  8. Simulating the Dependence of Aspen on Redistributed Snow

    Science.gov (United States)

    Soderquist, B.; Kavanagh, K.; Link, T. E.; Seyfried, M. S.; Winstral, A. H.

    2013-12-01

    In mountainous regions across the western USA, the distribution of aspen (Populus tremuloides) is often directly related to heterogeneous soil moisture subsidies resulting from redistributed snow. With decades of climate and precipitation data across elevational and precipitation gradients, the Reynolds Creek Experimental Watershed (RCEW) in southwest Idaho provides a unique opportunity to study the relationship between aspen and redistributed snow. Within the RCEW, the total amount of precipitation has not changed in the past 50 years, but there are sharp declines in the percentage of the precipitation falling as snow. As shifts in the distribution of available moisture continue, future trends in aspen net primary productivity (NPP) remain uncertain. In order to assess the importance of snowdrift subsidies, NPP of three aspen stands was simulated at sites spanning elevational and precipitation gradients using the biogeochemical process model BIOME-BGC. At the aspen site experiencing the driest climate and lowest amount of precipitation from snow, approximately 400 mm of total precipitation was measured from November to March of 2008. However, peak measured snow water equivalent (SWE) held in drifts directly upslope of this stand was approximately 2100 mm, 5 times more moisture than the uniform winter precipitation layer initially assumed by BIOME-BGC. BIOME-BGC simulations in dry years forced by adjusted precipitation data resulted in NPP values approximately 30% higher than simulations assuming a uniform precipitation layer. Using BIOME-BGC and climate data from 1985-2011, the relationship between simulated NPP and measured basal area increments (BAI) improved after accounting for redistributed snow, indicating increased simulation representation. In addition to improved simulation capabilities, soil moisture data, diurnal branch water potential, and stomatal conductance observations at each site detail the use of soil moisture in the rooting zone and the onset

  9. Modelling cointegration in the vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren

    2000-01-01

    A survey is given of some results obtained for the cointegrated VAR. The Granger representation theorem is discussed and the notions of cointegration and common trends are defined. The statistical model for cointegrated I(1) variables is defined, and it is shown how hypotheses on the cointegratin...

  10. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  11. Cloud model bat algorithm.

    Science.gov (United States)

    Zhou, Yongquan; Xie, Jian; Li, Liangliang; Ma, Mingzhi

    2014-01-01

    Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: "bats approach their prey." Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization.

  12. Cloud Model Bat Algorithm

    Directory of Open Access Journals (Sweden)

    Yongquan Zhou

    2014-01-01

    Full Text Available Bat algorithm (BA is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformation theory of cloud model to depict the qualitative concept: “bats approach their prey.” Furthermore, Lévy flight mode and population information communication mechanism of bats are introduced to balance the advantage between exploration and exploitation. The simulation results show that the cloud model bat algorithm has good performance on functions optimization.

  13. Elements of modeling

    International Nuclear Information System (INIS)

    Bozoki, E.

    1987-01-01

    There is burgeoning interest in modeling-based accelerator control. With more and more stringent requirements on the performance, the importance of knowing, controlling, predicting the behavior of the accelerator system is growing. Modeling means two things: (1) the development of programs and data which predict the outcome of a measurement, and (2) devising and performing measurements to find the machine physics parameter and their behavior under different conditions. These two sides should be tied together in an iterative process. With knowledge gained on the real system, the model will be modified, calibrated, and fine-tuned. The model of a system consists of data and the modeling program. The Modeling Based Control Programs (MBC) should in the on-line mode control, optimize, and correct the machine. In the off-line mode, the MBC is used to simulate the machine as well as explore and study its behavior and responses under a wide variety of circumstances. 15 refs., 3 figs

  14. Inverse and Predictive Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Syracuse, Ellen Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-27

    The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an even greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.

  15. Spatial cluster modelling

    CERN Document Server

    Lawson, Andrew B

    2002-01-01

    Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome research. In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space and space-time, spatial and spatio-temporal process modelling, nonparametric methods for clustering, and spatio-temporal ...

  16. The IIR evaluation model

    DEFF Research Database (Denmark)

    Borlund, Pia

    2003-01-01

    An alternative approach to evaluation of interactive information retrieval (IIR) systems, referred to as the IIR evaluation model, is proposed. The model provides a framework for the collection and analysis of IR interaction data. The aim of the model is two-fold: 1) to facilitate the evaluation...... of IIR systems as realistically as possible with reference to actual information searching and retrieval processes, though still in a relatively controlled evaluation environment; and 2) to calculate the IIR system performance taking into account the non-binary nature of the assigned relevance...... assessments. The IIR evaluation model is presented as an alternative to the system-driven Cranfield model (Cleverdon, Mills & Keen, 1966; Cleverdon & Keen, 1966) which still is the dominant approach to the evaluation of IR and IIR systems. Key elements of the IIR evaluation model are the use of realistic...

  17. Essentials of econophysics modelling

    CERN Document Server

    Slanina, Frantisek

    2014-01-01

    This book is a course in methods and models rooted in physics and used in modelling economic and social phenomena. It covers the discipline of econophysics, which creates an interface between physics and economics. Besides the main theme, it touches on the theory of complex networks and simulations of social phenomena in general. After a brief historical introduction, the book starts with a list of basic empirical data and proceeds to thorough investigation of mathematical and computer models. Many of the models are based on hypotheses of the behaviour of simplified agents. These comprise strategic thinking, imitation, herding, and the gem of econophysics, the so-called minority game. At the same time, many other models view the economic processes as interactions of inanimate particles. Here, the methods of physics are especially useful. Examples of systems modelled in such a way include books of stock-market orders, and redistribution of wealth among individuals. Network effects are investigated in the inter...

  18. Brain Network Modelling

    DEFF Research Database (Denmark)

    Andersen, Kasper Winther

    Three main topics are presented in this thesis. The first and largest topic concerns network modelling of functional Magnetic Resonance Imaging (fMRI) and Diffusion Weighted Imaging (DWI). In particular nonparametric Bayesian methods are used to model brain networks derived from resting state f...... for their ability to reproduce node clustering and predict unseen data. Comparing the models on whole brain networks, BCD and IRM showed better reproducibility and predictability than IDM, suggesting that resting state networks exhibit community structure. This also points to the importance of using models, which...... allow for complex interactions between all pairs of clusters. In addition, it is demonstrated how the IRM can be used for segmenting brain structures into functionally coherent clusters. A new nonparametric Bayesian network model is presented. The model builds upon the IRM and can be used to infer...

  19. Identification of physical models

    DEFF Research Database (Denmark)

    Melgaard, Henrik

    1994-01-01

    The problem of identification of physical models is considered within the frame of stochastic differential equations. Methods for estimation of parameters of these continuous time models based on descrete time measurements are discussed. The important algorithms of a computer program for ML or MAP...... design of experiments, which is for instance the design of an input signal that are optimal according to a criterion based on the information provided by the experiment. Also model validation is discussed. An important verification of a physical model is to compare the physical characteristics...... of the model with the available prior knowledge. The methods for identification of physical models have been applied in two different case studies. One case is the identification of thermal dynamics of building components. The work is related to a CEC research project called PASSYS (Passive Solar Components...

  20. Diaspora Business Model Innovation

    Directory of Open Access Journals (Sweden)

    Aki Harima

    2015-01-01

    Full Text Available This paper explores how diasporans achieve business model innovatin by using their unique resources. The hypothesis underlying the paper is that the unique backgrounds and resources of the diaspora businesses, due to diffrent sources of informatin and experiences as well as multile networks, contributes to business model innovatin in a distictie manner. We investiate the English school market in the Philippines which is established by East Asian diaspora who innovate a business model of conventinal English schools. Two case studies were conducted with Japanese diaspora English schools. Their business is analyzed using a business model canvas (Osterwalder & Pigneur, 2010 and contrasted with the conventinal business model. The empirical cases show that diaspora businesses use knowledge about their country of origin and engage with country of residence and multile networks in diffrent locatins and constellatins to identiy unique opportunitis, leading to a business model innovatin.

  1. Electricity market modeling trends

    International Nuclear Information System (INIS)

    Ventosa, Mariano; Baillo, Alvaro; Ramos, Andres; Rivier, Michel

    2005-01-01

    The trend towards competition in the electricity sector has led to efforts by the research community to develop decision and analysis support models adapted to the new market context. This paper focuses on electricity generation market modeling. Its aim is to help to identify, classify and characterize the somewhat confusing diversity of approaches that can be found in the technical literature on the subject. The paper presents a survey of the most relevant publications regarding electricity market modeling, identifying three major trends: optimization models, equilibrium models and simulation models. It introduces a classification according to their most relevant attributes. Finally, it identifies the most suitable approaches for conducting various types of planning studies or market analysis in this new context

  2. Alternative tsunami models

    Energy Technology Data Exchange (ETDEWEB)

    Tan, A; Lyatskaya, I [Department of Physics, Alabama A and M University, Normal, AL 35762 (United States)], E-mail: arjun.tan@aamu.edu

    2009-01-15

    The interesting papers by Margaritondo (2005 Eur. J. Phys. 26 401) and by Helene and Yamashita (2006 Eur. J. Phys. 27 855) analysed the great Indian Ocean tsunami of 2004 using a simple one-dimensional canal wave model, which was appropriate for undergraduate students in physics and related fields of discipline. In this paper, two additional, easily understandable models, suitable for the same level of readership, are proposed: one, a two-dimensional model in flat space, and two, the same on a spherical surface. The models are used to study the tsunami produced by the central Kuril earthquake of November 2006. It is shown that the two alternative models, especially the latter one, give better representations of the wave amplitude, especially at far-flung locations. The latter model further demonstrates the enhancing effect on the amplitude due to the curvature of the Earth for far-reaching tsunami propagation.

  3. Energy balance climate models

    Science.gov (United States)

    North, G. R.; Cahalan, R. F.; Coakley, J. A., Jr.

    1981-01-01

    An introductory survey of the global energy balance climate models is presented with an emphasis on analytical results. A sequence of increasingly complicated models involving ice cap and radiative feedback processes are solved, and the solutions and parameter sensitivities are studied. The model parameterizations are examined critically in light of many current uncertainties. A simple seasonal model is used to study the effects of changes in orbital elements on the temperature field. A linear stability theorem and a complete nonlinear stability analysis for the models are developed. Analytical solutions are also obtained for the linearized models driven by stochastic forcing elements. In this context the relation between natural fluctuation statistics and climate sensitivity is stressed.

  4. Multiscale Cancer Modeling

    Science.gov (United States)

    Macklin, Paul; Cristini, Vittorio

    2013-01-01

    Simulating cancer behavior across multiple biological scales in space and time, i.e., multiscale cancer modeling, is increasingly being recognized as a powerful tool to refine hypotheses, focus experiments, and enable more accurate predictions. A growing number of examples illustrate the value of this approach in providing quantitative insight on the initiation, progression, and treatment of cancer. In this review, we introduce the most recent and important multiscale cancer modeling works that have successfully established a mechanistic link between different biological scales. Biophysical, biochemical, and biomechanical factors are considered in these models. We also discuss innovative, cutting-edge modeling methods that are moving predictive multiscale cancer modeling toward clinical application. Furthermore, because the development of multiscale cancer models requires a new level of collaboration among scientists from a variety of fields such as biology, medicine, physics, mathematics, engineering, and computer science, an innovative Web-based infrastructure is needed to support this growing community. PMID:21529163

  5. Los Alamos Programming Models

    Energy Technology Data Exchange (ETDEWEB)

    Bergen, Benjamin Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-07

    This is the PDF of a powerpoint presentation from a teleconference on Los Alamos programming models. It starts by listing their assumptions for the programming models and then details a hierarchical programming model at the System Level and Node Level. Then it details how to map this to their internal nomenclature. Finally, a list is given of what they are currently doing in this regard.

  6. Modelling Meat Quality Attributes.

    OpenAIRE

    Farrell, Terence C.

    2001-01-01

    Recent meat demand models incorporate demand functions for cuts of meat rather than whole carcasses. However, parameters for “meat quality” are seldom included in such models. Modelling difficulty arises as meat cuts are heterogeneous in their quality attributes. Meat quality may be assessed by measurement of attributes including tenderness, juiciness and flavour. Cooking method and cooking time are the two primary factors that affect meat-eating quality. The purpose of this paper is to show ...

  7. AREST model description

    Energy Technology Data Exchange (ETDEWEB)

    Engel, D.W.; McGrail, B.P.

    1993-11-01

    The Office of Civilian Radioactive Waste Management and the Power Reactor and Nuclear Fuel Development Corporation of Japan (PNC) have supported the development of the Analytical Repository Source-Term (AREST) at Pacific Northwest Laboratory. AREST is a computer model developed to evaluate radionuclide release from an underground geologic repository. The AREST code can be used to calculate/estimate the amount and rate of each radionuclide that is released from the engineered barrier system (EBS) of the repository. The EBS is the man-made or disrupted area of the repository. AREST was designed as a system-level models to simulate the behavior of the total repository by combining process-level models for the release from an individual waste package or container. AREST contains primarily analytical models for calculating the release/transport of radionuclides to the lost rock that surrounds each waste package. Analytical models were used because of the small computational overhead that allows all the input parameters to be derived from a statistical distribution. Recently, a one-dimensional numerical model was also incorporated into AREST, to allow for more detailed modeling of the transport process with arbitrary length decay chains. The next step in modeling the EBS, is to develop a model that couples the probabilistic capabilities of AREST with a more detailed process model. This model will need to look at the reactive coupling of the processes that are involved with the release process. Such coupling would include: (1) the dissolution of the waste form, (2) the geochemical modeling of the groundwater, (3) the corrosion of the container overpacking, and (4) the backfill material, just to name a few. Several of these coupled processes are already incorporated in the current version of AREST.

  8. Conceptual IT model

    Science.gov (United States)

    Arnaoudova, Kristina; Stanchev, Peter

    2015-11-01

    The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

  9. The Protein Model Portal

    OpenAIRE

    Arnold, Konstantin; Kiefer, Florian; Kopp, J?rgen; Battey, James N. D.; Podvinec, Michael; Westbrook, John D.; Berman, Helen M.; Bordoli, Lorenza; Schwede, Torsten

    2008-01-01

    Structural Genomics has been successful in determining the structures of many unique proteins in a high throughput manner. Still, the number of known protein sequences is much larger than the number of experimentally solved protein structures. Homology (or comparative) modeling methods make use of experimental protein structures to build models for evolutionary related proteins. Thereby, experimental structure determination efforts and homology modeling complement each other in the exploratio...

  10. Modeling Frequency Comb Sources

    Directory of Open Access Journals (Sweden)

    Li Feng

    2016-06-01

    Full Text Available Frequency comb sources have revolutionized metrology and spectroscopy and found applications in many fields. Stable, low-cost, high-quality frequency comb sources are important to these applications. Modeling of the frequency comb sources will help the understanding of the operation mechanism and optimization of the design of such sources. In this paper,we review the theoretical models used and recent progress of the modeling of frequency comb sources.

  11. Garch-EVT Model

    OpenAIRE

    Altun, Emrah; Tatlidil, Hüseyin

    2016-01-01

    In this study, wavelet based GARCH-Extreme Value Theory (EVT) is proposed to model financial return series to forecast daily value-at-risk. Wavelets based GARCH-EVT is hybrid model combining the wavelet analysis and EVT. Proposed model contains three stages. In first stage, return series is decomposed into wavelet series and approximation series by applying the maximal overlap discrete wavelet transform. Second stage, detrended return series and approximation series are obtained by using wave...

  12. Liftoff Model for MELCOR.

    Energy Technology Data Exchange (ETDEWEB)

    Young, Michael F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    Aerosol particles that deposit on surfaces may be subsequently resuspended by air flowing over the surface. A review of models for this liftoff process is presented and compared to available data. Based on this review, a model that agrees with existing data and is readily computed is presented for incorporation into a system level code such as MELCOR. Liftoff Model for MELCOR July 2015 4 This page is intentionally blank

  13. Beyond the standard model

    International Nuclear Information System (INIS)

    Wilczek, F.

    1993-01-01

    The standard model of particle physics is highly successful, although it is obviously not a complete or final theory. In this presentation the author argues that the structure of the standard model gives some quite concrete, compelling hints regarding what lies beyond. Essentially, this presentation is a record of the author's own judgement of what the central clues for physics beyond the standard model are, and also it is an attempt at some pedagogy. 14 refs., 6 figs

  14. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  15. AREST model description

    International Nuclear Information System (INIS)

    Engel, D.W.; McGrail, B.P.

    1993-11-01

    The Office of Civilian Radioactive Waste Management and the Power Reactor and Nuclear Fuel Development Corporation of Japan (PNC) have supported the development of the Analytical Repository Source-Term (AREST) at Pacific Northwest Laboratory. AREST is a computer model developed to evaluate radionuclide release from an underground geologic repository. The AREST code can be used to calculate/estimate the amount and rate of each radionuclide that is released from the engineered barrier system (EBS) of the repository. The EBS is the man-made or disrupted area of the repository. AREST was designed as a system-level models to simulate the behavior of the total repository by combining process-level models for the release from an individual waste package or container. AREST contains primarily analytical models for calculating the release/transport of radionuclides to the lost rock that surrounds each waste package. Analytical models were used because of the small computational overhead that allows all the input parameters to be derived from a statistical distribution. Recently, a one-dimensional numerical model was also incorporated into AREST, to allow for more detailed modeling of the transport process with arbitrary length decay chains. The next step in modeling the EBS, is to develop a model that couples the probabilistic capabilities of AREST with a more detailed process model. This model will need to look at the reactive coupling of the processes that are involved with the release process. Such coupling would include: (1) the dissolution of the waste form, (2) the geochemical modeling of the groundwater, (3) the corrosion of the container overpacking, and (4) the backfill material, just to name a few. Several of these coupled processes are already incorporated in the current version of AREST

  16. Vacuum inhomogeneous cosmological models

    International Nuclear Information System (INIS)

    Hanquin, J.-L.

    1984-01-01

    The author presents some results concerning the vacuum cosmological models which admit a 2-dimensional Abelian group of isometries: classifications of these space-times based on the topological nature of their space-like hypersurfaces and on their time evolution, analysis of the asymptotical behaviours at spatial infinity for hyperbolical models as well as in the neighbourhood of the singularity for the models possessing a time singularity during their evolution. (Auth.)

  17. Beyond the standard model

    International Nuclear Information System (INIS)

    Pleitez, V.

    1994-01-01

    The search for physics laws beyond the standard model is discussed in a general way, and also some topics on supersymmetry theories. An approach is made on recent possibilities rise in the leptonic sector. Finally, models with SU(3) c X SU(2) L X U(1) Y symmetry are considered as alternatives for the extensions of the elementary particles standard model. 36 refs., 1 fig., 4 tabs

  18. Transgenesis for pig models

    OpenAIRE

    Yum, Soo-Young; Yoon, Ki-Young; Lee, Choong-Il; Lee, Byeong-Chun; Jang, Goo

    2016-01-01

    Animal models, particularly pigs, have come to play an important role in translational biomedical research. There have been many pig models with genetically modifications via somatic cell nuclear transfer (SCNT). However, because most transgenic pigs have been produced by random integration to date, the necessity for more exact gene-mutated models using recombinase based conditional gene expression like mice has been raised. Currently, advanced genome-editing technologies enable us to generat...

  19. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  20. Models of molecular geometry.

    Science.gov (United States)

    Gillespie, Ronald J; Robinson, Edward A

    2005-05-01

    Although the structure of almost any molecule can now be obtained by ab initio calculations chemists still look for simple answers to the question "What determines the geometry of a given molecule?" For this purpose they make use of various models such as the VSEPR model and qualitative quantum mechanical models such as those based on the valence bond theory. The present state of such models, and the support for them provided by recently developed methods for analyzing calculated electron densities, are reviewed and discussed in this tutorial review.

  1. Croatian Cadastre Database Modelling

    Directory of Open Access Journals (Sweden)

    Zvonko Biljecki

    2013-04-01

    Full Text Available The Cadastral Data Model has been developed as a part of a larger programme to improve products and production environment of the Croatian Cadastral Service of the State Geodetic Administration (SGA. The goal of the project was to create a cadastral data model conforming to relevant standards and specifications in the field of geoinformation (GI adapted by international organisations for standardisation under the competence of GI (ISO TC211 and OpenGIS and it implementations.The main guidelines during the project have been object-oriented conceptual modelling of the updated users' requests and a "new" cadastral data model designed by SGA - Faculty of Geodesy - Geofoto LLC project team. The UML of the conceptual model is given per all feature categories and is described only at class level. The next step was the UML technical model, which was developed from the UML conceptual model. The technical model integrates different UML schemas in one united schema.XML (eXtensible Markup Language was applied for XML description of UML models, and then the XML schema was transferred into GML (Geography Markup Language application schema. With this procedure we have completely described the behaviour of each cadastral feature and rules for the transfer and storage of cadastral features into the database.

  2. Graphical Models with R

    CERN Document Server

    Højsgaard, Søren; Lauritzen, Steffen

    2012-01-01

    Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many of these software developments have taken place within the R community, either in the form of new packages or by providing an R interface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In add

  3. Models of Reality.

    Energy Technology Data Exchange (ETDEWEB)

    Brown-VanHoozer, S. A.

    1999-06-02

    Conscious awareness of our environment is based on a feedback loop comprised of sensory input transmitted to the central nervous system leading to construction of our ''model of the world,'' (Lewis et al, 1982). We then assimilate the neurological model at the unconscious level into information we can later consciously consider useful in identifying belief systems and behaviors for designing diverse systems. Thus, we can avoid potential problems based on our open-to-error perceived reality of the world. By understanding how our model of reality is organized, we allow ourselves to transcend content and develop insight into how effective choices and belief systems are generated through sensory derived processes. These are the processes which provide the designer the ability to meta model (build a model of a model) the user; consequently, matching the mental model of the user with that of the designer's and, coincidentally, forming rapport between the two participants. The information shared between the participants is neither assumed nor generalized, it is closer to equivocal; thus minimizing error through a sharing of each other's model of reality. How to identify individual mental mechanisms or processes, how to organize the individual strategies of these mechanisms into useful patterns, and to formulate these into models for success and knowledge based outcomes is the subject of the discussion that follows.

  4. Dynamic Latent Classification Model

    DEFF Research Database (Denmark)

    Zhong, Shengtong; Martínez, Ana M.; Nielsen, Thomas Dyhre

    Monitoring a complex process often involves keeping an eye on hundreds or thousands of sensors to determine whether or not the process is under control. We have been working with dynamic data from an oil production facility in the North sea, where unstable situations should be identified as soon...... as possible. Motivated by this problem setting, we propose a generative model for dynamic classification in continuous domains. At each time point the model can be seen as combining a naive Bayes model with a mixture of factor analyzers (FA). The latent variables of the FA are used to capture the dynamics...... in the process as well as modeling dependences between attributes....

  5. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...

  6. Inside - Outside Model Viewing

    DEFF Research Database (Denmark)

    Nikolov, Ivan Adriyanov

    2016-01-01

    components of the model, their proportions compared to each other and the overall design. A variety of augmented reality(AR) applications have been created for overall visualization of large scale models. For tours inside 3D renderings of models many immersive virtual reality (VR) applications exist. Both...... types of applications have their limitation, omitting either important details in the AR case or the full picture in the case of VR. This paper presents a low-cost way to demonstrate models using a hybrid virtual environment system (HVE), combining virtual reality and augmented reality visualization...

  7. Ground water modelling

    International Nuclear Information System (INIS)

    Leino-Forsman, H.; Olin, M.

    1991-01-01

    The first Seminar on Groundwater Modelling was arranged by VTT (Reactor Laboratory) in Espoo Finland in May 1991. The one day seminar dealt both with modelling of geochemistry and transport of groundwater, as well as mathematical methods for modelling. The seminar concentrated on giving a broad picture of the applications of groundwater modelling e.g. nuclear waste, groundwater resources including artificial groundwater and pollution. The participants came from research institutes and universities as well as engineering companies. Articles are published in Finnish with English abstracts

  8. Business Model Alternations

    OpenAIRE

    Zagorsek, Branislav

    2013-01-01

    The purpose of this paper is to present possible ways how could a company maintain or even gain its competitive advantage in high dynamic business environment from a per-spective of business models. After a short introduction on evolution of innovation, this paper is divided in three parts. In first part it discusses the business model itself, how to design a business model and how to deal with it. Second part discusses business model innovations. When and how to innovate or reinvent your bus...

  9. Computer Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Pronskikh, V. S. [Fermilab

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  10. Particle bed reactor modeling

    Science.gov (United States)

    Sapyta, Joe; Reid, Hank; Walton, Lew

    The topics are presented in viewgraph form and include the following: particle bed reactor (PBR) core cross section; PBR bleed cycle; fuel and moderator flow paths; PBR modeling requirements; characteristics of PBR and nuclear thermal propulsion (NTP) modeling; challenges for PBR and NTP modeling; thermal hydraulic computer codes; capabilities for PBR/reactor application; thermal/hydralic codes; limitations; physical correlations; comparison of predicted friction factor and experimental data; frit pressure drop testing; cold frit mask factor; decay heat flow rate; startup transient simulation; and philosophy of systems modeling.

  11. MARKETING MODELS APPLICATION EXPERIENCE

    Directory of Open Access Journals (Sweden)

    A. Yu. Rymanov

    2011-01-01

    Full Text Available Marketing models are used for the assessment of such marketing elements as sales volume, market share, market attractiveness, advertizing costs, product pushing and selling, profit, profitableness. Classification of buying process decision taking models is presented. SWOT- and GAPbased models are best for selling assessments. Lately, there is a tendency to transfer from the assessment on the ba-sis of financial indices to that on the basis of those non-financial. From the marketing viewpoint, most important are long-term company activities and consumer drawingmodels as well as market attractiveness operative models.

  12. Multifamily Envelope Leakage Model

    Energy Technology Data Exchange (ETDEWEB)

    Faakye, O. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States); Griffiths, D. [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-05-01

    The objective of the 2013 research project was to develop the model for predicting fully guarded test results (FGT), using unguarded test data and specific building features of apartment units. The model developed has a coefficient of determination R2 value of 0.53 with a root mean square error (RMSE) of 0.13. Both statistical metrics indicate that the model is relatively strong. When tested against data that was not included in the development of the model, prediction accuracy was within 19%, which is reasonable given that seasonal differences in blower door measurements can vary by as much as 25%.

  13. Linear models with R

    CERN Document Server

    Faraway, Julian J

    2014-01-01

    A Hands-On Way to Learning Data AnalysisPart of the core of statistics, linear models are used to make predictions and explain the relationship between the response and the predictors. Understanding linear models is crucial to a broader competence in the practice of statistics. Linear Models with R, Second Edition explains how to use linear models in physical science, engineering, social science, and business applications. The book incorporates several improvements that reflect how the world of R has greatly expanded since the publication of the first edition.New to the Second EditionReorganiz

  14. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  15. Electrical load modeling

    Energy Technology Data Exchange (ETDEWEB)

    Valgas, Helio Moreira; Pinto, Roberto del Giudice R.; Franca, Carlos [Companhia Energetica de Minas Gerais (CEMIG), Belo Horizonte, MG (Brazil); Lambert-Torres, Germano; Silva, Alexandre P. Alves da; Pires, Robson Celso; Costa Junior, Roberto Affonso [Escola Federal de Engenharia de Itajuba, MG (Brazil)

    1994-12-31

    Accurate dynamic load models allow more precise calculations of power system controls and stability limits, which are critical mainly in the operation planning of power systems. This paper describes the development of a computer program (software) for static and dynamic load model studies using the measurement approach for the CEMIG system. Two dynamic load model structures are developed and tested. A procedure for applying a set of measured data from an on-line transient recording system to develop load models is described. (author) 6 refs., 17 figs.

  16. Modeling Asset Price Dynamics

    Directory of Open Access Journals (Sweden)

    Ranasinghe P. K. C. Malmini

    2008-09-01

    Full Text Available We model the price prediction in Sri Lankan stock market using Ising model and some recent developments in statistical physics techniques. In contrast to usual agent-models, the influence does not flow inward from the surrounding neighbors to the centre, but spreads outward from the center to the neighbors. Monte Carlo simulations were used to study this problem. The analysis was based on All share price index, Milanka price index in Colombo Stock Exchange and Simulated Price Process. The monthly and daily influences of the above indices to the Sri Lankan economy were also investigated. The model thus describes the spread of opinions traders.

  17. VBR video traffic models

    CERN Document Server

    Tanwir, Savera

    2014-01-01

    There has been a phenomenal growth in video applications over the past few years. An accurate traffic model of Variable Bit Rate (VBR) video is necessary for performance evaluation of a network design and for generating synthetic traffic that can be used for benchmarking a network. A large number of models for VBR video traffic have been proposed in the literature for different types of video in the past 20 years. Here, the authors have classified and surveyed these models and have also evaluated the models for H.264 AVC and MVC encoded video and discussed their findings.

  18. Coronal Magnetic Field Models

    Science.gov (United States)

    Wiegelmann, Thomas; Petrie, Gordon J. D.; Riley, Pete

    2017-09-01

    Coronal magnetic field models use photospheric field measurements as boundary condition to model the solar corona. We review in this paper the most common model assumptions, starting from MHD-models, magnetohydrostatics, force-free and finally potential field models. Each model in this list is somewhat less complex than the previous one and makes more restrictive assumptions by neglecting physical effects. The magnetohydrostatic approach neglects time-dependent phenomena and plasma flows, the force-free approach neglects additionally the gradient of the plasma pressure and the gravity force. This leads to the assumption of a vanishing Lorentz force and electric currents are parallel (or anti-parallel) to the magnetic field lines. Finally, the potential field approach neglects also these currents. We outline the main assumptions, benefits and limitations of these models both from a theoretical (how realistic are the models?) and a practical viewpoint (which computer resources to we need?). Finally we address the important problem of noisy and inconsistent photospheric boundary conditions and the possibility of using chromospheric and coronal observations to improve the models.

  19. Selected System Models

    Science.gov (United States)

    Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.

    Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.

  20. Chiral bag model

    International Nuclear Information System (INIS)

    Musakhanov, M.M.

    1980-01-01

    The chiral bag model is considered. It is suggested that pions interact only with the surface of a quark ''bag'' and do not penetrate inside. In the case of a large bag the pion field is rather weak and goes to the linearized chiral bag model. Within that model the baryon mass spectrum, β decay axial constant, magnetic moments of baryons, pion-baryon coupling constants and their form factors are calculated. It is shown that pion corrections to the calculations according to the chiral bag model is essential. The obtained results are found to be in a reasonable agreement with the experimental data

  1. SME International Business Models

    DEFF Research Database (Denmark)

    Child, John; Hsieh, Linda; Elbanna, Said

    2017-01-01

    This paper addresses two questions through a study of 180 SMEs located in contrasting industry and home country contexts. First, which business models for international markets prevail among SMEs and do they configure into different types? Second, which factors predict the international business...... models that SMEs follow? Three distinct international business models (traditional market-adaptive, technology exploiter, and ambidextrous explorer) are found among the SMEs studied. The likelihood of SMEs adopting one business model rather than another is to a high degree predictable with reference...... to a small set of factors: industry, level of home economy development, and decision-maker international experience....

  2. Modeling Optical Lithography Physics

    Science.gov (United States)

    Neureuther, Andrew R.; Rubinstein, Juliet; Chin, Eric; Wang, Lynn; Miller, Marshal; Clifford, Chris; Yamazoe, Kenji

    2010-06-01

    Key physical phenomena associated with resists, illumination, lenses and masks are used to show the progress in models and algorithms for modeling optical projection printing as well as current simulation challenges in managing process complexity for manufacturing. The amazing current capability and challenges for projection printing are discussed using the 22 nm device generation. A fundamental foundation for modeling resist exposure, partial coherent imaging and defect printability is given. The technology innovations of resolution enhancement and chemically amplified resist systems and their modeling challenges are overviewed. Automated chip-level applications in pattern pre-compensation and design-anticipation of residual process variations require new simulation approaches.

  3. Turbine stage model

    International Nuclear Information System (INIS)

    Kazantsev, A.A.

    2009-01-01

    A model of turbine stage for calculations of NPP turbine department dynamics in real time was developed. The simulation results were compared with manufacturer calculations for NPP low-speed and fast turbines. The comparison results have shown that the model is valid for real time simulation of all modes of turbines operation. The model allows calculating turbine stage parameters with 1% accuracy. It was shown that the developed turbine stage model meets the accuracy requirements if the data of turbine blades setting angles for all turbine stages are available [ru

  4. Mixed models for predictive modeling in actuarial science

    NARCIS (Netherlands)

    Antonio, K.; Zhang, Y.

    2012-01-01

    We start with a general discussion of mixed (also called multilevel) models and continue with illustrating specific (actuarial) applications of this type of models. Technical details on (linear, generalized, non-linear) mixed models follow: model assumptions, specifications, estimation techniques

  5. Model Based Temporal Reasoning

    Science.gov (United States)

    Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.

    1988-03-01

    Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.

  6. Why business models matter.

    Science.gov (United States)

    Magretta, Joan

    2002-05-01

    "Business model" was one of the great buzz-words of the Internet boom. A company didn't need a strategy, a special competence, or even any customers--all it needed was a Web-based business model that promised wild profits in some distant, ill-defined future. Many people--investors, entrepreneurs, and executives alike--fell for the fantasy and got burned. And as the inevitable counterreaction played out, the concept of the business model fell out of fashion nearly as quickly as the .com appendage itself. That's a shame. As Joan Magretta explains, a good business model remains essential to every successful organization, whether it's a new venture or an established player. To help managers apply the concept successfully, she defines what a business model is and how it complements a smart competitive strategy. Business models are, at heart, stories that explain how enterprises work. Like a good story, a robust business model contains precisely delineated characters, plausible motivations, and a plot that turns on an insight about value. It answers certain questions: Who is the customer? How do we make money? What underlying economic logic explains how we can deliver value to customers at an appropriate cost? Every viable organization is built on a sound business model, but a business model isn't a strategy, even though many people use the terms interchangeably. Business models describe, as a system, how the pieces of a business fit together. But they don't factor in one critical dimension of performance: competition. That's the job of strategy. Illustrated with examples from companies like American Express, EuroDisney, WalMart, and Dell Computer, this article clarifies the concepts of business models and strategy, which are fundamental to every company's performance.

  7. Comparing root architectural models

    Science.gov (United States)

    Schnepf, Andrea; Javaux, Mathieu; Vanderborght, Jan

    2017-04-01

    Plant roots play an important role in several soil processes (Gregory 2006). Root architecture development determines the sites in soil where roots provide input of carbon and energy and take up water and solutes. However, root architecture is difficult to determine experimentally when grown in opaque soil. Thus, root architectural models have been widely used and been further developed into functional-structural models that are able to simulate the fate of water and solutes in the soil-root system (Dunbabin et al. 2013). Still, a systematic comparison of the different root architectural models is missing. In this work, we focus on discrete root architecture models where roots are described by connected line segments. These models differ (a) in their model concepts, such as the description of distance between branches based on a prescribed distance (inter-nodal distance) or based on a prescribed time interval. Furthermore, these models differ (b) in the implementation of the same concept, such as the time step size, the spatial discretization along the root axes or the way stochasticity of parameters such as root growth direction, growth rate, branch spacing, branching angles are treated. Based on the example of two such different root models, the root growth module of R-SWMS and RootBox, we show the impact of these differences on simulated root architecture and aggregated information computed from this detailed simulation results, taking into account the stochastic nature of those models. References Dunbabin, V.M., Postma, J.A., Schnepf, A., Pagès, L., Javaux, M., Wu, L., Leitner, D., Chen, Y.L., Rengel, Z., Diggle, A.J. Modelling root-soil interactions using three-dimensional models of root growth, architecture and function (2013) Plant and Soil, 372 (1-2), pp. 93 - 124. Gregory (2006) Roots, rhizosphere and soil: the route to a better understanding of soil science? European Journal of Soil Science 57: 2-12.

  8. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  9. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  10. Walking technicolor models

    International Nuclear Information System (INIS)

    King, S.F.

    1989-01-01

    Recent work on technicolor theories with small β-functions has shown that the flavour changing neutral current problem which besets any realistic extended technicolor model might be solved by Holdom's original suggestion of raising the extended technicolor scales. In this paper we apply these field theoretic ideas to the problem of constructing a realistic model of the quark and lepton mass spectrum. We discuss two closely related models: (1) An extended technicolor model based on the gauge group SO(10) ETC x SO(10) GUT ; (2) A composite/elementary extended technicolor model based on the gauge group SO(10) MC x SO(10) ETC x SU(5) GUT . Model (1) is relatively simple, and contains three families of quarks and leptons plus an SO(7) TC family of technifermions. The technicolor sector corresponds to one of the examples of walking technicolor discussed by Appelquist et al. The model is fully discussed with particular emphasis on the resulting quark and lepton mass spectrum. Charged lepton masses are adequately described, but the quark masses are degenerate in pairs with zero mixig angles. Model (2) shares the desirable low energy spectrum of Model (1) but in addition provides a mechanism for enhancing the mass of u-type quarks relative to d-type quarks, based on non-perturbative compositeness corrections. We discuss these compositeness corrections, as far as a perturbative treatment allows, and develop techniques for calculating quark masses and mixing angles. We apply these techniques to the first two families of quarks, and are encouraged to find that we can reproduce the observed features of u-d mass inversion for the first family, and Cabibbo mixing. Model (2) leads to the prediction of D 0 -anti D 0 mixing, K L → e ± μ -+ , K + → π + e - μ + , all at rates close to current experimental limits. The model also predicts three families and a top quark mass m t ≅ 50 GeV. (orig.)

  11. Existing Model Metrics and Relations to Model Quality

    OpenAIRE

    Mohagheghi, Parastoo; Dehlen, Vegard

    2009-01-01

    This paper presents quality goals for models and provides a state-of-the-art analysis regarding model metrics. While model-based software development often requires assessing the quality of models at different abstraction and precision levels and developed for multiple purposes, existing work on model metrics do not reflect this need. Model size metrics are descriptive and may be used for comparing models but their relation to model quality is not welldefined. Code metrics are proposed to be ...

  12. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    appeal in building systems which operate robustly over a wide range of operating conditions by decomposing them into a number of simplerlinear modelling or control problems, even for nonlinear modelling or control problems. This appeal has been a factor in the development of increasinglypopular `local...... to problems in the process industries, biomedical applications and autonomoussystems. The successful application of the ideas to demanding problems is already encouraging, but creative development of the basic framework isneeded to better allow the integration of human knowledge with automated learning....... The underlying question is `How should we partition the system - what is `local'?'. This book presents alternative ways of bringing submodels together,which lead to varying levels of performance and insight. Some are further developed for autonomous learning of parameters from data, while others havefocused...

  13. Spiral model pilot project information model

    Science.gov (United States)

    1991-01-01

    The objective was an evaluation of the Spiral Model (SM) development approach to allow NASA Marshall to develop an experience base of that software management methodology. A discussion is presented of the Information Model (IM) that was used as part of the SM methodology. A key concept of the SM is the establishment of an IM to be used by management to track the progress of a project. The IM is the set of metrics that is to be measured and reported throughout the life of the project. These metrics measure both the product and the process to ensure the quality of the final delivery item and to ensure the project met programmatic guidelines. The beauty of the SM, along with the IM, is the ability to measure not only the correctness of the specification and implementation of the requirements but to also obtain a measure of customer satisfaction.

  14. Modeling Water Filtration

    Science.gov (United States)

    Parks, Melissa

    2014-01-01

    Model-eliciting activities (MEAs) are not new to those in engineering or mathematics, but they were new to Melissa Parks. Model-eliciting activities are simulated real-world problems that integrate engineering, mathematical, and scientific thinking as students find solutions for specific scenarios. During this process, students generate solutions…

  15. Model Children's Code.

    Science.gov (United States)

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  16. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  17. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  18. Modeling volcanic ash dispersal

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    The assessment of volcanic fallout hazard is an important scientific, economic, and political issue, especially in densely populated areas. From a scientific point of view, considerable progress has been made during the last two decades through the use of increasingly powerful computational models and capabilities. Nowadays, models are used to quantify hazard...

  19. (Non) linear regression modelling

    NARCIS (Netherlands)

    Cizek, P.; Gentle, J.E.; Hardle, W.K.; Mori, Y.

    2012-01-01

    We will study causal relationships of a known form between random variables. Given a model, we distinguish one or more dependent (endogenous) variables Y = (Y1,…,Yl), l ∈ N, which are explained by a model, and independent (exogenous, explanatory) variables X = (X1,…,Xp),p ∈ N, which explain or

  20. Symbol: artefact and model

    DEFF Research Database (Denmark)

    Galle, Per

    2000-01-01

    In preparation of an analysis of product modelling in terms of communication, this report presents a brief analysis of symbols; that is, the entities by means of which communication takes place. Symbols are defined in such a way as to admit artefacts and models (the latter including linguistic...