WorldWideScience

Sample records for model representing quality

  1. Quality Reporting of Multivariable Regression Models in Observational Studies: Review of a Representative Sample of Articles Published in Biomedical Journals.

    Science.gov (United States)

    Real, Jordi; Forné, Carles; Roso-Llorach, Albert; Martínez-Sánchez, Jose M

    2016-05-01

    Controlling for confounders is a crucial step in analytical observational studies, and multivariable models are widely used as statistical adjustment techniques. However, the validation of the assumptions of the multivariable regression models (MRMs) should be made clear in scientific reporting. The objective of this study is to review the quality of statistical reporting of the most commonly used MRMs (logistic, linear, and Cox regression) that were applied in analytical observational studies published between 2003 and 2014 by journals indexed in MEDLINE.Review of a representative sample of articles indexed in MEDLINE (n = 428) with observational design and use of MRMs (logistic, linear, and Cox regression). We assessed the quality of reporting about: model assumptions and goodness-of-fit, interactions, sensitivity analysis, crude and adjusted effect estimate, and specification of more than 1 adjusted model.The tests of underlying assumptions or goodness-of-fit of the MRMs used were described in 26.2% (95% CI: 22.0-30.3) of the articles and 18.5% (95% CI: 14.8-22.1) reported the interaction analysis. Reporting of all items assessed was higher in articles published in journals with a higher impact factor.A low percentage of articles indexed in MEDLINE that used multivariable techniques provided information demonstrating rigorous application of the model selected as an adjustment method. Given the importance of these methods to the final results and conclusions of observational studies, greater rigor is required in reporting the use of MRMs in the scientific literature.

  2. Representativeness of air quality monitoring networks

    NARCIS (Netherlands)

    Duyzer, J.; Hout, D. van den; Zandveld, P.; Ratingen, S. van

    2015-01-01

    The suitability of European networks to check compliance with air quality standards and to assess exposure of the population was investigated. An air quality model (URBIS) was applied to estimate and compare the spatial distribution of the concentration of nitrogen dioxide (NO2) in ambient air in

  3. Quality modelling

    NARCIS (Netherlands)

    Tijskens, L.M.M.

    2003-01-01

    For modelling product behaviour, with respect to quality for users and consumers, its essential to have at least a fundamental notion what quality really is, and which product properties determine the quality assigned by the consumer to a product. In other words: what is allowed and what is to be

  4. Representing uncertainty on model analysis plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-12-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  5. Representing Context in Hypermedia Data Models

    DEFF Research Database (Denmark)

    Hansen, Frank Allan

    2005-01-01

    As computers and software systems move beyond the desktopand into the physical environments we live and workin, the systems are required to adapt to these environmentsand the activities taking place within them. Making applicationscontext-aware and representing context informationalong side...... application data can be a challenging task. Thispaper describes how digital context traditionally has beenrepresented in hypermedia data models and how this representationcan scale to also represent physical context. TheHyCon framework and data model, designed for the developmentof mobile context...

  6. STATISTICAL MODELS OF REPRESENTING INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Andreea Feraru

    2016-07-01

    Full Text Available This article entitled Statistical Models of Representing Intellectual Capital approaches and analyses the concept of intellectual capital, as well as the main models which can support enterprisers/managers in evaluating and quantifying the advantages of intellectual capital. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. Among the group of static models for evaluating organisational intellectual capital the canonical model stands out. This model enables the structuring of organisational intellectual capital in: human capital, structural capital and relational capital. Although the model is widely spread, it is a static one and can thus create a series of errors in the process of evaluation, because all the three entities mentioned above are not independent from the viewpoint of their contents, as any logic of structuring complex entities requires.

  7. Do regional climate models represent regional climate?

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin

    2014-05-01

    When using climate change scenarios - either from global climate models or further downscaled - to assess localised real world impacts, one has to ensure that the local simulation indeed correctly represents the real world local climate. Representativeness has so far mainly been discussed as a scale issue: simulated meteorological variables in general represent grid box averages, whereas real weather is often expressed by means of point values. As a result, in particular simulated extreme values are not directly comparable with observed local extreme values. Here we argue that the issue of representativeness is more general. To illustrate this point, assume the following situations: first, the (GCM or RCM) simulated large scale weather, e.g., the mid-latitude storm track, might be systematically distorted compared to observed weather. If such a distortion at the synoptic scale is strong, the simulated local climate might be completely different from the observed. Second, the orography even of high resolution RCMs is only a coarse model of true orography. In particular in mountain ranges the simulated mesoscale flow might therefore considerably deviate from the observed flow, leading to systematically displaced local weather. In both cases, the simulated local climate does not represent observed local climate. Thus, representativeness also encompasses representing a particular location. We propose to measure this aspect of representativeness for RCMs driven with perfect boundary conditions as the correlation between observations and simulations at the inter-annual scale. In doing so, random variability generated by the RCMs is largely averaged out. As an example, we assess how well KNMIs RACMO2 RCM at 25km horizontal resolution represents winter precipitation in the gridded E-OBS data set over the European domain. At a chosen grid box, RCM precipitation might not be representative of observed precipitation, in particular in the rain shadow of major moutain ranges

  8. Stream Water Quality Model

    Data.gov (United States)

    U.S. Environmental Protection Agency — QUAL2K (or Q2K) is a river and stream water quality model that is intended to represent a modernized version of the QUAL2E (or Q2E) model (Brown and Barnwell 1987).

  9. How Are Feedbacks Represented in Land Models?

    Directory of Open Access Journals (Sweden)

    Yang Chen

    2016-09-01

    Full Text Available Land systems are characterised by many feedbacks that can result in complex system behaviour. We defined feedbacks as the two-way influences between the land use system and a related system (e.g., climate, soils and markets, both of which are encompassed by the land system. Land models that include feedbacks thus probably more accurately mimic how land systems respond to, e.g., policy or climate change. However, representing feedbacks in land models is a challenge. We reviewed articles incorporating feedbacks into land models and analysed each with predefined indicators. We found that (1 most modelled feedbacks couple land use systems with transport, soil and market systems, while only a few include feedbacks between land use and social systems or climate systems; (2 equation-based land use models that follow a top-down approach prevail; and (3 feedbacks’ effects on system behaviour remain relatively unexplored. We recommend that land system modellers (1 consider feedbacks between land use systems and social systems; (2 adopt (bottom-up approaches suited to incorporating spatial heterogeneity and better representing land use decision-making; and (3 pay more attention to nonlinear system behaviour and its implications for land system management and policy.

  10. Decadal application of WRF/Chem for regional air quality and climate modeling over the U.S. under the representative concentration pathways scenarios. Part 1: Model evaluation and impact of downscaling

    Science.gov (United States)

    Yahya, Khairunnisa; Wang, Kai; Campbell, Patrick; Chen, Ying; Glotfelty, Timothy; He, Jian; Pirhalla, Michael; Zhang, Yang

    2017-03-01

    An advanced online-coupled meteorology-chemistry model, i.e., the Weather Research and Forecasting Model with Chemistry (WRF/Chem), is applied for current (2001-2010) and future (2046-2055) decades under the representative concentration pathways (RCP) 4.5 and 8.5 scenarios to examine changes in future climate, air quality, and their interactions. In this Part I paper, a comprehensive model evaluation is carried out for current decade to assess the performance of WRF/Chem and WRF under both scenarios and the benefits of downscaling the North Carolina State University's (NCSU) version of the Community Earth System Model (CESM_NCSU) using WRF/Chem. The evaluation of WRF/Chem shows an overall good performance for most meteorological and chemical variables on a decadal scale. Temperature at 2-m is overpredicted by WRF (by ∼0.2-0.3 °C) but underpredicted by WRF/Chem (by ∼0.3-0.4 °C), due to higher radiation from WRF. Both WRF and WRF/Chem show large overpredictions for precipitation, indicating limitations in their microphysics or convective parameterizations. WRF/Chem with prognostic chemical concentrations, however, performs much better than WRF with prescribed chemical concentrations for radiation variables, illustrating the benefit of predicting gases and aerosols and representing their feedbacks into meteorology in WRF/Chem. WRF/Chem performs much better than CESM_NCSU for most surface meteorological variables and O3 hourly mixing ratios. In addition, WRF/Chem better captures observed temporal and spatial variations than CESM_NCSU. CESM_NCSU performance for radiation variables is comparable to or better than WRF/Chem performance because of the model tuning in CESM_NCSU that is routinely made in global models.

  11. Cadmium phytoavailability to rice (Oryza sativa L.) grown in representative Chinese soils. A model to improve soil environmental quality guidelines for food safety.

    Science.gov (United States)

    Rafiq, Muhammad T; Aziz, Rukhsanda; Yang, Xiaoe; Xiao, Wendan; Rafiq, Muhammad K; Ali, Basharat; Li, Tingqiang

    2014-05-01

    Food chain contamination by cadmium (Cd) is globally a serious health concern resulting in chronic abnormalities. Rice is a major staple food of the majority world population, therefore, it is imperative to understand the relationship between the bioavailability of Cd in soils and its accumulation in rice grain. Objectives of this study were to establish environment quality standards for seven different textured soils based on human dietary toxicity, total Cd content in soils and bioavailable portion of Cd in soil. Cadmium concentrations in polished rice grain were best related to total Cd content in Mollisols and Udic Ferrisols with threshold levels of 0.77 and 0.32mgkg(-1), respectively. Contrastingly, Mehlich-3-extractable Cd thresholds were more suitable for Calcaric Regosols, Stagnic Anthrosols, Ustic Cambosols, Typic Haplustalfs and Periudic Argosols with thresholds values of 0.36, 0.22, 0.17, 0.08 and 0.03mgkg(-1), respectively. Stepwise multiple regression analysis indicated that phytoavailability of Cd to rice grain was strongly correlated with Mehlich-3-extractable Cd and soil pH. The empirical model developed in this study explains the combined effects of soil properties and extractable soil Cd content on the phytoavailability of Cd to polished rice grain. This study indicates that accumulation of Cd in rice is influenced greatly by soil type, which should be considered in assessment of soil safety for Cd contamination in rice. This investigation concluded that the selection of proper soil type for food crop production can help us to avoid the toxicity of Cd in our daily diet.

  12. Representing Turbulence Model Uncertainty with Stochastic PDEs

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2012-11-01

    Validation of and uncertainty quantification for extrapolative predictions of RANS turbulence models are necessary to ensure that the models are not used outside of their domain of applicability and to properly inform decisions based on such predictions. In previous work, we have developed and calibrated statistical models for these purposes, but it has been found that incorporating all the knowledge of a domain expert--e.g., realizability, spatial smoothness, and known scalings--in such models is difficult. Here, we explore the use of stochastic PDEs for this purpose. The goal of this formulation is to pose the uncertainty model in a setting where it is easier for physical modelers to express what is known. To explore the approach, multiple stochastic models describing the error in the Reynolds stress are coupled with multiple deterministic turbulence models to make uncertain predictions of channel flow. These predictions are compared with DNS data to assess their credibility. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  13. SPECIFIC MODELS OF REPRESENTING THE INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Andreea Feraru

    2014-12-01

    Full Text Available Various scientists in the modern age of management have launched different models for evaluating intellectual capital, and some of these models are analysed critically in this study, too. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. In a spectral dynamic analysis, organisational intellectual capital is structured in: organisational knowledge, organisational intelligence, organisational values, and their value is built on certain mechanisms entitled integrators, whose chief constitutive elements are: individual knowledge, individual intelligence and individual cultural values. The organizations, as employers, must especially reconsider those employees’ work who value knowledge because they are free to choose how, and especially where they are inclined to invest their own energy, skills and time, and they can be treated as freelancers or as some little entrepreneurs .

  14. Representing Practice: Practice Models, Patterns, Bundles

    Science.gov (United States)

    Falconer, Isobel; Finlay, Janet; Fincher, Sally

    2011-01-01

    This article critiques learning design as a representation for sharing and developing practice, based on synthesis of three projects. Starting with the findings of the Mod4L Models of Practice project, it argues that the technical origins of learning design, and the consequent focus on structure and sequence, limit its usefulness for sharing…

  15. EPANET water quality model

    Energy Technology Data Exchange (ETDEWEB)

    Rossman, L.A.

    1993-01-01

    EPANET represents a third generation of water quality modeling software developed by the U.S. EPA's Drinking Water Research Division, offering significant advances in the state of the art for network water quality analysis. EPANET performs extended period simulation of hydraulic and water quality behavior within water distribution systems. In addition to substance concentration, water age and source tracing can also be simulated. EPANET includes a full featured hydraulic simulation model that can handle various types of pumps, valves, and their control rules. The water quality module is equipped to handle constituent reactions within the bulk pipe flow and at the pipe wall. It also features an efficient computational scheme that automatically determines optimal time steps and pipe segmentation for accurate tracking of material transport over time. EPANET is currently being used in the US to study such issues as loss of chlorine residual, source blending and trihalomethane (THM) formation, how altered tank operation affects water age, and total dissolved solids (TDS) control for an irrigation network.

  16. 基于投影寻踪回归的指标规范值的水质评价模型%Assessment Model of Water Quality Represented with Normalized Indices Values Based on Projection Pursuit Regression

    Institute of Scientific and Technical Information of China (English)

    李祚泳; 张正健; 余春雪

    2012-01-01

    Traditional projection pursuit regression represented with matrix, which is applied in water quality evaluation for multi-index, affects not only learning efficient of optimized parameter matrix element, but also optimal effects. The present work set the proper reference values and transformed forms for each index. Therefore, the different in the same grade standard values with different index could be weakened after the normal transformation, the normalized values of different indexes were e-quivalent to a certain normalized index. Therefore, it is only necessary to set up the models of NV-PPR (2) and NV-PPR(3) suited to 2 indexes and 3 indexes, respectively, for each normalized index values. Meanwhile, the optimization of the parameter matrix elements of model were iterated by monkey-king genetic algorithm. Furthermore, the multi-index NV-PPR model could be represented into the combinations of some NV-PPR (2) and (or) NV-PPR (3) models. The practicality of models was verified virtually. The results showed that the projection pursuit regression model of water quality evaluation based on normalized index transform exhibited the characteristics of simplicity in form, convenience during calculation, university as well as commonness.%传统的投影寻踪回归(PPR)的矩阵表示法用于水质评价,当指标较多时,不仅优化参数矩阵元的学习效率低,而且优化效果亦受到影响.若适当设置3类水体(地表水、地下水和富营养化水体)各指标的参照值及指标值的规范变换式,使不同指标的同级标准的规范值差异不大,从而可以认为用规范值表示的不同指标皆与某个规范指标“等效”.因此,只需构造并优化得出对各指标规范值都共同适用的2个指标变量的NV-PPR(2)和3个指标变量的NV-PPR(3)模型,对于指标变量较多的NV-PPR建模,只需将其分解为若干个NV-PPR(2)和(或)NV-PPR(3)的组合表示即可.对模型的实用性进行的效

  17. Model parameters for representative wetland plant functional groups

    Science.gov (United States)

    Williams, Amber S.; Kiniry, James R.; Mushet, David M.; Smith, Loren M.; McMurry, Scott T.; Attebury, Kelly; Lang, Megan; McCarty, Gregory W.; Shaffer, Jill A.; Effland, William R.; Johnson, Mari-Vaughn V.

    2017-01-01

    Wetlands provide a wide variety of ecosystem services including water quality remediation, biodiversity refugia, groundwater recharge, and floodwater storage. Realistic estimation of ecosystem service benefits associated with wetlands requires reasonable simulation of the hydrology of each site and realistic simulation of the upland and wetland plant growth cycles. Objectives of this study were to quantify leaf area index (LAI), light extinction coefficient (k), and plant nitrogen (N), phosphorus (P), and potassium (K) concentrations in natural stands of representative plant species for some major plant functional groups in the United States. Functional groups in this study were based on these parameters and plant growth types to enable process-based modeling. We collected data at four locations representing some of the main wetland regions of the United States. At each site, we collected on-the-ground measurements of fraction of light intercepted, LAI, and dry matter within the 2013–2015 growing seasons. Maximum LAI and k variables showed noticeable variations among sites and years, while overall averages and functional group averages give useful estimates for multisite simulation modeling. Variation within each species gives an indication of what can be expected in such natural ecosystems. For P and K, the concentrations from highest to lowest were spikerush (Eleocharis macrostachya), reed canary grass (Phalaris arundinacea), smartweed (Polygonum spp.), cattail (Typha spp.), and hardstem bulrush (Schoenoplectus acutus). Spikerush had the highest N concentration, followed by smartweed, bulrush, reed canary grass, and then cattail. These parameters will be useful for the actual wetland species measured and for the wetland plant functional groups they represent. These parameters and the associated process-based models offer promise as valuable tools for evaluating environmental benefits of wetlands and for evaluating impacts of various agronomic practices in

  18. Assessing temporal representativeness of water quality monitoring data.

    Science.gov (United States)

    Anttila, Saku; Ketola, Mirva; Vakkilainen, Kirsi; Kairesalo, Timo

    2012-02-01

    The effectiveness of different monitoring methods in detecting temporal changes in water quality depends on the achievable sampling intervals, and how these relate to the extent of temporal variation. However, water quality sampling frequencies are rarely adjusted to the actual variation of the monitoring area. Manual sampling, for example, is often limited by the level of funding and not by the optimal timing to take samples. Restrictions in monitoring methods therefore often determine their ability to estimate the true mean and variance values for a certain time period or season. Consequently, we estimated how different sampling intervals determine the mean and standard deviation in a specific monitoring area by using high frequency data from in situ automated monitoring stations. Raw fluorescence measurements of chlorophyll a for three automated monitoring stations were calibrated by using phycocyanin fluorescence measurements and chlorophyll a analyzed from manual water samples in a laboratory. A moving block bootstrap simulation was then used to estimate the standard errors of the mean and standard deviations for different sample sizes. Our results showed that in a temperate, meso-eutrophic lake, relatively high errors in seasonal statistics can be expected from monthly sampling. Moreover, weekly sampling yielded relatively small accuracy benefits compared to a fortnightly sampling. The presented method for temporal representation analysis can be used as a tool in sampling design by adjusting the sampling interval to suit the actual temporal variation in the monitoring area, in addition to being used for estimating the usefulness of previously collected data.

  19. GIS based assessment of the spatial representativeness of air quality monitoring stations using pollutant emissions data

    Science.gov (United States)

    Righini, G.; Cappelletti, A.; Ciucci, A.; Cremona, G.; Piersanti, A.; Vitali, L.; Ciancarella, L.

    2014-11-01

    Spatial representativeness of air quality monitoring stations is a critical parameter when choosing location of sites and assessing effects on population to long term exposure to air pollution. According to literature, the spatial representativeness of a monitoring site is related to the variability of pollutants concentrations around the site. As the spatial distribution of primary pollutants concentration is strongly correlated to the allocation of corresponding emissions, in this work a methodology is presented to preliminarily assess spatial representativeness of a monitoring site by analysing the spatial variation of emissions around it. An analysis of horizontal variability of several pollutants emissions was carried out by means of Geographic Information System using a neighbourhood statistic function; the rationale is that if the variability of emissions around a site is low, the spatial representativeness of this site is high consequently. The methodology was applied to detect spatial representativeness of selected Italian monitoring stations, located in Northern and Central Italy and classified as urban background or rural background. Spatialized emission data produced by the national air quality model MINNI, covering entire Italian territory at spatial resolution of 4 × 4 km2, were processed and analysed. The methodology has shown significant capability for quick detection of areas with highest emission variability. This approach could be useful to plan new monitoring networks and to approximately estimate horizontal spatial representativeness of existing monitoring sites. Major constraints arise from the limited spatial resolution of the analysis, controlled by the resolution of the emission input data, cell size of 4 × 4 km2, and from the applicability to primary pollutants only.

  20. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as represent

  1. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  2. Health related quality of life in a nationally representative sample of haematological patients

    DEFF Research Database (Denmark)

    Johnsen, Anna T; Tholstrup, Dorte; Petersen, Morten Aa;

    2009-01-01

    Knowledge of health related quality of life of haematological patients is limited. This study aimed at investigating the prevalence and predictors of symptoms and problems in a representative sample of haematological patients in Denmark.......Knowledge of health related quality of life of haematological patients is limited. This study aimed at investigating the prevalence and predictors of symptoms and problems in a representative sample of haematological patients in Denmark....

  3. Assessment of parameters describing representativeness of air quality in-situ measurement sites

    Directory of Open Access Journals (Sweden)

    S. Henne

    2010-04-01

    Full Text Available The atmospheric layer closest to the ground is strongly influenced by variable surface fluxes (emissions, surface deposition and can therefore be very heterogeneous. In order to perform air quality measurements that are representative of a larger domain or a certain degree of pollution, observatories are placed away from population centres or within areas of specific population density. Sites are often categorised based on subjective criteria that are not uniformly applied by the atmospheric community within different administrative domains yielding an inconsistent global air quality picture. A novel approach for the assessment of parameters reflecting site representativeness is presented here, taking emissions, deposition and transport towards 34 sites covering Western and Central Europe into account. These parameters are directly inter-comparable among the sites and can be used to select sites that are, on average, more or less suitable for data assimilation and comparison with satellite and model data. Advection towards these sites was simulated by backward Lagrangian Particle Dispersion Modelling (LPDM to determine the sites' average catchment areas for the year 2005 and advection times of 12, 24 and 48 h. Only variations caused by emissions and transport during these periods were considered assuming that these dominate the short-term variability of most but especially short lived trace gases. The derived parameters describing representativeness were compared between sites and a novel, uniform and observation-independent categorisation of the sites based on a clustering approach was established. Six groups of European background sites were identified ranging from generally remote to more polluted agglomeration sites. These six categories explained 50 to 80% of the inter-site variability of median mixing ratios and their standard deviation for NO2 and O3, while differences between group means of the longer

  4. Decadal application of WRF/chem for regional air quality and climate modeling over the U.S. under the representative concentration pathways scenarios. Part 2: Current vs. future simulations

    Science.gov (United States)

    Yahya, Khairunnisa; Campbell, Patrick; Zhang, Yang

    2017-03-01

    Following a comprehensive model evaluation, this Part II paper presents projected changes in future (2046-2055) climate, air quality, and their interactions under the RCP4.5 and RCP8.5 scenarios using the Weather, Research and Forecasting model with Chemistry (WRF/Chem). In general, both WRF/Chem RCP4.5 and RCP8.5 simulations predict similar increases on average (∼2 °C) for 2-m temperature (T2) but different spatial distributions of the projected changes in T2, 2-m relative humidity, 10-m wind speed, precipitation, and planetary boundary layer height, due to differences in the spatial distributions of projected emissions, and their feedbacks into climate. Future O3 mixing ratios will decrease for most parts of the U.S. under the RCP4.5 scenario but increase for all areas under the RCP8.5 scenario due to higher projected temperature, greenhouse gas concentrations and biogenic volatile organic compounds (VOC) emissions, higher O3 values for boundary conditions, and disbenefit of NOx reduction and decreased NO titration over VOC-limited O3 chemistry regions. Future PM2.5 concentrations will decrease for both RCP4.5 and RCP8.5 scenarios with different trends in projected concentrations of individual PM species. Total cloud amounts decrease under both scenarios in the future due to decreases in PM and cloud droplet number concentration thus increased radiation. Those results illustrate the impacts of carbon policies with different degrees of emission reductions on future climate and air quality. The WRF/Chem and WRF simulations show different spatial patterns for projected changes in T2 for future decade, indicating different impacts of prognostic and prescribed gas/aerosol concentrations, respectively, on climate change.

  5. Representing vegetation processes in hydrometeorological simulations using the WRF model

    DEFF Research Database (Denmark)

    Nielsen, Joakim Refslund

    -ments are still needed in the representation of the land surface variability and of some key land surface processes. This thesis explores two possibilities for improving the near-surface model predictions using the mesoscale Weather Research and Forecasting (WRF) model. In the _rst approach, data from satellite......For accurate predictions of weather and climate, it is important that the land surface and its processes are well represented. In a mesoscale model the land surface processes are calculated in a land surface model (LSM). These pro-cesses include exchanges of energy, water and momentum between...... the land surface components, such as vegetation and soil, and their interactions with the atmosphere. The land surface processes are complex and vary in time and space. Signi_cant e_ort by the land surface community has therefore been invested in improving the LSMs over the recent decades. However, improve...

  6. Representing the environment 3.0. Maps, models, networks.

    Directory of Open Access Journals (Sweden)

    Letizia Bollini

    2014-05-01

    Full Text Available Web 3.0 is changing the world we live and perceive the environment anthropomorphized, making a stratifation of levels of experience and mediated by the devices. If the urban landscape is designed, shaped and planned space, there is a social landscape that overwrite the territory of values, representations shared images, narratives of personal and collective history. Mobile technology introduces an additional parameter, a kind of non-place, which allows the coexistence of the here and elsewhere in an sort of digital landscape. The maps, mental models, the system of social networks become, then, the way to present, represented and represent themselves in a kind of ideal coring of the co-presence of levels of physical, cognitive and collective space.

  7. A BRIEF REVIEW OF MODELS REPRESENTING CREEP OF ALLOY 617

    Energy Technology Data Exchange (ETDEWEB)

    Swindeman, Robert W [ORNL; Swindeman, Michael [University of Dayton Research Institute; Ren, Weiju [ORNL

    2005-01-01

    Alloy 617 is being considered for the construction of components to operate in the Next Generation Nuclear Plant (NGNP). Service temperatures will range from 650 to 1000 C. To meet the needs of the conceptual designers of this plant, a materials handbook is being developed that will provide information on alloy 617, as well as other materials of interest. The database for alloy 617 to be incorporated into the handbook was produced in the 1970s and 1980s, while creep and damage models were developed from the database for use in the design of high-temperature gas-cooled reactors. In the work reported here, the US database and creep models are briefly reviewed. The work reported represents progress toward a useful model of the behavior of this material in the temperature range of 650 to 1000 C.

  8. Quality of reproductive healthcare for adolescents: A nationally representative survey of providers in Mexico

    Science.gov (United States)

    De Castro, Filipa; Barrientos-Gutiérrez, Tonatiuh; Leyva-López, Ahideé

    2017-01-01

    Objective Adolescents need sexual and reproductive health services but little is known about quality-of-care in lower- and middle-income countries where most of the world’s adolescents reside. Quality-of-care has important implications as lower quality may be linked to higher unplanned pregnancy and sexually transmitted infection rates. This study sought to generate evidence about quality-of-care in public sexual and reproductive health services for adolescents. Methods This cross-sectional study had a complex, probabilistic, stratified sampling design, representative at the national, regional and rural/urban level in Mexico, collecting provider questionnaires at 505 primary care units in 2012. A sexual and reproductive quality-of-healthcare index was defined and multinomial logistic regression was utilized in 2015. Results At the national level 13.9% (95%CI: 6.9–26.0) of healthcare units provide low quality, 68.6% (95%CI: 58.4–77.3) medium quality and 17.5% (95%CI: 11.9–25.0) high quality reproductive healthcare services to adolescents. Urban or metropolitan primary care units were at least 10 times more likely to provide high quality care than those in rural areas. Units with a space specifically for counseling adolescents were at least 8 times more likely to provide high quality care. Ministry of Health clinics provided the lowest quality of service, while those from Social Security for the Underserved provided the best. Conclusions The study indicates higher quality sexual and reproductive healthcare services are needed. In Mexico and other middle- to low-income countries where quality-of-care has been shown to be a problem, incorporating adolescent-friendly, gender-equity and rights-based perspectives could contribute to improvement. Setting and disseminating standards for care in guidelines and providing tools such as algorithms could help healthcare personnel provide higher quality care. PMID:28273129

  9. Quality of reproductive healthcare for adolescents: A nationally representative survey of providers in Mexico.

    Science.gov (United States)

    Villalobos, Aremis; Allen-Leigh, Betania; Salazar-Alberto, Javier; De Castro, Filipa; Barrientos-Gutiérrez, Tonatiuh; Leyva-López, Ahideé; Rojas-Martínez, Rosalba

    2017-01-01

    Adolescents need sexual and reproductive health services but little is known about quality-of-care in lower- and middle-income countries where most of the world's adolescents reside. Quality-of-care has important implications as lower quality may be linked to higher unplanned pregnancy and sexually transmitted infection rates. This study sought to generate evidence about quality-of-care in public sexual and reproductive health services for adolescents. This cross-sectional study had a complex, probabilistic, stratified sampling design, representative at the national, regional and rural/urban level in Mexico, collecting provider questionnaires at 505 primary care units in 2012. A sexual and reproductive quality-of-healthcare index was defined and multinomial logistic regression was utilized in 2015. At the national level 13.9% (95%CI: 6.9-26.0) of healthcare units provide low quality, 68.6% (95%CI: 58.4-77.3) medium quality and 17.5% (95%CI: 11.9-25.0) high quality reproductive healthcare services to adolescents. Urban or metropolitan primary care units were at least 10 times more likely to provide high quality care than those in rural areas. Units with a space specifically for counseling adolescents were at least 8 times more likely to provide high quality care. Ministry of Health clinics provided the lowest quality of service, while those from Social Security for the Underserved provided the best. The study indicates higher quality sexual and reproductive healthcare services are needed. In Mexico and other middle- to low-income countries where quality-of-care has been shown to be a problem, incorporating adolescent-friendly, gender-equity and rights-based perspectives could contribute to improvement. Setting and disseminating standards for care in guidelines and providing tools such as algorithms could help healthcare personnel provide higher quality care.

  10. Representing plants as rigid cylinders in experiments and models

    Science.gov (United States)

    Vargas-Luna, Andrés; Crosato, Alessandra; Calvani, Giulio; Uijttewaal, Wim S. J.

    2016-07-01

    Simulating the morphological adaptation of water systems often requires including the effects of plants on water and sediment dynamics. Physical and numerical models need representing vegetation in a schematic easily-quantifiable way despite the variety of sizes, shapes and flexibility of real plants. Common approaches represent plants as rigid cylinders, but the ability of these schematizations to reproduce the effects of vegetation on morphodynamic processes has never been analyzed systematically. This work focuses on the consequences of representing plants as rigid cylinders in laboratory tests and numerical simulations. New experiments show that the flow resistance decreases for increasing element Reynolds numbers for both plants and rigid cylinders. Cylinders on river banks can qualitatively reproduce vegetation effects on channel width and bank-related processes. A comparative review of numerical simulations shows that Baptist's method that sums the contribution of bed shear stress and vegetation drag, underestimates bed erosion within sparse vegetation in real rivers and overestimates the mean flow velocity in laboratory experiments. This is due to assuming uniform flow among plants and to an overestimation of the role of the submergence ratio.

  11. Exploratory piloted simulator study of the effects of winglets on handling qualities of a representative agricultural airplane

    Science.gov (United States)

    Ogburn, M. E.; Brown, P. W.

    1980-01-01

    The effects on handling qualities of adding winglets to a representative agricultural aircraft configuration during swath-run maneuvering were evaluated. Aerodynamic data used in the simulation were based on low-speed wind tunnel tests of a full scale airplane and a subscale model. The Cooper-Harper handling qualities rating scale, supplementary pilot comments, and pilot vehicle performance data were used to describe the handling qualities of the airplane with the different wing-tip configurations. Results showed that the lateral-directional handling qualities of the airplane were greatly affected by the application of winglets and winglet cant angle. The airplane with winglets canted out 20 deg exhibited severely degraded lateral directional handling qualities in comparison to the basic airplane. When the winglets were canted inward 10 deg, the flying qualities of the configuration were markedly improved over those of the winglet-canted-out configuration or the basic configuration without winglets, indicating that proper tailoring of the winglet design may afford a potential benefit in the area of handling qualities.

  12. Fuzzy Based Evaluation of Software Quality Using Quality Models and Goal Models

    Directory of Open Access Journals (Sweden)

    Arfan Mansoor

    2015-09-01

    Full Text Available Software quality requirements are essential part for the success of software development. Defined and guaranteed quality in software development requires identifying, refining, and predicting quality properties by appropriate means. Goal models of goal oriented requirements engineering (GORE and quality models are useful for modelling of functional goals as well as for quality goals. Once the goal models are obtained representing the functional requirements and integrated quality goals, there is need to evaluate each functional requirement arising from functional goals and quality requirement arising from quality goals. The process consist of two main parts. In first part, the goal models are used to evaluate functional goals. The leaf level goals are used to establish the evaluation criteria. Stakeholders are also involved to contribute their opinions about the importance of each goal (functional and/or quality goal. Stakeholder opinions are then converted into quantifiable numbers using triangle fuzzy numbers (TFN. After applying the defuzzification process on TFN, the scores (weights are obtained for each goal. In second part specific quality goals are identified, refined/tailored based on existing quality models and their evaluation is performed similarly using TFN and by applying defuzzification process. The two step process helps to evaluate each goal based on stakeholder opinions and to evaluate the impact of quality requirements. It also helps to evaluate the relationships among functional goals and quality goals. The process is described and applied on ’cyclecomputer’ case study.

  13. Representing plant hydraulics in a global Earth system model.

    Science.gov (United States)

    Kennedy, D.; Gentine, P.

    2015-12-01

    Earth system models need improvement to reproduce observed seasonal and diurnal cycles of photosynthesis and respiration. Model water stress parameterizations lag behind the plant physiology literature. A plant hydraulics model is developed and deployed in a global Earth system model (NCAR CESM 1.2.2 with CLM 4.5). Assimilation and transpiration are attenuated according to literature cavitation curves. Water stress is evaluated based on plant functional type hydraulic parameters forced by soil moisture and atmospheric conditions. Resolving the plant water status allows for modelling divergent strategies for water stress. The case of isohydric versus anisohydric species is presented, showing that including plant hydraulic traits alter modelled photosynthesis and transpiration.

  14. REPRESENTING AEROSOL DYNAMICS AND PROPERTIES IN CHEMICAL TRANSPORT MODELS BY THE METHOD OF MOMENTS.

    Energy Technology Data Exchange (ETDEWEB)

    SCHWARTZ, S.E.; MCGRAW, R.; BENKOVITZ, C.M.; WRIGHT, D.L.

    2001-04-01

    Atmospheric aerosols, suspensions of solid or liquid particles, are an important multi-phase system. Aerosols scatter and absorb shortwave (solar) radiation, affecting climate (Charlson et al., 1992; Schwartz, 1996) and visibility; nucleate cloud droplet formation, modifying the reflectivity of clouds (Twomey et al., 1984; Schwartz and Slingo, 1996) as well as contributing to composition of cloudwater and to wet deposition (Seinfeld and Pandis, 1998); and affect human health through inhalation (NRC, 1998). Existing and prospective air quality regulations impose standards on concentrations of atmospheric aerosols to protect human health and welfare (EPA, 1998). Chemical transport and transformation models representing the loading and geographical distribution of aerosols and precursor gases are needed to permit development of effective and efficient strategies for meeting air quality standards, and for examining aerosol effects on climate retrospectively and prospectively for different emissions scenarios. Important aerosol properties and processes depend on their size distribution: light scattering, cloud nucleating properties, dry deposition, and penetration into airways of lungs. The evolution of the mass loading itself depends on particle size because of the size dependence of growth and removal processes. For these reasons it is increasingly recognized that chemical transport and transformation models must represent not just the mass loading of atmospheric particulate matter but also the aerosol microphysical properties and the evolution of these properties if aerosols are to be accurately represented in these models. If the size distribution of the aerosol is known, a given property can be evaluated as the integral of the appropriate kernel function over the size distribution. This has motivated the approach of determining aerosol size distribution, and of explicitly representing this distribution and its evolution in chemical transport models.

  15. Explicitly representing soil microbial processes in Earth system models

    Science.gov (United States)

    Wieder, William R.; Allison, Steven D.; Davidson, Eric A.; Georgiou, Katerina; Hararuk, Oleksandra; He, Yujie; Hopkins, Francesca; Luo, Yiqi; Smith, Matthew J.; Sulman, Benjamin; Todd-Brown, Katherine; Wang, Ying-Ping; Xia, Jianyang; Xu, Xiaofeng

    2015-10-01

    Microbes influence soil organic matter decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) will make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here we review the diversity, advantages, and pitfalls of simulating soil biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models, we suggest the following: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.

  16. Quantum turing machine and brain model represented by Fock space

    Science.gov (United States)

    Iriyama, Satoshi; Ohya, Masanori

    2016-05-01

    The adaptive dynamics is known as a new mathematics to treat with a complex phenomena, for example, chaos, quantum algorithm and psychological phenomena. In this paper, we briefly review the notion of the adaptive dynamics, and explain the definition of the generalized Turing machine (GTM) and recognition process represented by the Fock space. Moreover, we show that there exists the quantum channel which is described by the GKSL master equation to achieve the Chaos Amplifier used in [M. Ohya and I. V. Volovich, J. Opt. B 5(6) (2003) 639., M. Ohya and I. V. Volovich, Rep. Math. Phys. 52(1) (2003) 25.

  17. A time fractional model to represent rainfall process

    Directory of Open Access Journals (Sweden)

    Jacques GOLDER

    2014-01-01

    Full Text Available This paper deals with a stochastic representation of the rainfall process. The analysis of a rainfall time series shows that cumulative representation of a rainfall time series can be modeled as a non-Gaussian random walk with a log-normal jump distribution and a time-waiting distribution following a tempered α-stable probability law. Based on the random walk model, a fractional Fokker-Planck equation (FFPE with tempered α-stable waiting times was obtained. Through the comparison of observed data and simulated results from the random walk model and FFPE model with tempered α-stable waiting times, it can be concluded that the behavior of the rainfall process is globally reproduced, and the FFPE model with tempered α-stable waiting times is more efficient in reproducing the observed behavior.

  18. Representing Microbial Processes in Environmental Reactive Transport Models

    Science.gov (United States)

    van Cappellen, P.

    2009-04-01

    Microorganisms play a key role in the biogeochemical functioning of the earth's surface and shallow subsurface. In the context of reactive transport modeling, a major challenge is to derive, parameterize, calibrate and verify mathematical expressions for microbially-mediated reactions in the environmental. This is best achieved by combining field observations, laboratory experiments, theoretical principles and modeling. Here, I will illustrate such an integrated approach for the case of microbial respiration processes in aquatic sediments. Important issues that will be covered include experimental design, model consistency and performance, as well as the bioenergetics and transient behavior of geomicrobial reaction systems.

  19. Representing and managing uncertainty in qualitative ecological models

    NARCIS (Netherlands)

    Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.

    2009-01-01

    Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete

  20. An experiment in representative ground-water sampling for water- quality analysis

    Science.gov (United States)

    Huntzinger, T.L.; Stullken, L.E.

    1988-01-01

    Obtaining a sample of groundwater that accurately represents the concentration of a chemical constituent in an aquifer is an important aspect of groundwater-quality studies. Varying aquifer and constituent properties may cause chemical constituents to move within selectively separate parts of the aquifer. An experiment was conducted in an agricultural region in south-central Kansas to address questions related to representative sample collection. Concentrations of selected constituents in samples taken from observation wells completed in the upper part of the aquifer were compared to concentrations in samples taken from irrigation wells to determine if there was a significant difference. Water in all wells sampled was a calcium bicarbonate type with more than 200 mg/L hardness and about 200 mg/L alkalinity. Sodium concentrations were also quite large (about 40 mg/L). There was a significant difference in the nitrite-plus-nitrate concentrations between samples from observation and irrigation wells. The median concentration of nitrite plus nitrate in water from observation wells was 5.7 mg/L compared to 3.4 mg/L in water from irrigation wells. The differences in concentrations of calcium, magnesium, and sodium (larger in water from irrigation wells) were significant at the 78% confidence level but not at the 97% confidence level. Concentrations of the herbicide, atrazine, were less than the detection limit of 0.1 micrograms/L in all but one well. (USGS)

  1. A Topic Model Approach to Representing and Classifying Football Plays

    KAUST Repository

    Varadarajan, Jagannadan

    2013-09-09

    We address the problem of modeling and classifying American Football offense teams’ plays in video, a challenging example of group activity analysis. Automatic play classification will allow coaches to infer patterns and tendencies of opponents more ef- ficiently, resulting in better strategy planning in a game. We define a football play as a unique combination of player trajectories. To this end, we develop a framework that uses player trajectories as inputs to MedLDA, a supervised topic model. The joint maximiza- tion of both likelihood and inter-class margins of MedLDA in learning the topics allows us to learn semantically meaningful play type templates, as well as, classify different play types with 70% average accuracy. Furthermore, this method is extended to analyze individual player roles in classifying each play type. We validate our method on a large dataset comprising 271 play clips from real-world football games, which will be made publicly available for future comparisons.

  2. Representing spatial information in a computational model for network management

    Science.gov (United States)

    Blaisdell, James H.; Brownfield, Thomas F.

    1994-01-01

    While currently available relational database management systems (RDBMS) allow inclusion of spatial information in a data model, they lack tools for presenting this information in an easily comprehensible form. Computer-aided design (CAD) software packages provide adequate functions to produce drawings, but still require manual placement of symbols and features. This project has demonstrated a bridge between the data model of an RDBMS and the graphic display of a CAD system. It is shown that the CAD system can be used to control the selection of data with spatial components from the database and then quickly plot that data on a map display. It is shown that the CAD system can be used to extract data from a drawing and then control the insertion of that data into the database. These demonstrations were successful in a test environment that incorporated many features of known working environments, suggesting that the techniques developed could be adapted for practical use.

  3. Model and observed seismicity represented in a two dimensional space

    Directory of Open Access Journals (Sweden)

    M. Caputo

    1976-06-01

    Full Text Available In recent years theoretical seismology lias introduced
    some formulae relating the magnitude and the seismic moment of earthquakes
    to the size of the fault and the stress drop which generated the
    earthquake.
    In the present paper we introduce a model for the statistics of the
    earthquakes based on these formulae. The model gives formulae which
    show internal consistency and are also confirmed by observations.
    For intermediate magnitudes the formulae reproduce also the trend
    of linearity of the statistics of magnitude and moment observed in all the
    seismic regions of the world. This linear trend changes into a curve with
    increasing slope for large magnitudes and moment.
    When a catalogue of the magnitudes and/or the seismic moment of
    the earthquakes of a seismic region is available, the model allows to estimate
    the maximum magnitude possible in the region.

  4. Physically representative atomistic modeling of atomic-scale friction

    Science.gov (United States)

    Dong, Yalin

    Nanotribology is a research field to study friction, adhesion, wear and lubrication occurred between two sliding interfaces at nano scale. This study is motivated by the demanding need of miniaturization mechanical components in Micro Electro Mechanical Systems (MEMS), improvement of durability in magnetic storage system, and other industrial applications. Overcoming tribological failure and finding ways to control friction at small scale have become keys to commercialize MEMS with sliding components as well as to stimulate the technological innovation associated with the development of MEMS. In addition to the industrial applications, such research is also scientifically fascinating because it opens a door to understand macroscopic friction from the most bottom atomic level, and therefore serves as a bridge between science and engineering. This thesis focuses on solid/solid atomic friction and its associated energy dissipation through theoretical analysis, atomistic simulation, transition state theory, and close collaboration with experimentalists. Reduced-order models have many advantages for its simplification and capacity to simulating long-time event. We will apply Prandtl-Tomlinson models and their extensions to interpret dry atomic-scale friction. We begin with the fundamental equations and build on them step-by-step from the simple quasistatic one-spring, one-mass model for predicting transitions between friction regimes to the two-dimensional and multi-atom models for describing the effect of contact area. Theoretical analysis, numerical implementation, and predicted physical phenomena are all discussed. In the process, we demonstrate the significant potential for this approach to yield new fundamental understanding of atomic-scale friction. Atomistic modeling can never be overemphasized in the investigation of atomic friction, in which each single atom could play a significant role, but is hard to be captured experimentally. In atomic friction, the

  5. Manipulating Models and Grasping the Ideas They Represent

    Science.gov (United States)

    Bryce, T. G. K.; Blown, E. J.

    2016-03-01

    This article notes the convergence of recent thinking in neuroscience and grounded cognition regarding the way we understand mental representation and recollection: ideas are dynamic and multi-modal, actively created at the point of recall. Also, neurophysiologically, re-entrant signalling among cortical circuits allows non-conscious processing to support our deliberative thoughts and actions. The qualitative research we describe examines the exchanges occurring during semi-structured interviews with 360 children age 3-13, including 294 from New Zealand (158 boys, 136 girls) and 66 from China (34 boys, 32 girls) concerning their understanding of the shape and motion of the Earth, Sun and Moon (ESM). We look closely at the relationships between what is revealed as children manipulate their own play-dough models and their apparent understandings of ESM concepts. In particular, we focus on the switching taking place between what is said, what is drawn and what is modelled. The evidence is supportive of Edelman's view that memory is non-representational and that concepts are the outcome of perceptual mappings, a view which is also in accord with Barsalou's notion that concepts are simulators or skills which operate consistently across several modalities. Quantitative data indicate that the dynamic structure of memory/concept creation is similar in both genders and common to the cultures/ethnicities compared (New Zealand European and Māori; Chinese Han) and that repeated interviews in this longitudinal research lead to more advanced modelling skills and/or more advanced shape and motion concepts, the results supporting hypotheses ( Kolmogorov- Smirnov alpha levels .05; r s : p < .001).

  6. Studying Effective Factors on Corporate Entrepreneurship: Representing a Model

    Directory of Open Access Journals (Sweden)

    Maryam Soleimani

    2013-02-01

    Full Text Available Development and advancement of current organizations depends on Corporate Entrepreneurship (CE and its anticipants considerably. Therefore purpose of conducting this survey is to study effective factors on corporate entrepreneurship (personal characteristics of entrepreneurship, human resource practices, organizational culture and employees' satisfaction. This survey was conducted using descriptive-field methodology. Statistical population included managers and experts of Hexa Consulting Engineers Company (Tehran/Iran and the sample consisted of forty seven of them. Questionnaire was tool of data collection. Data was collected in cross-sectional form in July-August 2011. Descriptive and inferential (spearman correlation statistics methods were used for data analysis. According to results, there is a positive significant relationship among all factors (personal characteristics of entrepreneurship, human resource practices, organizational culture and employees' satisfaction and corporate entrepreneurship. In other words, the proposed variables as effective factors on corporate entrepreneurship were confirmed in conceptual model of survey.

  7. Modeling and Representing National Climate Assessment Information using Linked Data

    Science.gov (United States)

    Zheng, J.; Tilmes, C.; Smith, A.; Zednik, S.; Fox, P. A.

    2012-12-01

    Every four years, earth scientists work together on a National Climate Assessment (NCA) report which integrates, evaluates, and interprets the findings of climate change and impacts on affected industries such as agriculture, natural environment, energy production and use, etc. Given the amount of information presented in each report, and the wide range of information sources and topics, it can be difficult for users to find and identify desired information. To ease the user effort of information discovery, well-structured metadata is needed that describes the report's key statements and conclusions and provide for traceable provenance of data sources used. We present an assessment ontology developed to describe terms, concepts and relations required for the NCA metadata. Wherever possible, the assessment ontology reuses terms from well-known ontologies such as Semantic Web for Earth and Environmental Terminology (SWEET) ontology, Dublin Core (DC) vocabulary. We have generated sample National Climate Assessment metadata conforming to our assessment ontology and publicly exposed via a SPARQL-endpoint and website. We have also modeled provenance information for the NCA writing activities using the W3C recommendation-candidate PROV-O ontology. Using this provenance the user will be able to trace the sources of information used in the assessment and therefore make trust decisions. In the future, we are planning to implement a faceted browser over the metadata to enhance metadata traversal and information discovery.

  8. Molecular Simulation towards Efficient and Representative Subsurface Reservoirs Modeling

    KAUST Repository

    Kadoura, Ahmad

    2016-09-01

    This dissertation focuses on the application of Monte Carlo (MC) molecular simulation and Molecular Dynamics (MD) in modeling thermodynamics and flow of subsurface reservoir fluids. At first, MC molecular simulation is proposed as a promising method to replace correlations and equations of state in subsurface flow simulators. In order to accelerate MC simulations, a set of early rejection schemes (conservative, hybrid, and non-conservative) in addition to extrapolation methods through reweighting and reconstruction of pre-generated MC Markov chains were developed. Furthermore, an extensive study was conducted to investigate sorption and transport processes of methane, carbon dioxide, water, and their mixtures in the inorganic part of shale using both MC and MD simulations. These simulations covered a wide range of thermodynamic conditions, pore sizes, and fluid compositions shedding light on several interesting findings. For example, the possibility to have more carbon dioxide adsorbed with more preadsorbed water concentrations at relatively large basal spaces. The dissertation is divided into four chapters. The first chapter corresponds to the introductory part where a brief background about molecular simulation and motivations are given. The second chapter is devoted to discuss the theoretical aspects and methodology of the proposed MC speeding up techniques in addition to the corresponding results leading to the successful multi-scale simulation of the compressible single-phase flow scenario. In chapter 3, the results regarding our extensive study on shale gas at laboratory conditions are reported. At the fourth and last chapter, we end the dissertation with few concluding remarks highlighting the key findings and summarizing the future directions.

  9. Surface Flux Modeling for Air Quality Applications

    Directory of Open Access Journals (Sweden)

    Limei Ran

    2011-08-01

    Full Text Available For many gasses and aerosols, dry deposition is an important sink of atmospheric mass. Dry deposition fluxes are also important sources of pollutants to terrestrial and aquatic ecosystems. The surface fluxes of some gases, such as ammonia, mercury, and certain volatile organic compounds, can be upward into the air as well as downward to the surface and therefore should be modeled as bi-directional fluxes. Model parameterizations of dry deposition in air quality models have been represented by simple electrical resistance analogs for almost 30 years. Uncertainties in surface flux modeling in global to mesoscale models are being slowly reduced as more field measurements provide constraints on parameterizations. However, at the same time, more chemical species are being added to surface flux models as air quality models are expanded to include more complex chemistry and are being applied to a wider array of environmental issues. Since surface flux measurements of many of these chemicals are still lacking, resistances are usually parameterized using simple scaling by water or lipid solubility and reactivity. Advances in recent years have included bi-directional flux algorithms that require a shift from pre-computation of deposition velocities to fully integrated surface flux calculations within air quality models. Improved modeling of the stomatal component of chemical surface fluxes has resulted from improved evapotranspiration modeling in land surface models and closer integration between meteorology and air quality models. Satellite-derived land use characterization and vegetation products and indices are improving model representation of spatial and temporal variations in surface flux processes. This review describes the current state of chemical dry deposition modeling, recent progress in bi-directional flux modeling, synergistic model development research with field measurements, and coupling with meteorological land surface models.

  10. Using rain-on-snow events to evaluate the quality of bias correction to represent complex inter-variable dependencies

    Science.gov (United States)

    Rössler, Ole; Bosshard, Thomas; Weingartner, Rolf

    2016-04-01

    the time period of 1990-2010. The hydrological model performance in terms of rain-on-snow representation indicate some room for improvement. Still, the comparison of the different climate data driven model runs revealed an overestimation of the occurrence frequency of rain-on-snow events applying ERA.RCA4.5 that can partly be corrected using DBS. The study finally discusses the potential of this framework to evaluate the quality to represent complex inter-variable dependencies based on a new indicator in future downscaling technique validations.

  11. Comparison of Statistical Multifragmentation Model simulations with Canonical Thermodynamical Model results: a few representative cases

    CERN Document Server

    Botvina, A; Gupta, S Das; Mishustin, I

    2008-01-01

    The statistical multifragmentation model (SMM) has been widely used to explain experimental data of intermediate energy heavy ion collisions. A later entrant in the field is the canonical thermodynamic model (CTM) which is also being used to fit experimental data. The basic physics of both the models is the same, namely that fragments are produced according to their statistical weights in the available phase space. However, they are based on different statistical ensembles, and the methods of calculation are different: while the SMM uses Monte-Carlo simulations, the CTM solves recursion relations. In this paper we compare the predictions of the two models for a few representative cases.

  12. A box model for representing estuarine physical processes in Earth system models

    Science.gov (United States)

    Sun, Qiang; Whitney, Michael M.; Bryan, Frank O.; Tseng, Yu-heng

    2017-04-01

    Appropriately treating riverine freshwater discharge into the oceans in Earth system models is a challenging problem. Commonly, the river runoff is discharged into the ocean models with zero salinity and arbitrarily distributed either horizontally or vertically over several grid cells. Those approaches entirely neglect estuarine physical processes that modify river inputs before they reach the open ocean. In order to realistically represent riverine freshwater inputs in Earth system models, a physically based Estuary Box Model (EBM) is developed to parameterize the mixing processes in estuaries. The EBM represents the estuary exchange circulation with a two-layer box structure. It takes as input the river volume flux from the land surface model and the subsurface salinity at the estuary mouth from the ocean model. It delivers the estuarine outflow salinity and net volume flux into and out of the estuary to the ocean model. An offline test of the EBM forced with observed conditions for the Columbia River system shows good agreement with observations of outflow salinity and high-resolution simulations of the exchange flow volume flux. To illustrate the practicality of use of the EBM in an Earth system model, the EBM is implemented for all coastal grid cells with river runoff in the Community Earth System Model (CESM). Compared to the standard version of CESM, which treats runoff as an augmentation to precipitation, the EBM increases sea surface salinity and reduces stratification near river mouths. The EBM also leads to significant regional and remote changes in CESM ocean surface salinities.

  13. Model quality and safety studies

    DEFF Research Database (Denmark)

    Petersen, K.E.

    1997-01-01

    The paper describes the EC initiative on model quality assessment and emphasizes some of the problems encountered in the selection of data from field tests used in the evaluation process. Further, it discusses the impact of model uncertainties in safety studies of industrial plants. The model...... that most of these have never been through a procedure of evaluation, but nonetheless are used to assist in making decisions that may directly affect the safety of the public and the environment. As a major funder of European research on major industrial hazards, DGXII is conscious of the importance......-tain model is appropriate for use in solving a given problem. Further, the findings from the REDIPHEM project related to dense gas dispersion will be highlighted. Finally, the paper will discuss the need for model quality assessment in safety studies....

  14. Representing ozone extremes in European megacities: the importance of resolution in a global chemistry climate model

    Directory of Open Access Journals (Sweden)

    Z. S. Stock

    2013-10-01

    Full Text Available The continuing growth of the world's urban population has led to an increasing number of cities with more than 10 million inhabitants. The higher emissions of pollutants, coupled to higher population density, makes predictions of air quality in these megacities of particular importance from both a science and a policy perspective. Global climate models are typically run at coarse resolution to enable both the efficient running of long time integrations, and the ability to run multiple future climate scenarios. However, when considering surface ozone concentrations at the local scale, coarse resolution can lead to inaccuracies arising from the highly non-linear ozone chemistry and the sensitivity of ozone to the distribution of its precursors on smaller scales. In this study, we use UM-UKCA, a global atmospheric chemistry model, coupled to the UK Met Office Unified Model, to investigate the impact of model resolution on tropospheric ozone, ranging from global to local scales. We focus on the model's ability to represent the probability of high ozone concentrations in the summer and low ozone concentrations, associated with polluted megacity environments, in the winter, and how this varies with horizontal resolution. We perform time-slice integrations with two model configurations at typical climate resolution (CR, ~150 km and at a higher resolution (HR, ~40 km. The CR configuration leads to overestimation of ozone concentrations on both regional and local scales, while it gives broadly similar results to the HR configuration on the global scale. The HR configuration is found to produce a more realistic diurnal cycle of ozone concentrations and to give a better representation of the probability density function of ozone values in urban areas such as the megacities of London and Paris. We discuss the possible causes for the observed difference in model behaviour between CR and HR configurations and estimate the relative contribution of chemical and

  15. Healthcare quality maturity assessment model based on quality drivers.

    Science.gov (United States)

    Ramadan, Nadia; Arafeh, Mazen

    2016-04-18

    Purpose - Healthcare providers differ in their readiness and maturity levels regarding quality and quality management systems applications. The purpose of this paper is to serve as a useful quantitative quality maturity-level assessment tool for healthcare organizations. Design/methodology/approach - The model proposes five quality maturity levels (chaotic, primitive, structured, mature and proficient) based on six quality drivers: top management, people, operations, culture, quality focus and accreditation. Findings - Healthcare managers can apply the model to identify the status quo, quality shortcomings and evaluating ongoing progress. Practical implications - The model has been incorporated in an interactive Excel worksheet that visually displays the quality maturity-level risk meter. The tool has been applied successfully to local hospitals. Originality/value - The proposed six quality driver scales appear to measure healthcare provider maturity levels on a single quality meter.

  16. River water quality modelling: II

    DEFF Research Database (Denmark)

    Shanahan, P.; Henze, Mogens; Koncsos, L.

    1998-01-01

    The U.S. EPA QUAL2E model is currently the standard for river water quality modelling. While QUAL2E is adequate for the regulatory situation for which it was developed (the U.S. wasteload allocation process), there is a need for a more comprehensive framework for research and teaching. Moreover......, and to achieve robust model calibration. Mass balance problems arise from failure to account for mass in the sediment as well as in the water column and due to the fundamental imprecision of BOD as a state variable. (C) 1998 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  17. Uncertainty in Air Quality Modeling.

    Science.gov (United States)

    Fox, Douglas G.

    1984-01-01

    Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that

  18. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  19. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    Science.gov (United States)

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  20. Do air quality targets really represent safe limits for lung cancer risk?

    Science.gov (United States)

    Buonanno, G; Stabile, L; Morawska, L; Giovinco, G; Querol, X

    2017-02-15

    In order to estimate the lung cancer risk associated to airborne particles, exposure and risk-assessment studies ordinarily use particle mass concentration as dosimetry parameter. Consequently, the corresponding air quality targets are based on this metrics, neglecting the potential impact of ultrafine particles (UFPs) due to their negligible mass. The main purpose of this study was to evaluate the reliability of air quality targets in protecting Italian non-smoking people from lung cancer risk due to exposure to polycyclic aromatic hydrocarbons and some heavy metals associated with particle inhalation. A modified risk-assessment scheme was applied to estimate the cancer risk contribution from both sub-micron (mainly UFPs) and super-micron particles. We found a very high lung cancer risk related to the actual target levels due to the contribution of UFPs, in particular from indoor microenvironments. Therefore, as possible actions to reduce the lung cancer risk, we have hypothesized and tested three different scenarios: a) a reduction of the concentration of carcinogenic chemicals condensed onto particles in agreement with the current EU air pollution policy; b) the use of local ventilation systems to mitigate the exposure to cooking-generated particles; c) the improvement of the overall indoor air quality by considering a mechanical ventilation system instead of the widespread natural ventilation in order to increase the air exchange rates. Even with the simultaneous application of specific actions, performed with the best technologies available, the corresponding estimated lifetime lung cancer risk (ELCR) values for the Italian population for the entire life were equal to 1.25×10(-4) and 1.23×10(-4) for males and females, respectively, well higher with respect to the maximum tolerable lifetime cancer risk, 1×10(-5).

  1. The EPANET water quality model

    Energy Technology Data Exchange (ETDEWEB)

    Rossman, L.A. [Environmental Protection Agency, Cincinnati, OH (United States)

    1995-10-01

    EPANET is a software package developed by US EPA`s Drinking Water Research Division for modeling hydraulic and water quality behavior within water distribution systems. Starting with a geometric description of the pipe network, a set of initial conditions, estimates of water usage, and a set of rules for how the system is operated, EPANET predicts all flows, pressures, and water quality levels throughout the network during an extended period of operation. In addition to substance concentration, water age and source tracing can also be simulated. EPANET offers a number of advanced features including: modular, highly portable C language code with no pre-set limits on network size; a simple data input format based on a problem oriented language; a full-featured hydraulic simulator; improved water quality algorithms; analysis of water quality reactions both within the bulk flow and at the pipe wall; an optional graphical user interface running under Microsoft{reg_sign} Windows{trademark}. The Windows user interface allows one to edit EPANET input files, run a simulation, and view the results all within a single program. Simulation output can be visualized through: color-coded maps of the distribution system with full zooming, panning and labeling capabilities and a slider control to move forward or backward through time; spreadsheet-like tables that can be searched for entries meeting a specified criterion; and time series graphs of both predicted and observed values for any variable at any location in the network. EPANET is currently being used to analyze a number of water quality issues in different distribution systems across the country. These include: chlorine decay dynamics, raw water source blending, altered tank operation, and integration with real-time monitoring and control systems.

  2. Developing a TQM quality management method model

    OpenAIRE

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This model describes the primary quality management methods which may be used to assess an organization's present strengths and weaknesses with regard to its use of quality management methods. This model ...

  3. Representing humans in system security models: An actor-network approach

    NARCIS (Netherlands)

    Pieters, Wolter

    2011-01-01

    System models to assess the vulnerability of information systems to security threats typically represent a physical infrastructure (buildings) and a digital infrastructure (computers and networks), in combination with an attacker traversing the system while acquiring credentials. Other humans are ge

  4. Representing humans in system security models: An actor-network approach

    NARCIS (Netherlands)

    Pieters, Wolter

    2011-01-01

    System models to assess the vulnerability of information systems to security threats typically represent a physical infrastructure (buildings) and a digital infrastructure (computers and networks), in combination with an attacker traversing the system while acquiring credentials. Other humans are ge

  5. Use of CFD modeling for estimating spatial representativeness of urban air pollution monitoring sites and suitability of their locations

    Energy Technology Data Exchange (ETDEWEB)

    Santiago, J. L.; Martin, F.

    2015-07-01

    A methodology to estimate the spatial representativeness of air pollution monitoring sites is applied to two urban districts. This methodology is based on high resolution maps of air pollution computed by using Computational Fluid Dynamics (CFD) modelling tools. Traffic-emitted NO{sub 2} dispersion is simulated for several meteorological conditions taking into account the effect of the buildings on air flow and pollutant dispersion and using a steady state CFD-RANS approach. From these results, maps of average pollutant concentrations for January -May 2011 are computed as a combination of the simulated scenarios. Two urban districts of Madrid City were simulated. Spatial representativeness areas for 32 different sites within the same district (including the site of the operative air quality stations) have been estimated by computing the portion of the domains with average NO{sub 2} concentration differing less than a 20% of the concentration at each candidate monitoring site. New parameters such as the ratio AR between the representativeness area and the whole domain area or the representativeness index (IR) has been proposed to discuss and compare the representativeness areas. Significant differences between the spatial representativeness of the candidate sites of both studied districts have been found. The sites of the Escuelas Aguirre district have generally smaller representativeness areas than those of the Plaza de Castilla. More stations are needed to cover the Escuelas Aguirre district than for the Plaza de Castilla one. The operative air quality station of the Escuelas Aguirre district is less representative than the station of the Plaza de Castilla district. The cause of these differences seems to be the differences in urban structure of both districts prompting different ventilation. (Author)

  6. Use of CFD modeling for estimating spatial representativeness of urban air pollution monitoring sites and suitability of their locations

    Energy Technology Data Exchange (ETDEWEB)

    Santiago, J.L.; Martin, F.

    2015-07-01

    A methodology to estimate the spatial representativeness of air pollution monitoring sites is applied to two urban districts. This methodology is based on high resolution maps of air pollution computed by using Computational Fluid Dynamics (CFD) modelling tools. Traffic-emitted NO2 dispersion is simulated for several meteorological conditions taking into account the effect of the buildings on air flow and pollutant dispersion and using a steady state CFD-RANS approach. From these results, maps of average pollutant concentrations for January–May 2011 are computed as a combination of the simulated scenarios. Two urban districts of Madrid City were simulated. Spatial representativeness areas for 32 different sites within the same district (including the site of the operative air quality stations) have been estimated by computing the portion of the domains with average NO2 concentration differing less than a 20% of the concentration at each candidate monitoring site. New parameters such as the ratio AR between the representativeness area and the whole domain area or the representativeness index (IR) has been proposed to discuss and compare the representativeness areas. Significant differences between the spatial representativeness of the candidate sites of both studied districts have been found. The sites of the Escuelas Aguirre district have generally smaller representativeness areas than those of the Plaza de Castilla. More stations are needed to cover the Escuelas Aguirre district than for the Plaza de Castilla one. The operative air quality station of the Escuelas Aguirre district is less representative than the station of the Plaza de Castilla district. The cause of these differences seems to be the differences in urban structure of both districts prompting different ventilation. (Author)

  7. Quality of Malaria Case Management in Malawi: Results from a Nationally Representative Health Facility Survey

    Science.gov (United States)

    Steinhardt, Laura C.; Chinkhumba, Jobiba; Wolkon, Adam; Luka, Madalitso; Luhanga, Misheck; Sande, John; Oyugi, Jessica; Ali, Doreen; Mathanga, Don; Skarbinski, Jacek

    2014-01-01

    Background Malaria is endemic throughout Malawi, but little is known about quality of malaria case management at publicly-funded health facilities, which are the major source of care for febrile patients. Methods In April–May 2011, we conducted a nationwide, geographically-stratified health facility survey to assess the quality of outpatient malaria diagnosis and treatment. We enrolled patients presenting for care and conducted exit interviews and re-examinations, including reference blood smears. Moreover, we assessed health worker readiness (e.g., training, supervision) and health facility capacity (e.g. availability of diagnostics and antimalarials) to provide malaria case management. All analyses accounted for clustering and unequal selection probabilities. We also used survey weights to produce estimates of national caseloads. Results At the 107 facilities surveyed, most of the 136 health workers interviewed (83%) had received training on malaria case management. However, only 24% of facilities had functional microscopy, 15% lacked a thermometer, and 19% did not have the first-line artemisinin-based combination therapy (ACT), artemether-lumefantrine, in stock. Of 2,019 participating patients, 34% had clinical malaria (measured fever or self-reported history of fever plus a positive reference blood smear). Only 67% (95% confidence interval (CI): 59%, 76%) of patients with malaria were correctly prescribed an ACT, primarily due to missed malaria diagnosis. Among patients without clinical malaria, 31% (95% CI: 24%, 39%) were prescribed an ACT. By our estimates, 1.5 million of the 4.4 million malaria patients seen in public facilities annually did not receive correct treatment, and 2.7 million patients without clinical malaria were inappropriately given an ACT. Conclusions Malawi has a high burden of uncomplicated malaria but nearly one-third of all patients receive incorrect malaria treatment, including under- and over-treatment. To improve malaria case

  8. Quality of malaria case management in Malawi: results from a nationally representative health facility survey.

    Directory of Open Access Journals (Sweden)

    Laura C Steinhardt

    Full Text Available BACKGROUND: Malaria is endemic throughout Malawi, but little is known about quality of malaria case management at publicly-funded health facilities, which are the major source of care for febrile patients. METHODS: In April-May 2011, we conducted a nationwide, geographically-stratified health facility survey to assess the quality of outpatient malaria diagnosis and treatment. We enrolled patients presenting for care and conducted exit interviews and re-examinations, including reference blood smears. Moreover, we assessed health worker readiness (e.g., training, supervision and health facility capacity (e.g. availability of diagnostics and antimalarials to provide malaria case management. All analyses accounted for clustering and unequal selection probabilities. We also used survey weights to produce estimates of national caseloads. RESULTS: At the 107 facilities surveyed, most of the 136 health workers interviewed (83% had received training on malaria case management. However, only 24% of facilities had functional microscopy, 15% lacked a thermometer, and 19% did not have the first-line artemisinin-based combination therapy (ACT, artemether-lumefantrine, in stock. Of 2,019 participating patients, 34% had clinical malaria (measured fever or self-reported history of fever plus a positive reference blood smear. Only 67% (95% confidence interval (CI: 59%, 76% of patients with malaria were correctly prescribed an ACT, primarily due to missed malaria diagnosis. Among patients without clinical malaria, 31% (95% CI: 24%, 39% were prescribed an ACT. By our estimates, 1.5 million of the 4.4 million malaria patients seen in public facilities annually did not receive correct treatment, and 2.7 million patients without clinical malaria were inappropriately given an ACT. CONCLUSIONS: Malawi has a high burden of uncomplicated malaria but nearly one-third of all patients receive incorrect malaria treatment, including under- and over-treatment. To improve

  9. Hydrogen for representing groundwater quality and contamination; Hidrogramas para la representacion de la calidad t contaminacion de las aguas subterraneas

    Energy Technology Data Exchange (ETDEWEB)

    Queralt, R. [Dept. Medi Ambient, Generalitat de Catalunya (Spain)

    2000-07-01

    A new groundwater hydrogram called Roda is defined. It represents the quality and contamination of the water in the form of a wheel, or clock, in which the circumference is equivalent to the legal or defined concentration limit for each of the radii corresponding to the 12 parameters involved: chlorides, sulphates, bicarbonates, nitrates, manganese, TOC, iron, potassium, magnesium, calcium, sodium and conductivity. Its practical application to the groundwater in various Catalan aquifers is reported. This involved a trial with a graphic representation hydrogram that complements already existing indices such as the ISQA for physicochemical quality, the BMWPC for biological quality and the star system for inshore seawater. It is hoped to devise a simpler representation system than RODA in the form of an index or equivalent. (Author) 10 refs.

  10. BIB-SEM of representative area clay structures paving towards an alternative model of porosity

    Science.gov (United States)

    Desbois, G.; Urai, J. L.; Houben, M.; Hemes, S.; Klaver, J.

    2012-04-01

    A major contribution to understanding the sealing capacity, coupled flow, capillary processes and associated deformation in clay-rich geomaterials is based on detailed investigation of the rock microstructures. However, the direct characterization of pores in representative elementary area (REA) and below µm-scale resolution remains challenging. To investigate directly the mm- to nm-scale porosity, SEM is certainly the most direct approach, but it is limited by the poor quality of the investigated surfaces. The recent development of ion milling tools (BIB and FIB; Desbois et al, 2009, 2011; Heath et al., 2011; Keller et al., 2011) and cryo-SEM allows respectively producing exceptional high quality polished cross-sections suitable for high resolution porosity SEM-imaging at nm-scale and investigating samples under wet conditions by cryogenic stabilization. This contribution focuses mainly on the SEM description of pore microstructures in 2D BIB-polished cross-sections of Boom (Mol site, Belgium) and Opalinus (Mont Terri, Switzerland) clays down to the SEM resolution. Pores detected in images are statistically analyzed to perform porosity quantification in REA. On the one hand, BIB-SEM results allow retrieving MIP measurements obtained from larger sample volumes. On the other hand, the BIB-SEM approach allows characterizing porosity-homogeneous and -predictable islands, which form the elementary components of an alternative concept of porosity/permeability model based on pore microstructures. Desbois G., Urai J.L. and Kukla P.A. (2009) Morphology of the pore space in claystones - evidence from BIB/FIB ion beam sectioning and cryo-SEM observations. E-Earth, 4, 15-22. Desbois G., Urai J.L., Kukla P.A., Konstanty J. and Baerle C. (2011). High-resolution 3D fabric and porosity model in a tight gas sandstone reservoir: a new approach to investigate microstructures from mm- to nm-scale combining argon beam cross-sectioning and SEM imaging . Journal of Petroleum Science

  11. Global modelling of river water quality under climate change

    Science.gov (United States)

    van Vliet, Michelle T. H.; Franssen, Wietse H. P.; Yearsley, John R.

    2017-04-01

    Climate change will pose challenges on the quality of freshwater resources for human use and ecosystems for instance by changing the dilution capacity and by affecting the rate of chemical processes in rivers. Here we assess the impacts of climate change and induced streamflow changes on a selection of water quality parameters for river basins globally. We used the Variable Infiltration Capacity (VIC) model and a newly developed global water quality module for salinity, temperature, dissolved oxygen and biochemical oxygen demand. The modelling framework was validated using observed records of streamflow, water temperature, chloride, electrical conductivity, dissolved oxygen and biochemical oxygen demand for 1981-2010. VIC and the water quality module were then forced with an ensemble of bias-corrected General Circulation Model (GCM) output for the representative concentration pathways RCP2.6 and RCP8.5 to study water quality trends and identify critical regions (hotspots) of water quality deterioration for the 21st century.

  12. Representing hybrid compensatory non-compensatory choice set formation in semi-compensatory models

    DEFF Research Database (Denmark)

    Kaplan, Sigal; Bekhor, Shlomo; Shigtan, Yoram

    2012-01-01

    Semi-compensatory models represent a choice process consisting of an elimination-based choice set formation upon satisfying criteria thresholds and a utility-based choice. Current semi-compensatory models assume a purely non-compensatory choice set formation and hence do not support multinomial c...

  13. Modeling Water Quality in Rivers

    Directory of Open Access Journals (Sweden)

    Liren Yu

    2005-01-01

    Full Text Available This study reports a PC software, used in a Windows-based environment, which was developed based on the first order reaction of Biological Oxygen Demand (BOD and a modified Streeter and Phelps equation, in order to simulate and determine the variations of Dissolved Oxygen (DO and of the BOD along with the studied river reaches. The software considers many impacts of environmental factors, such as the different type of discharges (concentrated or punctual source, tributary contribution, distributed source, nitrogenous BOD, BOD sedimentation, photosynthetic production and benthic demand of oxygen, and so on. The software has been used to model the DO profile along one river, with the aim to improve the water quality through suitable engineering measure.

  14. Testing the validity of a Cd soil quality standard in representative Mediterranean agricultural soils under an accumulator crop

    Energy Technology Data Exchange (ETDEWEB)

    Recatala, L., E-mail: luis.recatala@uv.es [Departamento de Planificacion Territorial, Centro de Investigaciones sobre Desertificacion-CIDE (CSIC-Universitat de Valencia-Generalitat Valenciana), Cami de la Marjal S/N, 46470 Albal (Valencia) (Spain); Sanchez, J. [Departamento de Planificacion Territorial, Centro de Investigaciones sobre Desertificacion-CIDE (CSIC-Universitat de Valencia-Generalitat Valenciana), Cami de la Marjal S/N, 46470 Albal (Valencia) (Spain); Arbelo, C. [Departamento de Edafologia y Geologia, Facultad de Biologia, Universidad de La Laguna, 38206 La Laguna (Tenerife), Islas Canarias (Spain); Sacristan, D. [Departamento de Planificacion Territorial, Centro de Investigaciones sobre Desertificacion-CIDE (CSIC-Universitat de Valencia-Generalitat Valenciana), Cami de la Marjal S/N, 46470 Albal (Valencia) (Spain)

    2010-12-01

    The validity of a quality standard for cadmium (Cd) in representative agricultural Mediterranean soils under an accumulator crop (Lactuca sativa L.) is evaluated in this work considering both its effect on the crop growth (biomass production) and the metal accumulation in the edible part of the plant. Four soils with different properties relevant to regulate the behaviour of heavy metals were selected from the Valencian Region, a representative area of the European Mediterranean Region. For all soils, the effective concentration of added Cd causing 50% inhibition (EC{sub 50}) on the biomass production was much higher than the minimum legal concentration used to declare soils as contaminated by cadmium, i.e. 100 times the baseline value for Cd, in Spain (Spanish Royal Decree 9/2005). As expected, Cd toxicity in the crop was higher in the soils having less carbonate content. On the other hand, for all soils, from the second dose on, which represents 10-times the baseline value for Cd, the metal content in crops exceeded the maximum level established for leaf crops by the European legislation (Regulation EC no. 466/2001). Soil salinity and coarse textures make the accumulation of Cd in the edible part of the plant easier. Therefore, the legal baseline soil cadmium content established by the Spanish legislation seems not valid neither from the point of view of the effect on the crop growth nor from the point of view of the metal accumulation in the edible part of the plant. In order to realistically declare contaminated soils by heavy metals, soil quality standards should be proposed taking into account the soil properties. Further research in other agricultural areas of the region would improve the basis for proposing adequate soil quality standards for heavy metals as highlighted by the European Thematic Strategy for Soil Protection.

  15. Attention modeling for video quality assessment

    DEFF Research Database (Denmark)

    You, Junyong; Korhonen, Jari; Perkis, Andrew

    2010-01-01

    averaged spatiotemporal pooling. The local quality is derived from visual attention modeling and quality variations over frames. Saliency, motion, and contrast information are taken into account in modeling visual attention, which is then integrated into IQMs to calculate the local quality of a video frame...

  16. Modeling and Depletion Simulations for a High Flux Isotope Reactor Cycle with a Representative Experiment Loading

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Betzler, Ben [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Hirtz, Gregory John [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Ilas, Germina [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Sunny, Eva [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division

    2016-09-01

    The purpose of this report is to document a high-fidelity VESTA/MCNP High Flux Isotope Reactor (HFIR) core model that features a new, representative experiment loading. This model, which represents the current, high-enriched uranium fuel core, will serve as a reference for low-enriched uranium conversion studies, safety-basis calculations, and other research activities. A new experiment loading model was developed to better represent current, typical experiment loadings, in comparison to the experiment loading included in the model for Cycle 400 (operated in 2004). The new experiment loading model for the flux trap target region includes full length 252Cf production targets, 75Se production capsules, 63Ni production capsules, a 188W production capsule, and various materials irradiation targets. Fully loaded 238Pu production targets are modeled in eleven vertical experiment facilities located in the beryllium reflector. Other changes compared to the Cycle 400 model are the high-fidelity modeling of the fuel element side plates and the material composition of the control elements. Results obtained from the depletion simulations with the new model are presented, with a focus on time-dependent isotopic composition of irradiated fuel and single cycle isotope production metrics.

  17. A Novel Elastographic Frame Quality Indicator and its use in Automatic Representative-Frame Selection from a Cine Loop.

    Science.gov (United States)

    Chintada, Bhaskara Rao; Subramani, Adhitya Vikraman; Raghavan, Bagyam; Thittai, Arun Kumar

    2017-01-01

    This study was aimed at developing a method for automatically selecting a few representative frames from several hundred axial-shear strain elastogram frames typically obtained during freehand compression elastography of the breast in vivo. This may also alleviate some inter-observer variations that arise at least partly because of differences in selection of representative frames from a cine loop for evaluation and feature extraction. In addition to the correlation coefficient and frame-average axial strain that have been previously used as quality indicators for axial strain elastograms, we incorporated the angle of compression, which has unique effects on axial-shear strain elastogram interpretation. These identified quality factors were computed for every frame in the elastographic cine loop. The algorithm identifies the section having N contiguous frames (N = 10) that possess the highest cumulative quality scores from the cine loop as the one containing representative frames. Data for total of 40 biopsy-proven malignant or benign breast lesions in vivo were part of this study. The performance of the automated algorithm was evaluated by comparing its selection against that by trained radiologists. The observer- identified frame that consisted of a sonogram, axial strain elastogram and axial-shear strain elastogram was compared with the respective images in the frames of the algorithm-identified section using cross-correlation as a similarity measure. It was observed that there was, on average (∼standard deviation), 82.2% (∼2.2%), 83.4% (∼3.8%) and 78.4% (∼3.6%) correlation between corresponding images of the observer-selected and algorithm-selected frames, respectively. The results indicate that the automatic frame selection method described here may provide an objective way to select a representative frame while saving time for the radiologist. Furthermore, the frame quality metric described and used here can be displayed in real time as feedback

  18. A general mathematical framework for representing soil organic matter dynamics in biogeochemistry models

    Science.gov (United States)

    Sierra, C. A.; Mueller, M.

    2013-12-01

    Recent work have highlighted the importance of nonlinear interactions in representing the decomposition of soil organic matter (SOM). It is unclear however how to integrate these concepts into larger biogeochemical models or into a more general mathematical description of the decomposition process. Here we present a mathematical framework that generalizes both previous decomposition models and recent ideas about nonlinear microbial interactions. The framework is based on a set of four basic principles: 1) mass balance, 2) heterogeneity in the decomposability of SOM, 3) transformations in the decomposability of SOM over time, 4) energy limitation of decomposers. This framework generalizes a large majority of SOM decomposition models proposed to date. We illustrate the application of this framework to the development of a continuous model that includes the ideas in the Dual Arrhenius Michaelis-Menten Model (DAMM) for explicitly representing temperature-moisture limitations of enzyme activity in the decomposition of heterogenous substrates.

  19. [Spatial representativeness of monitoring stations for air quality in Florence (Tuscany Region, Central Italy) according to ARPAT e LaMMA. Critical observations].

    Science.gov (United States)

    Grechi, Daniele

    2016-01-01

    On March 2015, the Environmental Protection Agency of Tuscany Region (Central Italy) and the Laboratory of monitoring and environmental modelling published a Report on spatial representativeness of monitoring stations for Tuscan air quality, where they supported the decommissioning of modelling stations located in the Florentine Plain. The stations of Signa, Scandicci, and Firenze-Bassi, located in a further South area, were considered representative Believing that air quality of the Plain could be evaluated by these stations is a stretch. In this text the author show the inconsistency of the conclusion of the Report through correlation graphs comparing daily means of PM10 detected in the disposed stations and in the active ones, showing relevant differences between the reported values and the days when the limits are exceeded. The discrepancy is due to the fact that uncertainty of theoretical estimates is greater than the differences recorded by the stations considered as a reference and the areas they may represent. The area of the Plain has a population of 150,000 individuals and it is subject to a heavy environmental pression, which will change for the urban works planned for the coming years. The population's legitimate request for the analytical monitoring of air pollution could be met through the organization of participated monitoring based on the use of low-cost innovative tools.

  20. Combining 3d Volume and Mesh Models for Representing Complicated Heritage Buildings

    Science.gov (United States)

    Tsai, F.; Chang, H.; Lin, Y.-W.

    2017-08-01

    This study developed a simple but effective strategy to combine 3D volume and mesh models for representing complicated heritage buildings and structures. The idea is to seamlessly integrate 3D parametric or polyhedral models and mesh-based digital surfaces to generate a hybrid 3D model that can take advantages of both modeling methods. The proposed hybrid model generation framework is separated into three phases. Firstly, after acquiring or generating 3D point clouds of the target, these 3D points are partitioned into different groups. Secondly, a parametric or polyhedral model of each group is generated based on plane and surface fitting algorithms to represent the basic structure of that region. A "bare-bones" model of the target can subsequently be constructed by connecting all 3D volume element models. In the third phase, the constructed bare-bones model is used as a mask to remove points enclosed by the bare-bones model from the original point clouds. The remaining points are then connected to form 3D surface mesh patches. The boundary points of each surface patch are identified and these boundary points are projected onto the surfaces of the bare-bones model. Finally, new meshes are created to connect the projected points and original mesh boundaries to integrate the mesh surfaces with the 3D volume model. The proposed method was applied to an open-source point cloud data set and point clouds of a local historical structure. Preliminary results indicated that the reconstructed hybrid models using the proposed method can retain both fundamental 3D volume characteristics and accurate geometric appearance with fine details. The reconstructed hybrid models can also be used to represent targets in different levels of detail according to user and system requirements in different applications.

  1. MODELS-3 COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODEL AEROSOL COMPONENT 1: MODEL DESCRIPTION

    Science.gov (United States)

    The aerosol component of the Community Multiscale Air Quality (CMAQ) model is designed to be an efficient and economical depiction of aerosol dynamics in the atmosphere. The approach taken represents the particle size distribution as the superposition of three lognormal subdis...

  2. Select strengths and biases of models in representing the Arctic winter boundary layer over sea ice

    NARCIS (Netherlands)

    Pithan, Felix; Ackerman, Andrew; Angevine, Wayne M.; Hartung, Kerstin; Ickes, Luisa; Kelley, Maxwell; Medeiros, Brian; Sandu, Irina; Steeneveld, Gert Jan; Sterk, H.A.M.

    2016-01-01

    Weather and climate models struggle to represent lower tropospheric temperature and moisture profiles and surface fluxes in Arctic winter, partly because they lack or misrepresent physical processes that are specific to high latitudes. Observations have revealed two preferred states of the Arctic

  3. Select strengths and biases of models in representing the Arctic winter boundary layer over sea ice

    NARCIS (Netherlands)

    Pithan, Felix; Ackerman, Andrew; Angevine, Wayne M.; Hartung, Kerstin; Ickes, Luisa; Kelley, Maxwell; Medeiros, Brian; Sandu, Irina; Steeneveld, Gert Jan; Sterk, H.A.M.

    2016-01-01

    Weather and climate models struggle to represent lower tropospheric temperature and moisture profiles and surface fluxes in Arctic winter, partly because they lack or misrepresent physical processes that are specific to high latitudes. Observations have revealed two preferred states of the Arctic

  4. Representing tissue mass and morphology in mechanistic models of digestive function in ruminants

    NARCIS (Netherlands)

    Bannink, A.; Dijkstra, J.; France, J.

    2011-01-01

    Representing changes in morphological and histological characteristics of epithelial tissue in the rumen and intestine and to evaluate their implications for absorption and tissue mass in models of digestive function requires a quantitative approach. The aim of the present study was to quantify tiss

  5. Data Acquisition for Quality Loss Function Modelling

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard; Howard, Thomas J.

    2016-01-01

    Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given by t...... by the product function in focus, the quality output can be measured and quantified in a number of ways. In this article a structured approach for acquiring stakeholder satisfaction data for use in quality loss function modelling is introduced.......Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given...

  6. Data Acquisition for Quality Loss Function Modelling

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard; Howard, Thomas J.

    2016-01-01

    Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given by t...... by the product function in focus, the quality output can be measured and quantified in a number of ways. In this article a structured approach for acquiring stakeholder satisfaction data for use in quality loss function modelling is introduced.......Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given...

  7. Developing a TQM quality management method model

    NARCIS (Netherlands)

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This mo

  8. Developing a TQM quality management method model

    NARCIS (Netherlands)

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This mo

  9. Developing a TQM quality management method model

    NARCIS (Netherlands)

    Zhang, Zhihai

    1997-01-01

    From an extensive review of total quality management literature, the external and internal environment affecting an organization's quality performance and the eleven primary elements of TQM are identified. Based on the primary TQM elements, a TQM quality management method model is developed. This

  10. Feasibility of Representing Data from Published Nursing Research Using the OMOP Common Data Model.

    Science.gov (United States)

    Kim, Hyeoneui; Choi, Jeeyae; Jang, Imho; Quach, Jimmy; Ohno-Machado, Lucila

    2016-01-01

    We explored the feasibility of representing nursing research data with the Observational Medical Outcomes Partners (OMOP) Common Data Model (CDM) to understand the challenges and opportunities in representing various types of health data not limited to diseases and drug treatments. We collected 1,431 unique data items from 256 nursing articles and mapped them to the OMOP CDM. A deeper level of mapping was explored by simulating 10 data search use cases. Although the majority of the data could be represented in the OMOP CDM, potential information loss was identified in contents related to patient reported outcomes, socio-economic information, and locally developed nursing intervention protocols. These areas will be further investigated in a follow up study. We will use lessons learned in this study to inform the metadata development efforts for data discovery.

  11. Representing time-varying cyclic dynamics using multiple-subject state-space models.

    Science.gov (United States)

    Chow, Sy-Miin; Hamaker, Ellen L; Fujita, Frank; Boker, Steven M

    2009-11-01

    Over the last few decades, researchers have become increasingly aware of the need to consider intraindividual variability in the form of cyclic processes. In this paper, we review two contemporary cyclic state-space models: Young and colleagues' dynamic harmonic regression model and Harvey and colleagues' stochastic cycle model. We further derive the analytic equivalence between the two models, discuss their unique strengths and propose multiple-subject extensions. Using data from a study on human postural dynamics and a daily affect study, we demonstrate the use of these models to represent within-person non-stationarities in cyclic dynamics and interindividual differences therein. The use of diagnostic tools for evaluating model fit is also illustrated.

  12. Representing Operational Knowledge of PWR Plant by Using Multilevel Flow Modelling

    DEFF Research Database (Denmark)

    Zhang, Xinxin; Lind, Morten; Jørgensen, Sten Bay

    2014-01-01

    situation and support operational decisions. This paper will provide a general MFM model of the primary side in a standard Westinghouse Pressurized Water Reactor ( PWR ) system including sub - systems of Reactor Coolant System, Rod Control System, Chemical and Volume Control System, emergency heat removal......The aim of this paper is to explore the capability of representing operational knowledge by using Multilevel Flow Modelling ( MFM ) methodology. The paper demonstrate s how the operational knowledge can be inserted into the MFM models and be used to evaluate the plant state, identify the current...... systems. And the sub - systems’ functions will be decomposed into sub - models according to different operational situations. An operational model will be developed based on the operating procedure by using MFM symbols and this model can be used to implement coordination rules for organize the utilizati...

  13. Dependability breakeven point mathematical model for production - quality strategy support

    Science.gov (United States)

    Vilcu, Adrian; Verzea, Ion; Chaib, Rachid

    2016-08-01

    This paper connects the field of dependability system with the production-quality strategies through a new mathematical model based on breakeven points. The novelties consist in the identification of the parameters of dependability system which, in safety control, represents the degree to which an item is capable of performing its required function at any randomly chosen time during its specified operating period disregarding non-operation related influences, as well as the analysis of the production-quality strategies, defining a mathematical model based on a new concept - dependability breakeven points, model validation on datasets and shows the practical applicability of this new approach.

  14. Can Geostatistical Models Represent Nature's Variability? An Analysis Using Flume Experiments

    Science.gov (United States)

    Scheidt, C.; Fernandes, A. M.; Paola, C.; Caers, J.

    2015-12-01

    The lack of understanding in the Earth's geological and physical processes governing sediment deposition render subsurface modeling subject to large uncertainty. Geostatistics is often used to model uncertainty because of its capability to stochastically generate spatially varying realizations of the subsurface. These methods can generate a range of realizations of a given pattern - but how representative are these of the full natural variability? And how can we identify the minimum set of images that represent this natural variability? Here we use this minimum set to define the geostatistical prior model: a set of training images that represent the range of patterns generated by autogenic variability in the sedimentary environment under study. The proper definition of the prior model is essential in capturing the variability of the depositional patterns. This work starts with a set of overhead images from an experimental basin that showed ongoing autogenic variability. We use the images to analyze the essential characteristics of this suite of patterns. In particular, our goal is to define a prior model (a minimal set of selected training images) such that geostatistical algorithms, when applied to this set, can reproduce the full measured variability. A necessary prerequisite is to define a measure of variability. In this study, we measure variability using a dissimilarity distance between the images. The distance indicates whether two snapshots contain similar depositional patterns. To reproduce the variability in the images, we apply an MPS algorithm to the set of selected snapshots of the sedimentary basin that serve as training images. The training images are chosen from among the initial set by using the distance measure to ensure that only dissimilar images are chosen. Preliminary investigations show that MPS can reproduce fairly accurately the natural variability of the experimental depositional system. Furthermore, the selected training images provide

  15. Application of the generalized vertical coordinate ocean model for better representing satellite data

    Science.gov (United States)

    Song, Y. T.

    2002-01-01

    It is found that two adaptive parametric functions can be introduced into the basic ocean equations for utilizing the optimal or hybrid features of commonly used z-level, terrain- following, isopycnal, and pressure coordinates in numerical ocean models. The two parametric functions are formulated by combining three techniques: the arbitrary vertical coordinate system of Kasahara (1 974), the Jacobian pressure gradient formulation of Song (1 998), and a newly developed metric factor that permits both compressible (non-Boussinesq) and incompressible (Boussinesq) approximations. Based on the new formulation, an adaptive modeling strategy is proposed and a staggered finite volume method is designed to ensure conservation of important physical properties and numerical accuracy. Implementation of the combined techniques to SCRUM (Song and Haidvogel1994) shows that the adaptive modeling strategy can be applied to any existing ocean model without incurring computational expense or altering the original numerical schemes. Such a generalized coordinate model is expected to benefit diverse ocean modelers for easily choosing optimal vertical structures and sharing modeling resources based on a common model platform. Several representing oceanographic problems with different scales and characteristics, such as coastal canyons, basin-scale circulation, and global ocean circulation, are used to demonstrate the model's capability for multiple applications. New results show that the model is capable of simultaneously resolving both Boussinesq and non-Boussinesq, and both small- and large-scale processes well. This talk will focus on its applications of multiple satellite sensing data in eddy-resolving simulations of Asian Marginal Sea and Kurosio. Attention will be given to how Topex/Poseidon SSH, TRMM SST; and GRACE ocean bottom pressure can be correctly represented in a non- Boussinesq model.

  16. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  17. Representative Model of the Learning Process in Virtual Spaces Supported by ICT

    Directory of Open Access Journals (Sweden)

    José CAPACHO

    2015-01-01

    Full Text Available This paper shows the results of research activities for building the representative model of the learning process in virtual spaces (e-Learning. The formal basis of the model are supported in the analysis of models of learning assessment in virtual spaces and specifically in Dembo´s teaching learning model, the systemic approach to evaluating virtual learning by Badrul H. Khan, and the Cybernetic model for evaluating virtual learning environments. The e-Learning model is systemic and of feedback by nature. The model integrates the society, Institution of Education, virtual training platform, virtual teacher and students, and finally the assessment of student learning in virtual learning spaces supported by ICT. The model consists of fourteen processes. Processes are defined taking into account the following dimensions: identification, academic, pedagogical, educational, formative, evaluative, assessment of virtual learning and technological. The model is fundamental to the management of e-learning supported by ICT, justified by the fact that it is an operative model of the teaching-learning process in virtual spaces. The importance of having an operative model in virtual education is to project the management and decision in virtual education. Then the operational, administrative and decision phases will allow the creation of a set of indicators. These indicators will assess the process of virtual education not only in students but also in the virtual institution.

  18. REPRESENTATIVE MODEL OF THE LEARNING PROCESS IN VIRTUAL SPACES SUPPORTED BY ICT

    Directory of Open Access Journals (Sweden)

    José CAPACHO

    2014-10-01

    Full Text Available This paper shows the results of research activities for building the representative model of the learning process in virtual spaces (e-Learning. The formal basis of the model are supported in the analysis of models of learning assessment in virtual spaces and specifically in Dembo´s teaching learning model, the systemic approach to evaluating virtual learning by Badrul H. Khan, and the Cybernetic model for evaluating virtual learning environments. The e-Learning model is systemic and of feedback by nature. The model integrates the society, Institution of Education, virtual training platform, virtual teacher and students, and finally the assessment of student learning in virtual learning spaces supported by ICT. The model consists of fourteen processes. Processes are defined taking into account the following dimensions: identification, academic, pedagogical, educational, formative, evaluative, assessment of virtual learning and technological. The model is fundamental to the management of e-learning supported by ICT, justified by the fact that it is an operative model of the teaching-learning process in virtual spaces. The importance of having an operative model in virtual education is to project the management and decision in virtual education. Then the operational, administrative and decision phases will allow the creation of a set of indicators. These indicators will assess the process of virtual education not only in students but also in the virtual institution.

  19. Emotion as a thermostat: representing emotion regulation using a damped oscillator model.

    Science.gov (United States)

    Chow, Sy-Miin; Ram, Nilam; Boker, Steven M; Fujita, Frank; Clore, Gerald

    2005-06-01

    The authors present in this study a damped oscillator model that provides a direct mathematical basis for testing the notion of emotion as a self-regulatory thermostat. Parameters from this model reflect individual differences in emotional lability and the ability to regulate emotion. The authors discuss concepts such as intensity, rate of change, and acceleration in the context of emotion, and they illustrate the strengths of this approach in comparison with spectral analysis and growth curve models. The utility of this modeling approach is illustrated using daily emotion ratings from 179 college students over 52 consecutive days. Overall, the damped oscillator model provides a meaningful way of representing emotion regulation as a dynamic process and helps identify the dominant periodicities in individuals' emotions.

  20. Quality Evaluation Model for Map Labeling

    Institute of Scientific and Technical Information of China (English)

    FAN Hong; ZHANG Zuxun; DU Daosheng

    2005-01-01

    This paper discusses and sums up the basic criterions of guaranteeing the labeling quality and abstracts the four basic factors including the conflict for a label with a label, overlay for label with the features, position's priority and the association for a label with its feature. By establishing the scoring system, a formalized four-factors quality evaluation model is constructed. Last, this paper introduces the experimental result of the quality evaluation model applied to the automatic map labeling system-MapLabel.

  1. Microtomographic imaging of multiphase flow in porous media: Validation of image analysis algorithms, and assessment of data representativeness and quality

    Science.gov (United States)

    Wildenschild, D.; Porter, M. L.

    2009-04-01

    Significant strides have been made in recent years in imaging fluid flow in porous media using x-ray computerized microtomography (CMT) with 1-20 micron resolution; however, difficulties remain in combining representative sample sizes with optimal image resolution and data quality; and in precise quantification of the variables of interest. Tomographic imaging was for many years focused on volume rendering and the more qualitative analyses necessary for rapid assessment of the state of a patient's health. In recent years, many highly quantitative CMT-based studies of fluid flow processes in porous media have been reported; however, many of these analyses are made difficult by the complexities in processing the resulting grey-scale data into reliable applicable information such as pore network structures, phase saturations, interfacial areas, and curvatures. Yet, relatively few rigorous tests of these analysis tools have been reported so far. The work presented here was designed to evaluate the effect of image resolution and quality, as well as the validity of segmentation and surface generation algorithms as they were applied to CMT images of (1) a high-precision glass bead pack and (2) gas-fluid configurations in a number of glass capillary tubes. Interfacial areas calculated with various algorithms were compared to actual interfacial geometries and we found very good agreement between actual and measured surface and interfacial areas. (The test images used are available for download at the website listed below). http://cbee.oregonstate.edu/research/multiphase_data/index.html

  2. How large-scale energy-environment models represent technology and technological change

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-01-01

    In the process of selecting measures against global warming, it is important to consider the introduction of technological innovations into the models, and studies were made in this connection. An induced technical change model has to be an economically total model that represents various incentives involving the form of profits from innovations; profits from cost functions, research-and-development production functions, and abstract profits from empirical estimates; and the dimensions in which technological change is assumed to progress. Under study at the Stanford Energy Modeling Forum is how to represent various technological assumptions and development, which is necessary to predict the cost for dealing with global warming. At the conference of February 2001, 10 cases of preliminary model scenarios were discussed. In one case, for instance, a carbon tax of $25/ton in 2010 is raised $25 every decade to be $100/ton in 2040. Three working groups are engaged in the study of long-run economy/technology baseline scenarios, characterization of current and potential future technologies, and ways of modeling technological change. (NEDO)

  3. Pharmaceutical sales representatives and patient safety: a comparative prospective study of information quality in Canada, France and the United States.

    Science.gov (United States)

    Mintzes, Barbara; Lexchin, Joel; Sutherland, Jason M; Beaulieu, Marie-Dominique; Wilkes, Michael S; Durrieu, Geneviève; Reynolds, Ellen

    2013-10-01

    The information provided by pharmaceutical sales representatives has been shown to influence prescribing. To enable safe prescribing, medicines information must include harm as well as benefits. Regulation supports this aim, but relative effectiveness of different approaches is not known. The United States (US) and France directly regulate drug promotion; Canada relies on industry self-regulation. France has the strictest information standards. This is a prospective cohort study in Montreal, Vancouver, Sacramento and Toulouse. We recruited random samples of primary care physicians from May 2009 to June 2010 to report on consecutive sales visits. The primary outcome measure was "minimally adequate safety information" (mention of at least one indication, serious adverse event, common adverse event, and contraindication, and no unqualified safety claims or unapproved indications). Two hundred and fifty-five physicians reported on 1,692 drug-specific promotions. "Minimally adequate safety information" did not differ: 1.7 % of promotions; range 0.9-3.0 % per site. Sales representatives provided some vs. no information on harm more often in Toulouse than in Montreal and Vancouver: 61 % vs. 34 %, OR = 4.0; 95 % CI 2.8-5.6, or Sacramento (39 %), OR = 2.4; 95 % CI 1.7-3.6. Serious adverse events were rarely mentioned (5-6 % of promotions in all four sites), although 45 % of promotions were for drugs with US Food and Drug Administration (FDA) "black box" warnings of serious risks. Nevertheless, physicians judged the quality of scientific information to be good or excellent in 901 (54 %) of promotions, and indicated readiness to prescribe 64 % of the time. "Minimally adequate safety information" did not differ in the US and Canadian sites, despite regulatory differences. In Toulouse, consistent with stricter standards, more harm information was provided. However, in all sites, physicians were rarely informed about serious adverse events, raising questions about

  4. A transferable coarse-grained model for diphenylalanine: How to represent an environment driven conformational transition

    OpenAIRE

    Dalgıçdir, Cahit; Şensoy, Özge; Sayar, Mehmet; Peter, Christine

    2013-01-01

    A transferable coarse-grained model for diphenylalanine: How to represent an environment driven conformational transition Cahit Dalgicdir, Ozge Sensoy, Christine Peter, and Mehmet Sayar Citation: The Journal of Chemical Physics 139, 234115 (2013); doi: 10.1063/1.4848675 View online: http://dx.doi.org/10.1063/1.4848675 View Table of Contents: http://scitation.aip.org/content/aip/journal/jcp/139/23?ver=pdfcov Published by the AIP Publishing Articles you may be interested in...

  5. Explicitly representing soil microbial processes in Earth system models: Soil microbes in earth system models

    Energy Technology Data Exchange (ETDEWEB)

    Wieder, William R. [Climate and Global Dynamics Division, National Center for Atmospheric Research, Boulder Colorado USA; Allison, Steven D. [Department of Ecology and Evolutionary Biology, University of California, Irvine California USA; Department of Earth System Science, University of California, Irvine California USA; Davidson, Eric A. [Appalachian Laboratory, University of Maryland Center for Environmental Science, Frostburg Maryland USA; Georgiou, Katerina [Department of Chemical and Biomolecular Engineering, University of California, Berkeley California USA; Earth Sciences Division, Lawrence Berkeley National Laboratory, Berkeley California USA; Hararuk, Oleksandra [Natural Resources Canada, Canadian Forest Service, Pacific Forestry Centre, Victoria British Columbia Canada; He, Yujie [Department of Earth System Science, University of California, Irvine California USA; Department of Earth, Atmospheric and Planetary Sciences, Purdue University, West Lafayette Indiana USA; Hopkins, Francesca [Department of Earth System Science, University of California, Irvine California USA; Jet Propulsion Laboratory, California Institute of Technology, Pasadena California USA; Luo, Yiqi [Department of Microbiology & Plant Biology, University of Oklahoma, Norman Oklahoma USA; Smith, Matthew J. [Computational Science Laboratory, Microsoft Research, Cambridge UK; Sulman, Benjamin [Department of Biology, Indiana University, Bloomington Indiana USA; Todd-Brown, Katherine [Department of Microbiology & Plant Biology, University of Oklahoma, Norman Oklahoma USA; Pacific Northwest National Laboratory, Richland Washington USA; Wang, Ying-Ping [CSIRO Ocean and Atmosphere Flagship, Aspendale Victoria Australia; Xia, Jianyang [Department of Microbiology & Plant Biology, University of Oklahoma, Norman Oklahoma USA; Tiantong National Forest Ecosystem Observation and Research Station, School of Ecological and Environmental Sciences, East China Normal University, Shanghai China; Xu, Xiaofeng [Department of Biological Sciences, University of Texas at El Paso, Texas USA

    2015-10-01

    Microbes influence soil organic matter (SOM) decomposition and the long-term stabilization of carbon (C) in soils. We contend that by revising the representation of microbial processes and their interactions with the physicochemical soil environment, Earth system models (ESMs) may make more realistic global C cycle projections. Explicit representation of microbial processes presents considerable challenges due to the scale at which these processes occur. Thus, applying microbial theory in ESMs requires a framework to link micro-scale process-level understanding and measurements to macro-scale models used to make decadal- to century-long projections. Here, we review the diversity, advantages, and pitfalls of simulating soil biogeochemical cycles using microbial-explicit modeling approaches. We present a roadmap for how to begin building, applying, and evaluating reliable microbial-explicit model formulations that can be applied in ESMs. Drawing from experience with traditional decomposition models we suggest: (1) guidelines for common model parameters and output that can facilitate future model intercomparisons; (2) development of benchmarking and model-data integration frameworks that can be used to effectively guide, inform, and evaluate model parameterizations with data from well-curated repositories; and (3) the application of scaling methods to integrate microbial-explicit soil biogeochemistry modules within ESMs. With contributions across scientific disciplines, we feel this roadmap can advance our fundamental understanding of soil biogeochemical dynamics and more realistically project likely soil C response to environmental change at global scales.

  6. Flow-Shop Scheduling Models with Parameters Represented by Rough Variables

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In reality, processing times are often imprecise and this imprecision is critical for the scheduling procedure. This research deals with flow-shop scheduling in rough environment. In this type of scheduling problem, we employ the rough sets to represent the job parameters. The job processing times are assumed to be rough variables, and the problem is to minimize the makespan. Three novel types of rough scheduling models are presented. A rough simulation-based genetic algorithm is designed to solve these models and its effectiveness is well illustrated by numerical experiments.

  7. Dynamic viscosity modeling of methane plus n-decane and methane plus toluene mixtures: Comparative study of some representative models

    DEFF Research Database (Denmark)

    Baylaucq, A.; Boned, C.; Canet, X.;

    2005-01-01

    .15 and for several methane compositions. Although very far from real petroleum fluids, these mixtures are interesting in order to study the potential of extending various models to the simulation of complex fluids with asymmetrical components (light/heavy hydrocarbon). These data (575 data points) have been...... discussed in the framework of recent representative models (hard sphere scheme, friction theory, and free volume model) and with mixing laws and two empirical models (particularly the LBC model which is commonly used in petroleum engineering, and the self-referencing model). This comparative study shows...

  8. A computational model of the hippocampus that represents environmental structure and goal location, and guides movement.

    Science.gov (United States)

    Matsumoto, Jumpei; Makino, Yoshinari; Miura, Haruki; Yano, Masafumi

    2011-08-01

    Hippocampal place cells (PCs) are believed to represent environmental structure. However, it is unclear how and which brain regions represent goals and guide movements. Recently, another type of cells that fire around a goal was found in rat hippocampus (we designate these cells as goal place cells, GPCs). This suggests that the hippocampus is also involved in goal representation. Assuming that the activities of GPCs depend on the distance to a goal, we propose an adaptive navigation model. By monitoring the population activity of GPCs, the model navigates to shorten the distance to the goal. To achieve the distance-dependent activities of GPCs, plastic connections are assumed between PCs and GPCs, which are modified depending on two reward-triggered activities: activity propagation through PC-PC network representing the topological environmental structure, and the activity of GPCs with different durations. The former activity propagation is regarded as a computational interpretation of "reverse replay" phenomenon found in rat hippocampus. Simulation results confirm that after reaching a goal only once, the model can navigate to the goal along almost the shortest path from arbitrary places in the environment. This indicates that the hippocampus might play a primary role in the representation of not only the environmental structure but also the goal, in addition to guiding the movement. This navigation strategy using the population activity of GPCs is equivalent to the taxis strategy, the simplest and most basic for biological systems. Our model is unique because this simple strategy allows the model to follow the shortest path in the topological map of the environment.

  9. A proposed-standard format to represent and distribute tomographic models and other earth spatial data

    Science.gov (United States)

    Postpischl, L.; Morelli, A.; Danecek, P.

    2009-04-01

    Formats used to represent (and distribute) tomographic earth models differ considerably and are rarely self-consistent. In fact, each earth scientist, or research group, uses specific conventions to encode the various parameterizations used to describe, e.g., seismic wave speed or density in three dimensions, and complete information is often found in related documents or publications (if available at all) only. As a consequence, use of various tomographic models from different authors requires considerable effort, is more cumbersome than it should be and prevents widespread exchange and circulation within the community. We propose a format, based on modern web standards, able to represent different (grid-based) model parameterizations within the same simple text-based environment, easy to write, to parse, and to visualise. The aim is the creation of self-describing data-structures, both human and machine readable, that are automatically recognised by general-purpose software agents, and easily imported in the scientific programming environment. We think that the adoption of such a representation as a standard for the exchange and distribution of earth models can greatly ease their usage and enhance their circulation, both among fellow seismologists and among a broader non-specialist community. The proposed solution uses semantic web technologies, fully fitting the current trends in data accessibility. It is based on Json (JavaScript Object Notation), a plain-text, human-readable lightweight computer data interchange format, which adopts a hierarchical name-value model for representing simple data structures and associative arrays (called objects). Our implementation allows integration of large datasets with metadata (authors, affiliations, bibliographic references, units of measure etc.) into a single resource. It is equally suited to represent other geo-referenced volumetric quantities — beyond tomographic models — as well as (structured and unstructured

  10. Using McDaniel's model to represent non-Rayleigh active sonar reverberation

    Science.gov (United States)

    Gu, Ming

    Reverberation in active sonar systems has often been observed to follow non-Rayleigh distributions. Current statistical models tend to be either too restrictive, leading to significant mismatch error, or too general, leading to large estimation error. McDaniel's model has shown promise as having reasonably tight representation in terms of skewness and kurtosis for reverberation from a variety of sonar systems. This dissertation intensively explores capability and effectiveness of the generalized McDaniel's model in representing non-Rayleigh reverberation when minimal data are available. Three major topics are covered in this dissertation. First, derivation and computation of the cumulative distribution function of McDaniel's model is addressed. Two approaches, one based on direct integration and the other via characteristic function inversion, are both shown to achieve adequate precision with the former leading to a closed-form solution and the latter requiring significantly less computational effort. Second, parameter estimators using both method of moments (MM) and maximum likelihood (ML) algorithms are developed. The MM estimator has the advantage of a simple and rapid implementation, but the disadvantage of a non- zero probability of a solution not existing. Bootstrap/pruning techniques are proposed to partially deal with the failure of this method. The ML estimator will always provide a solution; however, it requires multivariate optimization. The expectation-maximization (EM) algorithm iteration is also derived for obtaining the ML estimates and compared with the simplex method and quasi-Newton multivariate optimization routines. Furthermore, the ability of various statistical models to represent the probability of false alarm is evaluated as a function of sample size. It is demonstrated that when minimal data are available, McDaniel's model can more accurately represent non-Rayleigh reverberation than the K or Rayleigh mixture models. Third, detection

  11. A model-driven approach for representing clinical archetypes for Semantic Web environments.

    Science.gov (United States)

    Martínez-Costa, Catalina; Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás; Maldonado, José Alberto

    2009-02-01

    The life-long clinical information of any person supported by electronic means configures his Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. There are currently different standards for representing and exchanging EHR information among different systems. In advanced EHR approaches, clinical information is represented by means of archetypes. Most of these approaches use the Archetype Definition Language (ADL) to specify archetypes. However, ADL has some drawbacks when attempting to perform semantic activities in Semantic Web environments. In this work, Semantic Web technologies are used to specify clinical archetypes for advanced EHR architectures. The advantages of using the Ontology Web Language (OWL) instead of ADL are described and discussed in this work. Moreover, a solution combining Semantic Web and Model-driven Engineering technologies is proposed to transform ADL into OWL for the CEN EN13606 EHR architecture.

  12. Data Structure Analysis to Represent Basic Models of Finite State Automation

    Directory of Open Access Journals (Sweden)

    V. V. Gurenko

    2015-01-01

    Full Text Available Complex system engineering based on the automaton models requires a reasoned data structure selection to implement them. The problem of automaton representation and data structure selection to be used in it has been understudied. Arbitrary data structure selection for automaton model software implementation leads to unnecessary computational burden and reduces the developed system efficiency. This article proposes an approach to the reasoned selection of data structures to represent finite algoristic automaton basic models and gives practical considerations based on it.Static and dynamic data structures are proposed for three main ways to assign Mealy and Moore automatons: a transition table, a matrix of coupling and a transition graph. A thirddimensional array, a rectangular matrix and a matrix of lists are the static structures. Dynamic structures are list-oriented structures: two-level and three-level Ayliff vectors and a multi-linked list. These structures allow us to store all required information about finite state automaton model components - characteristic set cardinalities and data of transition and output functions.A criterion system is proposed for data structure comparative evaluation in virtue of algorithmic features of automata theory problems. The criteria focused on capacitive and time computational complexity of operations performed in tasks such as equivalent automaton conversions, proving of automaton equivalence and isomorphism, and automaton minimization.A data structure comparative analysis based on the criterion system has done for both static and dynamic type. The analysis showed advantages of the third-dimensional array, matrix and two-level Ayliff vector. These are structures that assign automaton by transition table. For these structures an experiment was done to measure the execution time of automation operations included in criterion system.The analysis of experiment results showed that a dynamic structure - two

  13. Regional climate models' performance in representing precipitation and temperature over selected Mediterranean areas

    Directory of Open Access Journals (Sweden)

    R. Deidda

    2013-12-01

    Full Text Available This paper discusses the relative performance of several climate models in providing reliable forcing for hydrological modeling in six representative catchments in the Mediterranean region. We consider 14 Regional Climate Models (RCMs, from the EU-FP6 ENSEMBLES project, run for the A1B emission scenario on a common 0.22° (about 24 km rotated grid over Europe and the Mediterranean region. In the validation period (1951 to 2010 we consider daily precipitation and surface temperatures from the observed data fields (E-OBS data set, available from the ENSEMBLES project and the data providers in the ECA&D project. Our primary objective is to rank the 14 RCMs for each catchment and select the four best-performing ones to use as common forcing for hydrological models in the six Mediterranean basins considered in the EU-FP7 CLIMB project. Using a common suite of four RCMs for all studied catchments reduces the (epistemic uncertainty when evaluating trends and climate change impacts in the 21st century. We present and discuss the validation setting, as well as the obtained results and, in some detail, the difficulties we experienced when processing the data. In doing so we also provide useful information and advice for researchers not directly involved in climate modeling, but interested in the use of climate model outputs for hydrological modeling and, more generally, climate change impact studies in the Mediterranean region.

  14. Climate model validation and selection for hydrological applications in representative Mediterranean catchments

    Directory of Open Access Journals (Sweden)

    R. Deidda

    2013-07-01

    Full Text Available This paper discusses the relative performance of several climate models in providing reliable forcing for hydrological modeling in six representative catchments in the Mediterranean region. We consider 14 Regional Climate Models (RCMs, from the EU-FP6 ENSEMBLES project, run for the A1B emission scenario on a common 0.22-degree (about 24 km rotated grid over Europe and the Mediterranean. In the validation period (1951 to 2010 we consider daily precipitation and surface temperatures from the E-OBS dataset, available from the ENSEMBLES project and the data providers in the ECA&D project. Our primary objective is to rank the 14 RCMs for each catchment and select the four best performing ones to use as common forcing for hydrological models in the six Mediterranean basins considered in the EU-FP7 CLIMB project. Using a common suite of 4 RCMs for all studied catchments reduces the (epistemic uncertainty when evaluating trends and climate change impacts in the XXI century. We present and discuss the validation setting, as well as the obtained results and, to some detail, the difficulties we experienced when processing the data. In doing so we also provide useful information and hint for an audience of researchers not directly involved in climate modeling, but interested in the use of climate model outputs for hydrological modeling and, more in general, climate change impact studies in the Mediterranean.

  15. A model for representing the Italian energy system. The NEEDS-TIMES experience

    Energy Technology Data Exchange (ETDEWEB)

    Cosmi, C.; Pietrapertosa, F.; Salvia, M. [National Research Council, Institute of Methodologies for Environmental Analysis, C.da S. Loja, I-85050 Tito Scalo (PZ) (Italy)]|[Federico II University, Department of Physical Sciences, Via Cintia, I-80126 Naples (Italy); Di Leo, S. [National Research Council, National Institute for the Physics of Matter, Via Cintia, I-80126 Naples (Italy)]|[University of Basilicata, Department of Environmental Engineering and Physics, C.da Macchia Romana, I-85100 Potenza (Italy); Loperte, S.; Cuomo, V. [National Research Council, Institute of Methodologies for Environmental Analysis, C.da S. Loja, I-85050 Tito Scalo (PZ) (Italy); Macchiato, M. [Federico II University, Department of Physical Sciences, Via Cintia, I-80126 Naples (Italy)]|[National Research Council, National Institute for the Physics of Matter, Via Cintia, I-80126 Naples (Italy)

    2009-05-15

    Sustainability of energy systems has a strategic role in the current energy-environmental policies as it involves key issues such as security of energy supply, mitigation of environmental impact (with special regard to air quality improvement) and energy affordability. In this framework modelling activities are more than ever a strategic issue in order to manage the large complexity of energy systems as well as to support the decision-making process at different stages and spatial scales (regional, national, Pan-European, etc.). The aim of this article is to present a new model for the Italian energy system implemented with a common effort in the framework of an integrated project under the Sixth Framework Programme. In particular, the main features of the common methodology are briefly recalled and the modelling structure, the main data and assumptions, sector by sector, are presented. Moreover the main results obtained for the baseline (BAU) scenario are fully described. (author)

  16. A two-layer flow model to represent ice-ocean interactions beneath Antarctic ice shelves

    Science.gov (United States)

    Lee, V.; Payne, A. J.; Gregory, J. M.

    2011-01-01

    We develop a two-dimensional two-layer flow model that can calculate melt rates beneath ice shelves from ocean temperature and salinity fields at the shelf front. The cavity motion is split into two layers where the upper plume layer represents buoyant meltwater-rich water rising along the underside of the ice to the shelf front, while the lower layer represents the ambient water connected to the open ocean circulating beneath the plume. Conservation of momentum has been reduced to a frictional geostrophic balance, which when linearized provides algebraic equations for the plume velocity. The turbulent exchange of heat and salt between the two layers is modelled through an entrainment rate which is directed into the faster flowing layer. The numerical model is tested using an idealized geometry based on the dimensions of Pine Island Ice Shelf. We find that the spatial distribution of melt rates is fairly robust. The rates are at least 2.5 times higher than the mean in fast flowing regions corresponding to the steepest section of the underside of the ice shelf close to the grounding line and to the converged geostrophic flow along the rigid lateral boundary. Precise values depend on a combination of entrainment and plume drag coefficients. The flow of the ambient is slow and the spread of ocean scalar properties is dominated by diffusion.

  17. A two-layer flow model to represent ice-ocean interactions beneath Antarctic ice shelves

    Directory of Open Access Journals (Sweden)

    V. Lee

    2011-01-01

    Full Text Available We develop a two-dimensional two-layer flow model that can calculate melt rates beneath ice shelves from ocean temperature and salinity fields at the shelf front. The cavity motion is split into two layers where the upper plume layer represents buoyant meltwater-rich water rising along the underside of the ice to the shelf front, while the lower layer represents the ambient water connected to the open ocean circulating beneath the plume. Conservation of momentum has been reduced to a frictional geostrophic balance, which when linearized provides algebraic equations for the plume velocity. The turbulent exchange of heat and salt between the two layers is modelled through an entrainment rate which is directed into the faster flowing layer.

    The numerical model is tested using an idealized geometry based on the dimensions of Pine Island Ice Shelf. We find that the spatial distribution of melt rates is fairly robust. The rates are at least 2.5 times higher than the mean in fast flowing regions corresponding to the steepest section of the underside of the ice shelf close to the grounding line and to the converged geostrophic flow along the rigid lateral boundary. Precise values depend on a combination of entrainment and plume drag coefficients. The flow of the ambient is slow and the spread of ocean scalar properties is dominated by diffusion.

  18. A Proposed Model for Assessing Defendant Competence to Self-Represent.

    Science.gov (United States)

    White, Mitzi M S; Gutheil, Thomas G

    2016-12-01

    The increasing number of criminal defendants who are choosing to self-represent poses special challenges for legal systems with regard to the types of limits that should be placed on a defendant's basic human right to defend himself without the assistance of counsel. While courts strive to respect the dignity and autonomy of the defendant that are encompassed in this right, they also want to ensure that justice is delivered and the dignity of the courtroom is maintained. The Supreme Court of the United States, in its opinion in Indiana v. Edwards (2008), held that while the right to self-represent recognized in Faretta v. California (1975) remains, states and trial judges can place limits on a defendant's right to self-representation when a defendant lacks the mental capacities needed to prepare and conduct an adequate defense. Following the court's lead, we first examine the types and range of tasks that a defendant who chooses to self-represent must perform. Based on this analysis, we propose a five-part model that forensic practitioners can use as a conceptual framework for assessing whether a defendant has deficits that would affect his competence to perform critical self-representation tasks. The five areas that the model recommends practitioners assess are whether a defendant can engage in goal-directed behaviors, has sufficient communication skills, can engage in constructive social intercourse, can control his emotions in an adversarial arena, and has the cognitive abilities needed to argue his case adequately. It is recommended that practitioners use the model in their testimony to provide the trier of fact with a comprehensive report of the areas in which a defendant has deficits that will prevent him from protecting his interests in receiving a fair and equitable trial. © 2016 American Academy of Psychiatry and the Law.

  19. Representing Resources in Petri Net Models: Hardwiring or Soft-coding?

    OpenAIRE

    2011-01-01

    This paper presents an interesting design problem in developing a new tool for discrete-event dynamic systems (DEDS). A new tool known as GPenSIM was developed for modeling and simulation of DEDS; GPenSIM is based on Petri Nets. The design issue this paper talks about is whether to represent resources in DEDS hardwired as a part of the Petri net structure (which is the widespread practice) or to soft code as common variables in the program code. This paper shows that soft coding resources giv...

  20. Representing environment-induced helix-coil transitions in a coarse grained peptide model

    Science.gov (United States)

    Dalgicdir, Cahit; Globisch, Christoph; Sayar, Mehmet; Peter, Christine

    2016-10-01

    Coarse grained (CG) models are widely used in studying peptide self-assembly and nanostructure formation. One of the recurrent challenges in CG modeling is the problem of limited transferability, for example to different thermodynamic state points and system compositions. Understanding transferability is generally a prerequisite to knowing for which problems a model can be reliably used and predictive. For peptides, one crucial transferability question is whether a model reproduces the molecule's conformational response to a change in its molecular environment. This is of particular importance since CG peptide models often have to resort to auxiliary interactions that aid secondary structure formation. Such interactions take care of properties of the real system that are per se lost in the coarse graining process such as dihedral-angle correlations along the backbone or backbone hydrogen bonding. These auxiliary interactions may then easily overstabilize certain conformational propensities and therefore destroy the ability of the model to respond to stimuli and environment changes, i.e. they impede transferability. In the present paper we have investigated a short peptide with amphiphilic EALA repeats which undergoes conformational transitions between a disordered and a helical state upon a change in pH value or due to the presence of a soft apolar/polar interface. We designed a base CG peptide model that does not carry a specific (backbone) bias towards a secondary structure. This base model was combined with two typical approaches of ensuring secondary structure formation, namely a C α -C α -C α -C α pseudodihedral angle potential or a virtual site interaction that mimics hydrogen bonding. We have investigated the ability of the two resulting CG models to represent the environment-induced conformational changes in the helix-coil equilibrium of EALA. We show that with both approaches a CG peptide model can be obtained that is environment-transferable and that

  1. Impact of Land-Use and Land-Cover Change on urban air quality in representative cities of China

    Science.gov (United States)

    Sun, L.; Wei, J.; Duan, D. H.; Guo, Y. M.; Yang, D. X.; Jia, C.; Mi, X. T.

    2016-05-01

    The atmospheric particulate pollution in China is getting worse. Land-Use and Land-Cover Change (LUCC) is a key factor that affects atmospheric particulate pollution. Understanding the response of particulate pollution to LUCC is necessary for environmental protection. Eight representative cities in China, Qingdao, Jinan, Zhengzhou, Xi'an, Lanzhou, Zhangye, Jiuquan, and Urumqi were selected to analyze the relationship between particulate pollution and LUCC. The MODIS (MODerate-resolution Imaging Spectroradiometer) aerosol product (MOD04) was used to estimate atmospheric particulate pollution for nearly 10 years, from 2001 to 2010. Six land-use types, water, woodland, grassland, cultivated land, urban, and unused land, were obtained from the MODIS land cover product (MOD12), where the LUCC of each category was estimated. The response of particulate pollution to LUCC was analyzed from the above mentioned two types of data. Moreover, the impacts of time-lag and urban type changes on particulate pollution were also considered. Analysis results showed that due to natural factors, or human activities such as urban sprawl or deforestation, etc., the response of particulate pollution to LUCC shows obvious differences in different areas. The correlation between particulate pollution and LUCC is lower in coastal areas but higher in inland areas. The dominant factor affecting urban air quality in LUCC changes from ocean, to woodland, to urban land, and eventually into grassland or unused land when moving from the coast to inland China.

  2. SOIL QUALITY ASSESSMENT USING FUZZY MODELING

    Science.gov (United States)

    Maintaining soil productivity is essential if agriculture production systems are to be sustainable, thus soil quality is an essential issue. However, there is a paucity of tools for measurement for the purpose of understanding changes in soil quality. Here the possibility of using fuzzy modeling t...

  3. Quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, E.J.A.

    2011-01-01

    Semantic IS (Information Systems) standards are essential for achieving interoperability between organizations. However a recent survey suggests that not the full benefits of standards are achieved, due to the quality issues. This paper presents a quality model for semantic IS standards, that should

  4. WATER QUALITY MODELING OF SUZHOU CREEK

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Water-quality models are important tools for improving river environment. In this paper, the project "Water Quality Modeling of the Suzhou Creek" was briefly described, including the choice and the principle of the model, the model study and methods, the calibration and verification of the stream model. A set of parameters about water environmental characteristic of the Suzhou Creek were put forward in the period of the third water dispatch experiment in 1999. It is necessary to point out that these parameters will change with the rehabilitation and construction of the Suzhou Creek.

  5. Lightweight Expression of Granular Objects (LEGO) Content Modeling Using the SNOMED CT Observables Model to Represent Nursing Assessment Data.

    Science.gov (United States)

    Johnson, Christie

    2016-01-01

    This poster presentation presents a content modeling strategy using the SNOMED CT Observable Model to represent large amounts of detailed clinical data in a consistent and computable manner that can support multiple use cases. Lightweight Expression of Granular Objects (LEGOs) represent question/answer pairs on clinical data collection forms, where a question is modeled by a (usually) post-coordinated SNOMED CT expression. LEGOs transform electronic patient data into a normalized consumable, which means that the expressions can be treated as extensions of the SNOMED CT hierarchies for the purpose of performing subsumption queries and other analytics. Utilizing the LEGO approach for modeling clinical data obtained from a nursing admission assessment provides a foundation for data exchange across disparate information systems and software applications. Clinical data exchange of computable LEGO patient information enables the development of more refined data analytics, data storage and clinical decision support.

  6. Can we trust climate models to realistically represent severe European windstorms?

    Science.gov (United States)

    Trzeciak, Tomasz M.; Knippertz, Peter; Pirret, Jennifer S. R.; Williams, Keith D.

    2016-06-01

    Cyclonic windstorms are one of the most important natural hazards for Europe, but robust climate projections of the position and the strength of the North Atlantic storm track are not yet possible, bearing significant risks to European societies and the (re)insurance industry. Previous studies addressing the problem of climate model uncertainty through statistical comparisons of simulations of the current climate with (re-)analysis data show large disagreement between different climate models, different ensemble members of the same model and observed climatologies of intense cyclones. One weakness of such evaluations lies in the difficulty to separate influences of the climate model's basic state from the influence of fast processes on the development of the most intense storms, which could create compensating effects and therefore suggest higher reliability than there really is. This work aims to shed new light into this problem through a cost-effective "seamless" approach of hindcasting 20 historical severe storms with the two global climate models, ECHAM6 and GA4 configuration of the Met Office Unified Model, run in a numerical weather prediction mode using different lead times, and horizontal and vertical resolutions. These runs are then compared to re-analysis data. The main conclusions from this work are: (a) objectively identified cyclone tracks are represented satisfactorily by most hindcasts; (b) sensitivity to vertical resolution is low; (c) cyclone depth is systematically under-predicted for a coarse resolution of T63 by both climate models; (d) no systematic bias is found for the higher resolution of T127 out to about three days, demonstrating that climate models are in fact able to represent the complex dynamics of explosively deepening cyclones well, if given the correct initial conditions; (e) an analysis using a recently developed diagnostic tool based on the surface pressure tendency equation points to too weak diabatic processes, mainly latent

  7. Air Quality Dispersion Modeling - Alternative Models

    Science.gov (United States)

    Models, not listed in Appendix W, that can be used in regulatory applications with case-by-case justification to the Reviewing Authority as noted in Section 3.2, Use of Alternative Models, in Appendix W.

  8. Representing winter wheat in the Community Land Model (version 4.5)

    Science.gov (United States)

    Lu, Yaqiong; Williams, Ian N.; Bagley, Justin E.; Torn, Margaret S.; Kueppers, Lara M.

    2017-05-01

    Winter wheat is a staple crop for global food security, and is the dominant vegetation cover for a significant fraction of Earth's croplands. As such, it plays an important role in carbon cycling and land-atmosphere interactions in these key regions. Accurate simulation of winter wheat growth is not only crucial for future yield prediction under a changing climate, but also for accurately predicting the energy and water cycles for winter wheat dominated regions. We modified the winter wheat model in the Community Land Model (CLM) to better simulate winter wheat leaf area index, latent heat flux, net ecosystem exchange of CO2, and grain yield. These included schemes to represent vernalization as well as frost tolerance and damage. We calibrated three key parameters (minimum planting temperature, maximum crop growth days, and initial value of leaf carbon allocation coefficient) and modified the grain carbon allocation algorithm for simulations at the US Southern Great Plains ARM site (US-ARM), and validated the model performance at eight additional sites across North America. We found that the new winter wheat model improved the prediction of monthly variation in leaf area index, reduced latent heat flux, and net ecosystem exchange root mean square error (RMSE) by 41 and 35 % during the spring growing season. The model accurately simulated the interannual variation in yield at the US-ARM site, but underestimated yield at sites and in regions (northwestern and southeastern US) with historically greater yields by 35 %.

  9. Is the mental wellbeing of young Australians best represented by a single, multidimensional or bifactor model?

    Science.gov (United States)

    Hides, Leanne; Quinn, Catherine; Stoyanov, Stoyan; Cockshaw, Wendell; Mitchell, Tegan; Kavanagh, David J

    2016-07-30

    Internationally there is a growing interest in the mental wellbeing of young people. However, it is unclear whether mental wellbeing is best conceptualized as a general wellbeing factor or a multidimensional construct. This paper investigated whether mental wellbeing, measured by the Mental Health Continuum-Short Form (MHC-SF), is best represented by: (1) a single-factor general model; (2) a three-factor multidimensional model or (3) a combination of both (bifactor model). 2220 young Australians aged between 16 and 25 years completed an online survey including the MHC-SF and a range of other wellbeing and mental ill-health measures. Exploratory factor analysis supported a bifactor solution, comprised of a general wellbeing factor, and specific group factors of psychological, social and emotional wellbeing. Confirmatory factor analysis indicated that the bifactor model had a better fit than competing single and three-factor models. The MHC-SF total score was more strongly associated with other wellbeing and mental ill-health measures than the social, emotional or psychological subscale scores. Findings indicate that the mental wellbeing of young people is best conceptualized as an overarching latent construct (general wellbeing) to which emotional, social and psychological domains contribute. The MHC-SF total score is a valid and reliable measure of this general wellbeing factor.

  10. Using ecosystem services to represent the environment in hydro-economic models

    Science.gov (United States)

    Momblanch, Andrea; Connor, Jeffery D.; Crossman, Neville D.; Paredes-Arquiola, Javier; Andreu, Joaquín

    2016-07-01

    Demand for water is expected to grow in line with global human population growth, but opportunities to augment supply are limited in many places due to resource limits and expected impacts of climate change. Hydro-economic models are often used to evaluate water resources management options, commonly with a goal of understanding how to maximise water use value and reduce conflicts among competing uses. The environment is now an important factor in decision making, which has resulted in its inclusion in hydro-economic models. We reviewed 95 studies applying hydro-economic models, and documented how the environment is represented in them and the methods they use to value environmental costs and benefits. We also sought out key gaps and inconsistencies in the treatment of the environment in hydro-economic models. We found that representation of environmental values of water is patchy in most applications, and there should be systematic consideration of the scope of environmental values to include and how they should be valued. We argue that the ecosystem services framework offers a systematic approach to identify the full range of environmental costs and benefits. The main challenges to more holistic representation of the environment in hydro-economic models are the current limits to understanding of ecological functions which relate physical, ecological and economic values and critical environmental thresholds; and the treatment of uncertainty.

  11. EXPERIMENTAL EVALUATION OF NUMERICAL MODELS TO REPRESENT THE STIFFNESS OF LAMINATED ROTOR CORES IN ELECTRICAL MACHINES

    Directory of Open Access Journals (Sweden)

    HIDERALDO L. V. SANTOS

    2013-08-01

    Full Text Available Usually, electrical machines have a metallic cylinder made up of a compacted stack of thin metal plates (referred as laminated core assembled with an interference fit on the shaft. The laminated structure is required to improve the electrical performance of the machine and, besides adding inertia, also enhances the stiffness of the system. Inadequate characterization of this element may lead to errors when assessing the dynamic behavior of the rotor. The aim of this work was therefore to evaluate three beam models used to represent the laminated core of rotating electrical machines. The following finite element beam models are analyzed: (i an “equivalent diameter model”, (ii an “unbranched model” and (iii a “branched model”. To validate the numerical models, experiments are performed with nine different electrical rotors so that the first non-rotating natural frequencies and corresponding vibration modes in a free-free support condition are obtained experimentally. The models are evaluated by comparing the natural frequencies and corresponding vibration mode shapes obtained experimentally with those obtained numerically. Finally, a critical discussion of the behavior of the beam models studied is presented. The results show that for the majority of the rotors tested, the “branched model” is the most suitable

  12. Representing life in the Earth system with soil microbial functional traits in the MIMICS model

    Science.gov (United States)

    Wieder, W. R.; Grandy, A. S.; Kallenbach, C. M.; Taylor, P. G.; Bonan, G. B.

    2015-06-01

    Projecting biogeochemical responses to global environmental change requires multi-scaled perspectives that consider organismal diversity, ecosystem processes, and global fluxes. However, microbes, the drivers of soil organic matter decomposition and stabilization, remain notably absent from models used to project carbon (C) cycle-climate feedbacks. We used a microbial trait-based soil C model with two physiologically distinct microbial communities, and evaluate how this model represents soil C storage and response to perturbations. Drawing from the application of functional traits used to model other ecosystems, we incorporate copiotrophic and oligotrophic microbial functional groups in the MIcrobial-MIneral Carbon Stabilization (MIMICS) model; these functional groups are akin to "gleaner" vs. "opportunist" plankton in the ocean, or r- vs. K-strategists in plant and animal communities. Here we compare MIMICS to a conventional soil C model, DAYCENT (the daily time-step version of the CENTURY model), in cross-site comparisons of nitrogen (N) enrichment effects on soil C dynamics. MIMICS more accurately simulates C responses to N enrichment; moreover, it raises important hypotheses involving the roles of substrate availability, community-level enzyme induction, and microbial physiological responses in explaining various soil biogeochemical responses to N enrichment. In global-scale analyses, we show that MIMICS projects much slower rates of soil C accumulation than a conventional soil biogeochemistry in response to increasing C inputs with elevated carbon dioxide (CO2) - a finding that would reduce the size of the land C sink estimated by the Earth system. Our findings illustrate that tradeoffs between theory and utility can be overcome to develop soil biogeochemistry models that evaluate and advance our theoretical understanding of microbial dynamics and soil biogeochemical responses to environmental change.

  13. How to Represent 100-meter Spatial Heterogeneity in Earth System Models

    Science.gov (United States)

    Chaney, Nathaniel; Shevliakova, Elena; Malyshev, Sergey

    2016-04-01

    Terrestrial ecosystems play a pivotal role in the Earth system; they have a profound impact on the global climate, food and energy production, freshwater resources, and biodiversity. One of the most fascinating yet challenging aspects of characterizing terrestrial ecosystems is their field-scale (~100 m) spatial heterogeneity. It has been observed repeatedly that the water, energy, and biogeochemical cycles at multiple temporal and spatial scales have deep ties to an ecosystem's spatial structure. Current Earth system models largely disregard this important relationship leading to an inadequate representation of ecosystem dynamics. In this presentation, we will show how existing hyperresolution environmental datasets can be harnessed to explicitly represent field-scale spatial heterogeneity in Earth system models. For each macroscale grid cell, these environmental data are clustered according to their field-scale soil and topographic attributes to define unique sub-grid tiles or hydrologic response units (HRUs). The novel Geophysical Fluid Dynamics Laboratory (GFDL) LM3-TiHy-PPA land model is then used to simulate these HRUs and their spatial interactions via the exchange of water, energy, and nutrients along explicit topographic gradients. Using historical simulations over the contiguous United States, we will show how a robust representation of field-scale spatial heterogeneity impacts modeled ecosystem dynamics including the water, energy, and biogeochemical cycles as well as vegetation composition and distribution.

  14. An air quality model for Central Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Jazcilevich, D. Aron; Garcia, R. Agustin; Suarez, Gerardo Ruiz; Magana, R. Victor; Perez, L. Jose Luis [Universidad Nacional Autonoma de Mexico, Centro de Ciencias de la Atmosfera, Mexico City (Mexico); Fuentes-Gea, Vicente [Universidad Nacional Autonoma de Mexico, Div. de Estudios del Posgrado, Mexico City (Mexico)

    1999-07-01

    A computational air quality model for Central Mexico that includes the Basin of the Valley of Mexico, the Valleys of Toluca, Puebla and Cuernavaca already in experimental operation, is presented. The meteorology of the region is obtained combining two non-hydrostatic models: a model designed for synoptic scales called MM5 provides initial and boundary data to a model specially designed for urban environments and scales called MEMO. The transport model used numerical techniques developed by the authors that eliminate numerical diffusion and dispersion. For the photochemical model several ODE's integrators were tested. The emissions model developed uses the latest inventory data gathered in the region. (Author)

  15. TOTAL QUALITY CUSTOMER SATISFACTION MODEL

    OpenAIRE

    Jesús Cruz Álvarez; Jesús Fabián López; Carlos Monge Perry

    2014-01-01

    In today’s business environment, all organizations are required to focus on their customers in order to fully understand their needs. There is a need to drive and engage strategic actions in order to close any potentials gaps between customer´s expectations and manufacture´s deliverables. Current customer satisfaction theory appears to be excluded from a holistic model that broadly covers the extent of customer satisfaction concept.This article empathizes the need of an integrated customer sa...

  16. Fault detection in processes represented by PLS models using an EWMA control scheme

    KAUST Repository

    Harrou, Fouzi

    2016-10-20

    Fault detection is important for effective and safe process operation. Partial least squares (PLS) has been used successfully in fault detection for multivariate processes with highly correlated variables. However, the conventional PLS-based detection metrics, such as the Hotelling\\'s T and the Q statistics are not well suited to detect small faults because they only use information about the process in the most recent observation. Exponentially weighed moving average (EWMA), however, has been shown to be more sensitive to small shifts in the mean of process variables. In this paper, a PLS-based EWMA fault detection method is proposed for monitoring processes represented by PLS models. The performance of the proposed method is compared with that of the traditional PLS-based fault detection method through a simulated example involving various fault scenarios that could be encountered in real processes. The simulation results clearly show the effectiveness of the proposed method over the conventional PLS method.

  17. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  18. Representing northern peatland microtopography and hydrology within the Community Land Model

    Directory of Open Access Journals (Sweden)

    X. Shi

    2015-02-01

    Full Text Available Predictive understanding of northern peatland hydrology is a necessary precursor to understanding the fate of massive carbon stores in these systems under the influence of present and future climate change. Current models have begun to address microtopographic controls on peatland hydrology, but none have included a prognostic calculation of peatland water table depth for a vegetated wetland, independent of prescribed regional water tables. We introduce here a new configuration of the Community Land Model (CLM which includes a fully prognostic water table calculation for a vegetated peatland. Our structural and process changes to CLM focus on modifications needed to represent the hydrologic cycle of bogs environment with perched water tables, as well as distinct hydrologic dynamics and vegetation communities of the raised hummock and sunken hollow microtopography characteristic of peatland bogs. The modified model was parameterized and independently evaluated against observations from an ombrotrophic raised-dome bog in northern Minnesota (S1-Bog, the site for the Spruce and Peatland Responses Under Climatic and Environmental Change experiment (SPRUCE. Simulated water table levels compared well with site-level observations. The new model predicts significant hydrologic changes in response to planned warming at the SPRUCE site. At present, standing water is commonly observed in bog hollows after large rainfall events during the growing season, but simulations suggest a sharp decrease in water table levels due to increased evapotranspiration under the most extreme warming level, nearly eliminating the occurrence of standing water in the growing season. Simulated soil energy balance was strongly influenced by reduced winter snowpack under warming simulations, with the warming influence on soil temperature partly offset by the loss of insulating snowpack in early and late winter. The new model provides improved predictive capacity for seasonal

  19. Modelling of Buckingham Canal water quality.

    Science.gov (United States)

    Abbasi, S A; Khan, F I; Sentilvelan, K; Shabudeen, A

    2002-10-01

    The paper presents a case study of the modelling of the water quality of a canal situated in a petrochemical industrial complex, which receives wastewaters from Madras Refineries Limited (MRL), and Madras Fertilizers Limited (MFL). The canal well known Buckingham Canal which passes through Chennai (Madras), India has been modelled using the software QUAL2E-UNCAS. After testing and validation of the model, simulations have been carried out. The exercise enables forecasting the impacts of different seasons, base flows, and waste water inputs on the water quality of the Buckingham Canal. It also enables development of water management strategies.

  20. Teaching-Family Model: Insuring Quality Practice

    Science.gov (United States)

    McElgunn, Peggy

    2012-01-01

    The Teaching-Family Model was one of the earliest approaches to be supported by an extensive research base. As it has evolved over four decades, it retains the focus on teaching and learning but incorporates a strength- and relationship-based orientation. The model is also unique in gathering ongoing practice-based evidence to insure quality.

  1. New challenges in integrated water quality modelling

    NARCIS (Netherlands)

    Rode, M.; Arhonditsis, G.; Balin, D.; Kebede, T.; Krysanova, V.; Griensven, A.; Zee, van der S.E.A.T.M.

    2010-01-01

    There is an increasing pressure for development of integrated water quality models that effectively couple catchment and in-stream biogeochemical processes. This need stems from increasing legislative requirements and emerging demands related to contemporary climate and land use changes. Modelling w

  2. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    the information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual......) for calibration of the model resulted in the same predicted level but narrower model prediction bounds than calibrations based on volume-proportional samples, allowing a better exploitation of the resources allocated for stormwater quality management.......Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect...

  3. DEVELOPMENT OF TWO-DIMENSIONAL HYDRODYNAMIC AND WATER QUALITY MODEL FOR HUANGPU RIVER

    Institute of Scientific and Technical Information of China (English)

    Xu Zu-xin; Yin Hai-long

    2003-01-01

    Based on numerical computation model RMA2 and RMA4 with open source code, finite element meshes representing the study domain are created, then the finite element hydrodynamic and water quality model for Huangpu River is developed and calibrated, and the simulation results are analyzed. This developed hydrodynamic and water quality model is used to analyze the influence of discharged wastewater from planning Wastwater Treatment Plant (WWTP) on Huangpu River's water quality.

  4. Impact of urban parameterization on high resolution air quality forecast with the GEM - AQ model

    National Research Council Canada - National Science Library

    J. Struzewska; J. W. Kaminski

    2012-01-01

    ... island and pollutant concentrations. In this study we used the Town Energy Balance (TEB) parameterization to represent urban effects on modelled meteorological and air quality parameters at the final nesting level with horizontal resolution...

  5. Representing dispositions

    Directory of Open Access Journals (Sweden)

    Röhl Johannes

    2011-08-01

    Full Text Available Abstract Dispositions and tendencies feature significantly in the biomedical domain and therefore in representations of knowledge of that domain. They are not only important for specific applications like an infectious disease ontology, but also as part of a general strategy for modelling knowledge about molecular interactions. But the task of representing dispositions in some formal ontological systems is fraught with several problems, which are partly due to the fact that Description Logics can only deal well with binary relations. The paper will discuss some of the results of the philosophical debate about dispositions, in order to see whether the formal relations needed to represent dispositions can be broken down to binary relations. Finally, we will discuss problems arising from the possibility of the absence of realizations, of multi-track or multi-trigger dispositions and offer suggestions on how to deal with them.

  6. Design of a Representative Low Earth Orbit Satellite to Improve Existing Debris Models

    Science.gov (United States)

    Clark, S.; Dietrich, A.; Werremeyer, M.; Fitz-Coy, N.; Liou, J.-C.

    2012-01-01

    This paper summarizes the process and methodologies used in the design of a small-satellite, DebriSat, that represents materials and construction methods used in modern day Low Earth Orbit (LEO) satellites. This satellite will be used in a future hypervelocity impact test with the overall purpose to investigate the physical characteristics of modern LEO satellites after an on-orbit collision. The major ground-based satellite impact experiment used by DoD and NASA in their development of satellite breakup models was conducted in 1992. The target used for that experiment was a Navy Transit satellite (40 cm, 35 kg) fabricated in the 1960 s. Modern satellites are very different in materials and construction techniques from a satellite built 40 years ago. Therefore, there is a need to conduct a similar experiment using a modern target satellite to improve the fidelity of the satellite breakup models. The design of DebriSat will focus on designing and building a next-generation satellite to more accurately portray modern satellites. The design of DebriSat included a comprehensive study of historical LEO satellite designs and missions within the past 15 years for satellites ranging from 10 kg to 5000 kg. This study identified modern trends in hardware, material, and construction practices utilized in recent LEO missions, and helped direct the design of DebriSat.

  7. Water quality modelling of Lis River, Portugal.

    Science.gov (United States)

    Vieira, Judite; Fonseca, André; Vilar, Vítor J P; Boaventura, Rui A R; Botelho, Cidália M S

    2013-01-01

    The aim of the study was to predict the impact of flow conditions, discharges and tributaries on the water quality of Lis River using QUAL2Kw model. Calibration of the model was performed, based on data obtained in field surveys carried out in July 2004 and November 2006. Generally the model fitted quite well the experimental data. The results indicated a decrease of water quality in the downstream area of Lis River, after the confluence of Lena, Milagres and Amor tributaries, as a result of discharges of wastewaters containing degradable organics, nutrients and pathogenic organisms from cattle-raising wastewaters, domestic effluents and agricultural runoff. The water quality criteria were exceeded in these areas for dissolved oxygen, biochemical oxygen demand, total nitrogen and faecal coliforms. Water quality modelling in different scenarios showed that the impact of tributaries on the quality of Lis River water was quite negligible and mainly depends on discharges, which are responsible by an increase of almost 45, 13 and 44 % of ultimate carbonaceous biochemical oxygen demand (CBOD(u)), ammonium nitrogen and faecal coliforms, for winter simulation, and 23, 33 and 36 % for summer simulation, respectively, when compared to the real case scenario.

  8. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cole, Wesley J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sun, Yinong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mai, Trieu T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Richards, James [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-01

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve demand over the evolution of many years or decades. Various CEM formulations are used to evaluate systems ranging in scale from states or utility service territories to national or multi-national systems. CEMs can be computationally complex, and to achieve acceptable solve times, key parameters are often estimated using simplified methods. In this paper, we focus on two of these key parameters associated with the integration of variable generation (VG) resources: capacity value and curtailment. We first discuss common modeling simplifications used in CEMs to estimate capacity value and curtailment, many of which are based on a representative subset of hours that can miss important tail events or which require assumptions about the load and resource distributions that may not match actual distributions. We then present an alternate approach that captures key elements of chronological operation over all hours of the year without the computationally intensive economic dispatch optimization typically employed within more detailed operational models. The updated methodology characterizes the (1) contribution of VG to system capacity during high load and net load hours, (2) the curtailment level of VG, and (3) the potential reductions in curtailments enabled through deployment of storage and more flexible operation of select thermal generators. We apply this alternate methodology to an existing CEM, the Regional Energy Deployment System (ReEDS). Results demonstrate that this alternate approach provides more accurate estimates of capacity value and curtailments by explicitly capturing system interactions across all hours of the year. This approach could be applied more broadly to CEMs at many different scales where hourly resource and load data is available, greatly improving the representation of challenges

  9. Priming and substrate quality interactions in soil organic matter models

    Directory of Open Access Journals (Sweden)

    T. Wutzler

    2012-12-01

    Full Text Available Interactions between different qualities of soil organic matter (SOM affecting their turnover are rarely represented in models. In this study we propose three mathematical strategies at different levels of abstraction for representing those interactions. Implementing these strategies into the Introductory Carbon Balance Model (ICBM and applying them to several scenarios of litter input show that the different levels of abstraction are applicable on different time scales. We present a simple one-parameter equation of substrate limitation applicable at decadal time scale that is straightforward to implement into other models of SOM dynamics. We show how substrate quality interactions can explain priming effects, acceleration of turnover times in FACE experiments, and the slowdown of decomposition in long-term bare fallow experiments as an effect of energy limitation of microbial biomass. The mechanisms of those interactions need to be further scrutinized empirically for a more complete understanding. Overall, substrate quality interactions offer a valuable way of understanding and quantitatively modelling SOM dynamics.

  10. Model based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combination of model with field sampling) affect...... the information obtained about MPs discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by volume-proportional and passive sampling in a storm drainage system in the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual...

  11. Design and Fabrication of DebriSat - A Representative LEO Satellite for Improvements to Standard Satellite Breakup Models

    Science.gov (United States)

    Clark, S.; Dietrich, A.; Fitz-Coy, N.; Weremeyer, M.; Liou, J.-C.

    2012-01-01

    This paper discusses the design and fabrication of DebriSat, a 50 kg satellite developed to be representative of a modern low Earth orbit satellite in terms of its components, materials used, and fabrication procedures. DebriSat will be the target of a future hypervelocity impact experiment to determine the physical characteristics of debris generated after an on-orbit collision of a modern LEO satellite. The major ground-based satellite impact experiment used by DoD and NASA in their development of satellite breakup models was SOCIT, conducted in 1992. The target used for that experiment was a Navy transit satellite (40 cm, 35 kg) fabricated in the 1960's. Modern satellites are very different in materials and construction techniques than those built 40 years ago. Therefore, there is a need to conduct a similar experiment using a modern target satellite to improve the fidelity of the satellite breakup models. To ensure that DebriSat is truly representative of typical LEO missions, a comprehensive study of historical LEO satellite designs and missions within the past 15 years for satellites ranging from 1 kg to 5000 kg was conducted. This study identified modern trends in hardware, material, and construction practices utilized in recent LEO missions. Although DebriSat is an engineering model, specific attention is placed on the quality, type, and quantity of the materials used in its fabrication to ensure the integrity of the outcome. With the exception of software, all other aspects of the satellite s design, fabrication, and assembly integration and testing will be as rigorous as that of an actual flight vehicle. For example, to simulate survivability of launch loads, DebriSat will be subjected to a vibration test. As well, the satellite will undergo thermal vacuum tests to verify that the components and overall systems meet typical environmental standards. Proper assembly and integration techniques will involve comprehensive joint analysis, including the precise

  12. Evaluation of video quality models for multimedia

    Science.gov (United States)

    Brunnström, Kjell; Hands, David; Speranza, Filippo; Webster, Arthur

    2008-02-01

    The Video Quality Experts Group (VQEG) is a group of experts from industry, academia, government and standards organizations working in the field of video quality assessment. Over the last 10 years, VQEG has focused its efforts on the evaluation of objective video quality metrics for digital video. Objective video metrics are mathematical models that predict the picture quality as perceived by an average observer. VQEG has completed validation tests for full reference objective metrics for the Standard Definition Television (SDTV) format. From this testing, two ITU Recommendations were produced. This standardization effort is of great relevance to the video industries because objective metrics can be used for quality control of the video at various stages of the delivery chain. Currently, VQEG is undertaking several projects in parallel. The most mature project is concerned with objective measurement of multimedia content. This project is probably the largest coordinated set of video quality testing ever embarked upon. The project will involve the collection of a very large database of subjective quality data. About 40 subjective assessment experiments and more than 160,000 opinion scores will be collected. These will be used to validate the proposed objective metrics. This paper describes the test plan for the project, its current status, and one of the multimedia subjective tests.

  13. Transit times and age distributions for reservoir models represented as nonlinear non-autonomuous systems

    Science.gov (United States)

    Müller, Markus; Meztler, Holger; Glatt, Anna; Sierra, Carlos

    2016-04-01

    We present theoretical methods to compute dynamic residence and transit time distributions for non-autonomous systems of pools governed by coupled nonlinear differential equations. Although transit time and age distributions have been used to describe reservoir models for a long time, a closer look to their assumptions reveals two major restrictions of generality in previous studies. First, the systems are assumed to be in equilibrium; and second, the equations under consideration are assumed to be linear. While both these assumptions greatly ease the computation and interpretation of transit time and age distributions they are not applicable to a wide range of problems. Moreover, the transfer of previous results learned from linear systems in steady state to the more complex nonlinear non-autonomous systems that do not even need to have equilibria, can be dangerously misleading. Fortunately the topic of time dependent age and transit time distributions has received some attention recently in hydrology, we aim to compute these distributions for systems of multiple reservoirs. We will discuss how storage selection functions can augment the information represented in an ODE system describing a system of reservoirs. We will present analytical and numerical algorithms and a Monte Carlo simulator to compute solutions for system transit time and age distributions for system-wide storage selection functions including the most simple, but important case of well mixed pools.

  14. Using EARTH Model to Estimate Groundwater Recharge at Five Representative Zones in the Hebei Plain, China

    Institute of Scientific and Technical Information of China (English)

    Bingguo Wang; Menggui Jin; Xing Liang

    2015-01-01

    Accurate estimation of groundwater recharge is essential for efficient and sustainable groundwater management in many semi-arid regions. In this paper, a lumped parameter model (EARTH) was established to simulate the recharge rate and recharge process in typical areas by the ob-servation datum of weather, soil water and groundwater synthetically, and the spatial and temporal variation law of groundwater recharge in the Hebei Plain was revealed. The mean annual recharge rates at LQ, LC, HS, DZ and CZ representative zones are 220.1, 196.7, 34.1, 141.0 and 188.0 mm/a and the recharge coefficients are 26.5%, 22.3%, 7.2%, 20.4%, and 22.0%, respectively. Recharge rate and re-charge coefficient are gradually reduced from piedmont plain to coastal plain. Groundwater recharge appears as only yearly waves, with higher frequency components of the input series filtered by the deep complicated unsaturated zone (such as LC). While at other zones, groundwater recharge series strongly dependent on the daily rainfall and irrigation because of the shallow water table or coarse lithology.

  15. Representative parameter estimation for hydrological models using a lexicographic calibration strategy

    Science.gov (United States)

    Gelleszun, Marlene; Kreye, Phillip; Meon, Günter

    2017-10-01

    We introduce the developed lexicographic calibration strategy to circumvent the imbalance between sophisticated hydrological models in combination with complex optimisation algorithms. The criteria for the evaluation of the approach were (i) robustness and transferability of the resulting parameters, (ii) goodness-of-fit criteria in calibration and validation and (iii) time-efficiency. An order of preference was determined prior to the calibration and the parameters were separated into groups for a stepwise calibration to reduce the search space. A comparison with the global optimisation method SCE-UA showed that only 6% of the calculation time was needed; the conditions total volume, seasonality and shape of the hydrograph were successfully achieved for the calibration and for the cross-validation periods. Furthermore, the parameter sets obtained by the lexicographic calibration strategy for different time periods were much more similar to each other than the parameters obtained by SCE-UA. Besides the similarities of the parameter sets, the goodness-of-fit criteria for the cross-validation were better for the lexicographic approach and the water balance components were also more similar. Thus, we concluded that the resulting parameters were more representative for the corresponding catchments and therefore more suitable for transferability. Time-efficient approximate methods were used to account for parameter uncertainty, confidence intervals and the stability of the solution in the optimum.

  16. Identifiability analysis of the CSTR river water quality model.

    Science.gov (United States)

    Chen, J; Deng, Y

    2006-01-01

    Conceptual river water quality models are widely known to lack identifiability. The causes for that can be due to model structure errors, observational errors and less frequent samplings. Although significant efforts have been directed towards better identification of river water quality models, it is not clear whether a given model is structurally identifiable. Information is also limited regarding the contribution of different unidentifiability sources. Taking the widely applied CSTR river water quality model as an example, this paper presents a theoretical proof that the CSTR model is indeed structurally identifiable. Its uncertainty is thus dominantly from observational errors and less frequent samplings. Given the current monitoring accuracy and sampling frequency, the unidentifiability from sampling frequency is found to be more significant than that from observational errors. It is also noted that there is a crucial sampling frequency between 0.1 and 1 day, over which the simulated river system could be represented by different illusions and the model application could be far less reliable.

  17. Kidney transplantation process in Brazil represented in business process modeling notation.

    Science.gov (United States)

    Peres Penteado, A; Molina Cohrs, F; Diniz Hummel, A; Erbs, J; Maciel, R F; Feijó Ortolani, C L; de Aguiar Roza, B; Torres Pisa, I

    2015-05-01

    Kidney transplantation is considered to be the best treatment for people with chronic kidney failure, because it improves the patients' quality of life and increases their length of survival compared with patients undergoing dialysis. The kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no visual representation of this process. The aim of this study was to analyze official documents to construct a representation of the kidney transplantation process in Brazil with the use of business process modeling notation (BPMN). The methodology for this study was based on an exploratory observational study, document analysis, and construction of process diagrams with the use of BPMN. Two rounds of validations by specialists were conducted. The result includes the kidney transplantation process in Brazil representation with the use of BPMN. We analyzed 2 digital documents that resulted in 2 processes with 45 total of activities and events, 6 organizations involved, and 6 different stages of the process. The constructed representation makes it easier to understand the rules for the business of kidney transplantation and can be used by the health care professionals involved in the various activities within this process. Construction of a representation with language appropriate for the Brazilian lay public is underway. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. From representing to modelling knowledge: Proposing a two-step training for excellence in concept mapping

    Directory of Open Access Journals (Sweden)

    Joana G. Aguiar

    2017-09-01

    Full Text Available Training users in the concept mapping technique is critical for ensuring a high-quality concept map in terms of graphical structure and content accuracy. However, assessing excellence in concept mapping through structural and content features is a complex task. This paper proposes a two-step sequential training in concept mapping. The first step requires the fulfilment of low-order cognitive objectives (remember, understand and apply to facilitate novices’ development into good Cmappers by honing their knowledge representation skills. The second step requires the fulfilment of high-order cognitive objectives (analyse, evaluate and create to grow good Cmappers into excellent ones through the development of knowledge modelling skills. Based on Bloom’s revised taxonomy and cognitive load theory, this paper presents theoretical accounts to (1 identify the criteria distinguishing good and excellent concept maps, (2 inform instructional tasks for concept map elaboration and (3 propose a prototype for training users on concept mapping combining online and face-to-face activities. The proposed training application and the institutional certification are the next steps for the mature use of concept maps for educational as well as business purposes.

  19. Sequential box models for indoor air quality: Application to airliner cabin air quality

    Science.gov (United States)

    Ryan, P. Barry; Spengler, John D.; Halfpenny, Paul F.

    In this paper we present the development and application of a model for indoor air quality. The model represents a departure from the standard box models typically used for indoor environments which has applicability in residences and office buildings. The model has been developed for a physical system consisting of sequential compartments which communicate only with adjacent compartments. Each compartment may contain various source and sink terms for a pollutant as well as leakage, and air transfer from adjacent compartments. The mathematical derivation affords rapid calculation of equilibrium concentrations in an essentially unlimited number of compartments. The model has been applied to air quality in the passenger cabin of three commercial aircraft. Simulations have been performed for environmental tobacco smoke (ETS) under two scenarios, CO 2 and water vapor. Additionally, concentrations in one aircraft have been simulated under conditions different from the standard configuration. Results of the simulations suggest the potential for elevated concentrations of ETS in smoking sections of non-air-recirculating aircraft and throughout the aircraft when air is recirculated. Concentrations of CO 2 and water vapor are consistent with expected results. We conclude that this model may be a useful tool in understanding indoor air quality in general and on aircraft in particular.

  20. A formal model for total quality management

    NARCIS (Netherlands)

    S.C. van der Made-Potuijt; H.B. Bertsch (Boudewijn); L.P.J. Groenewegen

    1996-01-01

    textabstractTotal Quality Management (TQM) is a systematic approach to managing a company. TQM is systematic in the sense that it is uses facts through observation, analysis and measurable goals. There are theoretical descriptions of this management concept, but there is no formal model of it. A for

  1. Potential of mathematical modeling in fruit quality

    African Journals Online (AJOL)

    ONOS

    2010-01-18

    Jan 18, 2010 ... estimate seasonal changes in quality traits as fruit size, dry matter, water content and the concentration of sugars and ... The global goodness-of-fit of a model is computed by averaging the ... into account climate variables such as radiation, salinity, .... and on exponential light extinction (Beer-Lambert Law).

  2. A formal model for total quality management

    NARCIS (Netherlands)

    S.C. van der Made-Potuijt; H.B. Bertsch (Boudewijn); L.P.J. Groenewegen

    1996-01-01

    textabstractTotal Quality Management (TQM) is a systematic approach to managing a company. TQM is systematic in the sense that it is uses facts through observation, analysis and measurable goals. There are theoretical descriptions of this management concept, but there is no formal model of it. A

  3. Evaluating the Quality of the Learning Outcome in Healthcare Sector: The Expero4care Model

    Science.gov (United States)

    Cervai, Sara; Polo, Federica

    2015-01-01

    Purpose: This paper aims to present the Expero4care model. Considering the growing need for a training evaluation model that does not simply fix processes, the Expero4care model represents the first attempt of a "quality model" dedicated to the learning outcomes of healthcare trainings. Design/Methodology/Approach: Created as development…

  4. Uncertainty in Regional Air Quality Modeling

    Science.gov (United States)

    Digar, Antara

    Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor

  5. Aspect-Oriented Software Quality Model: The AOSQ Model

    Directory of Open Access Journals (Sweden)

    Pankaj Kumar

    2012-04-01

    Full Text Available Nowadays, software development has become more complex and dynamic; they are expected more flexible, scalable and reusable. Under the umbrella of aspect, Aspect-Oriented Software Development (AOSD is relatively a modern programming paradigm to improve modularity in software development. Using Aspect-Oriented Programming (AOP language to implements crosscutting concerns through the introduction of a new construct Aspect like Class is defined as a modular unit of crosscutting behavior that affect multiple classes into reusable modules. Several quality models to measure the quality of software are available in literature. However, keep on developing software, and acceptance of new environment (i.e. AOP under conditions that give rise to an issue of evolvability. After the evolution of system, we have to find out how the new system needs to be extensible? What is the configurable status? Is designed pattern stable for new environment and technology? How the new system is sustainable? The objective of this paper is to propose a new quality model for AOSD to integrating some new qualityattributes in AOSQUAMO Model based which is based on ISO/IEC 9126 Quality Model, is called AspectOriented Quality (AOSQ Model. Analytic Hierarchy Process (AHP is used to evaluate an improved hierarchical quality model for AOSD.

  6. Source apportionment of population representative samples of PM(2.5) in three European cities using structural equation modelling.

    Science.gov (United States)

    Ilacqua, Vito; Hänninen, Otto; Saarela, Kristina; Katsouyanni, Klea; Künzli, Nino; Jantunen, Matti

    2007-10-01

    Apportionment of urban particulate matter (PM) to sources is central for air quality management and efficient reduction of the substantial public health risks associated with fine particles (PM(2.5)). Traffic is an important source combustion particles, but also a significant source of resuspended particles that chemically resemble Earth's crust and that are not affected by development of cleaner motor technologies. A substantial fraction of urban ambient PM originates from long-range transport outside the immediate urban environment including secondary particles formed from gaseous emissions of mainly sulphur, nitrogen oxides and ammonia. Most source apportionment studies are based on small number of fixed monitoring sites and capture well population exposures to regional and long-range transported particles. However, concentrations from local sources are very unevenly distributed and the results from such studies are therefore poorly representative of the actual exposures. The current study uses PM(2.5) data observed at population based random sampled residential locations in Athens, Basle and Helsinki with 17 elemental constituents, selected VOCs (xylenes, trimethylbenzenes, nonane and benzene) and light absorbance (black smoke). The major sources identified across the three cities included crustal, salt, long-range transported inorganic and traffic sources. Traffic was associated separately with source categories with crustal (especially Athens and Helsinki) and long-range transported chemical composition (all cities). Remarkably high fractions of the variability of elemental (R(2)>0.6 except for Ca in Basle 0.38) and chemical concentrations (R(2)>0.5 except benzene in Basle 0.22 and nonane in Athens 0.39) are explained by the source factors of an SEM model. The RAINS model that is currently used as the main tool in developing European air quality management policies seems to capture the local urban fraction (the city delta term) quite well, but underestimates

  7. Mathematical human body models representing a mid size male and a small female for frontal, lateral and rearward impact loading

    NARCIS (Netherlands)

    Happee, R.; Morsink, P.L.J.; Lange, R. de; Bours, R.; Ridella, S.; Nayef, A.; Hoof, J. van

    2000-01-01

    A human body model representing a mid size male has been presented at the 1998 STAPP conference. A combination of modeling techniques was applied using rigid bodies for most segments, but describing the thorax as a deformable structure. In this paper, this modeling strategy was employed to also deve

  8. Model validation of channel zapping quality

    OpenAIRE

    Kooij, R.; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective ...

  9. Future ozone air quality and radiative forcing over China owing to future changes in emissions under the Representative Concentration Pathways (RCPs)

    Science.gov (United States)

    Zhu, Jia; Liao, Hong

    2016-02-01

    We apply the nested grid version of the Goddard Earth Observing System (GEOS) chemical transport model (GEOS-Chem) to assess 2000-2050 changes in O3 air quality and associated radiative forcing in China owing to future changes in emissions under the Representative Concentration Pathways (RCP2.6, RCP4.5, RCP6.0, and RCP8.5). Changes in surface layer O3 concentrations, numbers of O3 exceedance days (days with maximum daily 8 h average (MDA8) O3 exceeding 74.7 ppbv), and tropospheric O3 radiative forcing (RF) are simulated for 2000-2050. Over China, RCP8.5 is the worst scenario for near future (2020-2030) and RCP6.0 is the worst scenario over 2040-2050; the maximum increases in annual mean surface layer O3 concentrations of 6-12 ppbv relative to present day (year 2000) are found over southern China in 2020 and 2030 under RCP8.5 and in 2040 and 2050 under RCP6.0. The numbers of MDA8 O3 exceedance days are simulated to be 10, 0, 0, and 2 days over Beijing-Tianjin-Tanggu (BTT), Yangtze River Delta (YRD), Pearl River Delta (PRD), and Sichuan Basin (SCB), respectively, in the present day (year 2000). No exceedance days are simulated in year 2050 for all the four regions under RCP2.6 and RCP4.5, but extremely high numbers of exceedance days are found in 2050 under RCP6.0 (with 102, 75, 57, and 179 days in BTT, YRD, PRD, and SCB, respectively) and in 2030 under RCP8.5 (with 94, 60, 34, and 162 days in BTT, YRD, PRD, and SCB, respectively). The tropospheric O3 RF in 2050 relative to 2000 averaged over eastern China (18°-45°N, 95°-125°E) is simulated to be -0.11, 0.0, 0.01, and 0.14 W m-2 under RCP2.6, RCP4.5, RCP6.0, and RCP8.5, respectively. When we consider both the health and climate impacts of tropospheric O3 over China in 2050, RCP2.6 is a significantly improving scenario for both air quality and climate, RCP4.5 is a significantly improving scenario for air quality but has small consequences for climate, RCP6.0 is a significantly worsening scenario for air quality

  10. Integrating microbial physiology and enzyme traits in the quality model

    Science.gov (United States)

    Sainte-Marie, Julien; Barrandon, Matthieu; Martin, Francis; Saint-André, Laurent; Derrien, Delphine

    2017-04-01

    Microbe activity plays an undisputable role in soil carbon storage and there have been many calls to integrate microbial ecology in soil carbon (C) models. With regard to this challenge, a few trait-based microbial models of C dynamics have emerged during the past decade. They parameterize specific traits related to decomposer physiology (substrate use efficiency, growth and mortality rates...) and enzyme properties (enzyme production rate, catalytic properties of enzymes…). But these models are built on the premise that organic matter (OM) can be represented as one single entity or are divided into a few pools, while organic matter exists as a continuum of many different compounds spanning from intact plant molecules to highly oxidised microbial metabolites. In addition, a given molecule may also exist in different forms, depending on its stage of polymerization or on its interactions with other organic compounds or mineral phases of the soil. Here we develop a general theoretical model relating the evolution of soil organic matter, as a continuum of progressively decomposing compounds, with decomposer activity and enzyme traits. The model is based on the notion of quality developed by Agren and Bosatta (1998), which is a measure of molecule accessibility to degradation. The model integrates three major processes: OM depolymerisation by enzyme action, OM assimilation and OM biotransformation. For any enzyme, the model reports the quality range where this enzyme selectively operates and how the initial quality distribution of the OM subset evolves into another distribution of qualities under the enzyme action. The model also defines the quality range where the OM can be uptaken and assimilated by microbes. It finally describes how the quality of the assimilated molecules is transformed into another quality distribution, corresponding to the decomposer metabolites signature. Upon decomposer death, these metabolites return to the substrate. We explore here the how

  11. College Teaching Quality Evaluation Based on System Dynamics Model

    Directory of Open Access Journals (Sweden)

    Zheng Kang-ning

    2016-01-01

    Full Text Available This paper analyzes the main factors that influence the teaching quality and the cause and effect relationship between them, using system dynamics to establish the evaluation model of teaching quality in Colleges and universities. Taking a college A for example, this model is induced in the simulation of teaching quality, then the change process of teaching quality and feedback mechanism between these effective factors is proposed, as well as the teaching quality change under different policy parameters. And with the purpose of improving teaching quality, some measures are put forward, including staff quality, management quality, organization quality and environment quality.

  12. Air quality modeling`s brave new world

    Energy Technology Data Exchange (ETDEWEB)

    Appleton, E.L.

    1996-05-01

    Since 1992, EPA has been creating a new generation of software - Models-3 - that is widely regarded as the next-generation air quality modeling system. The system has a modular framework that allows users to integrate a broad variety of air quality models. In the future, users will also be able to plug in economic decision support tools. A prototype version of Models-3 already exists in the Atmospheric Modeling Division of EPA`s National Exposure Research Laboratory in Research Triangle Park. EDSS was developed as a raid prototype of Models-3 under a three-year, $7.8 million cooperative agreement with EPA. An operational version of Models-3 may be in the hands of scientists and state air quality regulators by late 1997. Developers hope the new, more user-friendly system will make it easier to run models and present information to policy makers in graphical ways that are easy to understand. In addition, Models-3 will ultimately become a so-called `comprehensive modeling system` that enables users to simulate pollutants in other media, such as water. EPA also plans to include models that simulate health effects and other pollution consequences. 6 refs.

  13. Evaluation of the Swat Model in a Small Watershed Representative of the Atlantic Forest Biome in Southern Brazil

    Science.gov (United States)

    Marcon, I. R.; Cauduro Dias de Paiva, E. M.; Dias de Paiva, J.; Beling, F. A.; Heatwole, C.

    2011-12-01

    This study presents the results of simulations with the SWAT (Soil and Water Assessment Tool) model in a small watershed in Southern Brazil (latitude 29°38'37.5 " and longitude 53°48'2.2"), representative of the Atlantic Forest Biome. This area was monitored by two sequential stations, each with one rain gauge and one stage gauge, having contributing areas of 4.5 km2 and 12 km2 respectively. The altitudes in the basins range from 316 m to 431 m and vegetation is predominantly composed of native forest (55%) and native pasture (39%). The simulated period was from August 2007 to July 2011, corresponding to the period of monitoring. The temperature ranged from -2.2°C to 39.2°C, and annual rainfall ranged between 2005 mm and 2250 mm. For this application, a modification in the SWAT 2000 model algorithm was made, as proposed by Paiva and Paiva (2006), to adjust the rate of leaf area during the winter season of the region. The quality of the results was characterized by the Nash-Sutcliffe efficiency index (NSE) and by the coefficient of determination (R2). The model was evaluated in a monthly and daily scale. At the monthly scale, the values obtained for NSE in the calibration phase, were 0.73 and 0.81, respectively for the two sections. The values obtained for R2 were 0.77 and 0.83 in the same sections. At the daily scale, in the calibration phase NSE values were -0.44 and -0.31, respectively, for the two sections, while for R2, the values were 0.27 and 0.38 in the same sections. These results show that the fit was good for monthly values, but for daily values a proper adjustment was not possible. Due to the short period of monitoring, the validation of the model results was made with the observed data from first station with an area of 4.5 km2. The values obtained for the NSE in the validation phase were 0.73 and -0.33 for the monthly and daily scales respectively, and for R2, 0.77 and 0.27 for the monthly and daily values, thus confirming the quality of the fit

  14. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  15. Technical Note—Why Does the NBD Model Work? Robustness in Representing Product Purchases, Brand Purchases and Imperfectly Recorded Purchases

    OpenAIRE

    David C. Schmittlein; Albert C. Bemmaor; Donald G. Morrison

    1985-01-01

    One of the most managerially useful constructs that emerge from the stochastic modelling of brand choice is that of conditional expectations. In this paper the conditional expectations are derived for a generalization of the NBD model, called the beta binomial/negative binomial distribution (BB/NBD) model, first described by Jeuland, Bass and Wright. The model, developed to jointly represent the product class purchase and brand selection processes, is also particularly appropriate for analyzi...

  16. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.......A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated...... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  17. Development of a Future Representative Concentration Pathway for Use in the IPCC 5th Assessment Earth System Model Simulations

    Energy Technology Data Exchange (ETDEWEB)

    None

    2010-12-29

    The representative concentration pathway to be delivered is a scenario of atmospheric concentrations of greenhouse gases and other radiatively important atmospheric species, along with land-use changes, derived from the Global Change Assessment Model (GCAM). The particular representative concentration pathway (RCP) that the Joint Global Change Research Institute (JGCRI) has been responsible for is a not-to-exceed pathway that stabilizes at a radiative forcing of 4.5Wm-2 in the year 2100.

  18. A Sufficient Condition for a Wire-Frame Representing a Solid Modeling Uniquely

    Institute of Scientific and Technical Information of China (English)

    WANG Jiaye; CHEN Hui; WANG Wenping

    2001-01-01

    Generally speaking, it is impossible for a wire-frame to define a 3D object uniquely. But wire-frame as a graphics medium is still applied in some industrial areas. A sufficient condition is presented in this paper. If this condition is satisfied by a wire-frame,then the wire-frame can represent a 3D object uniquely. The result is applied to manufacturing of progressive stripe.

  19. Latent variable indirect response modeling of categorical endpoints representing change from baseline.

    Science.gov (United States)

    Hu, Chuanpu; Xu, Zhenhua; Mendelsohn, Alan M; Zhou, Honghui

    2013-02-01

    Accurate exposure-response modeling is important in drug development. Methods are still evolving in the use of mechanistic, e.g., indirect response (IDR) models to relate discrete endpoints, mostly of the ordered categorical form, to placebo/co-medication effect and drug exposure. When the discrete endpoint is derived using change-from-baseline measurements, a mechanistic exposure-response modeling approach requires adjustment to maintain appropriate interpretation. This manuscript describes a new modeling method that integrates a latent-variable representation of IDR models with standard logistic regression. The new method also extends to general link functions that cover probit regression or continuous clinical endpoint modeling. Compared to an earlier latent variable approach that constrained the baseline probability of response to be 0, placebo effect parameters in the new model formulation are more readily interpretable and can be separately estimated from placebo data, thus allowing convenient and robust model estimation. A general inherent connection of some latent variable representations with baseline-normalized standard IDR models is derived. For describing clinical response endpoints, Type I and Type III IDR models are shown to be equivalent, therefore there are only three identifiable IDR models. This approach was applied to data from two phase III clinical trials of intravenously administered golimumab for the treatment of rheumatoid arthritis, where 20, 50, and 70% improvement in the American College of Rheumatology disease severity criteria were used as efficacy endpoints. Likelihood profiling and visual predictive checks showed reasonable parameter estimation precision and model performance.

  20. Bianchi VI cosmological models representing perfect fluid and radiation with electric-type free gravitational fields

    Science.gov (United States)

    Roy, S. R.; Banerjee, S. K.

    1992-11-01

    A homogeneous Bianchi type VIh cosmological model filled with perfect fluid, null electromagnetic field and streaming neutrinos is obtained for which the free gravitational field is of the electric type. The barotropic equation of statep = (γ-1)ɛ is imposed in the particular case of Bianchi VI0 string models. Various physical and kinematical properties of the models are discussed.

  1. Selecting representative climate models for climate change impact studies : An advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; ter Maat, Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.|info:eu-repo/dai/nl/290472113

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change

  2. Selecting representative climate models for climate change impact studies: an advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; Maat, ter Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change

  3. Representing Micro–Macro Linkages by Actor-based Dynamic Network Models

    NARCIS (Netherlands)

    Snijders, Thomas; Steglich, Christian

    2015-01-01

    Stochastic actor-based models for network dynamics have the primary aim of statistical inference about processes of network change, but may be regarded as a kind of agent-based models. Similar to many other agent-based models, they are based on local rules for actor behavior. Different from many oth

  4. Selecting representative climate models for climate change impact studies : An advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; ter Maat, Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change impa

  5. Selecting representative climate models for climate change impact studies: an advanced envelope-based selection approach

    NARCIS (Netherlands)

    Lutz, Arthur F.; Maat, ter Herbert W.; Biemans, Hester; Shrestha, Arun B.; Wester, Philippus; Immerzeel, Walter W.

    2016-01-01

    Climate change impact studies depend on projections of future climate provided by climate models. The number of climate models is large and increasing, yet limitations in computational capacity make it necessary to compromise the number of climate models that can be included in a climate change impa

  6. Identifying sustainable foods: the relationship between environmental impact, nutritional quality, and prices of foods representative of the French diet.

    Science.gov (United States)

    Masset, Gabriel; Soler, Louis-Georges; Vieux, Florent; Darmon, Nicole

    2014-06-01

    Sustainable diets, as defined by the Food and Agriculture Organization, need to combine environment, nutrition, and affordability dimensions. However, it is unknown whether these dimensions are compatible, and no guidance is available in the official recommendations. To identify foods with compatible sustainability dimensions. For 363 of the most commonly consumed foods in the Second French Individual and National Study on Food Consumption, environmental impact indicators (ie, greenhouse gas [GHG] emissions, acidification, and eutrophication), and prices were collected. The nutritional quality of the foods was assessed by calculating the score for the nutritional adequacy of individual foods (SAIN) to score for disqualifying nutrients (LIM) ratio. A sustainability score based on the median GHG emissions, price, and SAIN:LIM was calculated for each food; the foods with the best values for all three variables received the highest score. The environmental indicators were strongly and positively correlated. Meat, fish, and eggs and dairy products had the strongest influence on the environment; starchy foods, legumes, and fruits and vegetables had the least influence. GHG emissions were inversely correlated with SAIN:LIM (r=-0.37) and positively correlated with price per kilogram (r=0.59); the correlation with price per kilocalorie was null. This showed that foods with a heavy environmental impact tend to have lower nutritional quality and a higher price per kilogram but not a lower price per kilocalorie. Using price per kilogram, 94 foods had a maximum sustainability score, including most plant-based foods and excluding all foods with animal ingredients except milk, yogurt, and soups. Using price per kilocalorie restricted the list to 42 foods, including 52% of all starchy foods and legumes but only 11% of fruits and vegetables (mainly 100% fruit juices). Overall, the sustainability dimensions seemed to be compatible when considering price per kilogram of food. However

  7. A general method to select representative models for decision making and optimization under uncertainty

    Science.gov (United States)

    Shirangi, Mehrdad G.; Durlofsky, Louis J.

    2016-11-01

    The optimization of subsurface flow processes under geological uncertainty technically requires flow simulation to be performed over a large set of geological realizations for each function evaluation at every iteration of the optimizer. Because flow simulation over many permeability realizations (only permeability is considered to be uncertain in this study) may entail excessive computation, simulations are often performed for only a subset of 'representative' realizations. It is however challenging to identify a representative subset that provides flow statistics in close agreement with those from the full set, especially when the decision parameters (e.g., time-varying well pressures, well locations) are unknown a priori, as they are in optimization problems. In this work, we introduce a general framework, based on clustering, for selecting a representative subset of realizations for use in simulations involving 'new' sets of decision parameters. Prior to clustering, each realization is represented by a low-dimensional feature vector that contains a combination of permeability-based and flow-based quantities. Calculation of flow-based features requires the specification of a (base) flow problem and simulation over the full set of realizations. Permeability information is captured concisely through use of principal component analysis. By computing the difference between the flow response for the subset and the full set, we quantify the performance of various realization-selection methods. The impact of different weightings for flow and permeability information in the cluster-based selection procedure is assessed for a range of examples involving different types of decision parameters. These decision parameters are generated either randomly, in a manner that is consistent with the solutions proposed in global stochastic optimization procedures such as GA and PSO, or through perturbation around a base case, consistent with the solutions considered in pattern search

  8. FIRESTORM: Modelling the water quality risk of wildfire.

    Science.gov (United States)

    Mason, C. I.; Sheridan, G. J.; Smith, H. G.; Jones, O.; Chong, D.; Tolhurst, K.

    2012-04-01

    Following wildfire, loss of vegetation and changes to soil properties may result in decreases in infiltration rates, less rainfall interception, and higher overland flow velocities. Rainfall events affecting burn areas before vegetation recovers can cause high magnitude erosion events that impact on downstream water quality. For cities and towns that rely upon fire-prone forest catchments for water supply, wildfire impacts on water quality represent a credible risk to water supply security. Quantifying the risk associated with the occurrence of wildfires and the magnitude of water quality impacts has important implications for managing water supplies. At present, no suitable integrative model exists that considers the probabilistic nature of system inputs as well as the range of processes and scales involved in this problem. We present FIRESTORM, a new model currently in development that aims to determine the range of sediment and associated contaminant loads that may be delivered to water supply reservoirs from the combination of wildfire and subsequent rainfall events. This Monte Carlo model incorporates the probabilistic nature of fire ignition, fire weather and rainfall, and includes deterministic models for fire behaviour and locally dominant erosion processes. FIRESTORM calculates the magnitude and associated annual risk of catchment-scale sediment loads associated with the occurrence of wildfire and rainfall generated by two rain event types. The two event types are localised, high intensity, short-duration convective storms, and widespread, longer duration synoptic-scale rainfall events. Initial application and testing of the model will focus on the two main reservoirs supplying water to Melbourne, Australia, both of which are situated in forest catchments vulnerable to wildfire. Probabilistic fire ignition and weather scenarios have been combined using 40 years of fire records and weather observations. These are used to select from a dataset of over 80

  9. Regional air quality modeling: North American and European perspectives

    NARCIS (Netherlands)

    Steyn, D.; Builtjes, P.; Schaap, M.; Yarwood, G.

    2013-01-01

    An overview of regional-scale quality modeling practices and perspectives in North America and Europe, highlighting the differences and commonalities in how regional-scale air quality modeling systems are being used and evaluated across both continents

  10. Involvement of patients or their representatives in quality management functions in EU hospitals: implementation and impact on patient-centred care strategies

    Science.gov (United States)

    Groene, Oliver; Sunol, Rosa; Klazinga, Niek S.; Wang, Aolin; Dersarkissian, Maral; Thompson, Caroline A.; Thompson, Andrew; Arah, Onyebuchi A.; Klazinga, N; Kringos, DS; Lombarts, MJMH; Plochg, T; Lopez, MA; Secanell, M; Sunol, R; Vallejo, P; Bartels, P; Kristensen, S; Michel, P; Saillour-Glenisson, F; Vlcek, F; Car, M; Jones, S; Klaus, E; Bottaro, S; Garel, P; Saluvan, M; Bruneau, C; Depaigne-Loth, A; Shaw, C; Hammer, A; Ommen, O; Pfaff, H; Groene, O; Botje, D; Wagner, C; Kutaj-Wasikowska, H; Kutryba, B; Escoval, A; Lívio, A; Eiras, M; Franca, M; Leite, I; Almeman, F; Kus, H; Ozturk, K; Mannion, R; Arah, OA; DerSarkissian, M; Thompson, CA; Wang, A; Thompson, A

    2014-01-01

    Objective The objective of this study was to describe the involvement of patients or their representatives in quality management (QM) functions and to assess associations between levels of involvement and the implementation of patient-centred care strategies. Design A cross-sectional, multilevel study design that surveyed quality managers and department heads and data from an organizational audit. Setting Randomly selected hospitals (n = 74) from seven European countries (The Czech Republic, France, Germany, Poland, Portugal, Spain and Turkey). Participants Hospital quality managers (n = 74) and heads of clinical departments (n = 262) in charge of four patient pathways (acute myocardial infarction, stroke, hip fracture and deliveries) participated in the data collection between May 2011 and February 2012. Main Outcome Measures Four items reflecting essential patient-centred care strategies based on an on-site hospital visit: (1) formal survey seeking views of patients and carers, (2) written policies on patients' rights, (3) patient information literature including guidelines and (4) fact sheets for post-discharge care. The main predictors were patient involvement in QM at the (i) hospital level and (ii) pathway level. Results Current levels of involving patients and their representatives in QM functions in European hospitals are low at hospital level (mean score 1.6 on a scale of 0 to 5, SD 0.7), but even lower at departmental level (mean 0.6, SD 0.7). We did not detect associations between levels of involving patients and their representatives in QM functions and the implementation of patient-centred care strategies; however, the smallest hospitals were more likely to have implemented patient-centred care strategies. Conclusions There is insufficient evidence that involving patients and their representatives in QM leads to establishing or implementing strategies and procedures that facilitate patient-centred care; however, lack of evidence should not be

  11. Involvement of patients or their representatives in quality management functions in EU hospitals: implementation and impact on patient-centred care strategies.

    Science.gov (United States)

    Groene, Oliver; Sunol, Rosa; Klazinga, Niek S; Wang, Aolin; Dersarkissian, Maral; Thompson, Caroline A; Thompson, Andrew; Arah, Onyebuchi A

    2014-04-01

    The objective of this study was to describe the involvement of patients or their representatives in quality management (QM) functions and to assess associations between levels of involvement and the implementation of patient-centred care strategies. A cross-sectional, multilevel that surveyed quality managers and department heads and data from an organizational audit. Randomly selected hospitals (n = 74) from seven European countries (The Czech Republic, France, Germany, Poland, Portugal, Spain and Turkey). Hospital quality managers (n = 74) and heads of clinical departments (n = 262) in charge of four patient pathways (acute myocardial infarction, stroke, hip fracture and deliveries) participated in the data collection between May 2011 and February 2012. Four items reflecting essential patient-centred care strategies based on an on-site hospital visit: (1) formal survey seeking views of patients and carers, (2) written policies on patients' rights, (3) patient information literature including guidelines and (4) fact sheets for post-discharge care. The main predictors were patient involvement in QM at the (i) hospital level and (ii) pathway level. Current levels of involving patients and their representatives in QM functions in European hospitals are low at hospital level (mean score 1.6 on a scale of 0 to 5, SD 0.7), but even lower at departmental level (mean 0.6, SD 0.7). We did not detect associations between levels of involving patients and their representatives in QM functions and the implementation of patient-centred care strategies; however, the smallest hospitals were more likely to have implemented patient-centred care strategies. There is insufficient evidence that involving patients and their representatives in QM leads to establishing or implementing strategies and procedures that facilitate patient-centred care; however, lack of evidence should not be interpreted as evidence of no effect.

  12. Predictive modeling using a nationally representative database to identify patients at risk of developing microalbuminuria.

    Science.gov (United States)

    Villa-Zapata, Lorenzo; Warholak, Terri; Slack, Marion; Malone, Daniel; Murcko, Anita; Runger, George; Levengood, Michael

    2016-02-01

    Predictive models allow clinicians to identify higher- and lower-risk patients and make targeted treatment decisions. Microalbuminuria (MA) is a condition whose presence is understood to be an early marker for cardiovascular disease. The aims of this study were to develop a patient data-driven predictive model and a risk-score assessment to improve the identification of MA. The 2007-2008 National Health and Nutrition Examination Survey (NHANES) was utilized to create a predictive model. The dataset was split into thirds; one-third was used to develop the model, while the other two-thirds were utilized for internal validation. The 2012-2013 NHANES was used as an external validation database. Multivariate logistic regression was performed to create the model. Performance was evaluated using three criteria: (1) receiver operating characteristic curves; (2) pseudo-R (2) values; and (3) goodness of fit (Hosmer-Lemeshow). The model was then used to develop a risk-score chart. A model was developed using variables for which there was a significant relationship. Variables included were systolic blood pressure, fasting glucose, C-reactive protein, blood urea nitrogen, and alcohol consumption. The model performed well, and no significant differences were observed when utilized in the validation datasets. A risk score was developed, and the probability of developing MA for each score was calculated. The predictive model provides new evidence about variables related with MA and may be used by clinicians to identify at-risk patients and to tailor treatment. The risk score developed may allow clinicians to measure a patient's MA risk.

  13. RHydro - Hydrological models and tools to represent and analyze hydrological data in R

    Science.gov (United States)

    Reusser, D. E.; Buytaert, W.; Vitolo, C.

    2012-04-01

    In hydrology, basic equations and procedures keep being implemented from scratch by scientist, with the potential for errors and inefficiency. The use of libraries can overcome these problems. As an example, hydrological libraries could contain: 1. Major representations of hydrological processes such as infiltration, sub-surface runoff and routing algorithms. 2. Scaling functions, for instance to combine remote sensing precipitation fields with rain gauge data 3. Data consistency checks 4. Performance measures. Here we present a beginning for such a library implemented in the high level data programming language R. Currently, Top-model, the abc-Model, HBV, a multi-model ensamble called FUSE, data import routines for WaSiM-ETH as well basic visualization and evaluation tools are implemented. Care is taken to make functions and models compatible with other existing frameworks in hydrology, such as for example Hydromad.

  14. Data assimilation for air quality models

    DEFF Research Database (Denmark)

    Silver, Jeremy David

    2014-01-01

    The chemical composition of the Earth’s atmosphere has major ramifications for not only human health, but also biodiversity and the climate; hence there are scientific, environmental and societal interests in accurate estimates of atmospheric chemical composition and in understanding the governing......-transport models (CTMs). Each of these methods has their limitations: direct measurements provide only data at point locations and may not be representative of a wider area, remotely-sensed data from polar-orbiting satellites cannot investigate diurnal variation, and CTM simulations are often associated...... with higher uncertainties. It is possible, however, to combine information from measurements and models to more accurately estimate the state of the atmosphere using a statistically consistent framework known as “data assimilation”. In this study, three data assimilation schemes are implemented and evaluated...

  15. Polar ozone depletion and trends as represented by the Whole Atmospheric Community Climate Model (WACCM)

    Science.gov (United States)

    Kinnison, Douglas; Solomon, Susan; Ivy, Diane; Mills, Michael; Neely, Ryan, III; Schmidt, Anja; Garcia, Rolando; Smith, Anne

    2016-04-01

    The Whole Atmosphere Community Climate Model, Version 4 (WACCM4) is a comprehensive numerical model, spanning the range of altitude from the Earth's surface to the lower thermosphere [Garcia et al., JGR, 2007; Kinnison et al., JGR, 2007; Marsh et al., J. of Climate, 2013]. WACCM4 is based on the framework of the NCAR Community Atmosphere Model, version 4 (CAM4), and includes all of the physical parameterizations of CAM4 and a finite volume dynamical core for the tracer advection. This version has a detailed representation of tropospheric and middle atmosphere chemical and physical processes. Simulations completed for the SPARC Chemistry Climate Model Initiative (CCMI), REFC1, REFC2, SENSC2, and REFC1SD scenarios are examined (see Eyring et al., SPARC Newsletter, 2013). Recent improvements in model representation of orographic gravity wave processes strongly impact temperature and therefore polar ozone depletion as well as its subsequent recovery. Model representation of volcanic events will also be shown to be important for ozone loss. Evaluation of polar ozone depletion processes (e.g., dehydration, denitrification, chemical activation) with key observations will be performed and the impact on future ozone recovery will be identified.

  16. Representing general theoretical concepts in structural equation models: The role of composite variables

    Science.gov (United States)

    Grace, J.B.; Bollen, K.A.

    2008-01-01

    Structural equation modeling (SEM) holds the promise of providing natural scientists the capacity to evaluate complex multivariate hypotheses about ecological systems. Building on its predecessors, path analysis and factor analysis, SEM allows for the incorporation of both observed and unobserved (latent) variables into theoretically-based probabilistic models. In this paper we discuss the interface between theory and data in SEM and the use of an additional variable type, the composite. In simple terms, composite variables specify the influences of collections of other variables and can be helpful in modeling heterogeneous concepts of the sort commonly of interest to ecologists. While long recognized as a potentially important element of SEM, composite variables have received very limited use, in part because of a lack of theoretical consideration, but also because of difficulties that arise in parameter estimation when using conventional solution procedures. In this paper we present a framework for discussing composites and demonstrate how the use of partially-reduced-form models can help to overcome some of the parameter estimation and evaluation problems associated with models containing composites. Diagnostic procedures for evaluating the most appropriate and effective use of composites are illustrated with an example from the ecological literature. It is argued that an ability to incorporate composite variables into structural equation models may be particularly valuable in the study of natural systems, where concepts are frequently multifaceted and the influence of suites of variables are often of interest. ?? Springer Science+Business Media, LLC 2007.

  17. Advanced microscale bioreactor system: a representative scale-down model for bench-top bioreactors.

    Science.gov (United States)

    Hsu, Wei-Ting; Aulakh, Rigzen P S; Traul, Donald L; Yuk, Inn H

    2012-12-01

    In recent years, several automated scale-down bioreactor systems have been developed to increase efficiency in cell culture process development. ambr™ is an automated workstation that provides individual monitoring and control of culture dissolved oxygen and pH in single-use, stirred-tank bioreactors at a working volume of 10-15 mL. To evaluate the ambr™ system, we compared the performance of four recombinant Chinese hamster ovary cell lines in a fed-batch process in parallel ambr™, 2-L bench-top bioreactors, and shake flasks. Cultures in ambr™ matched 2-L bioreactors in controlling the environment (temperature, dissolved oxygen, and pH) and in culture performance (growth, viability, glucose, lactate, Na(+), osmolality, titer, and product quality). However, cultures in shake flasks did not show comparable performance to the ambr™ and 2-L bioreactors.

  18. Representing life in the Earth system with soil microbial functional traits in the MIMICS model

    Directory of Open Access Journals (Sweden)

    W. R. Wieder

    2015-02-01

    Full Text Available Projecting biogeochemical responses to global environmental change requires multi-scaled perspectives that consider organismal diversity, ecosystem processes and global fluxes. However, microbes, the drivers of soil organic matter decomposition and stabilization, remain notably absent from models used to project carbon cycle–climate feedbacks. We used a microbial trait-based soil carbon (C model, with two physiologically distinct microbial communities to improve current estimates of soil C storage and their likely response to perturbations. Drawing from the application of functional traits used to model other ecosystems, we incorporate copiotrophic and oligotrophic microbial functional groups in the MIcrobial-MIneral Carbon Stabilization (MIMICS model, which incorporates oligotrophic and copiotrophic functional groups, akin to "gleaner" vs. "opportunist" plankton in the ocean, or r vs. K strategists in plant and animals communities. Here we compare MIMICS to a conventional soil C model, DAYCENT, in cross-site comparisons of nitrogen (N enrichment effects on soil C dynamics. MIMICS more accurately simulates C responses to N enrichment; moreover, it raises important hypotheses involving the roles of substrate availability, community-level enzyme induction, and microbial physiological responses in explaining various soil biogeochemical responses to N enrichment. In global-scale analyses, we show that current projections from Earth system models likely overestimate the strength of the land C sink in response to increasing C inputs with elevated carbon dioxide (CO2. Our findings illustrate that tradeoffs between theory and utility can be overcome to develop soil biogeochemistry models that evaluate and advance our theoretical understanding of microbial dynamics and soil biogeochemical responses to environmental change.

  19. Measuring Quality Satisfaction with Servqual Model

    Directory of Open Access Journals (Sweden)

    Dan Păuna

    2012-05-01

    Full Text Available The orientation to customer satisfaction is not a recent phenomenon, many very successfulbusinesspeople from the beginning of the 20th century, such as Sir Henry Royce, a name synonymous withRoll – Royce vehicles, stated the first principle regarding customer satisfaction “Our interest in the Roll-Royce cars does not end at the moment when the owner pays for and takes delivery the car. Our interest in thecar never wanes. Our ambition is that every purchaser of the Rolls - Royce car shall continue to be more thansatisfied (Rolls-Royce.” The following paper tries to deal with the important qualities of the concept for themeasuring of the gap between expected costumer services satisfactions, and perceived services like a routinecustomer feedback process, by means of a relatively new model, the Servqual model.

  20. Dynamic evaluation of air quality models over European regions

    NARCIS (Netherlands)

    Thunis, P.; Pisoni, E.; Degraeuwe, B.; Kranenburg, R.; Schaap, M.; Clappier, A.

    2015-01-01

    Chemistry-transport models are increasingly used in Europe for estimating air quality or forecasting changes in pollution levels. But with this increased use of modeling arises the need of harmonizing the methodologies to determine the quality of air quality model applications. This is complex for p

  1. Evaluation of CASP8 model quality predictions

    KAUST Repository

    Cozzetto, Domenico

    2009-01-01

    The model quality assessment problem consists in the a priori estimation of the overall and per-residue accuracy of protein structure predictions. Over the past years, a number of methods have been developed to address this issue and CASP established a prediction category to evaluate their performance in 2006. In 2008 the experiment was repeated and its results are reported here. Participants were invited to infer the correctness of the protein models submitted by the registered automatic servers. Estimates could apply to both whole models and individual amino acids. Groups involved in the tertiary structure prediction categories were also asked to assign local error estimates to each predicted residue in their own models and their results are also discussed here. The correlation between the predicted and observed correctness measures was the basis of the assessment of the results. We observe that consensus-based methods still perform significantly better than those accepting single models, similarly to what was concluded in the previous edition of the experiment. © 2009 WILEY-LISS, INC.

  2. Modelling bacterial water quality in streams draining pastoral land.

    Science.gov (United States)

    Collins, Rob; Rutherford, Kit

    2004-02-01

    A model has been developed to predict concentrations of the faecal bacteria indicator E. coli in streams draining grazed hill-country in New Zealand. The long-term aim of the modelling is to assess effects of land management upon faecal contamination and, in the short term, to provide a framework for field-based research. A daily record of grazing livestock is used to estimate E. coli inputs to a catchment, and transport of bacteria to the stream network is simulated within surface and subsurface flows. Deposition of E. coli directly to streams is incorporated where cattle have access to them, and areas of permanent saturation ('seepage zones') are also represented. Bacteria are routed down the stream network and in-stream processes of deposition and entrainment are simulated. Die-off, both on land and in water, is simulated as a function of temperature and solar radiation. The model broadly reproduces observed E. coli concentrations in a hill-country catchment grazed by sheep and beef cattle, although uncertainty exists with a number of the processes represented. The model is sensitive to the distance over which surface runoff delivers bacteria to a stream and the amount of excretion direct to streams and onto seepage zones. Scenario analysis suggests that riparian buffer strips may improve bacterial water quality both by eliminating livestock defaecation in and near streams, and by trapping of bacteria by the riparian vegetation.

  3. Total Variation Based Perceptual Image Quality Assessment Modeling

    Directory of Open Access Journals (Sweden)

    Yadong Wu

    2014-01-01

    Full Text Available Visual quality measure is one of the fundamental and important issues to numerous applications of image and video processing. In this paper, based on the assumption that human visual system is sensitive to image structures (edges and image local luminance (light stimulation, we propose a new perceptual image quality assessment (PIQA measure based on total variation (TV model (TVPIQA in spatial domain. The proposed measure compares TVs between a distorted image and its reference image to represent the loss of image structural information. Because of the good performance of TV model in describing edges, the proposed TVPIQA measure can illustrate image structure information very well. In addition, the energy of enclosed regions in a difference image between the reference image and its distorted image is used to measure the missing luminance information which is sensitive to human visual system. Finally, we validate the performance of TVPIQA measure with Cornell-A57, IVC, TID2008, and CSIQ databases and show that TVPIQA measure outperforms recent state-of-the-art image quality assessment measures.

  4. 8760-Based Method for Representing Variable Generation Capacity Value in Capacity Expansion Models

    Energy Technology Data Exchange (ETDEWEB)

    Frew, Bethany A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    Capacity expansion models (CEMs) are widely used to evaluate the least-cost portfolio of electricity generators, transmission, and storage needed to reliably serve load over many years or decades. CEMs can be computationally complex and are often forced to estimate key parameters using simplified methods to achieve acceptable solve times or for other reasons. In this paper, we discuss one of these parameters -- capacity value (CV). We first provide a high-level motivation for and overview of CV. We next describe existing modeling simplifications and an alternate approach for estimating CV that utilizes hourly '8760' data of load and VG resources. We then apply this 8760 method to an established CEM, the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) model (Eurek et al. 2016). While this alternative approach for CV is not itself novel, it contributes to the broader CEM community by (1) demonstrating how a simplified 8760 hourly method, which can be easily implemented in other power sector models when data is available, more accurately captures CV trends than a statistical method within the ReEDS CEM, and (2) providing a flexible modeling framework from which other 8760-based system elements (e.g., demand response, storage, and transmission) can be added to further capture important dynamic interactions, such as curtailment.

  5. Conclusions on motor control depend on the type of model used to represent the periphery.

    Science.gov (United States)

    Pinter, Ilona J; van Soest, Arthur J; Bobbert, Maarten F; Smeets, Jeroen B J

    2012-10-01

    Within the field of motor control, there is no consensus on which kinematic and kinetic aspects of movements are planned or controlled. Perturbing goal-directed movements is a frequently used tool to answer this question. To be able to draw conclusions about motor control from kinematic responses to perturbations, a model of the periphery (i.e., the skeleton, muscle-tendon complexes, and spinal reflex circuitry) is required. The purpose of the present study was to determine to what extent such conclusions depend on the level of simplification with which the dynamical properties of the periphery are modeled. For this purpose, we simulated fast goal-directed single-joint movement with four existing types of models. We tested how three types of perturbations affected movement trajectory if motor commands remained unchanged. We found that the four types of models of the periphery showed different robustness to the perturbations, leading to different predictions on how accurate motor commands need to be, i.e., how accurate the knowledge of external conditions needs to be. This means that when interpreting kinematic responses obtained in perturbation experiments the level of error correction attributed to adaptation of motor commands depends on the type of model used to describe the periphery.

  6. Modeling Fluid’s Dynamics with Master Equations in Ultrametric Spaces Representing the Treelike Structure of Capillary Networks

    OpenAIRE

    Andrei Khrennikov; Klaudia Oleschko; María de Jesús Correa López

    2016-01-01

    We present a new conceptual approach for modeling of fluid flows in random porous media based on explicit exploration of the treelike geometry of complex capillary networks. Such patterns can be represented mathematically as ultrametric spaces and the dynamics of fluids by ultrametric diffusion. The images of p-adic fields, extracted from the real multiscale rock samples and from some reference images, are depicted. In this model the porous background is treated as the environment contributin...

  7. Air quality modeling in Warsaw Metropolitan Area

    Directory of Open Access Journals (Sweden)

    Piotr Holnicki

    2013-04-01

    Full Text Available Decision support of air quality management needs to connect several categories of the input data with the analytical process of air pollution dispersion. The aim of the respective model of air pollution is to provide a quantitative assessment of environmental impact of emission sources in a form of spatial/temporal maps of pollutants’ concentration or deposition in the domain. These results are in turn used in assessment of environmental risk and supporting respective planning actions. However, due to the complexity of the forecasting system and the required input data, such environmental prognosis and related decisions contain many potential sources of imprecision and uncertainty. The main sources of uncertainty are commonly considered meteorological and emission input data. This paper addresses the problem of emission uncertainty, and impact of this uncertainty on the forecasted air pollution concentrations and adverse health effects. The computational experiment implemented for Warsaw Metropolitan Area, Poland, encompasses one-year forecast with the year 2005 meteorological dataset. The annual mean concentrations of the main urban pollutants are computed. The impact of uncertainty in emission field inventory is also considered. Uncertainty assessment is based on the Monte Carlo technique where the regional scale CALPUFF model is the main forecasting tool used in air quality analysis.

  8. MODELLING PURCHASING PROCESSES FROM QUALITY ASPECTS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-12-01

    Full Text Available Management has a fundamental task to identify and direct primary and specific processes within purchasing function, applying the up-to-date information infrastructure. ISO 9001:2000 defines a process as a number of interrelated or interactive activities transforming inputs and outputs, and the "process approach" as a systematic identification in management processes employed with the organization and particularly - relationships among the processes. To direct a quality management system using process approach, the organization is to determine the map of its general (basic processes. Primary processes are determined on the grounds of their interrelationship and impact on satisfying customers' needs. To make a proper choice of general business processes, it is necessary to determine the entire business flow, beginning with the customer demand up to the delivery of products or service provided. In the next step the process model is to be converted into data model which is essential for implementation of the information system enabling automation, monitoring, measuring, inspection, analysis and improvement of key purchase processes. In this paper are given methodology and some results of investigation of development of IS for purchasing process from aspects of quality.

  9. A Hidden Markov Model Representing the Spatial and Temporal Correlation of Multiple Wind Farms

    DEFF Research Database (Denmark)

    Fang, Jiakun; Su, Chi; Hu, Weihao

    2015-01-01

    To accommodate the increasing wind energy with stochastic nature becomes a major issue on power system reliability. This paper proposes a methodology to characterize the spatiotemporal correlation of multiple wind farms. First, a hierarchical clustering method based on self-organizing maps...... is adopted to categorize the similar output patterns of several wind farms into joint states. Then the hidden Markov model (HMM) is then designed to describe the temporal correlations among these joint states. Unlike the conventional Markov chain model, the accumulated wind power is taken into consideration....... The proposed statistical modeling framework is compatible with the sequential power system reliability analysis. A case study on optimal sizing and location of fast-response regulation sources is presented....

  10. Effects of various representations of temporally and spatially variable agricultural processes in air quality dispersion modeling

    Science.gov (United States)

    Agricultural activities that are both temporally and spatially variable, such as tillage and harvesting, can be challenging to represent as sources in air quality dispersion modeling. Existing models were mainly developed to predict concentrations resulting from a stationary and continuous source wi...

  11. Labour Quality Model for Organic Farming Food Chains

    OpenAIRE

    Gassner, B.; Freyer, B.; Leitner, H.

    2008-01-01

    The debate on labour quality in science is controversial as well as in the organic agriculture community. Therefore, we reviewed literature on different labour quality models and definitions, and had key informant interviews on labour quality issues with stakeholders in a regional oriented organic agriculture bread food chain. We developed a labour quality model with nine quality categories and discussed linkages to labour satisfaction, ethical values and IFOAM principles.

  12. Novel Diagnostic Model for the Deficient and Excess Pulse Qualities

    Directory of Open Access Journals (Sweden)

    Jaeuk U. Kim

    2012-01-01

    Full Text Available The deficient and excess pulse qualities (DEPs are the two representatives of the deficiency and excess syndromes, respectively. Despite its importance in the objectification of pulse diagnosis, a reliable classification model for the DEPs has not been reported to date. In this work, we propose a classification method for the DEPs based on a clinical study. First, through factor analysis and Fisher's discriminant analysis, we show that all the pulse amplitudes obtained at various applied pressures at Chon, Gwan, and Cheok contribute on equal orders of magnitude in the determination of the DEPs. Then, we discuss that the pulse pressure or the average pulse amplitude is appropriate for describing the collective behaviors of the pulse amplitudes and a simple and reliable classification can be constructed from either quantity. Finally, we propose an enhanced classification model that combines the two complementary variables sequentially.

  13. Representing northern peatland microtopography and hydrology within the Community Land Model

    Science.gov (United States)

    X. Shi; P.E. Thornton; D.M. Ricciuto; P J. Hanson; J. Mao; Stephen Sebestyen; N.A. Griffiths; G. Bisht

    2015-01-01

    Predictive understanding of northern peatland hydrology is a necessary precursor to understanding the fate of massive carbon stores in these systems under the influence of present and future climate change. Current models have begun to address microtopographic controls on peatland hydrology, but none have included a prognostic calculation of peatland water table depth...

  14. An Equivalent Mechanical Model for Representing the Entropy Generation in Heat Exchangers. Application to Power Cycles

    Directory of Open Access Journals (Sweden)

    Federico Ramírez

    2011-07-01

    Full Text Available

    One of the most common difficulties students face in learning Thermodynamics lies in grasping the physical meaning of concepts such as lost availability and entropy generation. This explains the quest for new approaches for explaining and comprehending these quantities, as suggested by diagrams from different authors. The difficulties worsen in the case of irreversibilities associated with heat transfer processes driven by a finite temperature difference, where no work transfer takes place. An equivalent mechanical model is proposed in this paper. Heat exchangers are modelled by means of Carnot heat engines and mechanical transmissions; the use of mechanical models allows an easy visualization of thermal irreversibilities. The proposed model is further applied to a power cycle, thus obtaining an “equivalent arrangement” where irreversibilities become clearly apparent.

  15. A mathematical model representing cellular immune development and response to Salmonella of chicken intestinal tissue

    NARCIS (Netherlands)

    Schokker, D.; Bannink, A.; Smits, M.A.; Rebel, J.M.J.

    2013-01-01

    The aim of this study was to create a dynamic mathematical model of the development of the cellular branch of the intestinal immune system of poultry during the first 42 days of life and of its response towards an oral infection with Salmonella enterica serovar Enteritidis. The system elements were

  16. A comparison of methods for representing random taste heterogeneity in discrete choice models

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Hess, Stephane

    2009-01-01

    This paper reports the findings of a systematic study using Monte Carlo experiments and a real dataset aimed at comparing the performance of various ways of specifying random taste heterogeneity in a discrete choice model. Specifically, the analysis compares the performance of two recent advanced...

  17. Towards a self-organizing pre-symbolic neural model representing sensorimotor primitives

    Directory of Open Access Journals (Sweden)

    Junpei eZhong

    2014-02-01

    Full Text Available The acquisition of symbolic and linguistic representations of sensorimotor behavior is a cognitive process performed by an agent when it is executing and/or observing own and others' actions. According to Piaget's theory of cognitive development, these representations develop during the sensorimotor stage and the pre-operational stage. We propose a model that relates the conceptualization of the higher-level information from visual stimuli to the development of ventral/dorsal visual streams. This model employs neural network architecture incorporating a predictive sensory module based on an RNNPB (Recurrent Neural Network with Parametric Biases and a horizontal product model. We exemplify this model through a robot passively observing an object to learn its features and movements. During the learning process of observing sensorimotor primitives, i.e. observing a set of trajectories of arm movements and its oriented object features, the pre-symbolic representation is self-organized in the parametric units. These representational units act as bifurcation parameters, guiding the robot to recognize and predict various learned sensorimotor primitives. The pre-symbolic representation also accounts for the learning of sensorimotor primitives in a latent learning context.

  18. What Happens when Representations Fail to Represent? Graduate Students' Mental Models of Organic Chemistry Diagrams

    Science.gov (United States)

    Strickland, Amanda M.; Kraft, Adam; Bhattacharyya, Gautam

    2010-01-01

    As part of our investigations into the development of representational competence, we report results from a study in which we elicited sixteen graduate students' expressed mental models of commonly-used terms for describing organic reactions--functional group, nucleophile/electrophile, acid/base--and for diagrams of transformations and their…

  19. Representativeness errors in comparing chemistry transport and chemistry climate models with satellite UV-Vis tropospheric column retrievals

    NARCIS (Netherlands)

    Boersma, K.F.; Vinken, G.C.M.; Eskes, H.J.

    2016-01-01

    Ultraviolet-visible (UV-Vis) satellite retrievals of trace gas columns of nitrogen dioxide (NO2), sulfur dioxide (SO2), and formaldehyde (HCHO) are useful to test and improve models of atmospheric composition, for data assimilation, air quality hindcasting and forecasting, a

  20. A critical study of quality parameters in health care establishment: developing an integrated quality model

    NARCIS (Netherlands)

    Azam, M.; Rahman, Z.; Talib, F.; Singh, K.J.

    2012-01-01

    PURPOSE: The purpose of this article is to identify and critically analyze healthcare establishment (HCE) quality parameters described in the literature. It aims to propose an integrated quality model that includes technical quality and associated supportive quality parameters to achieve optimum

  1. Final Technical Report: "Representing Endogenous Technological Change in Climate Policy Models: General Equilibrium Approaches"

    Energy Technology Data Exchange (ETDEWEB)

    Ian Sue Wing

    2006-04-18

    The research supported by this award pursued three lines of inquiry: (1) The construction of dynamic general equilibrium models to simulate the accumulation and substitution of knowledge, which has resulted in the preparation and submission of several papers: (a) A submitted pedagogic paper which clarifies the structure and operation of computable general equilibrium (CGE) models (C.2), and a review article in press which develops a taxonomy for understanding the representation of technical change in economic and engineering models for climate policy analysis (B.3). (b) A paper which models knowledge directly as a homogeneous factor, and demonstrates that inter-sectoral reallocation of knowledge is the key margin of adjustment which enables induced technical change to lower the costs of climate policy (C.1). (c) An empirical paper which estimates the contribution of embodied knowledge to aggregate energy intensity in the U.S. (C.3), followed by a companion article which embeds these results within a CGE model to understand the degree to which autonomous energy efficiency improvement (AEEI) is attributable to technical change as opposed to sub-sectoral shifts in industrial composition (C.4) (d) Finally, ongoing theoretical work to characterize the precursors and implications of the response of innovation to emission limits (E.2). (2) Data development and simulation modeling to understand how the characteristics of discrete energy supply technologies determine their succession in response to emission limits when they are embedded within a general equilibrium framework. This work has produced two peer-reviewed articles which are currently in press (B.1 and B.2). (3) Empirical investigation of trade as an avenue for the transmission of technological change to developing countries, and its implications for leakage, which has resulted in an econometric study which is being revised for submission to a journal (E.1). As work commenced on this topic, the U.S. withdrawal

  2. Aeromechanical stability analysis of a multirotor vehicle model representing a hybrid heavy lift airship (HHLA)

    Science.gov (United States)

    Venkatesan, C.; Friedmann, P. P.

    1984-01-01

    Hybrid Heavy Lift Airship (HHLA) is a proposed candidate vehicle aimed at providing heavy lift capability at low cost. This vehicle consists of a buoyant envelope attached to a supporting structure to which four rotor systems, taken from existing helicopters are attached. Nonlinear equations of motion capable of modelling the dynamics of this coupled multi-rotor/support frame/vehicle system have been developed. Using these equations of motion the aeroelastic and aeromechanical stability analysis is performed aimed at identifying potential instabilities which could occur for this type of vehicle. The coupling between various blade, supporting structure and rigid body modes is identified. Furthermore, the effects of changes in buoyancy ratio (Buoyant lift/total weight) on the dynamic characteristics of the vehicle are studied. The dynamic effects found are of considerable importance for the design of such vehicles. The analytical model developed is also useful for studying the aeromechanical stability of single rotor and tandem rotor coupled rotor/fuselage systems.

  3. A General Model for Representing Arbitrary Unsymmetries in Various Types of Network Analysis

    DEFF Research Database (Denmark)

    Rønne-Hansen, Jan

    1997-01-01

    When dealing with unsymmetric faults various proposals have been put forward. In general they have been characterized by specific treatment of the single fault in accordance with the structure and impedances involved. The model presented is based on node equations and was originally developed for...... complicated fault situation which has not been treated before for traditional transient stability analysis...... for transient stability studies in order to allow for an arbitrary fault representation as seen from the positive sequence network. The method results in impedances -or admittances-combining the negative sequence and zero sequence representation for the symmetrical network with the structure and electrical...... constants of the unsymmetry involving one or more buses. These impedances are introduced in the positive sequence network in the nodes involved in the unsymmetrical conditions. In addition the model can be used for static fault current analysis and presents also in this connection a general method...

  4. The Adaptive Co-Management Process: an Initial Synthesis of Representative Models and Influential Variables

    Directory of Open Access Journals (Sweden)

    Ryan Plummer

    2009-12-01

    Full Text Available Collaborative and adaptive approaches to environmental management have captured the attention of administrators, resource users, and scholars. Adaptive co-management builds upon these approaches to create a novel governance strategy. This paper investigates the dynamics of the adaptive co-management process and the variables that influence it. The investigation begins by summarizing analytical and causal models relevant to the adaptive co-management process. Variables that influence this process are then synthesized from diverse literatures, categorized as being exogenous or endogenous, and developed into respective analytical frameworks. In identifying commonalities among models of the adaptive co-management process and discerning influential variables, this paper provides initial insights into understanding the dynamic social process of adaptive co-management. From these insights conjectures for future inquires are offered in the conclusion.

  5. Lattice-Boltzmann modeling of micromodel experiments representing a CO2-brine system

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Mark L [Los Alamos National Laboratory; Kang, Qinjun [Los Alamos National Laboratory; Tarimala, Sowmitri [Los Alamos National Laboratory; Abdel - Fattah, Amr I [Los Alamos National Laboratory; Backhaus, Scott [Los Alamos National Laboratory; Carey, James W [Los Alamos National Laboratory

    2010-12-21

    Successful sequestration of CO{sub 2} into deep saline aquifers presents an enormous challenge that requires fundamental understanding of reactive-multi phase flow and transport across many temporal and spatial scales. Of critical importance is accurately predicting the efficiency of CO{sub 2} trapping mechanisms. At the pore scale (e.g., microns to millimeters) the interfacial area between CO{sub 2} and brine, as well as CO{sub 2} and the solid phase, directly influences the amount of CO{sub 2} trapped due to capillary forces, dissolution and mineral precipitation. In this work, we model immiscible displacement micromodel experiments using the lattice-Boltzmann (LB) method. We focus on quantifying interfacial area as a function of capillary numbers and viscosity ratios typically encountered in CO{sub 2} sequestration operations. We show that the LB model adequately predicts the steady-state experimental flow patterns and interfacial area measurements. Based on the steady-state agreement, we use the LB model to investigate interfacial dynamics (e.g., fluid-fluid interfacial velocity and the rate of production of fluid-fluid interfacial area). In addition, we quantify the amount of interfacial area and the interfacial dynamics associated with the capillary trapped nonwetting phase. This is expected to be important for predicting the amount of nonwetting phase subsequently trapped due to dissolution and mineral precipitation.

  6. Lattice-Boltzmann modeling of micromodel experiments representing a CO2-brine system

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Mark L [Los Alamos National Laboratory; Kang, Qinjun [Los Alamos National Laboratory; Tarimala, Sowmitri [Los Alamos National Laboratory; Abdel - Fattah, Amr I [Los Alamos National Laboratory; Backhaus, Scott [Los Alamos National Laboratory; Carey, James W [Los Alamos National Laboratory

    2010-12-21

    Successful sequestration of CO{sub 2} into deep saline aquifers presents an enormous challenge that requires fundamental understanding of reactive-multi phase flow and transport across many temporal and spatial scales. Of critical importance is accurately predicting the efficiency of CO{sub 2} trapping mechanisms. At the pore scale (e.g., microns to millimeters) the interfacial area between CO{sub 2} and brine, as well as CO{sub 2} and the solid phase, directly influences the amount of CO{sub 2} trapped due to capillary forces, dissolution and mineral precipitation. In this work, we model immiscible displacement micromodel experiments using the lattice-Boltzmann (LB) method. We focus on quantifying interfacial area as a function of capillary numbers and viscosity ratios typically encountered in CO{sub 2} sequestration operations. We show that the LB model adequately predicts the steady-state experimental flow patterns and interfacial area measurements. Based on the steady-state agreement, we use the LB model to investigate interfacial dynamics (e.g., fluid-fluid interfacial velocity and the rate of production of fluid-fluid interfacial area). In addition, we quantify the amount of interfacial area and the interfacial dynamics associated with the capillary trapped nonwetting phase. This is expected to be important for predicting the amount of nonwetting phase subsequently trapped due to dissolution and mineral precipitation.

  7. Modeling the subjective quality of highly contrasted videos displayed on LCD with local backlight dimming.

    Science.gov (United States)

    Mantel, Claire; Bech, Søren; Korhonen, Jari; Forchhammer, Søren; Pedersen, Jesper Melgaard

    2015-02-01

    Local backlight dimming is a technology aiming at both saving energy and improving visual quality on television sets. As the rendition of the image is specified locally, the numerical signal corresponding to the displayed image needs to be computed through a model of the display. This simulated signal can then be used as input to objective quality metrics. The focus of this paper is on determining which characteristics of locally backlit displays influence quality assessment. A subjective experiment assessing the quality of highly contrasted videos displayed with various local backlight-dimming algorithms is set up. Subjective results are then compared with both objective measures and objective quality metrics using different display models. The first analysis indicates that the most significant objective features are temporal variations, power consumption (probably representing leakage), and a contrast measure. The second analysis shows that modeling of leakage is necessary for objective quality assessment of sequences displayed with local backlight dimming.

  8. Advances in Application of Models in Soil Quality Evaluation

    Institute of Scientific and Technical Information of China (English)

    SI Zhi-guo; WANG Ji-jie; YU Yuan-chun; LIANG Guan-feng; CHEN Chang-ren; SHU Hong-lan

    2012-01-01

    Soil quality is a comprehensive reflection of soil properties.Since the soil quality concept was put forward in the 1970s,the quality of different type soils in different regions have been evaluated through a variety of evaluation methods,but it still lacks universal soil quantity evaluation models and methods.In this paper,the applications and prospects of grey relevancy comprehensive evaluation model,attribute hierarchical model,fuzzy comprehensive evaluation model,matter-element model,RAGA-based PPC /PPE model and GIS model in soil quality evaluation are reviewed.

  9. Model validation of channel zapping quality

    Science.gov (United States)

    Kooij, Robert; Nicolai, Floris; Ahmed, Kamal; Brunnström, Kjell

    2009-02-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call a lean forward experiment and gave the rule of thumb result that the zapping time must be less than 0.43 sec to be good ( > 3.5 on the MOS scale). To validate the model we have done new subjective experiments. These experiments included lean backwards zapping i.e. sitting in a sofa with a remote control. The subjects are more forgiving in this case and the requirement could be relaxed to 0.67 sec. We also conducted subjective experiments where the zapping times are varying. We found that the MOS rating decreases if zapping delay times are varying. In our experiments we assumed uniformly distributed delays, where the variance cannot be larger than the mean delay. We found that in order to obtain a MOS rating of at least 3.5, that the maximum allowed variance, and thus also the maximum allowed mean zapping delay, is 0.46 sec.

  10. Representing icebergs in the iLOVECLIM model (version 1.0 – a sensitivity study

    Directory of Open Access Journals (Sweden)

    M. Bügelmayer

    2014-07-01

    Full Text Available Recent modelling studies have indicated that icebergs alter the ocean's state, the thickness of sea ice and the prevailing atmospheric conditions, in short play an active role in the climate system. The icebergs' impact is due to their slowly released melt water which freshens and cools the ocean. The spatial distribution of the icebergs and thus their melt water depends on the forces (atmospheric and oceanic acting on them as well as on the icebergs' size. The studies conducted so far have in common that the icebergs were moved by reconstructed or modelled forcing fields and that the initial size distribution of the icebergs was prescribed according to present day observations. To address these shortcomings, we used the climate model iLOVECLIM that includes actively coupled ice-sheet and iceberg modules, to conduct 15 sensitivity experiments to analyse (1 the impact of the forcing fields (atmospheric vs. oceanic on the icebergs' distribution and melt flux, and (2 the effect of the used initial iceberg size on the resulting Northern Hemisphere climate and ice sheet under different climate conditions (pre-industrial, strong/weak radiative forcing. Our results show that, under equilibrated pre-industrial conditions, the oceanic currents cause the bergs to stay close to the Greenland and North American coast, whereas the atmospheric forcing quickly distributes them further away from their calving site. These different characteristics strongly affect the lifetime of icebergs, since the wind-driven icebergs melt up to two years faster as they are quickly distributed into the relatively warm North Atlantic waters. Moreover, we find that local variations in the spatial distribution due to different iceberg sizes do not result in different climate states and Greenland ice sheet volume, independent of the prevailing climate conditions (pre-industrial, warming or cooling climate. Therefore, we conclude that local differences in the distribution of their

  11. Evaluating NOx emission inventories for regulatory air quality modeling using satellite and air quality model data

    Science.gov (United States)

    Kemball-Cook, Susan; Yarwood, Greg; Johnson, Jeremiah; Dornblaser, Bright; Estes, Mark

    2015-09-01

    The purpose of this study was to assess the accuracy of NOx emissions in the Texas Commission on Environmental Quality's (TCEQ) State Implementation Plan (SIP) modeling inventories of the southeastern U.S. We used retrieved satellite tropospheric NO2 columns from the Ozone Monitoring Instrument (OMI) together with NO2 columns from the Comprehensive Air Quality Model with Extensions (CAMx) to make top-down NOx emissions estimates using the mass balance method. Two different top-down NOx emissions estimates were developed using the KNMI DOMINO v2.0 and NASA SP2 retrievals of OMI NO2 columns. Differences in the top-down NOx emissions estimates made with these two operational products derived from the same OMI radiance data were sufficiently large that they could not be used to constrain the TCEQ NOx emissions in the southeast. The fact that the two available operational NO2 column retrievals give such different top-down NOx emissions results is important because these retrievals are increasingly being used to diagnose air quality problems and to inform efforts to solve them. These results reflect the fact that NO2 column retrievals are a blend of measurements and modeled data and should be used with caution in analyses that will inform policy development. This study illustrates both benefits and challenges of using satellite NO2 data for air quality management applications. Comparison with OMI NO2 columns pointed the way toward improvements in the CAMx simulation of the upper troposphere, but further refinement of both regional air quality models and the NO2 column retrievals is needed before the mass balance and other emission inversion methods can be used to successfully constrain NOx emission inventories used in U.S. regulatory modeling.

  12. The SHOCT domain: a widespread domain under-represented in model organisms.

    Directory of Open Access Journals (Sweden)

    Ruth Y Eberhardt

    Full Text Available We have identified a new protein domain, which we have named the SHOCT domain (Short C-terminal domain. This domain is widespread in bacteria with over a thousand examples. But we found it is missing from the most commonly studied model organisms, despite being present in closely related species. It's predominantly C-terminal location, co-occurrence with numerous other domains and short size is reminiscent of the Gram-positive anchor motif, however it is present in a much wider range of species. We suggest several hypotheses about the function of SHOCT, including oligomerisation and nucleic acid binding. Our initial experiments do not support its role as an oligomerisation domain.

  13. Value of information estimation using geologic representative models; Estimativa de valor de informacao usando modelos geologicos representativos

    Energy Technology Data Exchange (ETDEWEB)

    Xavier, Alexandre M. [PETROBRAS, Rio de Janeiro, RJ (Brazil). Unidade de Negocio de Exploracao e Producao da Bacia de Santos. Gerencia de Reservatorio], e-mail: amxavier@petrobras.com.br; Ligero, Eliana L. [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica. Laboratorio de Pesquisa em Simulacao e Gerenciamento de Reservatorios], e-mail: eligero@dep.fem.unicamp.br

    2006-12-15

    Petroleum field development occurs under geological, economic, technological and political uncertainties. Risk proceeding from geological uncertainties can be mitigated from additional information or operational flexibility. In petroleum field development, especially offshore fields, where investment and information costs are high and flexibility is low, it is necessary to use a probabilistic methodology in the decision analysis, mainly in the production strategy definition. The employment of probabilistic methodologies in risk analysis require some simplifications due to the complexity of the process, high number of decision possibilities and high cost of flow simulation - the tool used to evaluate alternatives. A possible simplification is the geological representative models, which are models that are able to represent reservoir geological uncertainties. In a risk methodology, the GRM models are used to integrate the geological, economic, technological and production strategies . A methodology to determine the Value of Information has been developed and it is based on the geological representative models in order to minimize the risks involved in the project. The methodology has been validated and applied to an offshore field. (author)

  14. Community Multi-scale Air Quality (CMAQ) Modeling System for Air Quality Management

    Science.gov (United States)

    CMAQ simultaneously models multiple air pollutants including ozone, particulate matter and a variety of air toxics to help air quality managers determine the best air quality management scenarios for their communities, regions and states.

  15. Earth radiation balance as observed and represented in CMIP5 models

    Science.gov (United States)

    Wild, Martin; Folini, Doris; Schär, Christoph; Loeb, Norman; König-Langlo, Gert

    2014-05-01

    The genesis and evolution of Earth's climate is largely regulated by the Earth radiation balance. Despite of its key role in the context of climate change, substantial uncertainties still exist in the quantification of the magnitudes of its different components, and its representation in climate models. While the net radiative energy flows in and out of the climate system at the top of atmosphere are now known with considerable accuracy from new satellite programs such as CERES and SORCE, the energy distribution within the climate system and at the Earth's surface is less well determined. Accordingly, the magnitudes of the components of the surface energy balance have recently been controversially disputed, and potential inconsistencies between the estimated magnitudes of the global energy and water cycle have been emphasized. Here we summarize this discussion as presented in Chapter 2.3 of the 5th IPCC assessment report (AR5). In this context we made an attempt to better constrain the magnitudes of the surface radiative components with largest uncertainties. In addition to satellite observations, we thereby made extensive use of the growing number of surface observations to constrain the radiation balance not only from space, but also from the surface. We combined these observations with the latest modeling efforts performed for AR5 (CMIP5) to infer best estimates for the global mean surface radiative components. Our analyses favor global mean values of downward surface solar and thermal radiation near 185 and 342 Wm-2, respectively, which are most compatible with surface observations (Wild et al. 2013). These estimates are on the order of 10 Wm-2 lower and higher, respectively, than in some of the previous global energy balance assessments, including those presented in previous IPCC reports. It is encouraging that these estimates, which make full use of the information contained in the surface networks, coincide within 2 Wm-2 with the latest satellite

  16. TEMPORAL SIGNATURES OF AIR QUALITY OBSERVATIONS AND MODEL OUTPUTS: DO TIME SERIES DECOMPOSITION METHODS CAPTURE RELEVANT TIME SCALES?

    Science.gov (United States)

    Time series decomposition methods were applied to meteorological and air quality data and their numerical model estimates. Decomposition techniques express a time series as the sum of a small number of independent modes which hypothetically represent identifiable forcings, thereb...

  17. [Total quality management in healthcare. The European Foundation for Quality Management Model].

    Science.gov (United States)

    Parente, S; Loureiro, R

    1998-11-01

    After presenting the quality model adapted by the health authorities, the authors refer to the development of quality methodology to the present day, and the importance of the self assessment of Quality focusing on the one according to the European Foundation For Quality Management (EFQM) adapted by healthcare organisations. Each criteria of the EFQM model is presented and its potential as a motivation for change by means of self assessment is discussed.

  18. Modelling representative and coherent Danish farm types based on farm accountancy data for use in environmental assessments

    DEFF Research Database (Denmark)

    Dalgaard, Randi; Halberg, Niels; Kristensen, Ib Sillebak

    2006-01-01

    -oriented environmental assessment (e.g. greenhouse gas emissions per kg pork). The objective of this study was to establish a national agricultural model for estimating data on resource use, production and environmentally important emissions for a set of representative farm types. Every year a sample of farm accounts...... is established in order to report Danish agro-economical data to the ‘Farm Accountancy Data Network’ (FADN), and to produce ‘The annual Danish account statistics for agriculture’. The farm accounts are selected and weighted to be representative for the Danish agricultural sector, and similar samples of farm...... accounts are collected in most of the European countries. Based on a sample of 2138 farm accounts from year 1999 a national agricultural model, consisting of 31 farm types, was constructed. The farm accounts were grouped according to the major soil types, the number of working hours, the most important...

  19. Representing Causation

    Science.gov (United States)

    Wolff, Phillip

    2007-01-01

    The dynamics model, which is based on L. Talmy's (1988) theory of force dynamics, characterizes causation as a pattern of forces and a position vector. In contrast to counterfactual and probabilistic models, the dynamics model naturally distinguishes between different cause-related concepts and explains the induction of causal relationships from…

  20. Indoor Air Quality Building Education and Assessment Model Forms

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM) is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  1. Indoor Air Quality Building Education and Assessment Model

    Science.gov (United States)

    The Indoor Air Quality Building Education and Assessment Model (I-BEAM), released in 2002, is a guidance tool designed for use by building professionals and others interested in indoor air quality in commercial buildings.

  2. Impact of rainfall temporal resolution on urban water quality modelling performance and uncertainties.

    Science.gov (United States)

    Manz, Bastian Johann; Rodríguez, Juan Pablo; Maksimović, Cedo; McIntyre, Neil

    2013-01-01

    A key control on the response of an urban drainage model is how well the observed rainfall records represent the real rainfall variability. Particularly in urban catchments with fast response flow regimes, the selection of temporal resolution in rainfall data collection is critical. Furthermore, the impact of the rainfall variability on the model response is amplified for water quality estimates, as uncertainty in rainfall intensity affects both the rainfall-runoff and pollutant wash-off sub-models, thus compounding uncertainties. A modelling study was designed to investigate the impact of altering rainfall temporal resolution on the magnitude and behaviour of uncertainties associated with the hydrological modelling compared with water quality modelling. The case study was an 85-ha combined sewer sub-catchment in Bogotá (Colombia). Water quality estimates showed greater sensitivity to the inter-event variability in rainfall hyetograph characteristics than to changes in the rainfall input temporal resolution. Overall, uncertainties from the water quality model were two- to five-fold those of the hydrological model. However, owing to the intrinsic scarcity of observations in urban water quality modelling, total model output uncertainties, especially from the water quality model, were too large to make recommendations for particular model structures or parameter values with respect to rainfall temporal resolution.

  3. Nonlinear and Nonparametric Stochastic Model to Represent Uncertainty of Renewable Generation in Operation and Expansion Planning Studies of Electrical Energy Systems

    Science.gov (United States)

    Martins, T. M.; Alberto, J.

    2015-12-01

    The uncertainties of wind and solar generation patterns tends to be a critical factor in operation and expansion planning studies of electrical energy systems, as these generations are highly dependent on atmospheric variables which are difficult to predict. Traditionally, the uncertainty of renewable generation has been represented through scenarios generated by autoregressive parametric models (ARMA, PAR(p), SARIMA, etc.), that have been widely used for simulating the uncertainty of inflows and electrical demand. These methods have 3 disadvantages: (i) it is assumed that the random variables can be modelled through a known probability distribution, usually Weibull, log-normal, or normal, which are not always adequate; (ii) the temporal and spatial coupling of the represented variables are generally constructed from the Pearson Correlation, strictly requiring the hypothesis of data normality, that in the case of wind and solar generation is not met; (iii) there is an exponential increase in the model complexity due to its dimensionality. This work proposes the use of a stochastic model built from the combination of a non-parametric approach of a probability density function (the kernel density estimation method) with a dynamic Bayesian network framework. The kernel density estimation method is used to obtain the probability density function of the random variables directly from historical records, eliminating the need of choosing prior distributions. The Bayesian network allows the representation of nonlinearities in the temporal coupling of the time series, since they allow reproducing a compact probability distribution of a variable, subject to preceding stages. The proposed model was used to the generate wind power scenarios in long-term operation studies of the Brazilian Electric System, in which inflows of major rivers were also represented. The results show a considerable quality gain when compared to scenarios generated by traditional approaches.

  4. Impact of Domain Modeling Techniques on the Quality of Domain Model: An Experiment

    Directory of Open Access Journals (Sweden)

    Hiqmat Nisa

    2016-10-01

    Full Text Available The unified modeling language (UML is widely used to analyze and design different software development artifacts in an object oriented development. Domain model is a significant artifact that models the problem domain and visually represents real world objects and relationships among them. It facilitates the comprehension process by identifying the vocabulary and key concepts of the business world. Category list technique identifies concepts and associations with the help of pre defined categories, which are important to business information systems. Whereas noun phrasing technique performs grammatical analysis of use case description to recognize concepts and associations. Both of these techniques are used for the construction of domain model, however, no empirical evidence exists that evaluates the quality of the resultant domain model constructed via these two basic techniques. A controlled experiment was performed to investigate the impact of category list and noun phrasing technique on quality of the domain model. The constructed domain model is evaluated for completeness, correctness and effort required for its design. The obtained results show that category list technique is better than noun phrasing technique for the identification of concepts as it avoids generating unnecessary elements i.e. extra concepts, associations and attributes in the domain model. The noun phrasing technique produces a comprehensive domain model and requires less effort as compared to category list. There is no statistically significant difference between both techniques in case of correctness.

  5. Impact of Domain Modeling Techniques on the Quality of Domain Model: An Experiment

    Directory of Open Access Journals (Sweden)

    Hiqmat Nisa

    2016-11-01

    Full Text Available The unified modeling language (UML is widely used to analyze and design different software development artifacts in an object oriented development. Domain model is a significant artifact that models the problem domain and visually represents real world objects and relationships among them. It facilitates the comprehension process by identifying the vocabulary and key concepts of the business world. Category list technique identifies concepts and associations with the help of pre defined categories, which are important to business information systems. Whereas noun phrasing technique performs grammatical analysis of use case description to recognize concepts and associations. Both of these techniques are used for the construction of domain model, however, no empirical evidence exists that evaluates the quality of the resultant domain model constructed via these two basic techniques. A controlled experiment was performed to investigate the impact of category list and noun phrasing technique on quality of the domain model. The constructed domain model is evaluated for completeness, correctness and effort required for its design. The obtained results show that category list technique is better than noun phrasing technique for the identification of concepts as it avoids generating unnecessary elements i.e. extra concepts, associations and attributes in the domain model. The noun phrasing technique produces a comprehensive domain model and requires less effort as compared to category list. There is no statistically significant difference between both techniques in case of correctness.

  6. Impacts on quality of life related to dental caries in a national representative sample of Thai 12- and 15-year-olds.

    Science.gov (United States)

    Krisdapong, S; Prasertsom, P; Rattanarangsima, K; Sheiham, A

    2013-01-01

    Dental caries is generally given the highest priority in national oral health services for school-aged populations. Yet, there is no study exploring the impacts on quality of life specifically related to dental caries in national samples of school-aged children. This study assessed prevalence and characteristics of oral impacts attributed to dental caries on quality of life and compared them with overall oral health impacts. In addition, associations of oral impacts attributed to dental caries and dental caries status were investigated. A national representative sample of 1,063 12- and 811 15-year-olds completed a sociodemographic and behavioural questionnaire, and were orally examined and interviewed about oral health-related quality of life using the Child-OIDP or OIDP indexes, respectively. Associations of condition-specific impacts (CS impacts) attributed to dental caries with components of DMF were investigated using χ(2) tests and multivariate logistic regressions. CS impacts attributed to dental caries were reported by nearly half the children and such impacts accounted for half of overall oral impacts from all oral conditions. The majority of impacts were of little intensity and affected only 1-2 daily performances, particularly performances on Eating, Emotional stability and Cleaning teeth. CS impacts were significantly positively associated with number of decayed teeth, and strongly associated with severe decay.

  7. DATA QUALITY TOOLS FOR DATAWAREHOUSE MODELS

    Directory of Open Access Journals (Sweden)

    JASPREETI SINGH

    2015-05-01

    Full Text Available Data quality tools aim at detecting and correcting data problems that influence the accuracy and efficiency of data analysis applications. Data warehousing activities require data quality tools to ready the data and ensure that clean data populates the warehouse, thus raising usability of the warehouse. This research targets on the problems in the data that are addressed by data quality tools. We classify data quality tools based on datawarehouse stages and features of tool; which address the data quality problems and understand their functionalities.

  8. The Influence of Syntactic Quality on Pragmatic Quality of Enterprise Process Models

    Directory of Open Access Journals (Sweden)

    Merethe Heggset

    2015-12-01

    Full Text Available As approaches and tools for process and enterprise modelling are maturing, these techniques are being taken into use on a large scale in an increasing number of organizations. In this paper we report on the use of process modelling in connection to the quality system of Statoil, a large Norwegian oil company, in particular, on the aspects found necessary to be emphasized to achieve the appropriate quality of the models in this organization. Based on the investigation of usage statistics and user feedback on models, we have identified that there are problems in comprehending some of the models. Some of these models has poorer syntactic quality than the average syntactic quality of models of the same size. An experiment with improving syntactic quality on some of these models has given mixed results, and it appears that certain syntactic errors hinder comprehension more than others.

  9. The Interaction Network Ontology-supported modeling and mining of complex interactions represented with multiple keywords in biomedical literature.

    Science.gov (United States)

    Özgür, Arzucan; Hur, Junguk; He, Yongqun

    2016-01-01

    hierarchical display of these 34 interaction types and their ancestor terms in INO resulted in the identification of specific gene-gene interaction patterns from the LLL dataset. The phenomenon of having multi-keyword interaction types was also frequently observed in the vaccine dataset. By modeling and representing multiple textual keywords for interaction types, the extended INO enabled the identification of complex biological gene-gene interactions represented with multiple keywords.

  10. ARMA-GM combined forewarning model for the quality control

    Institute of Scientific and Technical Information of China (English)

    Wang Xingyuan; Yang Xu

    2005-01-01

    Three forecasting models are set up: the auto-regressive moving average model, the grey forecasting model for the rate of qualified products Pt, and the grey forecasting model for time intervals of the quality catastrophes. Then a combined forewarning system for the quality of products is established, which contains three models, judgment rules and forewarning state illustration. Finally with an example of the practical production, this modeling system is proved fairly effective.

  11. A linear programming approach to reconstructing subcellular structures from confocal images for automated generation of representative 3D cellular models.

    Science.gov (United States)

    Wood, Scott T; Dean, Brian C; Dean, Delphine

    2013-04-01

    This paper presents a novel computer vision algorithm to analyze 3D stacks of confocal images of fluorescently stained single cells. The goal of the algorithm is to create representative in silico model structures that can be imported into finite element analysis software for mechanical characterization. Segmentation of cell and nucleus boundaries is accomplished via standard thresholding methods. Using novel linear programming methods, a representative actin stress fiber network is generated by computing a linear superposition of fibers having minimum discrepancy compared with an experimental 3D confocal image. Qualitative validation is performed through analysis of seven 3D confocal image stacks of adherent vascular smooth muscle cells (VSMCs) grown in 2D culture. The presented method is able to automatically generate 3D geometries of the cell's boundary, nucleus, and representative F-actin network based on standard cell microscopy data. These geometries can be used for direct importation and implementation in structural finite element models for analysis of the mechanics of a single cell to potentially speed discoveries in the fields of regenerative medicine, mechanobiology, and drug discovery.

  12. This year`s model: Geochemical modeling and groundwater quality

    Energy Technology Data Exchange (ETDEWEB)

    Tuchfeld, H.A.; Simmons, S.P.; Jesionek, K.S. [GeoSyntec Consultants, Walnut Creek, CA (United States)]|[GeoSyntec Consultants, Atlanta, GA (United States); Romito, A.A. [Browning-Ferris Industries, Inc., Houston, TX (United States)

    1998-07-01

    It has been determined that landfill gas migration is a source of volatile organic compounds (VOCs) in groundwater. This can occur through: direct partitioning of migrating gas constituents into the groundwater; alteration of the physiochemical properties of the groundwater; and by indirect means (such as migration of landfill gas condensate and vadose zone water contaminated by landfill gas). This article examines the use of geochemical modeling as a useful tool for differentiating the effects of municipal solid waste (MSW) landfill gas versus leachate on groundwater quality at MSW landfill sites.

  13. Performance evaluation of quality monitor models in spot welding

    Institute of Scientific and Technical Information of China (English)

    Zhang Zhongdian; Li Dongqing; Wang Kai

    2005-01-01

    Performance of quality monitor models in spot welding determines the monitor precision directly, so it's crucial to evaluate it. Previously, mean square error ( MSE ) is often used to evaluate performances of models, but it can only show the total errors of finite specimens of models, and cannot show whether the quality information inferred from models are accurate and reliable enough or not. For this reason, by means of measure error theory, a new way to evaluate the performances of models according to the error distributions is developed as follows: Only if correct and precise enough the error distribution of model is, the quality information inferred from model is accurate and reliable.

  14. The role of subcutaneous tissue stiffness on microneedle performance in a representative in vitro model of skin.

    Science.gov (United States)

    Moronkeji, K; Todd, S; Dawidowska, I; Barrett, S D; Akhtar, R

    2016-11-10

    There has been growing interest in the mechanical behaviour of skin due to the rapid development of microneedle devices for drug delivery applications into skin. However, most in vitro experimentation studies that are used to evaluate microneedle performance do not consider the biomechanical properties of skin or that of the subcutaneous layers. In this study, a representative experimental model of skin was developed which was comprised of subcutaneous and muscle mimics. Neonatal porcine skin from the abdominal and back regions was used, with gelatine gels of differing water content (67, 80, 88 and 96%) to represent the subcutaneous tissue, and a type of ballistic gelatine, Perma-Gel®, as a muscle mimic. Dynamic nanoindentation was used to characterize the mechanical properties of each of these layers. A custom-developed impact test rig was used to apply dense polymethylmethacrylate (PMMA) microneedles to the skin models in a controlled and repeatable way with quantification of the insertion force and velocity. Image analysis methods were used to measure penetration depth and area of the breach caused by microneedle penetration following staining and optical imaging. The nanoindentation tests demonstrated that the tissue mimics matched expected values for subcutaneous and muscle tissue, and that the compliance of the subcutaneous mimics increased linearly with water content. The abdominal skin was thinner and less stiff as compared to back skin. The maximum force decreased with gel water content in the abdominal skin but not in the back skin. Overall, larger and deeper perforations were found in the skin models with increasing water content. These data demonstrate the importance of subcutaneous tissue on microneedle performance and the need for representative skin models in microneedle technology development.

  15. Impact of inherent meteorology uncertainty on air quality model predictions

    Science.gov (United States)

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...

  16. Quality Assurance Model for Digital Adult Education Materials

    Science.gov (United States)

    Dimou, Helen; Kameas, Achilles

    2016-01-01

    Purpose: This paper aims to present a model for the quality assurance of digital educational material that is appropriate for adult education. The proposed model adopts the software quality standard ISO/IEC 9126 and takes into account adult learning theories, Bloom's taxonomy of learning objectives and two instructional design models: Kolb's model…

  17. Representing spatial and temporal complexity in ecohydrological models: a meta-analysis focusing on groundwater - surface water interactions

    Science.gov (United States)

    McDonald, Karlie; Mika, Sarah; Kolbe, Tamara; Abbott, Ben; Ciocca, Francesco; Marruedo, Amaia; Hannah, David; Schmidt, Christian; Fleckenstein, Jan; Karuse, Stefan

    2016-04-01

    Sub-surface hydrologic processes are highly dynamic, varying spatially and temporally with strong links to the geomorphology and hydrogeologic properties of an area. This spatial and temporal complexity is a critical regulator of biogeochemical and ecological processes within the interface groundwater - surface water (GW-SW) ecohydrological interface and adjacent ecosystems. Many GW-SW models have attempted to capture this spatial and temporal complexity with varying degrees of success. The incorporation of spatial and temporal complexity within GW-SW model configuration is important to investigate interactions with transient storage and subsurface geology, infiltration and recharge, and mass balance of exchange fluxes at the GW-SW ecohydrological interface. Additionally, characterising spatial and temporal complexity in GW-SW models is essential to derive predictions using realistic environmental conditions. In this paper we conduct a systematic Web of Science meta-analysis of conceptual, hydrodynamic, and reactive and heat transport models of the GW-SW ecohydrological interface since 2004 to explore how these models handled spatial and temporal complexity. The freshwater - groundwater ecohydrological interface was the most commonly represented in publications between 2004 and 2014 with 91% of papers followed by marine 6% and estuarine systems with 3% of papers. Of the GW-SW models published since 2004, the 52% have focused on hydrodynamic processes and heat and reactive transport). Within the hydrodynamic subset, 25% of models focused on a vertical depth of limitations of incorporating spatial and temporal variability into GW-SW models are identified as the inclusion of woody debris, carbon sources, subsurface geological structures and bioclogging into model parameterization. The technological limitations influence the types of models applied, such as hydrostatic coupled models and fully intrinsic saturated and unsaturated models, and the assumptions or

  18. Representing Development

    DEFF Research Database (Denmark)

    Representing Development presents the different social representations that have formed the idea of development in Western thinking over the past three centuries. Offering an acute perspective on the current state of developmental science and providing constructive insights into future pathways...... and development, addressing their contemporary enactments and reflecting on future theoretical and empirical directions. The first section of the book provides an historical account of early representations of development that, having come from life science, has shaped the way in which developmental science has...... approached development. Section two focuses upon the contemporary issues of developmental psychology, neuroscience and developmental science at large. The final section offers a series of commentaries pointing to the questions opened by the previous chapters, looking to outline the future lines...

  19. Representing Microbial Dormancy in Soil Decomposition Models Improves Model Performance and Reveals Key Ecosystem Controls on Microbial Activity

    Science.gov (United States)

    He, Y.; Yang, J.; Zhuang, Q.; Wang, G.; Liu, Y.

    2014-12-01

    Climate feedbacks from soils can result from environmental change and subsequent responses of plant and microbial communities and nutrient cycling. Explicit consideration of microbial life history traits and strategy may be necessary to predict climate feedbacks due to microbial physiology and community changes and their associated effect on carbon cycling. In this study, we developed an explicit microbial-enzyme decomposition model and examined model performance with and without representation of dormancy at six temperate forest sites with observed soil efflux ranged from 4 to 10 years across different forest types. We then extrapolated the model to all temperate forests in the Northern Hemisphere (25-50°N) to investigate spatial controls on microbial and soil C dynamics. Both models captured the observed soil heterotrophic respiration (RH), yet no-dormancy model consistently exhibited large seasonal amplitude and overestimation in microbial biomass. Spatially, the total RH from temperate forests based on dormancy model amounts to 6.88PgC/yr, and 7.99PgC/yr based on no-dormancy model. However, no-dormancy model notably overestimated the ratio of microbial biomass to SOC. Spatial correlation analysis revealed key controls of soil C:N ratio on the active proportion of microbial biomass, whereas local dormancy is primarily controlled by soil moisture and temperature, indicating scale-dependent environmental and biotic controls on microbial and SOC dynamics. These developments should provide essential support to modeling future soil carbon dynamics and enhance the avenue for collaboration between empirical soil experiment and modeling in the sense that more microbial physiological measurements are needed to better constrain and evaluate the models.

  20. The EDEN-IW ontology model for sharing knowledge and water quality data between heterogenous databases

    DEFF Research Database (Denmark)

    Stjernholm, M.; Poslad, S.; Zuo, L.;

    2004-01-01

    The Environmental Data Exchange Network for Inland Water (EDEN-IW) project's main aim is to develop a system for making disparate and heterogeneous databases of Inland Water quality more accessible to users. The core technology is based upon a combination of: ontological model to represent a Sema...

  1. Model validation of channel zapping quality

    NARCIS (Netherlands)

    Kooij, R.E; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call

  2. Model validation of channel zapping quality

    NARCIS (Netherlands)

    Kooij, R.E; Nicolai, F.; Ahmed, K.; Brunnström, K.

    2009-01-01

    In an earlier paper we showed, that perceived quality of channel zapping is related to the perceived quality of download time of web browsing, as suggested by ITU-T Rec.G.1030. We showed this by performing subjective tests resulting in an excellent fit with a 0.99 correlation. This was what we call

  3. Discovering the Future: Modelling Quality Matters

    NARCIS (Netherlands)

    Tijskens, L.M.M.

    2004-01-01

    Quality of agricultural products becomes increasingly important to consumers, and hence to producers and retailers. Quality, however, is il! defined and perceived and evaluated differently by different individuals, or by groups of individuals. Based on problem decomposition, a working theory on qual

  4. Assessment of the Quality Management Models in Higher Education

    Science.gov (United States)

    Basar, Gulsun; Altinay, Zehra; Dagli, Gokmen; Altinay, Fahriye

    2016-01-01

    This study involves the assessment of the quality management models in Higher Education by explaining the importance of quality in higher education and by examining the higher education quality assurance system practices in other countries. The qualitative study was carried out with the members of the Higher Education Planning, Evaluation,…

  5. Modeling the Subjective Quality of Highly Contrasted Videos Displayed on LCD With Local Backlight Dimming

    DEFF Research Database (Denmark)

    Mantel, Claire; Bech, Søren; Korhonen, Jari

    2015-01-01

    Local backlight dimming is a technology aiming at both saving energy and improving visual quality on television sets. As the rendition of the image is specified locally, the numerical signal corresponding to the displayed image needs to be computed through a model of the display. This simulated...... signal can then be used as input to objective quality metrics. The focus of this paper is on determining which characteristics of locally backlit displays influence quality assessment. A subjective experiment assessing the quality of highly contrasted videos displayed with various local backlight......-dimming algorithms is set up. Subjective results are then compared with both objective measures and objective quality metrics using different display models. The first analysis indicates that the most significant objective features are temporal variations, power consumption (probably representing leakage...

  6. Modeling data quality for risk assessment of GIS

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper presents a methodology to determine three data quality (DQ) risk characteristics: accuracy, comprehensiveness and nonmembership. The methodology provides a set of quantitative models to confirm the information quality risks for the database of the geographical information system (GIS). Four quantitative measures are introduced to examine how the quality risks of source information affect the quality of information outputs produced using the relational algebra operations Selection, Projection, and...

  7. [Aquatic ecosystem modelling approach: temperature and water quality models applied to Oualidia and Nador lagoons].

    Science.gov (United States)

    Idrissi, J Lakhdar; Orbi, A; Hilmi, K; Zidane, F; Moncef, M

    2005-07-01

    The objective of this work is to develop an aquatic ecosystem and apply it on Moroccan lagoon systems. This model will keep us abreast of the yearly development of the main parameters that characterize these ecosystems while integrating all the data that have so far been acquired. Within this framework, a simulation model of the thermal system and a model of the water quality have been elaborated. These models, which have been simulated on the lagoon of Oualidia (North of Morocco) and validated on the lagoon of Nador (North West Mediterranean), permit to foresee the cycles of temperature of the surface and the parameters of the water quality (dissolved oxygen and biomass phytoplankton) by using meteorological information, specific features and in situ measurements in the studied sites. The elaborated model, called Zero-Dimensional, simulates the average conduct of the site during the time of variable states that are representatives of the studied ecosystem. This model will provide answers for the studied phenomena and is a work tool adequate for numerical simplicity.

  8. Pharmacodynamic modelling of in vitro activity of tetracycline against a representative, naturally occurring population of porcine Escherichia coli

    DEFF Research Database (Denmark)

    Ahmad, Amais; Zachariasen, Camilla; Christiansen, Lasse Engbo;

    2015-01-01

    text] between susceptible and resistant strains in the absence of a drug was not different. EC 50 increased linearly with MIC on a log-log scale, and γ was different between susceptible and resistant strains. The in vitro model parameters described the inhibition effect of tetracycline on E. coli when...... of Escherichia coli representative of those found in the Danish pig population, we compared the growth of 50 randomly selected strains. The observed net growth rates were used to describe the in vitro pharmacodynamic relationship between drug concentration and net growth rate based on E max model with three...... parameters: maximum net growth rate (α max ); concentration for a half-maximal response (E max ); and the Hill coefficient (γ). The net growth rate in the absence of antibiotic did not differ between susceptible and resistant isolates (P = 0.97). The net growth rate decreased with increasing tetracycline...

  9. NewsPaperBox - Online News Space: a visual model for representing the social space of a website

    Directory of Open Access Journals (Sweden)

    Selçuk Artut

    2010-02-01

    Full Text Available NewsPaperBox * propounds an alternative visual model utilizing the treemap algorithm to represent the collective use of a website that evolves in response to user interaction. While the technology currently exists to track various user behaviors such as number of clicks, duration of stay on a given web site, these statistics are not yet employed to influence the visual representation of that site's design in real time. In that sense, this project propounds an alternative modeling of a representational outlook of a website that is developed by collaborations and competitions of its global users. This paper proposes the experience of cyberspace as a generative process driven by its effective user participation.

  10. Representing nursing guideline with unified modeling language to facilitate development of a computer system: a case study.

    Science.gov (United States)

    Choi, Jeeyae; Choi, Jeungok E

    2014-01-01

    To provide best recommendations at the point of care, guidelines have been implemented in computer systems. As a prerequisite, guidelines are translated into a computer-interpretable guideline format. Since there are no specific tools to translate nursing guidelines, only a few nursing guidelines are translated and implemented in computer systems. Unified modeling language (UML) is a software writing language and is known to well and accurately represent end-users' perspective, due to the expressive characteristics of the UML. In order to facilitate the development of computer systems for nurses' use, the UML was used to translate a paper-based nursing guideline, and its ease of use and the usefulness were tested through a case study of a genetic counseling guideline. The UML was found to be a useful tool to nurse informaticians and a sufficient tool to model a guideline in a computer program.

  11. NewsPaperBox - Online News Space: a visual model for representing the social space of a website

    Directory of Open Access Journals (Sweden)

    Selçuk Artut

    2010-02-01

    Full Text Available NewsPaperBox * propounds an alternative visual model utilizing the treemap algorithm to represent the collective use of a website that evolves in response to user interaction. While the technology currently exists to track various user behaviors such as number of clicks, duration of stay on a given web site, these statistics are not yet employed to influence the visual representation of that site's design in real time. In that sense, this project propounds an alternative modeling of a representational outlook of a website that is developed by collaborations and competitions of its global users. This paper proposes the experience of cyberspace as a generative process driven by its effective user participation.

  12. Parametric packet-based audiovisual quality model for IPTV services

    CERN Document Server

    Garcia, Marie-Neige

    2014-01-01

    This volume presents a parametric packet-based audiovisual quality model for Internet Protocol TeleVision (IPTV) services. The model is composed of three quality modules for the respective audio, video and audiovisual components. The audio and video quality modules take as input a parametric description of the audiovisual processing path, and deliver an estimate of the audio and video quality. These outputs are sent to the audiovisual quality module which provides an estimate of the audiovisual quality. Estimates of perceived quality are typically used both in the network planning phase and as part of the quality monitoring. The same audio quality model is used for both these phases, while two variants of the video quality model have been developed for addressing the two application scenarios. The addressed packetization scheme is MPEG2 Transport Stream over Real-time Transport Protocol over Internet Protocol. In the case of quality monitoring, that is the case for which the network is already set-up, the aud...

  13. Three-Dimensional Algebraic Models of the tRNA Code and 12 Graphs for Representing the Amino Acids

    Science.gov (United States)

    José, Marco V.; Morgado, Eberto R.; Guimarães, Romeu Cardoso; Zamudio, Gabriel S.; de Farías, Sávio Torres; Bobadilla, Juan R.; Sosa, Daniela

    2014-01-01

    Three-dimensional algebraic models, also called Genetic Hotels, are developed to represent the Standard Genetic Code, the Standard tRNA Code (S-tRNA-C), and the Human tRNA code (H-tRNA-C). New algebraic concepts are introduced to be able to describe these models, to wit, the generalization of the 2n-Klein Group and the concept of a subgroup coset with a tail. We found that the H-tRNA-C displayed broken symmetries in regard to the S-tRNA-C, which is highly symmetric. We also show that there are only 12 ways to represent each of the corresponding phenotypic graphs of amino acids. The averages of statistical centrality measures of the 12 graphs for each of the three codes are carried out and they are statistically compared. The phenotypic graphs of the S-tRNA-C display a common triangular prism of amino acids in 10 out of the 12 graphs, whilst the corresponding graphs for the H-tRNA-C display only two triangular prisms. The graphs exhibit disjoint clusters of amino acids when their polar requirement values are used. We contend that the S-tRNA-C is in a frozen-like state, whereas the H-tRNA-C may be in an evolving state. PMID:25370377

  14. INTERCOMPARISON OF ALTERNATIVE VEGETATION DATABASES FOR REGIONAL AIR QUALITY MODELING

    Science.gov (United States)

    Vegetation cover data are used to characterize several regional air quality modeling processes, including the calculation of heat, moisture, and momentum fluxes with the Mesoscale Meteorological Model (MM5) and the estimate of biogenic volatile organic compound and nitric oxide...

  15. Islamic Banks Service Innovation Quality: Conceptual Model

    Directory of Open Access Journals (Sweden)

    Tahreem Noor Khan

    2016-07-01

    Full Text Available Customer perspectives and satisfaction level are considered important for analysing the performance of Islamic bank service quality. Sufficient researches has been done to explore customer perception and satisfaction level with Islamic banking service quality, however there is lack of data to compare and find the similarity in understanding the main determinant attributes needed for Islamic banking service quality. The purpose of this paper is to describe and integrate the results of existing wealth of research on service quality in Islamic banks. After weighing up all the views from existing research, common findings, concerns will be discussed. This research did not find much of information or studies indicating toward innovation in Islamic banking service quality. Thus based on review of the literature this paper suggests main key attributes of service for Islamic banks (RIBA Service IQ. This research strongly asserts that sincere motivation, truthful intention, dynamic and practical service innovation of quality approaches can uplift Islamic financial brandDOI: 10.15408/aiq.v8i2.3161

  16. Image quality in CT: From physical measurements to model observers.

    Science.gov (United States)

    Verdun, F R; Racine, D; Ott, J G; Tapiovaara, M J; Toroi, P; Bochud, F O; Veldkamp, W J H; Schegerer, A; Bouwman, R W; Giron, I Hernandez; Marshall, N W; Edyvean, S

    2015-12-01

    Evaluation of image quality (IQ) in Computed Tomography (CT) is important to ensure that diagnostic questions are correctly answered, whilst keeping radiation dose to the patient as low as is reasonably possible. The assessment of individual aspects of IQ is already a key component of routine quality control of medical x-ray devices. These values together with standard dose indicators can be used to give rise to 'figures of merit' (FOM) to characterise the dose efficiency of the CT scanners operating in certain modes. The demand for clinically relevant IQ characterisation has naturally increased with the development of CT technology (detectors efficiency, image reconstruction and processing), resulting in the adaptation and evolution of assessment methods. The purpose of this review is to present the spectrum of various methods that have been used to characterise image quality in CT: from objective measurements of physical parameters to clinically task-based approaches (i.e. model observer (MO) approach) including pure human observer approach. When combined together with a dose indicator, a generalised dose efficiency index can be explored in a framework of system and patient dose optimisation. We will focus on the IQ methodologies that are required for dealing with standard reconstruction, but also for iterative reconstruction algorithms. With this concept the previously used FOM will be presented with a proposal to update them in order to make them relevant and up to date with technological progress. The MO that objectively assesses IQ for clinically relevant tasks represents the most promising method in terms of radiologist sensitivity performance and therefore of most relevance in the clinical environment.

  17. Quality assurance of weather data for agricultural system model input

    Science.gov (United States)

    It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...

  18. Quality control of geological voxel models using experts' gaze

    NARCIS (Netherlands)

    Maanen, van Peter-Paul; Busschers, Freek S.; Brouwer, Anne-Marie; Meulendijk, van der Michiel J.; Erp, van Jan B.F.

    2015-01-01

    Due to an expected increase in geological voxel model data-flow and user demands, the development of improved quality control for such models is crucial. This study explores the potential of a new type of quality control that improves the detection of errors by just using gaze behavior of 12 geologi

  19. Air Quality Modelling and the National Emission Database

    DEFF Research Database (Denmark)

    Jensen, S. S.

    The project focuses on development of institutional strengthening to be able to carry out national air emission inventories based on the CORINAIR methodology. The present report describes the link between emission inventories and air quality modelling to ensure that the new national air emission...... inventory is able to take into account the data requirements of air quality models...

  20. Dynamic heart model for the mathematical cardiac torso (MCAT) phantom to represent the invariant total heart volume

    Science.gov (United States)

    Pretorius, P. H.; King, Michael A.; Tsui, Benjamin M.; LaCroix, Karen; Xia, Weishi

    1998-07-01

    This manuscript documents the alteration of the heart model of the MCAT phantom to better represent cardiac motion. The objective of the inclusion of motion was to develop a digital simulation of the heart such that the impact of cardiac motion on single photon emission computed tomography (SPECT) imaging could be assessed and methods of quantitating cardiac function could be investigated. The motion of the dynamic MCAT's heart is modeled by a 128 time frame volume curve. Eight time frames are averaged together to obtain a gated perfusion acquisition of 16 time frames and ensure motion within every time frame. The position of the MCAT heart was changed during contraction to rotate back and forth around the long axis through the center of the left ventricle (LV) using the end systolic time frame as turning point. Simple respiratory motion was also introduced by changing the orientation of the heart model in a 2 dimensional (2D) plane with every time frame. The averaging effect of respiratory motion in a specific time frame was modeled by randomly selecting multiple heart locations between two extreme orientations. Non-gated perfusion phantoms were also generated by averaging over all time frames. Maximal chamber volumes were selected to fit a profile of a normal healthy person. These volumes were changed during contraction of the ventricles such that the increase in volume in the atria compensated for the decrease in volume in the ventricles. The myocardium were modeled to represent shortening of muscle fibers during contraction with the base of the ventricles moving towards a static apex. The apical region was modeled with moderate wall thinning present while myocardial mass was conserved. To test the applicability of the dynamic heart model, myocardial wall thickening was measured using maximum counts and full width half maximum measurements, and compared with published trends. An analytical 3D projector, with attenuation and detector response included, was used

  1. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  2. Assessment of Climate Change Impacts on Water Resources in Three Representative Ukrainian Catchments Using Eco-Hydrological Modelling

    Directory of Open Access Journals (Sweden)

    Iulii Didovets

    2017-03-01

    Full Text Available The information about climate change impact on river discharge is vitally important for planning adaptation measures. The future changes can affect different water-related sectors. The main goal of this study was to investigate the potential water resource changes in Ukraine, focusing on three mesoscale river catchments (Teteriv, Upper Western Bug, and Samara characteristic for different geographical zones. The catchment scale watershed model—Soil and Water Integrated Model (SWIM—was setup, calibrated, and validated for the three catchments under consideration. A set of seven GCM-RCM (General Circulation Model-Regional Climate Model coupled climate scenarios corresponding to RCPs (Representative Concentration Pathways 4.5 and 8.5 were used to drive the hydrological catchment model. The climate projections, used in the study, were considered as three combinations of low, intermediate, and high end scenarios. Our results indicate the shifts in the seasonal distribution of runoff in all three catchments. The spring high flow occurs earlier as a result of temperature increases and earlier snowmelt. The fairly robust trend is an increase in river discharge in the winter season, and most of the scenarios show a potential decrease in river discharge in the spring.

  3. Application of Service Quality Model in Education Environment

    Directory of Open Access Journals (Sweden)

    Ting Ding Hooi

    2016-02-01

    Full Text Available Most of the ideas on service quality stem from the West. The massive developments in research in the West are undeniable of their importance. This leads to the generation and development of new ideas. These ideas were subsequently channeled to developing countries. Ideas obtained were then formulated and used by these developing countries in order to obtain better approach in channeling service quality. There are ample to be learnt from the service quality model, SERVQUAL which attain high acceptance in the West. Service quality in the education system is important to guarantee the effectiveness and quality of education. Effective and quality education will be able to offer quality graduates, which will contribute to the development of the nation. This paper will discuss the application of the SERVQUAL model into the education environment.

  4. Quality of peas modelled by a structural equation system

    DEFF Research Database (Denmark)

    Bech, Anne C.; Juhl, Hans Jørn; Martens, Magni

    2000-01-01

    The quality of peas has been studied in a joint project between a pea producing company in Denmark and several research institutions. The study included quality from a consumer point of view based on market research and quality from more internal company points of view based on measurement...... expressed by consumers as a function of the objective measurements of quality, eg the physical/chemical variables? (3) Which of the measured objective variables are most important for further product development? In the paper we describe consumer evaluations as a function of physical/chemical variables...... in a PLS structural model with the Total Food Quality Model as starting point. The results show that texture and flavour do have approximately the same effect on consumers' perception of overall quality. Quality development goals for plant breeders would be to optimse perceived flavour directly...

  5. [Review on HSPF model for simulation of hydrology and water quality processes].

    Science.gov (United States)

    Li, Zhao-fu; Liu, Hong-Yu; Li, Yan

    2012-07-01

    Hydrological Simulation Program-FORTRAN (HSPF), written in FORTRAN, is one ol the best semi-distributed hydrology and water quality models, which was first developed based on the Stanford Watershed Model. Many studies on HSPF model application were conducted. It can represent the contributions of sediment, nutrients, pesticides, conservatives and fecal coliforms from agricultural areas, continuously simulate water quantity and quality processes, as well as the effects of climate change and land use change on water quantity and quality. HSPF consists of three basic application components: PERLND (Pervious Land Segment) IMPLND (Impervious Land Segment), and RCHRES (free-flowing reach or mixed reservoirs). In general, HSPF has extensive application in the modeling of hydrology or water quality processes and the analysis of climate change and land use change. However, it has limited use in China. The main problems with HSPF include: (1) some algorithms and procedures still need to revise, (2) due to the high standard for input data, the accuracy of the model is limited by spatial and attribute data, (3) the model is only applicable for the simulation of well-mixed rivers, reservoirs and one-dimensional water bodies, it must be integrated with other models to solve more complex problems. At present, studies on HSPF model development are still undergoing, such as revision of model platform, extension of model function, method development for model calibration, and analysis of parameter sensitivity. With the accumulation of basic data and imorovement of data sharing, the HSPF model will be applied more extensively in China.

  6. Quality Model of Foodstuff in a Refrigerated Display Cabinet

    DEFF Research Database (Denmark)

    Cai, Junping; Risum, Jørgen; Thybo, Claus

    2006-01-01

    as what is the optimal defrost scheme from food quality point of view are answered. This will serve as a prerequisite of designing of optimal control scheme for the commercial refrigeration system, aiming at optimizing a weighed cost function of both food quality and overall energy consumption of system....... happens to the food inside during this period, when we look at the quality factor? This paper discusses quality model of foodstuff, different scenarios of defrost scheme are simulated, questions such as how the defrost temperature and duration influence the food temperature, thus the food quality, as well...

  7. Effects of modeling means on properties of monitoring models of spot welding quality

    Institute of Scientific and Technical Information of China (English)

    张忠典; 李冬青; 赵洪运; 于燕

    2002-01-01

    Analyzing and modeling the relation between monitoring information during welding and quality information of the joints is the foundation of monitoring resistance spot welding quality. According to the means of modeling, the known models can be divided into three large categories: single linear regression models, multiple linear regression models and multiple non-linear models. By modeling the relations between dynamic resistance information and welding quality parameters with different means, this paper analyzes effects of modeling means on performances of monitoring models of resistance spot welding quality. From the test results, the following conclusions can be drawn: By comparison with two other kinds of models, artificial neural network (ANN) model can describe non-linear and high coupling relationship between monitoring information and quality information more reasonably, improve performance of monitoring model remarkably, and make the estimated values of welding quality parameters more accurate and reliable.

  8. A review of hydrological/water-quality models

    Directory of Open Access Journals (Sweden)

    Liangliang GAO,Daoliang LI

    2014-12-01

    Full Text Available Water quality models are important in predicting the changes in surface water quality for environmental management. A range of water quality models are wildly used, but every model has its advantages and limitations for specific situations. The aim of this review is to provide a guide to researcher for selecting a suitable water quality model. Eight well known water quality models were selected for this review: SWAT, WASP, QUALs, MIKE 11, HSPF, CE-QUAL-W2, ELCOM-CAEDYM and EFDC. Each model is described according to its intended use, development, simulation elements, basic principles and applicability (e.g., for rivers, lakes, and reservoirs and estuaries. Currently, the most important trends for future model development are: (1 combination models─individual models cannot completely solve the complex situations so combined models are needed to obtain the most appropriate results, (2 application of artificial intelligence and mechanistic models combined with non-mechanistic models will provide more accurate results because of the realistic parameters derived from non-mechanistic models, and (3 integration with remote sensing, geographical information and global position systems (3S ─3S can solve problems requiring large amounts of data.

  9. Collaborative problem solving with a total quality model.

    Science.gov (United States)

    Volden, C M; Monnig, R

    1993-01-01

    A collaborative problem-solving system committed to the interests of those involved complies with the teachings of the total quality management movement in health care. Deming espoused that any quality system must become an integral part of routine activities. A process that is used consistently in dealing with problems, issues, or conflicts provides a mechanism for accomplishing total quality improvement. The collaborative problem-solving process described here results in quality decision-making. This model incorporates Ishikawa's cause-and-effect (fishbone) diagram, Moore's key causes of conflict, and the steps of the University of North Dakota Conflict Resolution Center's collaborative problem solving model.

  10. Modeling and Evaluation of Multimodal Perceptual Quality

    DEFF Research Database (Denmark)

    Petersen, Kim T; Hansen, Steffen Duus; Sørensen, John Aasted

    1997-01-01

    The increasing performance requirements of multimedia modalities, carrying speech, audio, video, image, and graphics emphasize the need for assessment methods of the total quality of a multimedia system and methods for simultaneous analysis of the system components. It is important to take into a...

  11. Data assimilation for air quality models

    DEFF Research Database (Denmark)

    Silver, Jeremy David

    2014-01-01

    -dimensional optimal interpolation procedure (OI), an Ensemble Kalman Filter (EnKF), and a three-dimensional variational scheme (3D-var). The three assimilation procedures are described and tested. A multi-faceted approach is taken for the verification, using independent measurements from surface air-quality...

  12. Dynamic neuronal ensembles: Issues in representing structure change in object-oriented, biologically-based brain models

    Energy Technology Data Exchange (ETDEWEB)

    Vahie, S.; Zeigler, B.P.; Cho, H. [Univ. of Arizona, Tucson, AZ (United States)

    1996-12-31

    This paper describes the structure of dynamic neuronal ensembles (DNEs). DNEs represent a new paradigm for learning, based on biological neural networks that use variable structures. We present a computational neural element that demonstrates biological neuron functionality such as neurotransmitter feedback absolute refractory period and multiple output potentials. More specifically, we will develop a network of neural elements that have the ability to dynamically strengthen, weaken, add and remove interconnections. We demonstrate that the DNE is capable of performing dynamic modifications to neuron connections and exhibiting biological neuron functionality. In addition to its applications for learning, DNEs provide an excellent environment for testing and analysis of biological neural systems. An example of habituation and hyper-sensitization in biological systems, using a neural circuit from a snail is presented and discussed. This paper provides an insight into the DNE paradigm using models developed and simulated in DEVS.

  13. Modeling Fluid’s Dynamics with Master Equations in Ultrametric Spaces Representing the Treelike Structure of Capillary Networks

    Directory of Open Access Journals (Sweden)

    Andrei Khrennikov

    2016-07-01

    Full Text Available We present a new conceptual approach for modeling of fluid flows in random porous media based on explicit exploration of the treelike geometry of complex capillary networks. Such patterns can be represented mathematically as ultrametric spaces and the dynamics of fluids by ultrametric diffusion. The images of p-adic fields, extracted from the real multiscale rock samples and from some reference images, are depicted. In this model the porous background is treated as the environment contributing to the coefficients of evolutionary equations. For the simplest trees, these equations are essentially less complicated than those with fractional differential operators which are commonly applied in geological studies looking for some fractional analogs to conventional Euclidean space but with anomalous scaling and diffusion properties. It is possible to solve the former equation analytically and, in particular, to find stationary solutions. The main aim of this paper is to attract the attention of researchers working on modeling of geological processes to the novel utrametric approach and to show some examples from the petroleum reservoir static and dynamic characterization, able to integrate the p-adic approach with multifractals, thermodynamics and scaling. We also present a non-mathematician friendly review of trees and ultrametric spaces and pseudo-differential operators on such spaces.

  14. A Model of Housing Quality Determinants (HQD for Affordable Housing

    Directory of Open Access Journals (Sweden)

    Afaq Hyder Chohan

    2015-01-01

    Full Text Available This research identifies the design quality determinants and parameters for affordable housing in a developing metropolis, Karachi, Pakistan. The absence of quality housing in Karachi has resulted in a variety of factors including policy failure, violation of bylaws, housing scarcity and more low quality housing. The combination of these factors has resulted in poor housing design and construction and has lowered the overall quality of housing. Homeowners (end-users experience unplanned maintenance and repairs. This study provides a design quality model for use as a survey tool among professionals and endusers. This study resulted in a table of 24 quality determinants marked as Housing Quality Determinants (HQD grouped into eight sections. This research concludes that the existing design quality of affordable housing in Karachi could be enhanced by resolving problems related to design, construction, services, site development, neighbourhood and sustainability. The HQD model provides a platform for developing quality indicators of housing design and an opportunity for local and international design and construction professionals to rethink design in the context of housing quality. This article provides the development of the HQD framework (model.

  15. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    Science.gov (United States)

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A linked hydrodynamic and water quality model for the Salton Sea

    Science.gov (United States)

    Chung, E.G.; Schladow, S.G.; Perez-Losada, J.; Robertson, D.M.

    2008-01-01

    A linked hydrodynamic and water quality model was developed and applied to the Salton Sea. The hydrodynamic component is based on the one-dimensional numerical model, DLM. The water quality model is based on a new conceptual model for nutrient cycling in the Sea, and simulates temperature, total suspended sediment concentration, nutrient concentrations, including PO4-3, NO3-1 and NH4+1, DO concentration and chlorophyll a concentration as functions of depth and time. Existing water temperature data from 1997 were used to verify that the model could accurately represent the onset and breakup of thermal stratification. 1999 is the only year with a near-complete dataset for water quality variables for the Salton Sea. The linked hydrodynamic and water quality model was run for 1999, and by adjustment of rate coefficients and other water quality parameters, a good match with the data was obtained. In this article, the model is fully described and the model results for reductions in external phosphorus load on chlorophyll a distribution are presented. ?? 2008 Springer Science+Business Media B.V.

  17. Evaluation of model quality predictions in CASP9

    KAUST Repository

    Kryshtafovych, Andriy

    2011-01-01

    CASP has been assessing the state of the art in the a priori estimation of accuracy of protein structure prediction since 2006. The inclusion of model quality assessment category in CASP contributed to a rapid development of methods in this area. In the last experiment, 46 quality assessment groups tested their approaches to estimate the accuracy of protein models as a whole and/or on a per-residue basis. We assessed the performance of these methods predominantly on the basis of the correlation between the predicted and observed quality of the models on both global and local scales. The ability of the methods to identify the models closest to the best one, to differentiate between good and bad models, and to identify well modeled regions was also analyzed. Our evaluations demonstrate that even though global quality assessment methods seem to approach perfection point (weighted average per-target Pearson\\'s correlation coefficients are as high as 0.97 for the best groups), there is still room for improvement. First, all top-performing methods use consensus approaches to generate quality estimates, and this strategy has its own limitations. Second, the methods that are based on the analysis of individual models lag far behind clustering techniques and need a boost in performance. The methods for estimating per-residue accuracy of models are less accurate than global quality assessment methods, with an average weighted per-model correlation coefficient in the range of 0.63-0.72 for the best 10 groups.

  18. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    Science.gov (United States)

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  19. A Model to Improve the Quality Products

    Directory of Open Access Journals (Sweden)

    Hasan GOKKAYA

    2010-08-01

    Full Text Available The topic of this paper is to present a solution who can improve product qualityfollowing the idea: “Unlike people who have verbal skills, machines use "sign language"to communicate what hurts or what has invaded their system’. Recognizing the "signs"or symptoms that the machine conveys is a required skill for those who work withmachines and are responsible for their care and feeding. The acoustic behavior of technical products is predominantly defined in the design stage, although the acoustic characteristics of machine structures can be analyze and give a solution for the actual products and create a new generation of products. The paper describes the steps intechnological process for a product and the solution who will reduce the costs with the non-quality of product and improve the management quality.

  20. Quality of peas modelled by a structural equation system

    DEFF Research Database (Denmark)

    Bech, Anne C.; Juhl, Hans Jørn; Martens, Magni

    2000-01-01

    The quality of peas has been studied in a joint project between a pea producing company in Denmark and several research institutions. The study included quality from a consumer point of view based on market research and quality from more internal company points of view based on measurement...... in a PLS structural model with the Total Food Quality Model as starting point. The results show that texture and flavour do have approximately the same effect on consumers' perception of overall quality. Quality development goals for plant breeders would be to optimse perceived flavour directly...... by increasing the amount of sugars and more indirectly by improving the perception of colour through darker and less yellow peas. Perceived texture can be optimised by focusing on selected texture measurements. Udgivelsesdato: JUL...

  1. THE RELATIONSHIP BETWEEN MODELS OF QUALITY MANAGEMENT AND CSR

    Directory of Open Access Journals (Sweden)

    CĂTĂLINA SITNIKOV

    2015-03-01

    Full Text Available Lately, the quality management has integrated more and more among its components Corporate Social Responsibility (CSR. With strong roots in the foundation for sustainable development, protection of the environment, issues of social justness and economic growth, CSR raises numerous issues related to obtaining profits, business performance and firms and activities based on the quality of management. From the point of view of the last issues, the models of quality management built on the fundamental principles of quality become the foundation and catalyst for the effective implementation of CSR in organizations. This is the reason why it is necessary to investigate the extent to which quality management models provide frameworks and guidelines for integrating CSR in the management of quality and, moreover, in the management of the organization, with a clear focus on the extent to which the concept can be institutionalized and operated by the organization.

  2. A Model to Improve the Quality Products

    OpenAIRE

    2010-01-01

    The topic of this paper is to present a solution who can improve product quality following the idea: “Unlike people who have verbal skills, machines use "sign language" to communicate what hurts or what has invaded their system’. Recognizing the "signs" or symptoms that the machine conveys is a required skill for those who work with machines and are responsible for their care and feeding. The acoustic behavior of technical products is predominantly defined in the design stage, although the ac...

  3. Negative symptoms and the failure to represent the expected reward value of actions: behavioral and computational modeling evidence.

    Science.gov (United States)

    Gold, James M; Waltz, James A; Matveeva, Tatyana M; Kasanova, Zuzana; Strauss, Gregory P; Herbener, Ellen S; Collins, Anne G E; Frank, Michael J

    2012-02-01

    Negative symptoms are a core feature of schizophrenia, but their pathogenesis remains unclear. Negative symptoms are defined by the absence of normal function. However, there must be a productive mechanism that leads to this absence. To test a reinforcement learning account suggesting that negative symptoms result from a failure in the representation of the expected value of rewards coupled with preserved loss-avoidance learning. Participants performed a probabilistic reinforcement learning paradigm involving stimulus pairs in which choices resulted in reward or in loss avoidance. Following training, participants indicated their valuation of the stimuli in a transfer test phase. Computational modeling was used to distinguish between alternative accounts of the data. A tertiary care research outpatient clinic. In total, 47 clinically stable patients with a diagnosis of schizophrenia or schizoaffective disorder and 28 healthy volunteers participated in the study. Patients were divided into a high-negative symptom group and a low-negative symptom group. The number of choices leading to reward or loss avoidance, as well as performance in the transfer test phase. Quantitative fits from 3 different models were examined. Patients in the high-negative symptom group demonstrated impaired learning from rewards but intact loss-avoidance learning and failed to distinguish rewarding stimuli from loss-avoiding stimuli in the transfer test phase. Model fits revealed that patients in the high-negative symptom group were better characterized by an "actor-critic" model, learning stimulus-response associations, whereas control subjects and patients in the low-negative symptom group incorporated expected value of their actions ("Q learning") into the selection process. Negative symptoms in schizophrenia are associated with a specific reinforcement learning abnormality: patients with high-negative symptoms do not represent the expected value of rewards when making decisions but learn

  4. The influence of socio-demographic, psychological and knowledge-related variables alongside perceived cooking and food skills abilities in the prediction of diet quality in adults: a nationally representative cross-sectional study.

    Science.gov (United States)

    McGowan, Laura; Pot, Gerda K; Stephen, Alison M; Lavelle, Fiona; Spence, Michelle; Raats, Monique; Hollywood, Lynsey; McDowell, Dawn; McCloat, Amanda; Mooney, Elaine; Caraher, Martin; Dean, Moira

    2016-10-26

    Interventions to increase cooking skills (CS) and food skills (FS) as a route to improving overall diet are popular within public health. This study tested a comprehensive model of diet quality by assessing the influence of socio-demographic, knowledge- and psychological-related variables alongside perceived CS and FS abilities. The correspondence of two measures of diet quality further validated the Eating Choices Index (ECI) for use in quantitative research. A cross-sectional survey was conducted in a quota-controlled nationally representative sample of 1049 adults aged 20-60 years drawn from the Island of Ireland. Surveys were administered in participants' homes via computer-assisted personal interviewing (CAPI) assessing a range of socio-demographic, knowledge- and psychological-related variables alongside perceived CS and FS abilities. Regression models were used to model factors influencing diet quality. Correspondence between 2 measures of diet quality was assessed using chi-square and Pearson correlations. ECI score was significantly negatively correlated with DINE Fat intake (r = -0.24, p socio-demographic, knowledge, psychological variables and CS and FS abilities on dietary outcomes varied, with regression models explaining 10-20 % of diet quality variance. CS ability exerted the strongest relationship with saturated fat intake (β = -0.296, p food choices (ECI) (β = 0.04, p > 0.05). Greater CS and FS abilities may not lead directly to healthier dietary choices given the myriad of other factors implicated; however, CS appear to have differential influences on aspects of the diet, most notably in relation to lowering saturated fat intake. Findings suggest that CS and FS should not be singular targets of interventions designed to improve diet; but targeting specific sub-groups of the population e.g. males, younger adults, those with limited education might be more fruitful. A greater understanding of the interaction of factors

  5. MEASURING THE DATA MODEL QUALITY IN THE ESUPPLY CHAINS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2012-03-01

    Full Text Available The implementation of Internet technology in business has enabled the development of e-business supply chains with large-scale information integration among all partners.The development of information systems (IS is based on the established business objectives whose achievement, among other things, directly depends on the quality of development and design of IS. In the process of analysis of the key elements of company operations in the supply chain, process model and corresponding data model are designed which should enable selection of appropriate information system architecture. The quality of the implemented information system, which supports e-supply chain, directly depends on the level of data model quality. One of the serious limitations of the data model is its complexity. With a large number of entities, data model is difficult to analyse, monitor and maintain. The problem gets bigger when looking at an integrated data model at the level of participating partners in the supply chain, where the data model usually consists of hundreds or even thousands of entities.The paper will analyse the key elements affecting the quality of data models and show their interactions and factors of significance. In addition, the paper presents various measures for assessing the quality of the data model on which it is possible to easily locate the problems and focus efforts in specific parts of a complex data model where it is not economically feasible to review every detail of the model.

  6. Model Driven Manufacturing Process Design and Managing Quality

    OpenAIRE

    Lundgren, Magnus; Hedlind, Mikael; Kjellberg, Torsten

    2016-01-01

    Besides decisions in design, decisions made in process planning determine the conditions for manufacturing the right quality. Hence systematic process planning is a key enabler for robust product realization from design through manufacturing. Current work methods for process planning and quality assurance lack efficient system integration. As a consequence companies spend unnecessary lot of non-value adding time on managing quality. This paper presents a novel model-based approach to integrat...

  7. Representing and Performing Businesses

    DEFF Research Database (Denmark)

    Boll, Karen

    2014-01-01

    and MacKenzie’s idea of performativity. Based on these two approaches, the article demonstrates that the segmentation model represents and performs the businesses as it makes up certain new ways to be a business and as the businesses can be seen as moving targets. Inspired by MacKenzie the argument......This article investigates a segmentation model used by the Danish Tax and Customs Administration to classify businesses’ motivational postures. The article uses two different conceptualisations of performativity to analyse what the model’s segmentations do: Hacking’s notion of making up people...... is that the segmentation model embodies cleverness in that it simultaneously alters what it represents and then represents this altered reality to confirm the accuracy of its own model of the businesses’ postures. Despite the cleverness of the model, it also has a blind spot. The model assumes a world wherein everything...

  8. The education quality model: Saudi and British perspectives on pillars of quality in education

    OpenAIRE

    Abaalkhail, Mohammed

    2013-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. Research Purpose: This study aims to build a new model of quality for education based on a Saudi-British consensus regarding the major factors contributing to education quality and after considering other models (such as EFQM) and other authors’ perspectives. Research Methodology: The research relies on realism philosophy and as a multiple case study with 15 cases, it utilises a mainly qualit...

  9. Why do global climate models struggle to represent low-level clouds in the West African summer monsoon?

    Science.gov (United States)

    Knippertz, Peter; Hannak, Lisa; Fink, Andreas H.; Kniffka, Anke; Pante, Gregor

    2017-04-01

    Climate models struggle to realistically represent the West African monsoon (WAM), which hinders reliable future projections and the development of adequate adaption measures. Low-level clouds over southern West Africa (5-10°N, 8°W-8°E) during July-September are an integral part of the WAM through their effect on the surface energy balance and precipitation, but their representation in climate models has so far received little attention. These clouds usually form during the night near the level of the nocturnal low-level jet ( 950 hPa), thicken and spread until the mid-morning ( 09 UTC), and then break up and rise in the course of the day, typically to about 850 hPa. The low thermal contrast to the surface and the frequent presence of obscuring higher-level clouds make detection of the low-level clouds from space rather challenging. Here we use 30 years of output from 18 models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) as well as 20 years of output from 8 models participating in the Year of Tropical Convection (YoTC) experiments to identify cloud biases and their causes. A great advantage of the YoTC dataset is the 6-hourly output frequency, which allows an analysis of the diurnal cycle, and the availability of temperature and moisture tendencies from parameterized processes such as convection, radiation and boundary-layer turbulence. A comparison to earlier analyses based on CMIP3 output reveals rather limited improvements with regard to the represenation of low-level cloud and winds. Compared to ERA-Interim re-analyses, which shows satisfactory agreement with surface observations, many of the CMIP5 and YoTC models still have large biases in low-level cloudiness of both signs and a tendency to too high elevation and too weak diurnal cycles. At the same time, these models tend to have too strong low-level jets, the impact of which is unclear due to concomitant effects on temperature and moisture advection as well as turbulent

  10. A Model for Assessing the Quality of Websites.

    Science.gov (United States)

    von Dran, Gisela; Zhang, Ping

    2000-01-01

    Uses Kano's Model of Quality to develop a conceptual framework regarding the quality expectations and needs of Website users and reports on empirical investigations of features in the Web environment that satisfy basic, performance, and excitement needs of customers. Suggests implications for research and Website design. (Contains 12 references.)…

  11. Kinetic Modeling of Food Quality: A Critical Review

    NARCIS (Netherlands)

    Boekel, van T.

    2008-01-01

    ABSTRACT: This article discusses the possibilities to study relevant quality aspects of food, such as color, nutrient content, and safety, in a quantitative way via mathematical models. These quality parameters are governed by chemical, biochemical, microbial, and physical changes. It is argued that

  12. A Rotational Blended Learning Model: Enhancement and Quality Assurance

    Science.gov (United States)

    Ghoul, Said

    2013-01-01

    Research on blended learning theory and practice is growing nowadays with a focus on the development, evaluation, and quality assurance of case studies. However, the enhancement of blended learning existing models, the specification of their online parts, and the quality assurance related specifically to them have not received enough attention.…

  13. The pyramid model as a structured way of quality management

    Directory of Open Access Journals (Sweden)

    van der Tuuk Adriani Willem

    2008-01-01

    Full Text Available Three quality systems that can be used in blood establishments are briefly explained. The Pyramid model is described as a tool to manage the quality systems. Finally, some experiences in other countries are given to prove the validity of the system.

  14. Power quality analyzer device modeling by real time SIMULINK MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Martins, C.H.N.; Silva, L.R.M.; Fabri, D.F.; Duque, C.A. [Federal University of Juiz de Fora (UFJF), MG (Brazil)], Emails: chnmartins@yahoo.com.br, leandro.manso@engenharia.ufjf.br, Diego.fabri@engenharia.ufjf.br, Carlos.duque@ufjf.br; Ribeiro, P.F. [Calvin College, Grand Rapids, MI (United States)], E-mail: pfribeiro@ieee.org

    2009-07-01

    The expansion of electronic devices have increased non linear loads. The effect is high levels of electric disturbances and EMC and EMI interferences. The control of power quality parameters are of primordial importance to ensure minimal power quality. This paper deals with the modeling, simulation and development of a device capable of measuring electrical events. (author)

  15. Hydrologic and Water Quality Model Development Using Simulink

    Directory of Open Access Journals (Sweden)

    James D. Bowen

    2014-11-01

    Full Text Available A stormwater runoff model based on the Soil Conservation Service (SCS method and a finite-volume based water quality model have been developed to investigate the use of Simulink for use in teaching and research. Simulink, a MATLAB extension, is a graphically based model development environment for system modeling and simulation. Widely used for mechanical and electrical systems, Simulink has had less use for modeling of hydrologic systems. The watershed model is being considered for use in teaching graduate-level courses in hydrology and/or stormwater modeling. Simulink’s block (data process and arrow (data transfer object model, the copy and paste user interface, the large number of existing blocks, and the absence of computer code allows students to become model developers almost immediately. The visual depiction of systems, their component subsystems, and the flow of data through the systems are ideal attributes for hands-on teaching of hydrologic and mass balance processes to today’s computer-savvy visual learners. Model development with Simulink for research purposes is also investigated. A finite volume, multi-layer pond model using the water quality kinetics present in CE-QUAL-W2 has been developed using Simulink. The model is one of the first uses of Simulink for modeling eutrophication dynamics in stratified natural systems. The model structure and a test case are presented. One use of the model for teaching a graduate-level water quality modeling class is also described.

  16. Modelling of the Quality Management of the Human Resource Training

    Directory of Open Access Journals (Sweden)

    Bucur Amelia

    2015-12-01

    Full Text Available It is known that for the scientific substantiation of quality management have been applied models that pertain to mathematical statistics, the probability theory, the information theory, fuzzy systems, graphic methods, time series, and algebraic and numerical methods.

  17. A FEDERATED PARTNERSHIP FOR URBAN METEOROLOGICAL AND AIR QUALITY MODELING

    Science.gov (United States)

    Recently, applications of urban meteorological and air quality models have been performed at resolutions on the order of km grid sizes. This necessitated development and incorporation of high resolution landcover data and additional boundary layer parameters that serve to descri...

  18. A checklist for quality assistance in environmental modelling

    NARCIS (Netherlands)

    Risbey, James S.; Sluijs, J.P. van der; Ravetz, Jerome R.; Janssen, P.

    2006-01-01

    The goal of this checklist is to assist in the quality control process for environmental modelling. The point of the checklist is not that a model can be classified as 'good' or 'bad', but that there are 'better' and 'worse' forms of modelling practice. We believe that one should guard against poor

  19. A SIMPLIFIED WATER QUALITY MODEL FOR WETLANDS

    Institute of Scientific and Technical Information of China (English)

    Jan-Tai KUO; Jihn-Sung LAI; Wu-Seng LUNG; Chou-Ping YANG

    2004-01-01

    The purpose of this study is to develop a simplified mathematical model to simulate suspended solids and total phosphorus concentrations in a wetland or detention pond.Field data collected from a wet detention pond during storms were used to demonstrate the application of this model.Favorable agreements between the model results and data were achieved.The ratio of average outlet method and summary of loads method were used to quantify the removal efficiency of pollutants,reflecting the efficiencies are very close.The results of this study can be used for nonpoint source pollution control,wastewater treatment or best management practices (BMPs) through the wetland.

  20. Quality Management in Hospital Departments : Empirical Studies of Organisational Models

    OpenAIRE

    Kunkel, Stefan

    2008-01-01

    The general aim of this thesis was to empirically explore the organisational characteristics of quality systems of hospital departments, to develop and empirically test models for the organisation and implementation of quality systems, and to discuss the clinical implications of the findings. Data were collected from hospital departments through interviews (n=19) and a nation-wide survey (n=386). The interviews were analysed thematically and organisational models were developed. Relationships...

  1. A reaction-based river/stream water quality model: Model development and numerical schemes

    Science.gov (United States)

    Zhang, Fan; Yeh, Gour-Tsyh; Parker, Jack C.; Jardine, Philip M.

    2008-01-01

    SummaryThis paper presents the conceptual and mathematical development of a numerical model of sediment and reactive chemical transport in rivers and streams. The distribution of mobile suspended sediments and immobile bed sediments is controlled by hydrologic transport as well as erosion and deposition processes. The fate and transport of water quality constituents involving a variety of chemical and physical processes is mathematically described by a system of reaction equations for immobile constituents and advective-dispersive-reactive transport equations for mobile constituents. To circumvent stiffness associated with equilibrium reactions, matrix decomposition is performed via Gauss-Jordan column reduction. After matrix decomposition, the system of water quality constituent reactive transport equations is transformed into a set of thermodynamic equations representing equilibrium reactions and a set of transport equations involving no equilibrium reactions. The decoupling of equilibrium and kinetic reactions enables robust numerical integration of the partial differential equations (PDEs) for non-equilibrium-variables. Solving non-equilibrium-variable transport equations instead of individual water quality constituent transport equations also reduces the number of PDEs. A variety of numerical methods are investigated for solving the mixed differential and algebraic equations. Two verification examples are compared with analytical solutions to demonstrate the correctness of the code and to illustrate the importance of employing application-dependent numerical methods to solve specific problems.

  2. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  3. The Three Estates Model: Represented and Satirised in Chaucer’s General Prologue to the Canterbury Tales

    Directory of Open Access Journals (Sweden)

    Sadenur Doğan

    2013-07-01

    Full Text Available This paper presents an investigation of the ‘Three Estates Model’ of the English medieval society in Chaucer’s General Prologue to the Canterbury Tales. Based upon the descriptions and illustrations of the characters, it aims to explore the hierarchal structure of the medieval society which is divided into three main groups or ‘estates’: the ones who pray, the ones who rule and govern, and the ones who work. In the General Prologue, Chaucer gives a series of sketches of the characters that are the representatives of the three estates, and through these depictions he investigates the social characteristics and roles of the medieval people who are expected to speak and behave in accordance with what their social group requires. While presenting Three Estates Model, he employs the tradition of ‘estates satire’ by criticising the social vices resulting from the corruption in this model. Through the characteristics and virtues of the ‘Knight’, the ‘Parson’, and the ‘Plowman’, he demonstrates the perfect integration of the people who belong to chivalry, clergy and the commoners in the medieval English society. Also, by offering contrasting views to these positive traits in the portrayal of almost all of the other characters, as illustrated in the portrayal of the ‘Monk’, the ‘Reeve’, and the ‘Wife of Bathe’ in this paper, he criticises the vices and sins (that are mainly resulted from the religious, financial and moral corruption of the people belonging to the social classes of the Middle Ages.

  4. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  5. COST OF QUALITY MODELS AND THEIR IMPLEMENTATION IN MANUFACTURING FIRMS

    Directory of Open Access Journals (Sweden)

    N.M. Vaxevanidis

    2009-03-01

    Full Text Available In order to improve quality, an organization must take into account the costs associated with achieving quality since the objective of continuous improvement programs is not only to meet customer requirements, but also to do it at the lowest, possible, cost. This can only obtained by reducing the costs needed to achieve quality, and the reduction of these costs is only possible if they are identified and measured. Therefore, measuring and reporting the cost of quality (CoQ should be considered an important issue for achieving quality excellence. To collect quality costs an organization needs to adopt a framework to classify costs; however, there is no general agreement on a single broad definition of quality costs. CoQ is usually understood as the sum of conformance plus non-conformance costs, where cost of conformance is the price paid for prevention of poor quality (for example, inspection and quality appraisal and cost of non-conformance is the cost of poor quality caused by product and service failure (for example, rework and returns. The objective of this paper is to give a survey of research articles on the topic of CoQ; it opens with a literature review focused on existing CoQ models; then, it briefly presents the most common CoQ parameters and the metrics (indices used for monitoring CoQ. Finally, the use of CoQ models in practice, i.e., the implementation of a quality costing system and cost of quality reporting in companies is discussed, with emphasis in cases concerning manufacturing firms.

  6. Feature Analysis for Modeling Game Content Quality

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2011-01-01

    ’ preferences, and by defining the smallest game session size for which the model can still predict reported emotion with acceptable accuracy. Neuroevolutionary preference learning is used to approximate the function from game content to reported emotional preferences. The experiments are based on a modified...

  7. An Overview of Atmospheric Chemistry and Air Quality Modeling

    Science.gov (United States)

    Johnson, Matthew S.

    2017-01-01

    This presentation will include my personal research experience and an overview of atmospheric chemistry and air quality modeling to the participants of the NASA Student Airborne Research Program (SARP 2017). The presentation will also provide examples on ways to apply airborne observations for chemical transport (CTM) and air quality (AQ) model evaluation. CTM and AQ models are important tools in understanding tropospheric-stratospheric composition, atmospheric chemistry processes, meteorology, and air quality. This presentation will focus on how NASA scientist currently apply CTM and AQ models to better understand these topics. Finally, the importance of airborne observation in evaluating these topics and how in situ and remote sensing observations can be used to evaluate and improve CTM and AQ model predictions will be highlighted.

  8. QUALITY OF AN ACADEMIC STUDY PROGRAMME - EVALUATION MODEL

    Directory of Open Access Journals (Sweden)

    Mirna Macur

    2016-01-01

    Full Text Available Quality of an academic study programme is evaluated by many: employees (internal evaluation and by external evaluators: experts, agencies and organisations. Internal and external evaluation of an academic programme follow written structure that resembles on one of the quality models. We believe the quality models (mostly derived from EFQM excellence model don’t fit very well into non-profit activities, policies and programmes, because they are much more complex than environment, from which quality models derive from (for example assembly line. Quality of an academic study programme is very complex and understood differently by various stakeholders, so we present dimensional evaluation in the article. Dimensional evaluation, as opposed to component and holistic evaluation, is a form of analytical evaluation in which the quality of value of the evaluand is determined by looking at its performance on multiple dimensions of merit or evaluation criteria. First stakeholders of a study programme and their views, expectations and interests are presented, followed by evaluation criteria. They are both joined into the evaluation model revealing which evaluation criteria can and should be evaluated by which stakeholder. Main research questions are posed and research method for each dimension listed.

  9. A New Empirical Sewer Water Quality Model for the Prediction of WWTP Influent Quality

    NARCIS (Netherlands)

    Langeveld, J.G.; Schilperoort, R.P.S.; Rombouts, P.M.M.; Benedetti, L.; Amerlinck, Y.; de Jonge, J.; Flameling, T.; Nopens, I.; Weijers, S.

    2014-01-01

    Modelling of the integrated urban water system is a powerful tool to optimise wastewater system performance or to find cost-effective solutions for receiving water problems. One of the challenges of integrated modelling is the prediction of water quality at the inlet of a WWTP. Recent applications

  10. Climate change forecasting in a mountainous data scarce watershed using CMIP5 models under representative concentration pathways

    Science.gov (United States)

    Aghakhani Afshar, A.; Hasanzadeh, Y.; Besalatpour, A. A.; Pourreza-Bilondi, M.

    2016-09-01

    Hydrology cycle of river basins and available water resources in arid and semi-arid regions are highly affected by climate changes. In recent years, the increment of temperature due to excessive increased emission of greenhouse gases has led to an abnormality in the climate system of the earth. The main objective of this study is to survey the future climate changes in one of the biggest mountainous watersheds in northeast of Iran (i.e., Kashafrood). In this research, by considering the precipitation and temperature as two important climatic parameters in watersheds, 14 models evolved in the general circulation models (GCMs) of the newest generation in the Coupled Model Intercomparison Project Phase 5 (CMIP5) were used to forecast the future climate changes in the study area. For the historical period of 1992-2005, four evaluation criteria including Nash-Sutcliffe (NS), percent of bias (PBIAS), coefficient of determination (R 2) and the ratio of the root-mean-square-error to the standard deviation of measured data (RSR) were used to compare the simulated observed data for assessing goodness-of-fit of the models. In the primary results, four climate models namely GFDL-ESM2G, IPSL-CM5A-MR, MIROC-ESM, and NorESM1-M were selected among the abovementioned 14 models due to their more prediction accuracies to the investigated evaluation criteria. Thereafter, climate changes of the future periods (near-century, 2006-2037; mid-century, 2037-2070; and late-century, 2070-2100) were investigated and compared by four representative concentration pathways (RCPs) of new emission scenarios of RCP2.6, RCP4.5, RCP6.0, and RCP8.5. In order to assess the trend of annual and seasonal changes of climatic components, Mann-Kendall non-parametric test (MK) was also employed. The results of Mann-Kendall test revealed that the precipitation has significant variable trends of both positive and negative alterations. Furthermore, the mean, maximum, and minimum temperature values had significant

  11. Climate change forecasting in a mountainous data scarce watershed using CMIP5 models under representative concentration pathways

    Science.gov (United States)

    Aghakhani Afshar, A.; Hasanzadeh, Y.; Besalatpour, A. A.; Pourreza-Bilondi, M.

    2017-07-01

    Hydrology cycle of river basins and available water resources in arid and semi-arid regions are highly affected by climate changes. In recent years, the increment of temperature due to excessive increased emission of greenhouse gases has led to an abnormality in the climate system of the earth. The main objective of this study is to survey the future climate changes in one of the biggest mountainous watersheds in northeast of Iran (i.e., Kashafrood). In this research, by considering the precipitation and temperature as two important climatic parameters in watersheds, 14 models evolved in the general circulation models (GCMs) of the newest generation in the Coupled Model Intercomparison Project Phase 5 (CMIP5) were used to forecast the future climate changes in the study area. For the historical period of 1992-2005, four evaluation criteria including Nash-Sutcliffe (NS), percent of bias (PBIAS), coefficient of determination ( R 2) and the ratio of the root-mean-square-error to the standard deviation of measured data (RSR) were used to compare the simulated observed data for assessing goodness-of-fit of the models. In the primary results, four climate models namely GFDL-ESM2G, IPSL-CM5A-MR, MIROC-ESM, and NorESM1-M were selected among the abovementioned 14 models due to their more prediction accuracies to the investigated evaluation criteria. Thereafter, climate changes of the future periods (near-century, 2006-2037; mid-century, 2037-2070; and late-century, 2070-2100) were investigated and compared by four representative concentration pathways (RCPs) of new emission scenarios of RCP2.6, RCP4.5, RCP6.0, and RCP8.5. In order to assess the trend of annual and seasonal changes of climatic components, Mann-Kendall non-parametric test (MK) was also employed. The results of Mann-Kendall test revealed that the precipitation has significant variable trends of both positive and negative alterations. Furthermore, the mean, maximum, and minimum temperature values had

  12. A method to represent ozone response to large changes in precursor emissions using high-order sensitivity analysis in photochemical models

    OpenAIRE

    G. Yarwood; Emery, C; Jung, J.; U. Nopmongcol; T. Sakulyanotvittaya

    2013-01-01

    Photochemical grid models (PGMs) are used to simulate tropospheric ozone and quantify its response to emission changes. PGMs are often applied for annual simulations to provide both maximum concentrations for assessing compliance with air quality standards and frequency distributions for assessing human exposure. Efficient methods for computing ozone at different emission levels can improve the quality of ozone air quality management efforts. This study demonstrates the feasibility of using t...

  13. A method to represent ozone response to large changes in precursor emissions using high-order sensitivity analysis in photochemical models

    OpenAIRE

    G. Yarwood; Emery, C; Jung, J.; U. Nopmongcol; Sakulyanontvittaya, T.

    2013-01-01

    Photochemical grid models (PGMs) are used to simulate tropospheric ozone and quantify its response to emission changes. PGMs are often applied for annual simulations to provide both maximum concentrations for assessing compliance with air quality standards and frequency distributions for assessing human exposure. Efficient methods for computing ozone at different emission levels can improve the quality of ozone air quality management efforts. This study demonstrates the feas...

  14. Feature Analysis for Modeling Game Content Quality

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2011-01-01

    entertainment for individual game players is to tailor player experience in real-time via automatic game content generation. Modeling the relationship between game content and player preferences or affective states is an important step towards this type of game personalization. In this paper we...... analyse the relationship between level design parameters of platform games and player experience. We introduce a method to extract the most useful information about game content from short game sessions by investigating the size of game session that yields the highest accuracy in predicting players......’ preferences, and by defining the smallest game session size for which the model can still predict reported emotion with acceptable accuracy. Neuroevolutionary preference learning is used to approximate the function from game content to reported emotional preferences. The experiments are based on a modified...

  15. QAM: PROPOSED MODEL FOR QUALITY ASSURANCE IN CBSS

    Directory of Open Access Journals (Sweden)

    Latika Kharb

    2015-08-01

    Full Text Available Component-based software engineering (CBSE / Component-Based Development (CBD lays emphasis on decomposition of the engineered systems into functional or logical components with well-defined interfaces used for communication across the components. Component-based software development approach is based on the idea to develop software systems by selecting appropriate off-the-shelf components and then to assemble them with a well-defined software architecture. Because the new software development paradigm is much different from the traditional approach, quality assurance for component-based software development is a new topic in the software engineering research community. Because component-based software systems are developed on an underlying process different from that of the traditional software, their quality assurance model should address both the process of components and the process of the overall system. Quality assurance for component-based software systems during the life cycle is used to analyze the components for achievement of high quality component-based software systems. Although some Quality assurance techniques and component based approach to software engineering have been studied, there is still no clear and well-defined standard or guidelines for component-based software systems. Therefore, identification of the quality assurance characteristics, quality assurance models, quality assurance tools and quality assurance metrics, are under urgent need. As a major contribution in this paper, I have proposed QAM: Quality Assurance Model for component-based software development, which covers component requirement analysis, component development, component certification, component architecture design, integration, testing, and maintenance.

  16. Prediction of Local Quality of Protein Structure Models Considering Spatial Neighbors in Graphical Models

    Science.gov (United States)

    Shin, Woong-Hee; Kang, Xuejiao; Zhang, Jian; Kihara, Daisuke

    2017-01-01

    Protein tertiary structure prediction methods have matured in recent years. However, some proteins defy accurate prediction due to factors such as inadequate template structures. While existing model quality assessment methods predict global model quality relatively well, there is substantial room for improvement in local quality assessment, i.e. assessment of the error at each residue position in a model. Local quality is a very important information for practical applications of structure models such as interpreting/designing site-directed mutagenesis of proteins. We have developed a novel local quality assessment method for protein tertiary structure models. The method, named Graph-based Model Quality assessment method (GMQ), explicitly considers the predicted quality of spatially neighboring residues using a graph representation of a query protein structure model. GMQ uses conditional random field as its core of the algorithm, and performs a binary prediction of the quality of each residue in a model, indicating if a residue position is likely to be within an error cutoff or not. The accuracy of GMQ was improved by considering larger graphs to include quality information of more surrounding residues. Moreover, we found that using different edge weights in graphs reflecting different secondary structures further improves the accuracy. GMQ showed competitive performance on a benchmark for quality assessment of structure models from the Critical Assessment of Techniques for Protein Structure Prediction (CASP). PMID:28074879

  17. Determination of a new uniform thorax density representative of the living population from 3D external body shape modeling.

    Science.gov (United States)

    Amabile, Celia; Choisne, Julie; Nérot, Agathe; Pillet, Hélène; Skalli, Wafa

    2016-05-03

    Body segment parameters (BSP) for each body׳s segment are needed for biomechanical analysis. To provide population-specific BSP, precise estimation of body׳s segments volume and density are needed. Widely used uniform densities, provided by cadavers׳ studies, did not consider the air present in the lungs when determining the thorax density. The purpose of this study was to propose a new uniform thorax density representative of the living population from 3D external body shape modeling. Bi-planar X-ray radiographies were acquired on 58 participants allowing 3D reconstructions of the spine, rib cage and human body shape. Three methods of computing the thorax mass were compared for 48 subjects: (1) the Dempster Uniform Density Method, currently in use for BSPs calculation, using Dempster density data, (2) the Personalized Method using full-description of the thorax based on 3D reconstruction of the rib cage and spine and (3) the Improved Uniform Density Method using a uniform thorax density resulting from the Personalized Method. For 10 participants, comparison was made between the body mass obtained from a force-plate and the body mass computed with each of the three methods. The Dempster Uniform Density Method presented a mean error of 4.8% in the total body mass compared to the force-plate vs 0.2% for the Personalized Method and 0.4% for the Improved Uniform Density Method. The adjusted thorax density found from the 3D reconstruction was 0.74g/cm(3) for men and 0.73g/cm(3) for women instead of the one provided by Dempster (0.92g/cm(3)), leading to a better estimate of the thorax mass and body mass. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.

    1997-01-01

    There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...... of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions...

  19. Hydrologic and water quality terminology as applied to modeling

    Science.gov (United States)

    A survey of literature and examination in particular of terminology use in a previous special collection of modeling calibration and validation papers has been conducted to arrive at a list of consistent terminology recommended for writing about hydrologic and water quality model calibration and val...

  20. A Cost-of-Quality Model at El Camino College.

    Science.gov (United States)

    Miller, Timothy D.

    1994-01-01

    This article describes the cost-of-quality model used at El Camino College (California), which formally measures the costs associated with process improvement activities. A case study illustrates use of the model in evaluating a new process for reimbursing faculty and staff attending conferences. (DB)

  1. Modeling the leakage of LCD displays with local backlight for quality assessment

    DEFF Research Database (Denmark)

    Mantel, Claire; Korhonen, Jari; Pedersen, Jesper M.

    2014-01-01

    The recent technique of local backlight dimming has a significant impact on the quality of images displayed with a LCD screen with LED local dimming. Therefore it represents a necessary step in the quality assessment chain, independently from the other processes applied to images. This paper...... on videos displayed on LCD TV with local backlight dimming viewed from a 0° and 15° angles. The subjective results are then compared objective data using different leakage models: constant over the whole display or horizontally varying and three leakage factor (no leakage, measured at 0° and 15...... investigates the modeling of one of the major spatial artifacts produced by local dimming: leakage. Leakage appears in dark areas when the backlight level is too high for LC cells to block sufficiently and the final displayed brightness is higher than it should. A subjective quality experiment was run...

  2. Development of a water quality loading index based on water quality modeling.

    Science.gov (United States)

    Song, Tao; Kim, Kyehyun

    2009-03-01

    Water quality modeling is an ideal tool for simulating physical, chemical, and biological changes in aquatic systems. It has been utilized in a number of GIS-based water quality management and analysis applications. However, there is considerable need for a decision-making process to translate the modeling result into an understandable form and thereby help users to make relevant judgments and decisions. This paper introduces a water quality index termed QUAL2E water quality loading index (QWQLI). This new WQI is based on water quality modeling by QUAL2E, which is a popular steady-state model for the water quality of rivers and streams. An experiment applying the index to the Sapgyo River in Korea was implemented. Unlike other WQIs, the proposed index is specifically used for simulated water quality using QUAL2E to mainly reflect pollutant loading levels. Based on the index, an iterative modeling-judgment process was designed to make decisions to decrease input pollutants from pollutant sources. Furthermore, an indexing and decision analysis can be performed in a GIS framework, which can provide various spatial analyses. This can facilitate the decision-making process under various scenarios considering spatial variability. The result shows that the index can evaluate and classify the simulation results using QUAL2E and that it can effectively identify the elements that should be improved in the decision-making process. In addition, the results imply that further study should be carried out to automate algorithms and subsidiary programs supporting the decision-making process.

  3. Model-based monitoring of stormwater runoff quality

    DEFF Research Database (Denmark)

    Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen

    2013-01-01

    Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combining a model with field sampling) affect...... the information obtained about MP discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by automatic volume-proportional sampling and passive sampling in a storm drainage system on the outskirts of Copenhagen (Denmark) and a 10-year rain series was used......) for calibration of the model, resulted in the same predicted level but with narrower model prediction bounds than by using volume-proportional samples for calibration. This shows that passive sampling allows for a better exploitation of the resources allocated for stormwater quality monitoring....

  4. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  5. Embryo quality predictive models based on cumulus cells gene expression

    Directory of Open Access Journals (Sweden)

    Devjak R

    2016-06-01

    Full Text Available Since the introduction of in vitro fertilization (IVF in clinical practice of infertility treatment, the indicators for high quality embryos were investigated. Cumulus cells (CC have a specific gene expression profile according to the developmental potential of the oocyte they are surrounding, and therefore, specific gene expression could be used as a biomarker. The aim of our study was to combine more than one biomarker to observe improvement in prediction value of embryo development. In this study, 58 CC samples from 17 IVF patients were analyzed. This study was approved by the Republic of Slovenia National Medical Ethics Committee. Gene expression analysis [quantitative real time polymerase chain reaction (qPCR] for five genes, analyzed according to embryo quality level, was performed. Two prediction models were tested for embryo quality prediction: a binary logistic and a decision tree model. As the main outcome, gene expression levels for five genes were taken and the area under the curve (AUC for two prediction models were calculated. Among tested genes, AMHR2 and LIF showed significant expression difference between high quality and low quality embryos. These two genes were used for the construction of two prediction models: the binary logistic model yielded an AUC of 0.72 ± 0.08 and the decision tree model yielded an AUC of 0.73 ± 0.03. Two different prediction models yielded similar predictive power to differentiate high and low quality embryos. In terms of eventual clinical decision making, the decision tree model resulted in easy-to-interpret rules that are highly applicable in clinical practice.

  6. Quality quantification model of basic raw materials

    Directory of Open Access Journals (Sweden)

    Š. Vilamová

    2016-07-01

    Full Text Available Basic raw materials belong to the key input sources in the production of pig iron. The properties of basic raw materials can be evaluated using a variety of criteria. The essential ones include the physical and chemical properties. Current competitive pressures, however, force the producers of iron more and more often to include cost and logistic criteria into the decision-making process. In this area, however, they are facing a problem of how to convert a variety of vastly different parameters into one evaluation indicator in order to compare the available raw materials. This article deals with the analysis of a model created to evaluate the basic raw materials, which was designed as part of the research.

  7. Quality control of the RMS US flood model

    Science.gov (United States)

    Jankowfsky, Sonja; Hilberts, Arno; Mortgat, Chris; Li, Shuangcai; Rafique, Farhat; Rajesh, Edida; Xu, Na; Mei, Yi; Tillmanns, Stephan; Yang, Yang; Tian, Ye; Mathur, Prince; Kulkarni, Anand; Kumaresh, Bharadwaj Anna; Chaudhuri, Chiranjib; Saini, Vishal

    2016-04-01

    The RMS US flood model predicts the flood risk in the US with a 30 m resolution for different return periods. The model is designed for the insurance industry to estimate the cost of flood risk for a given location. Different statistical, hydrological and hydraulic models are combined to develop the flood maps for different return periods. A rainfall-runoff and routing model, calibrated with observed discharge data, is run with 10 000 years of stochastic simulated precipitation to create time series of discharge and surface runoff. The 100, 250 and 500 year events are extracted from these time series as forcing for a two-dimensional pluvial and fluvial inundation model. The coupling of all the different models which are run on the large area of the US implies a certain amount of uncertainty. Therefore, special attention is paid to the final quality control of the flood maps. First of all, a thorough quality analysis of the Digital Terrain model and the river network was done, as the final quality of the flood maps depends heavily on the DTM quality. Secondly, the simulated 100 year discharge in the major river network (600 000 km) is compared to the 100 year discharge derived using extreme value distribution of all USGS gauges with more than 20 years of peak values (around 11 000 gauges). Thirdly, for each gauge the modelled flood depth is compared to the depth derived from the USGS rating curves. Fourthly, the modelled flood depth is compared to the base flood elevation given in the FEMA flood maps. Fifthly, the flood extent is compared to the FEMA flood extent. Then, for historic events we compare flood extents and flood depths at given locations. Finally, all the data and spatial layers are uploaded on geoserver to facilitate the manual investigation of outliers. The feedback from the quality control is used to improve the model and estimate its uncertainty.

  8. Developing a fuzzy model for Tehran's air quality

    Directory of Open Access Journals (Sweden)

    Nafiseh Tokhmehchi

    2015-01-01

    Full Text Available This research aims to offer a fuzzy approach for calculating Tehran's air pollution index. The method is based on fuzzy analysis model, and uses the information about air quality index (AQI, included on the website of Tehran’s Air Quality Monitoring And Supervision Bureau. The contrived fuzzy logic is considered a powerful tool for demonstrating the information associated with uncertainty. In the end, several graphs visualize this inferential system in various levels of pollution.

  9. IMPROVED SOFTWARE QUALITY ASSURANCE TECHNIQUES USING SAFE GROWTH MODEL

    Directory of Open Access Journals (Sweden)

    M.Sangeetha

    2010-09-01

    Full Text Available In our lives are governed by large, complex systems with increasingly complex software, and the safety, security, and reliability of these systems has become a major concern. As the software in today’ssystems grows larger, it has more defects, and these defects adversely affect the safety, security, and reliability of the systems. Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, andmaintenance of software. Software divides into two pieces: internal and external quality characteristics.External quality characteristics are those parts of a product that face its users, where internal quality characteristics are those that do not.Quality is conformance to product requirements and should be free. This research concerns the role of software Quality. Software reliability is an important facet of software quality. It is the probability of failure-freeoperation of a computer program in a specified environment for a specified time. In software reliability modeling, the parameters of the model are typically estimated from the test data of the corresponding component. However, the widely used point estimatorsare subject to random variations in the data, resulting in uncertainties in these estimated parameters. This research describes a new approach to the problem of software testing. The approach is based on Bayesian graphical models and presents formal mechanisms forthe logical structuring of the software testing problem, the probabilistic and statistical treatment of the uncertainties to be addressed, the test design and analysis process, and the incorporation and implication of test results. Once constructed, the models produced are dynamic representations of the software testingproblem. It explains need of the common test-and-fix software quality strategy is no longer adequate, and characterizes the properties of the quality strategy.

  10. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    Science.gov (United States)

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  11. Towards benchmarking an in-stream water quality model

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available A method of model evaluation is presented which utilises a comparison with a benchmark model. The proposed benchmarking concept is one that can be applied to many hydrological models but, in this instance, is implemented in the context of an in-stream water quality model. The benchmark model is defined in such a way that it is easily implemented within the framework of the test model, i.e. the approach relies on two applications of the same model code rather than the application of two separate model codes. This is illustrated using two case studies from the UK, the Rivers Aire and Ouse, with the objective of simulating a water quality classification, general quality assessment (GQA, which is based on dissolved oxygen, biochemical oxygen demand and ammonium. Comparisons between the benchmark and test models are made based on GQA, as well as a step-wise assessment against the components required in its derivation. The benchmarking process yields a great deal of important information about the performance of the test model and raises issues about a priori definition of the assessment criteria.

  12. Air Quality Model System For The Vienna/bratislava Region

    Science.gov (United States)

    Krüger, B. C.; Schmittner, W.; Kromp-Kolb, H.

    A model system has been build up, consisting of the mesoscale meteorological fore- cast model MM5 and the chemical air-quality model CAMx. The coarse grid covers central Europe. By nesting, a spatial resolution of 3 km is reached for the core area, which includes the cities of Vienna (Austria) and Bratislava (Slovakia). In a first approach, the model system has been applied to a 6-day period in Febru- ary 1997, which was characterized by stagnant meteorological conditions. During this episode, primary pollutants like CO and NO2 have been compared with ambient mea- surements for the validation of the new model system. In the future it is foreseen to improve the spatial resolution, to apply the model system also for ozone and particulates, and to utilize it for a short-time forecast of air-quality parameters.

  13. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    Science.gov (United States)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  14. Extracellular and intraneuronal HMW-AbetaOs represent a molecular basis of memory loss in Alzheimer's disease model mouse

    Directory of Open Access Journals (Sweden)

    Yamamoto Naoki

    2011-03-01

    Full Text Available Abstract Background Several lines of evidence indicate that memory loss represents a synaptic failure caused by soluble amyloid β (Aβ oligomers. However, the pathological relevance of Aβ oligomers (AβOs as the trigger of synaptic or neuronal degeneration, and the possible mechanism underlying the neurotoxic action of endogenous AβOs remain to be determined. Results To specifically target toxic AβOs in vivo, monoclonal antibodies (1A9 and 2C3 specific to them were generated using a novel design method. 1A9 and 2C3 specifically recognize soluble AβOs larger than 35-mers and pentamers on Blue native polyacrylamide gel electrophoresis, respectively. Biophysical and structural analysis by atomic force microscopy (AFM revealed that neurotoxic 1A9 and 2C3 oligomeric conformers displayed non-fibrilar, relatively spherical structure. Of note, such AβOs were taken up by neuroblastoma (SH-SY5Y cell, resulted in neuronal death. In humans, immunohistochemical analysis employing 1A9 or 2C3 revealed that 1A9 and 2C3 stain intraneuronal granules accumulated in the perikaryon of pyramidal neurons and some diffuse plaques. Fluoro Jade-B binding assay also revealed 1A9- or 2C3-stained neurons, indicating their impending degeneration. In a long-term low-dose prophylactic trial using active 1A9 or 2C3 antibody, we found that passive immunization protected a mouse model of Alzheimer's disease (AD from memory deficits, synaptic degeneration, promotion of intraneuronal AβOs, and neuronal degeneration. Because the primary antitoxic action of 1A9 and 2C3 occurs outside neurons, our results suggest that extracellular AβOs initiate the AD toxic process and intraneuronal AβOs may worsen neuronal degeneration and memory loss. Conclusion Now, we have evidence that HMW-AβOs are among the earliest manifestation of the AD toxic process in mice and humans. We are certain that our studies move us closer to our goal of finding a therapeutic target and/or confirming the

  15. Adaptive quality prediction of batch processes based on PLS model

    Institute of Scientific and Technical Information of China (English)

    LI Chun-fu; ZHANG Jie; WANG Gui-zeng

    2006-01-01

    There are usually no on-line product quality measurements in batch and semi-batch processes,which make the process control task very difficult.In this paper,a model for predicting the end-product quality from the available on-line process variables at the early stage of a batch is developed using partial least squares (PLS)method.Furthermore,some available mid-course quality measurements are used to rectify the final prediction results.To deal with the problem that the process may change with time,recursive PLS (RPLS) algorithm is used to update the model based on the new batch data and the old model parameters after each batch.An application to a simulated batch MMA polymerization process demonstrates the effectiveness of the proposed method.

  16. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  17. An Analysis of the Propulsion Experiments Performed on a Model Representing the Stretched PONCE DE LEON (SPDL) Class RO/RO Ship Fitted with Two Sets of Design Contrarotating Propellers (Model 5362; Propellers 4731 & 4732 and 9019 & 9020).

    Science.gov (United States)

    1981-01-01

    REPRESENTING THE 5TRETCHED PONQ DE LEON (S.PI.) 9€ ASS _!"Q SHIP FITTED WITH TWO SETS OF DESIGN CONTRAROTATING PROPELLERS (MODEL 5362; PROPELLERS 4731...TYPE OF REPORT & PERIOD COVERED AN ANALYSIS OF THE PROPULSION EXPERIMENTS PER- Final FORMED ON A MODEL REPRESENTING THE STRETCHED PONCE DE LEON (SPDL...number) A ser ies of propulsion exper ments were performed on Model 5362, representing a Stretched PONCE DE LEON Clas RO/RO ship. The model was fitted

  18. Select strengths and biases of models in representing the Arctic winter boundary layer over sea ice: the Larcform 1 single column model intercomparison

    Science.gov (United States)

    Pithan, Felix; Ackerman, Andrew; Angevine, Wayne M.; Hartung, Kerstin; Ickes, Luisa; Kelley, Maxwell; Medeiros, Brian; Sandu, Irina; Steeneveld, Gert-Jan; Sterk, H. A. M.; Svensson, Gunilla; Vaillancourt, Paul A.; Zadra, Ayrton

    2016-09-01

    Weather and climate models struggle to represent lower tropospheric temperature and moisture profiles and surface fluxes in Arctic winter, partly because they lack or misrepresent physical processes that are specific to high latitudes. Observations have revealed two preferred states of the Arctic winter boundary layer. In the cloudy state, cloud liquid water limits surface radiative cooling, and temperature inversions are weak and elevated. In the radiatively clear state, strong surface radiative cooling leads to the build-up of surface-based temperature inversions. Many large-scale models lack the cloudy state, and some substantially underestimate inversion strength in the clear state. Here, the transformation from a moist to a cold dry air mass is modeled using an idealized Lagrangian perspective. The trajectory includes both boundary layer states, and the single-column experiment is the first Lagrangian Arctic air formation experiment (Larcform 1) organized within GEWEX GASS (Global atmospheric system studies). The intercomparison reproduces the typical biases of large-scale models: some models lack the cloudy state of the boundary layer due to the representation of mixed-phase microphysics or to the interaction between micro- and macrophysics. In some models, high emissivities of ice clouds or the lack of an insulating snow layer prevent the build-up of surface-based inversions in the radiatively clear state. Models substantially disagree on the amount of cloud liquid water in the cloudy state and on turbulent heat fluxes under clear skies. Observations of air mass transformations including both boundary layer states would allow for a tighter constraint of model behavior.

  19. Modeling water quality in an urban river using hydrological factors--data driven approaches.

    Science.gov (United States)

    Chang, Fi-John; Tsai, Yu-Hsuan; Chen, Pin-An; Coynel, Alexandra; Vachaud, Georges

    2015-03-15

    Contrasting seasonal variations occur in river flow and water quality as a result of short duration, severe intensity storms and typhoons in Taiwan. Sudden changes in river flow caused by impending extreme events may impose serious degradation on river water quality and fateful impacts on ecosystems. Water quality is measured in a monthly/quarterly scale, and therefore an estimation of water quality in a daily scale would be of good help for timely river pollution management. This study proposes a systematic analysis scheme (SAS) to assess the spatio-temporal interrelation of water quality in an urban river and construct water quality estimation models using two static and one dynamic artificial neural networks (ANNs) coupled with the Gamma test (GT) based on water quality, hydrological and economic data. The Dahan River basin in Taiwan is the study area. Ammonia nitrogen (NH3-N) is considered as the representative parameter, a correlative indicator in judging the contamination level over the study. Key factors the most closely related to the representative parameter (NH3-N) are extracted by the Gamma test for modeling NH3-N concentration, and as a result, four hydrological factors (discharge, days w/o discharge, water temperature and rainfall) are identified as model inputs. The modeling results demonstrate that the nonlinear autoregressive with exogenous input (NARX) network furnished with recurrent connections can accurately estimate NH3-N concentration with a very high coefficient of efficiency value (0.926) and a low RMSE value (0.386 mg/l). Besides, the NARX network can suitably catch peak values that mainly occur in dry periods (September-April in the study area), which is particularly important to water pollution treatment. The proposed SAS suggests a promising approach to reliably modeling the spatio-temporal NH3-N concentration based solely on hydrological data, without using water quality sampling data. It is worth noticing that such estimation can be

  20. Frequency analysis of river water quality using integrated urban wastewater models.

    Science.gov (United States)

    Fu, Guangtao; Butler, David

    2012-01-01

    In recent years integrated models have been developed to simulate the entire urban wastewater system, including urban drainage systems, wastewater treatment plants, and receiving waterbodies. This paper uses such an integrated urban wastewater model to analyze the frequency of receiving water quality in an urban wastewater system with the aim of assessing the overall system performance during rainfall events. The receiving water quality is represented by two indicators: event mean dissolved oxygen (DO) concentration and event mean ammonium concentration. The compliance probability of the water quality indicators satisfying a specific threshold is used to represent the system performance, and is derived using the rainfall events from a series of 10 years' rainfall data. A strong correlation between the depth of each rainfall event and the associated volume of combined sewer overflow (CSO) discharges is revealed for the case study catchment, while there is a low correlation between the intensity/duration of the rainfall event and the volume of the CSO discharges. The frequency analysis results obtained suggest that the event mean DO and ammonium concentrations have very different characteristics in terms of compliance probabilities at two discharging points for CSO and wastewater treatment plant effluent, respectively. In general, the simulation results provide an understanding of the performance of the integrated urban wastewater system and can provide useful information to support water quality management.

  1. Comparison of most adaptive meta model With newly created Quality Meta-Model using CART Algorithm

    Directory of Open Access Journals (Sweden)

    Jasbir Malik

    2012-09-01

    Full Text Available To ensure that the software developed is of high quality, it is now widely accepted that various artifacts generated during the development process should be rigorously evaluated using domain-specific quality model. However, a domain-specific quality model should be derived from a generic quality model which is time-proven, well-validated and widely-accepted. This thesis lays down a clear definition of quality meta-model and then identifies various quality meta-models existing in the research and practice-domains. This thesis then compares the various existing quality meta-models to identify which model is the most adaptable to various domains. A set of criteria is used to compare the various quality meta-models. In this we specify the categories, as the CART Algorithms is completely a tree architecture which works on either true or false meta model decision making power .So in the process it has been compared that , if the following items has been found in one category then it falls under true section else under false section .

  2. Good manufacturing practice for modelling air pollution: Quality criteria for computer models to calculate air pollution

    Science.gov (United States)

    Dekker, C. M.; Sliggers, C. J.

    To spur on quality assurance for models that calculate air pollution, quality criteria for such models have been formulated. By satisfying these criteria the developers of these models and producers of the software packages in this field can assure and account for the quality of their products. In this way critics and users of such (computer) models can gain a clear understanding of the quality of the model. Quality criteria have been formulated for the development of mathematical models, for their programming—including user-friendliness, and for the after-sales service, which is part of the distribution of such software packages. The criteria have been introduced into national and international frameworks to obtain standardization.

  3. A study of V79 cell survival after for proton and carbon ion beams as represented by the parameters of Katz' track structure model

    DEFF Research Database (Denmark)

    Grzanka, Leszek; Waligórski, M. P. R.; Bassler, Niels

    Katz’s theory of cellular track structure (1) is an amorphous analytical model which applies a set of four cellular parameters representing survival of a given cell line after ion irradiation. Usually the values of these parameters are best fitted to a full set of experimentally measured survival...... curves available for a variety of ions. Once fitted, using these parameter values and the analytical formulae of the model calculations, cellular survival curves and RBE may be predicted for that cell line after irradiation by any ion, including mixed ion fields. While it is known that the Katz model...... of the proton response. This suggests that for increased accuracy of a therapy planning system based on Katz’s model, different sets of parameters may need to be used to represent cell survival after proton irradiation from those representing survival of this cell line after heavier ions, up to and including...

  4. Assessing the fit of the Dysphoric Arousal model across two nationally representative epidemiological surveys: The Australian NSMHWB and the United States NESARC

    DEFF Research Database (Denmark)

    Armour, C.; Carragher, N.; Elhai, J. D.

    2013-01-01

    samples. Results revealed that the Dysphoric Arousal model provided superior fit to the data compared to the alternative models. In conclusion, these findings suggest that items D1-D3 (sleeping difficulties; irritability; concentration difficulties) represent a separate, fifth factor within PTSD's latent...

  5. Association of elevated blood pressure with low distress and good quality of life: results from the nationwide representative German Health Interview and Examination Survey for Children and Adolescents.

    Science.gov (United States)

    Berendes, Angela; Meyer, Thomas; Hulpke-Wette, Martin; Herrmann-Lingen, Christoph

    2013-05-01

    Quality of life is often impaired in patients with known hypertension, but it is less or not at all reduced in people unaware of their elevated blood pressure. Some studies have even shown less self-rated distress in adults with elevated blood pressure. In this substudy of the nationwide German Health Interview and Examination Survey for Children and Adolescents (KIGGS), we addressed the question whether, also in adolescents, hypertensive blood pressure is linked to levels of distress and quality of life. Study participants aged 11 to 17 years (N = 7688) received standardized measurements of blood pressure, quality of life (using the Children's Quality of Life Questionnaire), and distress (Strengths and Difficulties Questionnaire). Elevated blood pressure was twice as frequent as expected, with 10.7% (n = 825) above published age-, sex- and height-adjusted 95th percentiles. Hypertensive participants were more likely to be obese and to report on adverse health behaviors, but they showed better academic success than did normotensive participants. Elevated blood pressure was significantly and positively associated with higher self- and parent-rated quality of life (for both, p ≤ .006), less hyperactivity (for both, p blood pressure to better well-being and low distress can partly be explained by the absence of confounding physical comorbidity and the unawareness of being hypertensive. It also corresponds to earlier research suggesting a bidirectional relationship with repressed emotions leading to elevated blood pressure and, furthermore, elevated blood pressure serving as a potential stress buffer.

  6. River water quality model no. 1 (RWQM1): I. Modelling approach

    DEFF Research Database (Denmark)

    Shanahan, P.; Borchardt, D.; Henze, Mogens

    2001-01-01

    Successful river water quality modelling requires the specification of an appropriate model structure and process formulation. Both must be related to the compartment structure of running water ecosystems including their longitudinal, vertical, and lateral zonation patterns. Furthermore, the temp...

  7. River water quality model no. 1 (RWQM1): I. Modelling approach

    DEFF Research Database (Denmark)

    Shanahan, P.; Borchardt, D.; Henze, Mogens

    2001-01-01

    Successful river water quality modelling requires the specification of an appropriate model structure and process formulation. Both must be related to the compartment structure of running water ecosystems including their longitudinal, vertical, and lateral zonation patterns. Furthermore...

  8. Impact of urban parameterization on high resolution air quality forecast with the GEM – AQ model

    Directory of Open Access Journals (Sweden)

    J. Struzewska

    2012-11-01

    Full Text Available The aim of this study is to assess the impact of urban cover on high-resolution air quality forecast simulations with the GEM-AQ (Global Environmental Multiscale and Air Quality model. The impact of urban area on the ambient atmosphere is non-stationary, and short-term variability of meteorological conditions may result in significant changes of the observed intensity of urban heat island and pollutant concentrations. In this study we used the Town Energy Balance (TEB parameterization to represent urban effects on modelled meteorological and air quality parameters at the final nesting level with horizontal resolution of ~5 km over Southern Poland. Three one-day cases representing different meteorological conditions were selected and the model was run with and without the TEB parameterization. Three urban cover categories were used in the TEB parameterization: mid-high buildings, very low buildings and low density suburbs. Urban cover layers were constructed based on an area fraction of towns in a grid cell. To analyze the impact of urban parameterization on modelled meteorological and air quality parameters, anomalies in the lowest model layer for the air temperature, wind speed and pollutant concentrations were calculated. Anomalies of the specific humidity fields indicate that the use of the TEB parameterization leads to a systematic reduction of moisture content in the air. Comparison with temperature and wind speed measurements taken at urban background monitoring stations shows that application of urban parameterization improves model results. For primary pollutants the impact of urban areas is most significant in regions characterized with high emissions. In most cases the anomalies of NO2 and CO concentrations were negative. This reduction is most likely caused by an enhanced vertical mixing due to elevated surface temperature and modified vertical stability.

  9. Impact of urban parameterization on high resolution air quality forecast with the GEM - AQ model

    Science.gov (United States)

    Struzewska, J.; Kaminski, J. W.

    2012-11-01

    The aim of this study is to assess the impact of urban cover on high-resolution air quality forecast simulations with the GEM-AQ (Global Environmental Multiscale and Air Quality) model. The impact of urban area on the ambient atmosphere is non-stationary, and short-term variability of meteorological conditions may result in significant changes of the observed intensity of urban heat island and pollutant concentrations. In this study we used the Town Energy Balance (TEB) parameterization to represent urban effects on modelled meteorological and air quality parameters at the final nesting level with horizontal resolution of ~5 km over Southern Poland. Three one-day cases representing different meteorological conditions were selected and the model was run with and without the TEB parameterization. Three urban cover categories were used in the TEB parameterization: mid-high buildings, very low buildings and low density suburbs. Urban cover layers were constructed based on an area fraction of towns in a grid cell. To analyze the impact of urban parameterization on modelled meteorological and air quality parameters, anomalies in the lowest model layer for the air temperature, wind speed and pollutant concentrations were calculated. Anomalies of the specific humidity fields indicate that the use of the TEB parameterization leads to a systematic reduction of moisture content in the air. Comparison with temperature and wind speed measurements taken at urban background monitoring stations shows that application of urban parameterization improves model results. For primary pollutants the impact of urban areas is most significant in regions characterized with high emissions. In most cases the anomalies of NO2 and CO concentrations were negative. This reduction is most likely caused by an enhanced vertical mixing due to elevated surface temperature and modified vertical stability.

  10. Advanced quality prediction model for software architectural knowledge sharing

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris; Tang, Antony; Xu, Lai

    2011-01-01

    In the field of software architecture, a paradigm shift is occurring from describing the outcome of architecting process to describing the Architectural Knowledge (AK) created and used during architecting. Many AK models have been defined to represent domain concepts and their relationships, and the

  11. Application of artificial intelligence models in water quality forecasting.

    Science.gov (United States)

    Yeon, I S; Kim, J H; Jun, K W

    2008-06-01

    The real-time data of the continuous water quality monitoring station at the Pyeongchang river was analyzed separately during the rainy period and non-rainy period. Total organic carbon data observed during the rainy period showed a greater mean value, maximum value and standard deviation than the data observed during the non-rainy period. Dissolved oxygen values during the rainy period were lower than those observed during the non-rainy period. It was analyzed that the discharge due to rain fall from the basin affects the change of the water quality. A model for the forecasting of water quality was constructed and applied using the neural network model and the adaptive neuro-fuzzy inference system. Regarding the models of levenberg-marquardt neural network, modular neural network and adaptive neuro-fuzzy inference system, all three models showed good results for the simulation of total organic carbon. The levenberg-marquardt neural network and modular neural network models showed better results than the adaptive neuro-fuzzy inference system model in the forecasting of dissolved oxygen. The modular neural network model, which was applied with the qualitative data of time in addition to quantitative data, showed the least error.

  12. Conceptual Model As The Tool For Managing Bank Services Quality

    Directory of Open Access Journals (Sweden)

    Kornelija Severović

    2009-07-01

    Full Text Available Quality has become basic factor of economic efficiency and basic principle of business activities of successful organizations. Its consequence is revolution on the area of quality that has comprised all kinds of products and services and so the bank services as well. To understand the present and future needs of clients and to know how to fulfill and try to exceed their expectations is the task of each efficient economy. Therefore, the banks in the developed economies try to re-orientate organizationally, technologically and informatically their business activities placing the client in the core of this business activity. Significant indicators of quality services that banks offer is measured by the waiting time of clients for the offer of the desirable service and the number of clients who give up to enter the bank due to the long waiting queues. Dissatisfied client is the worst work result and business activity of banks. Following the stated, the great effort is made to improve service qualities, which means professionalism and communication of personnel with whom the clients come in contact, and giving punctual and clear information and short waiting period of standing in the lines. The aim of this work is to present and describe the functioning of bank system under the conditions of establishing quality in offering services to clients and to recognize basic guidelines for quality increase in the work of sub branches. Since the banking is very dynamic and complex system, the conceptual model is carried out for the purpose of optimization of the stated quality parameters for the bank business activity; this model, in further research, will serve for the development of simulation model.

  13. Pediatric health-related quality of life: a structural equation modeling approach.

    Directory of Open Access Journals (Sweden)

    Ester Villalonga-Olives

    Full Text Available OBJECTIVES: One of the most referenced theoretical frameworks to measure Health Related Quality of Life (HRQoL is the Wilson and Cleary framework. With some adaptions this framework has been validated in the adult population, but has not been tested in pediatric populations. Our goal was to empirically investigate it in children. METHODS: The contributory factors to Health Related Quality of Life that we included were symptom status (presence of chronic disease or hospitalizations, functional status (developmental status, developmental aspects of the individual (social-emotional behavior, and characteristics of the social environment (socioeconomic status and area of education. Structural equation modeling was used to assess the measurement structure of the model in 214 German children (3-5 years old participating in a follow-up study that investigates pediatric health outcomes. RESULTS: Model fit was χ2 = 5.5; df = 6; p = 0.48; SRMR  = 0.01. The variance explained of Health Related Quality of Life was 15%. Health Related Quality of Life was affected by the area education (i.e. where kindergartens were located and development status. Developmental status was affected by the area of education, socioeconomic status and individual behavior. Symptoms did not affect the model. CONCLUSIONS: The goodness of fit and the overall variance explained were good. However, the results between children' and adults' tests differed and denote a conceptual gap between adult and children measures. Indeed, there is a lot of variety in pediatric Health Related Quality of Life measures, which represents a lack of a common definition of pediatric Health Related Quality of Life. We recommend that researchers invest time in the development of pediatric Health Related Quality of Life theory and theory based evaluations.

  14. Towards image quality assessment in mammography using model observers: detection of a calcification like object.

    Science.gov (United States)

    Bouwman, Ramona W; Mackenzie, Alistair; van Engen, Ruben E; M Broeders, Mireille J; Young, Kenneth C; Dance, David R; den Heeten, Gerard J; Veldkamp, Wouter J H

    2017-08-24

    Model observers (MOs) are of interest in the field of medical imaging to asses image quality. However, before procedures using MOs can be proposed in quality control guidelines for mammography systems, we need to know whether MOs are sensitive to changes in image quality and correlations in background structure. Therefore, as a proof of principle, in this study human and model observer (MO) performance are compared for the detection of calcification like objects using different background structures and image quality levels of unprocessed mammography images. Three different phantoms, homogeneous polymethyl methacrylate, BR3D slabs with swirled patterns (CIRS, Norfolk, USA) and a prototype anthropomorphic breast phantom (Institute of Medical Physics and Radiation Protection, Technische Hochschule Mittelhessen, Germany) were imaged on an Amulet Innovality (FujiFilm, Tokyo, Japan) mammographic X-ray unit. Because the complexities of the structures of these three phantoms were different and not optimized to match the characteristics of real mammographic images, image processing was not applied in this study. Additionally, real mammograms were acquired on the same system. Regions of interest (ROIs) were extracted from each image. In half of the ROIs a 0.25 mm diameter disk was inserted at four different contrast levels to represent a calcification-like object. Each ROI was then modified so four image qualities relevant for mammography were simulated. The signal-present and signal-absent ROIs were evaluated by a non-pre-whitening model observer with eye filter (NPWE) and a channelized Hotelling observer (CHO) using dense-difference of Gaussian channels. The ROIs were also evaluated by human observers in a 2 alternative forced choice experiment. Detectability results for the human and model observer experiments were correlated using a mixed effect regression model. Threshold disk contrasts for human and predicted human observer performance based on the NPWE MO and CHO

  15. Brief Report: The KIDSCREEN Follow-Up Study on Health-Related Quality of Life (HRQoL) in Spanish Children and Adolescents. Pilot Test and Representativeness

    Science.gov (United States)

    Palacio-Vieira, J. A.; Villalonga-Olives, E.; Alonso, J.; Valderas, J. M.; Herdman, M.; Espallargues, M.; Berra, S.; Rajmil, L.

    2010-01-01

    The Spanish KIDSCREEN follow-up study reassessed the Spanish baseline sample (n = 840) of the European KIDSCREEN study 3 years later (2006). The aims of this paper were to describe the KIDSCREEN follow-up study and the pilot test, and to analyze participation rates and representativeness. Instruments included the KIDSCREEN-52 HRQoL measure and a…

  16. Model quality assessment using distance constraints from alignments

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Karplus, Kevin

    2008-01-01

    Given a set of alternative models for a specific protein sequence, the model quality assessment (MQA) problem asks for an assignment of scores to each model in the set. A good MQA program assigns these scores such that they correlate well with real quality of the models, ideally scoring best...... with the best MQA methods that were assessed at CASP7. We also propose a new evaluation measure, Kendall's tau, that is more interpretable than conventional measures used for evaluating MQA methods (Pearson's r and Spearman's rho). We show clear examples where Kendall's tau agrees much more with our intuition...... of a correct MQA, and we therefore propose that Kendall's tau be used for future CASP MQA assessments. Proteins 2009. (c) 2008 Wiley-Liss, Inc....

  17. A Dependent Hidden Markov Model of Credit Quality

    Directory of Open Access Journals (Sweden)

    Małgorzata Wiktoria Korolkiewicz

    2012-01-01

    Full Text Available We propose a dependent hidden Markov model of credit quality. We suppose that the "true" credit quality is not observed directly but only through noisy observations given by posted credit ratings. The model is formulated in discrete time with a Markov chain observed in martingale noise, where "noise" terms of the state and observation processes are possibly dependent. The model provides estimates for the state of the Markov chain governing the evolution of the credit rating process and the parameters of the model, where the latter are estimated using the EM algorithm. The dependent dynamics allow for the so-called "rating momentum" discussed in the credit literature and also provide a convenient test of independence between the state and observation dynamics.

  18. Fuzzy model for determination and assessment of groundwater quality in the city of Zrenjanin, Serbia

    Directory of Open Access Journals (Sweden)

    Kiurski-Milosević Jelena Ž.

    2015-01-01

    Full Text Available The application of the fuzzy logic for determination and assessment of the chemical quality of groundwater for drinking purposes in the city of Zrenjanin is presented. The degree of certainty and uncertainties are one of the problems in the most commonly used methods for assessing the water quality. Fuzzy logic can successfully handle these problems. Evaluation of fuzzy model was carried out on the samples from two representative wells that are located at depths of two aquifers from which water is taken to supply the population as drinking water. The samples were analyzed on 8 different chemical water quality parameters. In the research arsenic concentration (As3+, As5+ is considered as the dominant parameter due to its suspecting carcinogenic effects on human health. This type of research is for the first time conducted in the city of Zrenjanin, middle Banat region. [Projekat Ministarstva nauke Republike Srbije, br. MNTR174009 i br. TR34014

  19. Identification of water quality degradation hotspots in developing countries by applying large scale water quality modelling

    Science.gov (United States)

    Malsy, Marcus; Reder, Klara; Flörke, Martina

    2014-05-01

    Decreasing water quality is one of the main global issues which poses risks to food security, economy, and public health and is consequently crucial for ensuring environmental sustainability. During the last decades access to clean drinking water increased, but 2.5 billion people still do not have access to basic sanitation, especially in Africa and parts of Asia. In this context not only connection to sewage system is of high importance, but also treatment, as an increasing connection rate will lead to higher loadings and therefore higher pressure on water resources. Furthermore, poor people in developing countries use local surface waters for daily activities, e.g. bathing and washing. It is thus clear that water utilization and water sewerage are indispensable connected. In this study, large scale water quality modelling is used to point out hotspots of water pollution to get an insight on potential environmental impacts, in particular, in regions with a low observation density and data gaps in measured water quality parameters. We applied the global water quality model WorldQual to calculate biological oxygen demand (BOD) loadings from point and diffuse sources, as well as in-stream concentrations. Regional focus in this study is on developing countries i.e. Africa, Asia, and South America, as they are most affected by water pollution. Hereby, model runs were conducted for the year 2010 to draw a picture of recent status of surface waters quality and to figure out hotspots and main causes of pollution. First results show that hotspots mainly occur in highly agglomerated regions where population density is high. Large urban areas are initially loading hotspots and pollution prevention and control become increasingly important as point sources are subject to connection rates and treatment levels. Furthermore, river discharge plays a crucial role due to dilution potential, especially in terms of seasonal variability. Highly varying shares of BOD sources across

  20. Optimum profit model considering production, quality and sale problem

    Science.gov (United States)

    Chen, Chung-Ho; Lu, Chih-Lun

    2011-12-01

    Chen and Liu ['Procurement Strategies in the Presence of the Spot Market-an Analytical Framework', Production Planning and Control, 18, 297-309] presented the optimum profit model between the producers and the purchasers for the supply chain system with a pure procurement policy. However, their model with a simple manufacturing cost did not consider the used cost of the customer. In this study, the modified Chen and Liu's model will be addressed for determining the optimum product and process parameters. The authors propose a modified Chen and Liu's model under the two-stage screening procedure. The surrogate variable having a high correlation with the measurable quality characteristic will be directly measured in the first stage. The measurable quality characteristic will be directly measured in the second stage when the product decision cannot be determined in the first stage. The used cost of the customer will be measured by adopting Taguchi's quadratic quality loss function. The optimum purchaser's order quantity, the producer's product price and the process quality level will be jointly determined by maximising the expected profit between them.

  1. Guiding and Modelling Quality Improvement in Higher Education Institutions

    Science.gov (United States)

    Little, Daniel

    2015-01-01

    The article considers the process of creating quality improvement in higher education institutions from the point of view of current organisational theory and social-science modelling techniques. The author considers the higher education institution as a functioning complex of rules, norms and other organisational features and reviews the social…

  2. Total Quality Management, a New Culture Model of the Enterprise

    Directory of Open Access Journals (Sweden)

    Constantin Dumitrescu

    2006-10-01

    Full Text Available The paper brings bags of clarifications about concept definition and bases principles of TQM, presenting the critical factors during the implementation of those fundamentals. Also, it has been proposed a lot of models to present the Total Quality Management, being also presented its evolution.

  3. A Statistical Quality Model for Data-Driven Speech Animation.

    Science.gov (United States)

    Ma, Xiaohan; Deng, Zhigang

    2012-11-01

    In recent years, data-driven speech animation approaches have achieved significant successes in terms of animation quality. However, how to automatically evaluate the realism of novel synthesized speech animations has been an important yet unsolved research problem. In this paper, we propose a novel statistical model (called SAQP) to automatically predict the quality of on-the-fly synthesized speech animations by various data-driven techniques. Its essential idea is to construct a phoneme-based, Speech Animation Trajectory Fitting (SATF) metric to describe speech animation synthesis errors and then build a statistical regression model to learn the association between the obtained SATF metric and the objective speech animation synthesis quality. Through delicately designed user studies, we evaluate the effectiveness and robustness of the proposed SAQP model. To the best of our knowledge, this work is the first-of-its-kind, quantitative quality model for data-driven speech animation. We believe it is the important first step to remove a critical technical barrier for applying data-driven speech animation techniques to numerous online or interactive talking avatar applications.

  4. Heuristic Model Of The Composite Quality Index Of Environmental Assessment

    Science.gov (United States)

    Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.

    2017-01-01

    The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.

  5. Quality Assurance Based on Descriptive and Parsimonious Appearance Models

    DEFF Research Database (Denmark)

    Nielsen, Jannik Boll; Eiríksson, Eyþór Rúnar; Kristensen, Rasmus Lyngby

    2015-01-01

    In this positional paper, we discuss the potential benefits of using appearance models in additive manufacturing, metal casting, wind turbine blade production, and 3D content acquisition. Current state of the art in acquisition and rendering of appearance cannot easily be used for quality assurance...

  6. Guiding and Modelling Quality Improvement in Higher Education Institutions

    Science.gov (United States)

    Little, Daniel

    2015-01-01

    The article considers the process of creating quality improvement in higher education institutions from the point of view of current organisational theory and social-science modelling techniques. The author considers the higher education institution as a functioning complex of rules, norms and other organisational features and reviews the social…

  7. A deterministic aggregate production planning model considering quality of products

    Science.gov (United States)

    Madadi, Najmeh; Yew Wong, Kuan

    2013-06-01

    Aggregate Production Planning (APP) is a medium-term planning which is concerned with the lowest-cost method of production planning to meet customers' requirements and to satisfy fluctuating demand over a planning time horizon. APP problem has been studied widely since it was introduced and formulated in 1950s. However, in several conducted studies in the APP area, most of the researchers have concentrated on some common objectives such as minimization of cost, fluctuation in the number of workers, and inventory level. Specifically, maintaining quality at the desirable level as an objective while minimizing cost has not been considered in previous studies. In this study, an attempt has been made to develop a multi-objective mixed integer linear programming model that serves those companies aiming to incur the minimum level of operational cost while maintaining quality at an acceptable level. In order to obtain the solution to the multi-objective model, the Fuzzy Goal Programming approach and max-min operator of Bellman-Zadeh were applied to the model. At the final step, IBM ILOG CPLEX Optimization Studio software was used to obtain the experimental results based on the data collected from an automotive parts manufacturing company. The results show that incorporating quality in the model imposes some costs, however a trade-off should be done between the cost resulting from producing products with higher quality and the cost that the firm may incur due to customer dissatisfaction and sale losses.

  8. Innovations in projecting emissions for air quality modeling ...

    Science.gov (United States)

    Air quality modeling is used in setting air quality standards and in evaluating their costs and benefits. Historically, modeling applications have projected emissions and the resulting air quality only 5 to 10 years into the future. Recognition that the choice of air quality management strategy has climate change implications is encouraging longer modeling time horizons. However, for multi-decadal time horizons, many questions about future conditions arise. For example, will current population, economic, and land use trends continue, or will we see shifts that may alter the spatial and temporal pattern of emissions? Similarly, will technologies such as building-integrated solar photovoltaics, battery storage, electric vehicles, and CO2 capture emerge as disruptive technologies - shifting how we produce and use energy - or will these technologies achieve only niche markets and have little impact? These are some of the questions that are being evaluated by researchers within the U.S. EPA’s Office of Research and Development. In this presentation, Dr. Loughlin will describe a range of analytical approaches that are being explored. These include: (i) the development of alternative scenarios of the future that can be used to evaluate candidate management strategies over wide-ranging conditions, (ii) the application of energy system models to project emissions decades into the future and to assess the environmental implications of new technologies, (iii) and methodo

  9. Towards a quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, E.J.A.; Soest, J. van

    2011-01-01

    This research focuses on developing a quality model for semantic Information System (IS) standards. A lot of semantic IS standards are available in different industries. Often these standards are developed by a dedicated organization. While these organizations have the goal of increasing interoperab

  10. Towards a quality model for semantic IS standards

    NARCIS (Netherlands)

    Folmer, E.J.A.; Soest, J. van

    2012-01-01

    This research focuses on developing a quality model for semantic information system (IS) standards. A lot of semantic IS standards are available in different industries. Often these standards are developed by a dedicated organisation. While these organisations have the goal of increasing interoperab

  11. Graded Response Modeling of the Quality of Life Interview.

    Science.gov (United States)

    Uttaro, Thomas; Lehman, Anthony

    1999-01-01

    Outlined a graded response model and applied it to an aggregated data set from four studies involving subjective items from the Quality of Life Interview (QOLI) (A. Lehman, 1988). Used the results to create customized QOLI scales. Discusses the use of this methodology for scales involving ordered, graded categories. (SLD)

  12. Association of fast food consumption with energy intake, diet quality, body mass index and the risk of obesity in a representative Mediterranean population.

    Science.gov (United States)

    Schröder, Helmut; Fïto, Montserrat; Covas, Maria Isabel

    2007-12-01

    The aim of the present study was to describe the association of fast food consumption with BMI, energy intake and diet quality in a Mediterranean population. The subjects were Spanish men (n 1491) and women (n 1563) aged 25-74 years who were examined in 1999-2000, in a population-based cross-sectional survey in northeast Spain (Girona). Dietary intake was assessed using a FFQ that included four typical fast food items. Two dietary-quality indices, the Mediterranean diet score and the healthy eating index, were created. Height and weight were measured. Within the population studied, 10.1 % reported eating fast food at least once per month. Dietary energy intake and energy density were directly associated with frequency of fast food consumption. Multivariate logistic regression analysis adjusted for lifestyle and educational level showed an inverse association of frequency of fast food consumption with meeting the dietary reference intake (DRI) for energy (P = 0.001). The consumption of fast food more than once per week increased the risk of overall low diet quality (P food consumption expressed in g/d (P = 0.025) and in kJ/d (P = 0.017). The risk of being obese increased with the frequency of fast food consumption (P = 0.046). Fast food consumption was associated with higher energy intakes, poor diet quality and higher BMI. The likelihood of not meeting the DRI for energy, and of being obese, increased with the frequency of fast food consumption.

  13. Applying revised gap analysis model in measuring hotel service quality.

    Science.gov (United States)

    Lee, Yu-Cheng; Wang, Yu-Che; Chien, Chih-Hung; Wu, Chia-Huei; Lu, Shu-Chiung; Tsai, Sang-Bing; Dong, Weiwei

    2016-01-01

    With the number of tourists coming to Taiwan growing by 10-20 % since 2010, the number has increased due to an increasing number of foreign tourists, particularly after deregulation allowed admitting tourist groups, followed later on by foreign individual tourists, from mainland China. The purpose of this study is to propose a revised gap model to evaluate and improve service quality in Taiwanese hotel industry. Thus, service quality could be clearly measured through gap analysis, which was more effective for offering direction in developing and improving service quality. The HOLSERV instrument was used to identify and analyze service gaps from the perceptions of internal and external customers. The sample for this study included three main categories of respondents: tourists, employees, and managers. The results show that five gaps influenced tourists' evaluations of service quality. In particular, the study revealed that Gap 1 (management perceptions vs. customer expectations) and Gap 9 (service provider perceptions of management perceptions vs. service delivery) were more critical than the others in affecting perceived service quality, making service delivery the main area of improvement. This study contributes toward an evaluation of the service quality of the Taiwanese hotel industry from the perspectives of customers, service providers, and managers, which is considerably valuable for hotel managers. It was the aim of this study to explore all of these together in order to better understand the possible gaps in the hotel industry in Taiwan.

  14. Robust Multiscale Modelling Of Two-Phase Steels On Heterogeneous Hardware Infrastructures By Using Statistically Similar Representative Volume Element

    Directory of Open Access Journals (Sweden)

    Rauch Ł.

    2015-09-01

    Full Text Available The coupled finite element multiscale simulations (FE2 require costly numerical procedures in both macro and micro scales. Attempts to improve numerical efficiency are focused mainly on two areas of development, i.e. parallelization/distribution of numerical procedures and simplification of virtual material representation. One of the representatives of both mentioned areas is the idea of Statistically Similar Representative Volume Element (SSRVE. It aims at the reduction of the number of finite elements in micro scale as well as at parallelization of the calculations in micro scale which can be performed without barriers. The simplification of computational domain is realized by transformation of sophisticated images of material microstructure into artificially created simple objects being characterized by similar features as their original equivalents. In existing solutions for two-phase steels SSRVE is created on the basis of the analysis of shape coefficients of hard phase in real microstructure and searching for a representative simple structure with similar shape coefficients. Optimization techniques were used to solve this task. In the present paper local strains and stresses are added to the cost function in optimization. Various forms of the objective function composed of different elements were investigated and used in the optimization procedure for the creation of the final SSRVE. The results are compared as far as the efficiency of the procedure and uniqueness of the solution are considered. The best objective function composed of shape coefficients, as well as of strains and stresses, was proposed. Examples of SSRVEs determined for the investigated two-phase steel using that objective function are demonstrated in the paper. Each step of SSRVE creation is investigated from computational efficiency point of view. The proposition of implementation of the whole computational procedure on modern High Performance Computing (HPC

  15. Statistical properties of fluctuations of time series representing appearances of words in nationwide blog data and their applications: An example of modeling fluctuation scalings of nonstationary time series

    Science.gov (United States)

    Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako

    2016-11-01

    To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3 ×109 Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.

  16. Muscular weakness represents the main limiting factor of walk, functional independence and quality of life of myelopathy patients associated to HTLV-1

    Directory of Open Access Journals (Sweden)

    Renata Costa Caiafa

    2016-04-01

    Full Text Available ABSTRACT HTLV-1-associated myelopathy is a progressive disabling disease associated with gait abnormalities. Objective To identify and quantify the main muscles affected by weakness and spasticity, their impact on gait, functional capacity and on quality of life of HTLV-1-associated myelopathy patients. Method We evaluated lower limbs muscular strength according to the Medical Research Council scale, spasticity according to the modified Ashworth scale, daily activities according to the Barthel Index and quality of life according to the Short-Form Health Survey-36 of 26 HTLV-1-associated myelopathy patients. Results The muscles most affected by weakness included the dorsal flexors and knee flexors. Spasticity predominated in the hip adductor muscles and in plantar flexors. Assistance for locomotion, minimal dependence in daily activities, limitations in functional capacity and physical aspects were the most common findings. Conclusion The impairment of gait, functional dependence and quality of life were predominantly a consequence of intense muscle weakness in HTLV-1-associated myelopathy patients.

  17. Software Defect Prediction Models for Quality Improvement: A Literature Study

    Directory of Open Access Journals (Sweden)

    Mrinal Singh Rawat

    2012-09-01

    Full Text Available In spite of meticulous planning, well documentation and proper process control during software development, occurrences of certain defects are inevitable. These software defects may lead to degradation of the quality which might be the underlying cause of failure. In todays cutting edge competition its necessary to make conscious efforts to control and minimize defects in software engineering. However, these efforts cost money, time and resources. This paper identifies causative factors which in turn suggest the remedies to improve software quality and productivity. The paper also showcases on how the various defect prediction models are implemented resulting in reduced magnitude of defects.

  18. A Framework for Conceptual Modeling of Geographic Data Quality

    DEFF Research Database (Denmark)

    Friis-Christensen, Anders; Christensen, J.V.; Jensen, Christian Søndergaard

    2004-01-01

    Sustained advances in wireless communications, geo-positioning, and consumer electronics pave the way to a kind of location-based service that relies on the tracking of the continuously changing positions of an entire population of service users. This type of service is characterized by large...... determined by how "good" the data is, as different applications of geographic data require different qualities of the data are met. Such qualities concern the object level as well as the attribute level of the data. This paper presents a systematic and integrated approach to the conceptual modeling...

  19. Discharge Water Quality Models of Storm Runoff in a Catchment

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The relationships between the water qualities of nitrogen and phosphorous contents in the discharge water and the discharge of storm runoff of an experimental catchment including terraced paddy field are analyzed based on experiment results of the catchment. By summarizing the currently related research on water quality models, the water quality models of different components of storm runoff of the catchment are presented and verified with the experiment data of water quality analyses and the corresponding discharge of the storm runoffs during 3 storms. Through estimating the specific discharge of storm runoff, the specific load of different components of nitrogen and phosphorus in the discharge water of the catchment can be forecasted by the models. It is found that the mathematical methods of linear regression are very useful for analysis of the relationship between the concentrations of nitrogen and phosphorus and the water discharge of storm runoff. It is also found that the most content of the nitrogen (75%) in the discharge water is organic, while half of the content (49%) of phosphorus in the discharge water is inorganic.

  20. Food quality, effects on health and sustainability today: a model case report.

    Science.gov (United States)

    Borroni, Vittorio Natale; Fargion, Silvia; Mazzocchi, Alessandra; Giachetti, Marco; Lanzarini, Achille; Dall'Asta, Margherita; Scazzina, Francesca; Agostoni, Carlo

    2017-02-01

    The Fondazione IRCCS Ca' Granda Ospedale Maggiore Policlinico is a five-century institution that, besides the unique clinical role in the center of Milan, may rely on benefactor donations such as fields and farming houses not far from the city, for a total of 8500 ha, all managed by the "Sviluppo Ca' Granda' Foundation". Presently, the main products of these fields are represented by rice and cow's milk. During the latest years, farmers and managers have developed a model of sustainable food production, with great attention to the product quality based on compositional analysis and functional nutritional characteristics. This experience represents a new holistic model of food production and consumption, taking great care of both sustainability and health.

  1. Modeling air quality over China: Results from the Panda project

    Science.gov (United States)

    Katinka Petersen, Anna; Bouarar, Idir; Brasseur, Guy; Granier, Claire; Xie, Ying; Wang, Lili; Wang, Xuemei

    2015-04-01

    China faces strong air pollution problems related to rapid economic development in the past decade and increasing demand for energy. Air quality monitoring stations often report high levels of particle matter and ozone all over the country. Knowing its long-term health impacts, air pollution became then a pressing problem not only in China but also in other Asian countries. The PANDA project is a result of cooperation between scientists from Europe and China who joined their efforts for a better understanding of the processes controlling air pollution in China, improve methods for monitoring air quality and elaborate indicators in support of European and Chinese policies. A modeling system of air pollution is being setup within the PANDA project and include advanced global (MACC, EMEP) and regional (WRF-Chem, EMEP) meteorological and chemical models to analyze and monitor air quality in China. The poster describes the accomplishments obtained within the first year of the project. Model simulations for January and July 2010 are evaluated with satellite measurements (SCIAMACHY NO2 and MOPITT CO) and in-situ data (O3, CO, NOx, PM10 and PM2.5) observed at several surface stations in China. Using the WRF-Chem model, we investigate the sensitivity of the model performance to emissions (MACCity, HTAPv2), horizontal resolution (60km, 20km) and choice of initial and boundary conditions.

  2. A model of quality assurance and quality improvement for post-graduate medical education in Europe.

    Science.gov (United States)

    Da Dalt, Liviana; Callegaro, Silvia; Mazzi, Anna; Scipioni, Antonio; Lago, Paola; Chiozza, Maria L; Zacchello, Franco; Perilongo, Giorgio

    2010-01-01

    The issue of quality assurance (QA) and quality improvement (QI), being the quality of medical education intimately related to the quality of the health care, is becoming of paramount importance worldwide. To describe a model of implementing a system for internal QA and QI within a post-graduate paediatric training programme based on the ISO 9001:2000 standard. For the ISO 9001:2000 standard, the curriculum was managed as a series of interrelated processes and their level of function was monitored by ad hoc elaborated objective indicators. The training programme was fragmented in 19 interlinked processes, 15 related procedures and 24 working instructions. All these materials, along with the quality policy, the mission, the strategies and the values were made publicly available. Based on the measurable indicators developed to monitor some of the processes, areas of weakness of the system were objectively identified and consequently QI actions implemented. The appropriateness of all this allowed the programme to finally achieve an official ISO 9000:2001 certification. The application of the ISO 9001:2000 standard served to develop an internal QA and QI system and to meet most of the standards developed for QA in higher and medical education.

  3. Representing the acquisition and use of energy by individuals in agent-based models of animal populations

    DEFF Research Database (Denmark)

    Sibly, RS; Grimm, Volker; Johnston, Alice S.A.;

    2013-01-01

    Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge...... of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction......, and these can be used to obtain estimates of background mortality rate. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. The development of ABMs incorporating...

  4. Source Apportionment of Final Particulate Matterin North China Plain based on Air Quality Modeling

    Science.gov (United States)

    Xing, J.; Wu, W.; Chang, X.; Wang, S.; Hao, J.

    2016-12-01

    Most Chinese cities in North China Plain are suffering from serious air pollution. To develop the regional air pollution control policies, we need to identify the major source contributions to such pollution and to design the control policy which is accurate, efficient and effective. This study used the air quality model with serval advanced technologies including ISAM and ERSM, to assess the source contributions from individual pollutants (incl. SO2, NOx, VOC, NH3, primary PM), sectors (incl. power plants, industry, transportation and domestic), and regions (Beijing, Hebei, Tianjing and surrounding provinces). The modeling period is two months in 2012 as January and July which represent winter and summer respectively. The non-linear relationship between air pollutant emissions and air quality will be addressed, and the integrated control of multi-pollutants and multi-regions in China will be suggested.

  5. Joint space-time geostatistical model for air quality surveillance

    Science.gov (United States)

    Russo, A.; Soares, A.; Pereira, M. J.

    2009-04-01

    Air pollution and peoples' generalized concern about air quality are, nowadays, considered to be a global problem. Although the introduction of rigid air pollution regulations has reduced pollution from industry and power stations, the growing number of cars on the road poses a new pollution problem. Considering the characteristics of the atmospheric circulation and also the residence times of certain pollutants in the atmosphere, a generalized and growing interest on air quality issues led to research intensification and publication of several articles with quite different levels of scientific depth. As most natural phenomena, air quality can be seen as a space-time process, where space-time relationships have usually quite different characteristics and levels of uncertainty. As a result, the simultaneous integration of space and time is not an easy task to perform. This problem is overcome by a variety of methodologies. The use of stochastic models and neural networks to characterize space-time dispersion of air quality is becoming a common practice. The main objective of this work is to produce an air quality model which allows forecasting critical concentration episodes of a certain pollutant by means of a hybrid approach, based on the combined use of neural network models and stochastic simulations. A stochastic simulation of the spatial component with a space-time trend model is proposed to characterize critical situations, taking into account data from the past and a space-time trend from the recent past. To identify near future critical episodes, predicted values from neural networks are used at each monitoring station. In this paper, we describe the design of a hybrid forecasting tool for ambient NO2 concentrations in Lisbon, Portugal.

  6. Water quality modeling using geographic information system (GIS) data

    Science.gov (United States)

    Engel, Bernard A

    1992-01-01

    Protection of the environment and natural resources at the Kennedy Space Center (KSC) is of great concern. The potential for surface and ground water quality problems resulting from non-point sources of pollution was examined using models. Since spatial variation of parameters required was important, geographic information systems (GIS) and their data were used. The potential for groundwater contamination was examined using the SEEPAGE (System for Early Evaluation of the Pollution Potential of Agricultural Groundwater Environments) model. A watershed near the VAB was selected to examine potential for surface water pollution and erosion using the AGNPS (Agricultural Non-Point Source Pollution) model.

  7. Combining catchment and instream modelling to assess physical habitat quality

    DEFF Research Database (Denmark)

    Olsen, Martin

    observations showed that juvenile trout in stream Ledreborg prefered lower water depths and water velocities than juvenile trout in larger Danish streams, e.g. River Gudenå. Repeated electrofishing in the stream revealed big differences in temporal and spatial distribution of the trouts on the four reaches...... and abundance of trout on the reaches. • Comparison of reference condition minimum run-off and WUA curves suggested that summer low flow were not a limiting factor on the physical habitat quality for juvenile trout under reference conditions. • Habitat hydraulic modelling suggested that stream Ledreborg had...... the best potential physical habitat quality for trout fry and juvenile trout and the lowest potential physical habitat quality for adult trout. This finding supports previous evaluations of the stream as a trout habitat, concluding that stream Ledreborg has very few suitable habitats for adult trout...

  8. MODELS OF QUALITY MANAGEMENT SYSTEM: CONTENT AND SCOPE

    Directory of Open Access Journals (Sweden)

    Awny ZREKAT

    2015-12-01

    Full Text Available In this article are analyzed methods developed to prevent the wastage of the majority of the benefit from the production process: JIT, Value Engineering and Constructability. These methods were developed parallel to the development of quality control, quality assurance and total quality management.MODELE DE SISTEM AL MANAGEMENTULUI CALITĂŢII: CONŢINUTUL ŞI DOMENIUL DE APLICAREÎn acest articol sunt analizate metodele de prevenire a pierderilor beneficiului majoritar din procesul de producţie: JIT (eficienţa timpului, Valoarea Inginerie, Constructivitate. Aceste metode au fost dezvoltate în paralel cu evoluţia controlului calităţii, asigurarea calităţii şi managementului calităţii totale.

  9. Importance of demand modelling in network water quality models: a review

    Directory of Open Access Journals (Sweden)

    J. C. van Dijk

    2008-09-01

    Full Text Available Today, there is a growing interest in network water quality modelling. The water quality issues of interest relate to both dissolved and particulate substances. For dissolved substances the main interest is in residual chlorine and (microbiological contaminant propagation; for particulate substances it is in sediment leading to discolouration. There is a strong influence of flows and velocities on transport, mixing, production and decay of these substances in the network. This imposes a different approach to demand modelling which is reviewed in this article.

    For the large diameter lines that comprise the transport portion of a typical municipal pipe system, a skeletonised network model with a top-down approach of demand pattern allocation, a hydraulic time step of 1 h, and a pure advection-reaction water quality model will usually suffice. For the smaller diameter lines that comprise the distribution portion of a municipal pipe system, an all-pipes network model with a bottom-up approach of demand pattern allocation, a hydraulic time step of 1 min or less, and a water quality model that considers dispersion and transients may be needed.

    Demand models that provide stochastic residential demands per individual home and on a one-second time scale are available. A stochastic demands based network water quality model needs to be developed and validated with field measurements. Such a model will be probabilistic in nature and will offer a new perspective for assessing water quality in the drinking water distribution system.

  10. Validation of mathematical models for Salmonella growth in raw ground beef under dynamic temperature conditions representing loss of refrigeration.

    Science.gov (United States)

    McConnell, Jennifer A; Schaffner, Donald W

    2014-07-01

    Temperature is a primary factor in controlling the growth of microorganisms in food. The current U. S. Food and Drug Administration Model Food Code guidelines state that food can be kept out of temperature control for up to 4 h without qualifiers, or up to 6 h, if the food product starts at an initial 41 °F (5 °C) temperature and does not exceed 70 °F (21 °C) at 6 h. This project validates existing ComBase computer models for Salmonella growth under changing temperature conditions modeling scenarios using raw ground beef as a model system. A cocktail of Salmonella serovars isolated from different meat products ( Salmonella Copenhagen, Salmonella Montevideo, Salmonella Typhimurium, Salmonella Saintpaul, and Salmonella Heidelberg) was made rifampin resistant and used for all experiments. Inoculated samples were held in a programmable water bath at 4.4 °C (40 °F) and subjected to linear temperature changes to different final temperatures over various lengths of time and then returned to 4.4 °C (40 °F). Maximum temperatures reached were 15.6, 26.7, or 37.8 °C (60, 80, or 100 °F), and the temperature increases took place over 4, 6, and 8 h, with varying cooling times. Our experiments show that when maximum temperatures were lower (15.6 or 26.7 °C), there was generally good agreement between the ComBase models and experiments: when temperature increases of 15.6 or 26.7 °C occurred over 8 h, experimental data were within 0.13 log CFU of the model predictions. When maximum temperatures were 37 °C, predictive models were fail-safe. Overall bias of the models was 1.11. and accuracy was 2.11. Our experiments show the U.S. Food and Drug Administration Model Food Code guidelines for holding food out of temperature control are quite conservative. Our research also shows that the ComBase models for Salmonella growth are accurate or fail-safe for dynamic temperature conditions as might be observed due to power loss from natural disasters or during transport out of

  11. Representing the acquisition and use of energy by individuals in agent-based models of animal populations

    Science.gov (United States)

    Sibly, Richard M.; Grimm, Volker; Martin, Benjamin T.; Johnston, Alice S.A.; Kulakowska, Katarzyna; Topping, Christopher J.; Calow, Peter; Nabe-Nielsen, Jacob; Thorbek, Pernille; DeAngelis, Donald L.

    2013-01-01

    1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests.

  12. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-05-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  13. OCEANFILMS-2: Representing coadsorption of saccharides in marine films and potential impacts on modeled marine aerosol chemistry

    Science.gov (United States)

    Burrows, Susannah M.; Gobrogge, Eric; Fu, Li; Link, Katie; Elliott, Scott M.; Wang, Hongfei; Walker, Rob

    2016-08-01

    Here we show that the addition of chemical interactions between soluble monosaccharides and an insoluble lipid surfactant monolayer improves agreement of modeled sea spray chemistry with observed marine aerosol chemistry. In particular, the alkane:hydroxyl mass ratio in modeled sea spray organic matter is reduced from a median of 2.73 to a range of 0.41-0.69, reducing the discrepancy with previous Fourier transform infrared spectroscopy (FTIR) observations of clean marine aerosol (ratio: 0.24-0.38). The overall organic fraction of submicron sea spray also increases, allowing organic mass fractions in the range 0.5-0.7 for submicron sea spray particles over highly active phytoplankton blooms. Sum frequency generation experiments support the modeling approach by demonstrating that soluble monosaccharides can strongly adsorb to a lipid monolayer likely via Coulomb interactions under appropriate conditions. These laboratory findings motivate further research to determine the relevance of coadsorption mechanisms for real-world, sea spray aerosol production.

  14. OCEANFILMS-2: Representing coadsorption of saccharides in marine films and potential impacts on modeled marine aerosol chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Burrows, Susannah M. [Atmospheric Science and Global Change Division, Pacific Northwest National Laboratory, Richland Washington USA; Gobrogge, Eric [Department of Chemistry and Biochemistry, Montana State University, Bozeman Montana USA; Fu, Li [Environmental and Molecular Sciences Laboratory, Pacific Northwest National Laboratory, Richland Washington USA; Link, Katie [Department of Chemistry and Biochemistry, Montana State University, Bozeman Montana USA; Elliott, Scott M. [Climate, Ocean, and Sea Ice Modelling Group, Los Alamos National Laboratory, Los Alamos New Mexico USA; Wang, Hongfei [Environmental and Molecular Sciences Laboratory, Pacific Northwest National Laboratory, Richland Washington USA; Walker, Rob [Department of Chemistry and Biochemistry, Montana State University, Bozeman Montana USA

    2016-08-10

    Here we show that the addition of chemical interactions of soluble polysaccharides with a surfactant monolayer improves agreement of modeled sea spray chemistry with observed marine aerosol chemistry. In particular, the fraction of hydroxyl functional groups in modeled sea spray organic matter is increased, improving agreement with FTIR observations of marine aerosol composition. The overall organic fraction of submicron sea spray also increases, allowing organic mass fractions in the range 0.5 – 0.7 for submicron sea spray particles over highly active phytoplankton blooms. We show results from Sum Frequency Generation (SFG) experiments that support the modeling approach, by demonstrating that soluble polysaccharides can strongly adsorb to a lipid monolayer via columbic interactions under appropriate conditions.

  15. Representing soakaways in a physically distributed urban drainage model – Upscaling individual allotments to an aggregated scale

    DEFF Research Database (Denmark)

    Roldin, Maria Kerstin; Mark, Ole; Kuczera, George;

    2012-01-01

    The increased load on urban stormwater systems due to climate change and growing urbanization can be partly alleviated by using soakaways and similar infiltration techniques. However, while soakaways are usually small-scale structures, most urban drainage network models operate on a larger spatial...... of individual soakaways well. Six upscaling methods to aggregate individual soakaway units with varying saturated hydraulic conductivity (K) in the surrounding soil have been investigated. In the upscaled model, the weighted geometric mean hydraulic conductivity of individual allotments is found to provide...

  16. Modelling the Cost and Quality of Preservation Imaging and Archiving

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2009-01-01

    materials held by national cultural heritage institutions in Denmark, a study was undertaken to provide a generic cost model for digital preservation. The outcome of the study is an activity based cost model, which accounts for full economic costs. It is structured around the functional descriptions...... investigated and specifications based on best practice and testing established. Also, the image quality parameters, which influence the long term preservation costs, were identified. In addition, the suitability for preservation of different image file formats and compression algorithms was evaluated...... in the OAIS Reference Model. The cost model divides the OAIS functions in a hierarchy of cost critical activities and measurable components, which are implemented as formulas in a spreadsheet. So far the model has only been completed for activities relating to preservation planning and digital migrations...

  17. A stochastic physical system approach to modeling river water quality

    Science.gov (United States)

    Curi, W. F.; Unny, T. E.; Kay, J. J.

    1995-06-01

    In this paper, concepts of network thermodynamics are applied to a river water quality model, which is based on Streeter-Phelps equations, to identify the corresponding physical components and their topology. Then, the randomness in the parameters, input coefficients and initial conditions are modeled by Gaussian white noises. From the stochastic components of the physical system description of problem and concepts of physical system theory, a set of stochastic differential equations can be automatically generated in a computer and the recent developments on the automatic formulation of the moment equations based on Ito calculus can be used. This procedure is illustrated through the solution of an example of stochastic river water quality problem and it is also shown how other related problems with different configurations can be automatically solved in a computer using just one software.

  18. Modeling and Simulating of Uncertain Quality Abnormity Diagnosis

    Directory of Open Access Journals (Sweden)

    Shiwang Hou

    2013-05-01

    Full Text Available There is much fuzzy uncertain information during the diagnosis of quality abnormity. The effective utilization model of that can provide important decision-making support. In this study, we consider three main types of fuzzy production rules, which can be used in fuzzy quality abnormity diagnosis problem and their presentation models are constructed by use of Fuzzy Reasoning Petri Nets (FRPNs. Considering of the graphic representation and logic structure of FRPNs, we propose the method for simulating model using Matlab toolbox state flow. By establishing a corresponding relationship between FRPNs rules and state flow block diagram, three simulating models for the three corresponding FRPNs’ basic structure are developed. Finally, we give an application case of the proposed model. Taking place truth degree data of FRPNs as input, the diagnosis process and results can be shown dynamically in the state flow simulating model under Matlab environment. The result illustrated that the method proposed can give reliable information for process maintenance and abnormal causes’ location.

  19. Representing the acquisition and use of energy by individuals in agent-based models of animal populations

    DEFF Research Database (Denmark)

    Sibly, RS; Grimm, Volker; Johnston, Alice S.A.

    2013-01-01

    , and these can be used to obtain estimates of background mortality rate. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. The development of ABMs incorporating...

  20. Contribution to Experimental Validation of Linear and Non-Linear Dynamic Models for Representing Rotor-Blade Parametric Coupled Vibrations

    DEFF Research Database (Denmark)

    Santos, Ilmar; Saracho, C.M.; Smith, J.T.

    2004-01-01

    This work gives a theoretical and experimental contribution to the problem of rotor-blades dynamic interaction. A validation procedure of mathematical models is carried out with help of a simple test rig, built by a mass-spring system attached to four flexible rotating blades. With this test rig,...

  1. Importance of demand modelling in network water quality models: a review

    NARCIS (Netherlands)

    Blokker, E.J.M.; Vreeburg, J.H.G.; Buchberger, S.G.; Van Dijk, J.C.

    2008-01-01

    Today, there is a growing interest in network water quality modelling. The water quality issues of interest relate to both dissolved and particulate substances. For dissolved substances the main interest is in residual chlorine and (microbiological) contaminant propagation; for particulate substance

  2. Importance of demand modelling in network water quality models: a review

    NARCIS (Netherlands)

    Blokker, E.J.M.; Vreeburg, J.H.G.; Buchberger, S.G.; Van Dijk, J.C.

    2008-01-01

    Today, there is a growing interest in network water quality modelling. The water quality issues of interest relate to both dissolved and particulate substances. For dissolved substances the main interest is in residual chlorine and (microbiological) contaminant propagation; for particulate substance

  3. Towards a more representative parametrisation of hydrological models via synthesizing the strengths of particle swarm optimisation and robust parameter estimation

    Directory of Open Access Journals (Sweden)

    T. Krauße

    2011-03-01

    Full Text Available The development of methods for estimating the parameters of hydrological models considering uncertainties has been of high interest in hydrological research over the last years. In particular methods which understand the estimation of hydrological model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth found growing research interest. Bárdossy and Singh (2008 presented a first proposal and applied it for the calibration of a conceptual rainfall-runoff model with daily time step. Krauße and Cullmann (2011 further developed this method and applied it in a case study to calibrate a process oriented hydrological model with hourly time step focussing on flood events in a fast responding catchment. The results of both studies showed the potential of the application of the principle of data depth. However, also the weak point of the presented approach got obvious. The algorithm identifies a set of model parameter vectors with high model performance and subsequently generates a set of parameter vectors with high data depth with respect to the first set. These both steps are repeated iteratively until a stopping criterion is met. In the first step the estimation of the good parameter vectors is based on the Monte Carlo method. The major shortcoming of this method is that it is strongly dependent on a high number of samples exponentially growing with the dimensionality of the problem. In this paper we present another robust parameter estimation strategy which applies an approved search strategy for high-dimensional parameter spaces, the particle swarm optimisation in order to identify a set of good parameter vectors with given uncertainty bounds. The generation of deep parameters is according to Krauße and Cullmann (2011. The method was compared to the Monte Carlo based robust parameter estimation algorithm on the example of a case study in Krauße and Cullmann (2011 to

  4. Application of Extended Kalman Filter to the Modeling of Electric Arc Furnace for Power Quality Issues

    Institute of Scientific and Technical Information of China (English)

    JIN Zhi-jian; WANG Feng-hua; ZHU Zi-shu

    2007-01-01

    Electric arc furnaces (EAFs) represent one of the most disturbing loads in the subtransmission or transmission electric power systems. Therefore, it is necessary to build a practical model to descript the behavior of EAF in the simulation of power system for power quality issues. This paper deals with the modeling of EAF based on the combination of extended Kalman filter to identify the parameter of arc current and the power balance equation to obtain the dynamic, multi-valued u-i characteristics of EAF load. The whole EAF systems are simulated by means of power system blockset in Matlab to validate the proposed EAF model. This model can also be used to assess the impact of the new plant or highly varying nonlinear loads that exhibit chaos in power systems.

  5. Underground water quality model inversion of genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    MA Ruijie; LI Xin

    2009-01-01

    The underground water quality model with non-linear inversion problem is ill-posed, and boils down to solving the minimum of nonlinear function. Genetic algorithms are adopted in a number of individuals of groups by iterative search to find the optimal solution of the problem, the encoding strings as its operational objective, and achieving the iterative calculations by the genetic operators. It is an effective method of inverse problems of groundwater, with incomparable advantages and practical significances.

  6. Quality of Methods Reporting in Animal Models of Colitis

    Science.gov (United States)

    Bramhall, Michael; Flórez-Vargas, Oscar; Stevens, Robert; Brass, Andy

    2015-01-01

    Background: Current understanding of the onset of inflammatory bowel diseases relies heavily on data derived from animal models of colitis. However, the omission of information concerning the method used makes the interpretation of studies difficult or impossible. We assessed the current quality of methods reporting in 4 animal models of colitis that are used to inform clinical research into inflammatory bowel disease: dextran sulfate sodium, interleukin-10−/−, CD45RBhigh T cell transfer, and 2,4,6-trinitrobenzene sulfonic acid (TNBS). Methods: We performed a systematic review based on PRISMA guidelines, using a PubMed search (2000–2014) to obtain publications that used a microarray to describe gene expression in colitic tissue. Methods reporting quality was scored against a checklist of essential and desirable criteria. Results: Fifty-eight articles were identified and included in this review (29 dextran sulfate sodium, 15 interleukin-10−/−, 5 T cell transfer, and 16 TNBS; some articles use more than 1 colitis model). A mean of 81.7% (SD = ±7.038) of criteria were reported across all models. Only 1 of the 58 articles reported all essential criteria on our checklist. Animal age, gender, housing conditions, and mortality/morbidity were all poorly reported. Conclusions: Failure to include all essential criteria is a cause for concern; this failure can have large impact on the quality and replicability of published colitis experiments. We recommend adoption of our checklist as a requirement for publication to improve the quality, comparability, and standardization of colitis studies and will make interpretation and translation of data to human disease more reliable. PMID:25989337

  7. The EDEN-IW ontology model for sharing knowledge and water quality data between heterogenous databases

    DEFF Research Database (Denmark)

    Stjernholm, M.; Poslad, S.; Zuo, L.

    2004-01-01

    The Environmental Data Exchange Network for Inland Water (EDEN-IW) project's main aim is to develop a system for making disparate and heterogeneous databases of Inland Water quality more accessible to users. The core technology is based upon a combination of: ontological model to represent...... successfully demonstrated the use of our systems to semantically integrate two main database resources from IOW and NERI - these are available on-line. We are in the process of adding further databases and sup-porting a wider variety of user queries such as Decision Support System queries....

  8. Benchmarking consensus model quality assessment for protein fold recognition

    Directory of Open Access Journals (Sweden)

    McGuffin Liam J

    2007-09-01

    Full Text Available Abstract Background Selecting the highest quality 3D model of a protein structure from a number of alternatives remains an important challenge in the field of structural bioinformatics. Many Model Quality Assessment Programs (MQAPs have been developed which adopt various strategies in order to tackle this problem, ranging from the so called "true" MQAPs capable of producing a single energy score based on a single model, to methods which rely on structural comparisons of multiple models or additional information from meta-servers. However, it is clear that no current method can separate the highest accuracy models from the lowest consistently. In this paper, a number of the top performing MQAP methods are benchmarked in the context of the potential value that they add to protein fold recognition. Two novel methods are also described: ModSSEA, which based on the alignment of predicted secondary structure elements and ModFOLD which combines several true MQAP methods using an artificial neural network. Results The ModSSEA method is found to be an effective model quality assessment program for ranking multiple models from many servers, however further accuracy can be gained by using the consensus approach of ModFOLD. The ModFOLD method is shown to significantly outperform the true MQAPs tested and is competitive with methods which make use of clustering or additional information from multiple servers. Several of the true MQAPs are also shown to add value to most individual fold recognition servers by improving model selection, when applied as a post filter in order to re-rank models. Conclusion MQAPs should be benchmarked appropriately for the practical context in which they are intended to be used. Clustering based methods are the top performing MQAPs where many models are available from many servers; however, they often do not add value to individual fold recognition servers when limited models are available. Conversely, the true MQAP methods

  9. AQA - Air Quality model for Austria - Evaluation and Developments

    Science.gov (United States)

    Hirtl, M.; Krüger, B. C.; Baumann-Stanzer, K.; Skomorowski, P.

    2009-04-01

    The regional weather forecast model ALADIN of the Central Institute for Meteorology and Geodynamics (ZAMG) is used in combination with the chemical transport model CAMx (www.camx.com) to conduct forecasts of gaseous and particulate air pollution over Europe. The forecasts which are done in cooperation with the University of Natural Resources and Applied Life Sciences in Vienna (BOKU) are supported by the regional governments since 2005 with the main interest on the prediction of tropospheric ozone. The daily ozone forecasts are evaluated for the summer 2008 with the observations of about 150 air quality stations in Austria. In 2008 the emission-model SMOKE was integrated into the modelling system to calculate the biogenic emissions. The anthropogenic emissions are based on the newest EMEP data set as well as on regional inventories for the core domain. The performance of SMOKE is shown for a summer period in 2007. In the frame of the COST-action 728 „Enhancing mesoscale meteorological modelling capabilities for air pollution and dispersion applications", multi-model ensembles are used to conduct an international model evaluation. The model calculations of meteorological- and concentration fields are compared to measurements on the ensemble platform at the Joint Research Centre (JRC) in Ispra. The results for 2 episodes in 2006 show the performance of the different models as well as of the model ensemble.

  10. Towards a more representative parametrisation of hydrologic models via synthesizing the strengths of Particle Swarm Optimisation and Robust Parameter Estimation

    Directory of Open Access Journals (Sweden)

    T. Krauße

    2012-02-01

    Full Text Available The development of methods for estimating the parameters of hydrologic models considering uncertainties has been of high interest in hydrologic research over the last years. In particular methods which understand the estimation of hydrologic model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth found growing research interest. Bárdossy and Singh (2008 presented a first Robust Parameter Estimation Method (ROPE and applied it for the calibration of a conceptual rainfall-runoff model with daily time step. The basic idea of this algorithm is to identify a set of model parameter vectors with high model performance called good parameters and subsequently generate a set of parameter vectors with high data depth with respect to the first set. Both steps are repeated iteratively until a stopping criterion is met. The results estimated in this case study show the high potential of the principle of data depth to be used for the estimation of hydrologic model parameters. In this paper we present some further developments that address the most important shortcomings of the original ROPE approach. We developed a stratified depth based sampling approach that improves the sampling from non-elliptic and multi-modal distributions. It provides a higher efficiency for the sampling of deep points in parameter spaces with higher dimensionality. Another modification addresses the problem of a too strong shrinking of the estimated set of robust parameter vectors that might lead to overfitting for model calibration with a small amount of calibration data. This contradicts the principle of robustness. Therefore, we suggest to split the available calibration data into two sets and use one set to control the overfitting. All modifications were implemented into a further developed ROPE approach that is called Advanced Robust Parameter Estimation (AROPE. However, in this approach the estimation of

  11. [Prediction of PCBs uptake by vegetable in a representative area and evaluation of the human health risk by Trapp model].

    Science.gov (United States)

    Deng, Shao-Po; Luo, Yong-Ming; Song, Jing; Teng, Ying; Chen, Yong-Shan

    2010-12-01

    Air, soil and vegetable samples were collected from an e-waste disassembly site and analyzed for characteristic contaminants PCBs. Based on the measured PCBs concentrations in soil and air, PCBs concentration in leafy vegetables was predicted by Trapp Model and the sources, composition of PCBs in vegetable and influencing factors were analyzed. By using human health risk assessment model of USEPA, risk to human health from consumption of vegetable that take up PCBs from environment was evaluated. The results showed that the Trapp Model could give good prediction of PCBs concentrations in leafy vegetables based on PCBs concentration in the soil and air. For instance, the measured sum of seven PCBs in vegetable was 51.2 microg x kg(-1) and the predicted value was 39.9 microg x kg(-1). So the predicted value agrees well with the measured value. The gaseous PCBs were the main source of PCBs in leafy vegetables, and the model predicting results indicated that the contribution rate was as high as 98.8%. The uptake pathway, n-octanol/water partition coefficient (K(ow)) and the n-octanol/air partition coefficient (K(oa)) of PCBs determine the concentration and composition of PCBs in vegetables. The duration needed for PCBs uptake to reach equilibrium was in good correlation with lgK(ow) and lgK(oa). Multiple linear regression analysis indicated that lgK(oa) was more important. Carcinogenic risk from consumption of PCBs contaminated vegetables was 10 000 times higher than that of gaseous PCBs, and the no-carcinogenic risk was increased by approximately 200 times. The main reasons are firstly the vegetables take up and accumulate more toxic PCBs with high-chloride substitutes and consequently the oral toxic factors of PCBs increase dramatically. Secondly, an adult takes 71 times more PCBs via consumption of vegetables than via inhalation of air.

  12. Development and Implementation of a Transversely Isotropic Hyperelastic Constitutive Model With Two Fiber Families to Represent Anisotropic Soft Biological Tissues

    Science.gov (United States)

    2014-06-01

    region ( cervical , thoracic or lumbar), and, starting with the most superior (highest) vertebra in that region, numbered consecutively until the most...plane of the intervertebral disc have all been used by researchers to model the fibers of the annulus fibrosus (1, 18–20). CERVICAL VERTEBRAE THORACIC...typical vertebra (panel b). Vertebra are color-coded according to their location classification. Panel c is an illustration (not drawn to scale) of an

  13. A NEW COMBINED LOCAL AND NON-LOCAL PBL MODEL FOR METEOROLOGY AND AIR QUALITY MODELING

    Science.gov (United States)

    A new version of the Asymmetric Convective Model (ACM) has been developed to describe sub-grid vertical turbulent transport in both meteorology models and air quality models. The new version (ACM2) combines the non-local convective mixing of the original ACM with local eddy diff...

  14. A NEW COMBINED LOCAL AND NON-LOCAL PBL MODEL FOR METEOROLOGY AND AIR QUALITY MODELING

    Science.gov (United States)

    A new version of the Asymmetric Convective Model (ACM) has been developed to describe sub-grid vertical turbulent transport in both meteorology models and air quality models. The new version (ACM2) combines the non-local convective mixing of the original ACM with local eddy diff...

  15. Mathematical models of magnetite desliming for automated quality control systems

    Science.gov (United States)

    Olevska, Yu.; Mishchenko, V.; Olevskyi, V.

    2016-10-01

    The aim of the study is to provide multifactor mathematical models suitable for use in automatic control systems of desliming process. For this purpose we described the motion of a two-phase environment regard to the shape the desliming machine and technological parameters of the enrichment process. We created the method for preparation of dependences of the enrichment process quality from the technological and design parameters. To automate the process we constructed mathematical models to justify intensive technological modes and optimal parameters for design of desliming machine.

  16. Boundary-layer turbulent processes and mesoscale variability represented by numerical weather prediction models during the BLLAST campaign

    Science.gov (United States)

    Couvreux, Fleur; Bazile, Eric; Canut, Guylaine; Seity, Yann; Lothon, Marie; Lohou, Fabienne; Guichard, Françoise; Nilsson, Erik

    2016-07-01

    This study evaluates the ability of three operational models, with resolution varying from 2.5 to 16 km, to predict the boundary-layer turbulent processes and mesoscale variability observed during the Boundary Layer Late-Afternoon and Sunset Turbulence (BLLAST) field campaign. We analyse the representation of the vertical profiles of temperature and humidity and the time evolution of near-surface atmospheric variables and the radiative and turbulent fluxes over a total of 12 intensive observing periods (IOPs), each lasting 24 h. Special attention is paid to the evolution of the turbulent kinetic energy (TKE), which was sampled by a combination of independent instruments. For the first time, this variable, a central one in the turbulence scheme used in AROME and ARPEGE, is evaluated with observations.In general, the 24 h forecasts succeed in reproducing the variability from one day to another in terms of cloud cover, temperature and boundary-layer depth. However, they exhibit some systematic biases, in particular a cold bias within the daytime boundary layer for all models. An overestimation of the sensible heat flux is noted for two points in ARPEGE and is found to be partly related to an inaccurate simplification of surface characteristics. AROME shows a moist bias within the daytime boundary layer, which is consistent with overestimated latent heat fluxes. ECMWF presents a dry bias at 2 m above the surface and also overestimates the sensible heat flux. The high-resolution model AROME resolves the vertical structures better, in particular the strong daytime inversion and the thin evening stable boundary layer. This model is also able to capture some specific observed features, such as the orographically driven subsidence and a well-defined maximum that arises during the evening of the water vapour mixing ratio in the upper part of the residual layer due to fine-scale advection. The model reproduces the order of magnitude of spatial variability observed at

  17. Water quality modelling for ephemeral rivers: Model development and parameter assessment

    Science.gov (United States)

    Mannina, Giorgio; Viviani, Gaspare

    2010-11-01

    SummaryRiver water quality models can be valuable tools for the assessment and management of receiving water body quality. However, such water quality models require accurate model calibration in order to specify model parameters. Reliable model calibration requires an extensive array of water quality data that are generally rare and resource-intensive, both economically and in terms of human resources, to collect. In the case of small rivers, such data are scarce due to the fact that these rivers are generally considered too insignificant, from a practical and economic viewpoint, to justify the investment of such considerable time and resources. As a consequence, the literature contains very few studies on the water quality modelling for small rivers, and such studies as have been published are fairly limited in scope. In this paper, a simplified river water quality model is presented. The model is an extension of the Streeter-Phelps model and takes into account the physico-chemical and biological processes most relevant to modelling the quality of receiving water bodies (i.e., degradation of dissolved carbonaceous substances, ammonium oxidation, algal uptake and denitrification, dissolved oxygen balance, including depletion by degradation processes and supply by physical reaeration and photosynthetic production). The model has been applied to an Italian case study, the Oreto river (IT), which has been the object of an Italian research project aimed at assessing the river's water quality. For this reason, several monitoring campaigns have been previously carried out in order to collect water quantity and quality data on this river system. In particular, twelve river cross sections were monitored, and both flow and water quality data were collected for each cross section. The results of the calibrated model show satisfactory agreement with the measured data and results reveal important differences between the parameters used to model small rivers as compared to

  18. Biofuels and water quality: challenges and opportunities for simulation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Engel, Bernard A. [Purdue University; Chaubey, Indrajeet [Purdue University; Thomas, Mark [Purdue University; Saraswat, Dharmendra [University of Arkansas; Murphy, Patrick [Purdue University; Bhaduri, Budhendra L [ORNL

    2010-01-01

    Quantification of the various impacts of biofuel feedstock production on hydrology and water quality is complex. Mathematical models can be used to efficiently evaluate various what if scenarios related to biofeedstock production and their impacts on hydrology and water quality at various spatial and temporal scales. Currently available models, although having the potential to serve such purposes, have many limitations. In this paper, we review the strengths and weaknesses of such models in light of short- and long term biofeedstock production scenarios. The representation of processes in the currently available models and how these processes need to be modified to fully evaluate various complex biofeedstock production scenarios are discussed. Similarly, issues related to availability of data that are needed to parameterize and evaluate these models are presented. We have presented a vision for the development of decision support tools and ecosystem services that can be used to make watershed management decisions to minimize any potentially adverse environmental impacts while meeting biofeedstock demands. We also discuss a case study of biofeedstock impact simulation in relation to watershed management policy implications for various state and federal agencies in the USA.

  19. Full mtGenome reference data: development and characterization of 588 forensic-quality haplotypes representing three U.S. populations.

    Science.gov (United States)

    Just, Rebecca S; Scheible, Melissa K; Fast, Spence A; Sturk-Andreaggi, Kimberly; Röck, Alexander W; Bush, Jocelyn M; Higginbotham, Jennifer L; Peck, Michelle A; Ring, Joseph D; Huber, Gabriela E; Xavier, Catarina; Strobl, Christina; Lyons, Elizabeth A; Diegoli, Toni M; Bodner, Martin; Fendt, Liane; Kralj, Petra; Nagl, Simone; Niederwieser, Daniela; Zimmermann, Bettina; Parson, Walther; Irwin, Jodi A

    2015-01-01

    Though investigations into the use of massively parallel sequencing technologies for the generation of complete mitochondrial genome (mtGenome) profiles from difficult forensic specimens are well underway in multiple laboratories, the high quality population reference data necessary to support full mtGenome typing in the forensic context are lacking. To address this deficiency, we have developed 588 complete mtGenome haplotypes, spanning three U.S. population groups (African American, Caucasian and Hispanic) from anonymized, randomly-sampled specimens. Data production utilized an 8-amplicon, 135 sequencing reaction Sanger-based protocol, performed in semi-automated fashion on robotic instrumentation. Data review followed an intensive multi-step strategy that included a minimum of three independent reviews of the raw data at two laboratories; repeat screenings of all insertions, deletions, heteroplasmies, transversions and any additional private mutations; and a check for phylogenetic feasibility. For all three populations, nearly complete resolution of the haplotypes was achieved with full mtGenome sequences: 90.3-98.8% of haplotypes were unique per population, an improvement of 7.7-29.2% over control region sequencing alone, and zero haplotypes overlapped between populations. Inferred maternal biogeographic ancestry frequencies for each population and heteroplasmy rates in the control region were generally consistent with published datasets. In the coding region, nearly 90% of individuals exhibited length heteroplasmy in the 12418-12425 adenine homopolymer; and despite a relatively high rate of point heteroplasmy (23.8% of individuals across the entire molecule), coding region point heteroplasmies shared by more than one individual were notably absent, and transversion-type heteroplasmies were extremely rare. The ratio of nonsynonymous to synonymous changes among point heteroplasmies in the protein-coding genes (1:1.3) and average pathogenicity scores in

  20. Modelling the impacts of agricultural management practices on river water quality in Eastern England.

    Science.gov (United States)

    Taylor, Sam D; He, Yi; Hiscock, Kevin M

    2016-09-15

    Agricultural diffuse water pollution remains a notable global pressure on water quality, posing risks to aquatic ecosystems, human health and water resources and as a result legislation has been introduced in many parts of the world to protect water bodies. Due to their efficiency and cost-effectiveness, water quality models have been increasingly applied to catchments as Decision Support Tools (DSTs) to identify mitigation options that can be introduced to reduce agricultural diffuse water pollution and improve water quality. In this study, the Soil and Water Assessment Tool (SWAT) was applied to the River Wensum catchment in eastern England with the aim of quantifying the long-term impacts of potential changes to agricultural management practices on river water quality. Calibration and validation were successfully performed at a daily time-step against observations of discharge, nitrate and total phosphorus obtained from high-frequency water quality monitoring within the Blackwater sub-catchment, covering an area of 19.6 km(2). A variety of mitigation options were identified and modelled, both singly and in combination, and their long-term effects on nitrate and total phosphorus losses were quantified together with the 95% uncertainty range of model predictions. Results showed that introducing a red clover cover crop to the crop rotation scheme applied within the catchment reduced nitrate losses by 19.6%. Buffer strips of 2 m and 6 m width represented the most effective options to reduce total phosphorus losses, achieving reductions of 12.2% and 16.9%, respectively. This is one of the first studies to quantify the impacts of agricultural mitigation options on long-term water quality for nitrate and total phosphorus at a daily resolution, in addition to providing an estimate of the uncertainties of those impacts. The results highlighted the need to consider multiple pollutants, the degree of uncertainty associated with model predictions and the risk of

  1. NEW MODEL FOR EVALUATION OF THE PERCEIVED IMAGE QUALITY BY SMARTPHONE USERS

    Directory of Open Access Journals (Sweden)

    Pinchas ZOREA

    2015-12-01

    Full Text Available Mobile devices, like smartphones and tablet computers, became an essential part in our life. Image quality assessment plays an important role in various image processing applications. A great deal of effort has been made in recent years to develop “objective” image quality metrics that correlate with perceived quality measurement. Unfortunately, only limi-ted success has been achieved. In this paper, I provide a quantitative method to evaluate perceived image quality of color images on mobile displays. Five image quality factors - Vividness, Brightness, Clarity, Sharpness and Contrast were chosen to represent perceived image quality. Image quality assessment models are constructed based on results of human visual experiments compared with image analysis by SW tool. Values of parameters of image quality assessment models are estimated based on results from human visual experiments, and a new model is proposed based on the human visual tests and computer image analysis.MODEL NOU DE EVALUARE A CALITĂŢII IMAGINII PERCEPUTE DE CĂTRE UTILIZATORII DE SMARTPHONEDispozitivele mobile (ca exemplu – smartphone şi tablete au devenit o parte esenţială din viaţa noastră. Evaluarea calităţii imaginii joacă un rol important în diverse aplicaţii de procesare a imaginii. O mare parte din efort a fost făcut în ultimii ani pentru a dezvolta metrici „obiective” de evaluare a calităţii imaginii, care corelează cu măsurarea calităţii percepute. Spre regret, doar un succes limitat a fost atins în acest domeniu. În lucrare este prezentată o metodă cantitativă de evaluare a calităţii imaginii percepute cu referire la imaginile color pe ecranele dispozitivelor mobile. Cinci factori de calitate a imaginii – Intensitate, Luminozitate, Claritate, Rezoluţie şi Contrast – au fost aleşi pentru a reprezenta calitatea imaginii percepute. Modelele de evaluare a calităţii imaginii sunt construite pe baza rezultatelor experimentelor

  2. An annual assessment of air quality with the CALIOPE modeling system over Spain.

    Science.gov (United States)

    Baldasano, J M; Pay, M T; Jorba, O; Gassó, S; Jiménez-Guerrero, P

    2011-05-01

    The CALIOPE project, funded by the Spanish Ministry of the Environment, aims at establishing an air quality forecasting system for Spain. With this goal, CALIOPE modeling system was developed and applied with high resolution (4km×4km, 1h) using the HERMES emission model (including emissions of resuspended particles from paved roads) specifically built up for Spain. The present study provides an evaluation and the assessment of the modeling system, coupling WRF-ARW/HERMES/CMAQ/BSC-DREAM8b for a full-year simulation in 2004 over Spain. The evaluation focuses on the capability of the model to reproduce the temporal and spatial distribution of gas phase species (NO(2), O(3), and SO(2)) and particulate matter (PM10) against ground-based measurements from the Spanish air quality monitoring network. The evaluation of the modeling results on an hourly basis shows a strong dependency of the performance of the model on the type of environment (urban, suburban and rural) and the dominant emission sources (traffic, industrial, and background). The O(3) chemistry is best represented in summer, when mean hourly variability and high peaks are generally well reproduced. The mean normalized error and bias meet the recommendations proposed by the United States Environmental Protection Agency (US-EPA) and the European regulations. Modeled O(3) shows higher performance for urban than for rural stations, especially at traffic stations in large cities, since stations influenced by traffic emissions (i.e., high-NO(x) environments) are better characterized with a more pronounced daily variability. NO(x)/O(3) chemistry is better represented under non-limited-NO(2) regimes. SO(2) is mainly produced from isolated point sources (power generation and transformation industries) which generate large plumes of high SO(2) concentration affecting the air quality on a local to national scale where the meteorological pattern is crucial. The contribution of mineral dust from the Sahara desert through

  3. Constraining Water Quality Models With Electrical Resistivity Tomography (ERT)

    Science.gov (United States)

    Bentley, L. R.; Gharibi, M.; Mrklas, O.; Lunn, S. D.

    2001-12-01

    Water quality models are difficult to constrain with piezometer data alone because the data are spatially sparse. Since the electrical conductivity (EC) of water is often correlated with water quality, geophysical measurements of electrical conductivity may provide densely sampled secondary data for constraining water quality models. We present a quantitative interpretation protocol for interpreting EC derived from surface ERT results. A standard temperature is selected that is in the range of the in situ field temperatures, and laboratory measurements establish a functional relationship between water EC and temperature. Total meq/l of charge are often strongly correlated with water EC at the standard temperature. Laboratory data is used to develop a correlation model between indicator parameters or water chemistry evolution and total meq/l of charge. Since the solid phase may contain a conductive clay fraction, a site specific calibrated Waxman-Smits rock physics model is used to estimate groundwater EC from bulk EC derived from ERT inversions. The groundwater EC at in situ temperature is converted to EC at the standard temperature, and the total meq/l is estimated using the laboratory-established correlation. The estimated meq/l can be used as soft information to map distribution of water quality or to estimate changes to water chemistry with time. We apply the analysis to a decommissioned sour gas plant undergoing remediation. Background bulk EC is high (50 to 100 mS/m) due to the clay content of tills. The highest values of groundwater EC are mainly due to acetic acid, which is a degradation product of amines and glycols. Acetic acid degrades readily under aerobic conditions, lowering the EC of pore waters. The calibrated Waxman-Smits model predicts that a reduction of groundwater EC from 1600 mS/m to 800mS/m will result in a reduction of bulk EC from 150 mS/m to 110 mS/m. Groundwater EC values both increase and decrease with time due to site heterogeneity, and

  4. Identifying the representative flow unit for capillary dominated two-phase flow in porous media using morphology-based pore-scale modeling

    Science.gov (United States)

    Mu, Yaoming; Sungkorn, Radompon; Toelke, Jonas

    2016-09-01

    In this paper, we extend pore-morphology-based methods proposed by Hazlett (1995) and Hilpert and Miller (2001) to simulate drainage and imbibition in uniformly wetting porous media and add an (optional) entrapment of the (non-)wetting phase. By improving implementation, this method allows us to identify the statistical representative elementary volume and estimate uncertainty by computing fluid flow properties and saturation distributions of hundreds of subsamples within a reasonable time-frame. The method was utilized to study three different porous medium systems and results demonstrate that morphology-based pore-scale modeling is a viable approach to assess the representative elementary volume with respect to capillary dominated two-phase flow. The focus of this paper is the determination of the representative elementary volume for multiphase-flow properties for a digital representation of a rock.

  5. The QOL-DASS Model to Estimate Overall Quality of Life and General Health

    Directory of Open Access Journals (Sweden)

    Mehrdad Mazaheri

    2011-01-01

    Full Text Available "n Objective: In order to find how rating the WHOQOL-BREF and DASS scales are combined to produce an overall measure of quality of life and satisfaction with health rating, a QOL-DASS model was designed ; and the strength of this hypothesized model was examined using the structural equation modeling "n "nMethod: Participants included a sample of 103 voluntary males who were divided into two groups of unhealthy (N=55 and healthy (N=48. To assess satisfaction and negative  emotions of depression, anxiety and stress among the participants, they were asked to fill out the WHOQOLBREF and The Depression Anxiety Stress Scale (DASS-42. "nResults: Our findings on running the hypothesized model of QOL-DASS indicated that the proposed model of QOL-DASS fitted the data well for the both healthy and unhealthy groups "nConclusion: Our findings with CFA to evaluate the hypothesized model of QOL-DASS indicated that the different satisfaction domain ratings and the negative emotions of depression, anxiety and stress as the observed variables can represent the underlying constructs of general health and quality of life on both healthy and unhealthy groups.

  6. Errors in the Bag Model of Strings, and Regge Trajectories Represent the Conservation of Angular Momentum in Hyperbolic Space

    CERN Document Server

    Lavenda, B H

    2011-01-01

    The MIT bag model is shown to be wrong because the bag pressure cannot be held constant, and the volume can be fixed in terms of it. The bag derivation of Regge's trajectories is invalidated by an integration of the energy and angular momentum over all values of the radius up to $r_0=c/\\omega$. This gives the absurd result that "total" angular momentum decreases as the frequency increases. The correct expression for the angular momentum is obtained from hyperbolic geometry of constant negative curvature $r_0$. When the square of the relativistic mass is introduced, it gives a negative intercept which is the Euclidean value of the angular momentum. Regge trajectories are simply statements of the conservation of angular momentum in hyperbolic space. The frequencies and values of the angular momentum are in remarkable agreement with experiment.

  7. Towards a mechanical failure model for degrading permafrost rock slopes representing changes in rock toughness and infill

    Science.gov (United States)

    Mamot, Philipp; Krautblatter, Michael; Scandroglio, Riccardo

    2016-04-01

    The climate-induced degradation of permafrost in mountain areas can reduce the stability of rock slopes. An increasing number of rockfalls and rockslides originate from permafrost-affected rock faces. Discontinuity patterns and their geometrical and mechanical properties play a decisive role in controlling rock slope stability. Under thawing conditions the shear resistance of rock reduces due to lower friction along rock-rock contacts, decreasing fracture toughness of rock-ice contacts, diminishing fracture toughness of cohesive rock bridges and altered creep or fracture of the ice itself. Compressive strength is reduced by 20 to 50 % and tensile strength decreases by 15 to 70 % when intact saturated rock thaws (KRAUTBLATTER ET AL. 2013). Elevated water pressures in fractures can lead to reduced effective normal stresses and thus to lower shear strengths of fractures. However, the impact of degrading permafrost on the mechanical properties of intact or fractured rock still remains poorly understood. In this study, we develop a new approach for modeling the influence of degrading permafrost on the stability of high mountain rock slopes. Hereby, we focus on the effect of rock- and ice-mechanical changes along striking discontinuities onto the whole rock slope. We aim at contributing to a better rock-ice mechanical process understanding of degrading permafrost rocks. For parametrisation and subsequent calibration of our model, we chose a test site (2885 m a.s.l.) close by the Zugspitze summit in Germany. It reveals i) a potential rockslide at the south face involving 10E4m³ of rock and ii) permafrost occurrence due to ice-filled caves and fractures. Here we combine kinematic, geotechnical and thermal monitoring in the field with rock-mechanical laboratory tests and a 2D numerical failure modeling. Up to date, the following results underline the potential effects of thawing rock and fracture infill on the stability of steep rock slopes in theory and praxis: i. ERT and

  8. The effect of winglets on the static aerodynamic stability characteristics of a representative second generation jet transport model

    Science.gov (United States)

    Jacobs, P. F.; Flechner, S. G.

    1976-01-01

    A baseline wing and a version of the same wing fitted with winglets were tested. The longitudinal aerodynamic characteristics were determined through an angle-of-attack range from -1 deg to 10 deg at an angle of sideslip of 0 deg for Mach numbers of 0.750, 0.800, and 0.825. The lateral aerodynamic characteristics were determined through the same angle-of-attack range at fixed sideslip angles of 2.5 deg and 5 deg. Both configurations were investigated at Reynolds numbers of 13,000,000, per meter (4,000,000 per foot) and approximately 20,000,000 per meter (6,000,000 per foot). The winglet configuration showed slight increases over the baseline wing in static longitudinal and lateral aerodynamic stability throughout the test Mach number range for a model design lift coefficient of 0.53. Reynolds number variation had very little effect on stability.

  9. Representative-Sandwich Model for Mechanical-Crush and Short-Circuit Simulation of Lithium-ion Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Chao; Santhanagopalan, Shriram; Sprague, Michael A.; Pesaran, Ahmad A.

    2015-07-28

    Lithium-ion batteries are currently the state-of-the-art power sources for a variety of applications, from consumer electronic devices to electric-drive vehicles (EDVs). Being an energized component, failure of the battery is an essential concern, which can result in rupture, smoke, fire, or venting. The failure of Lithium-ion batteries can be due to a number of external abusive conditions (impact/crush, overcharge, thermal ramp, etc.) or internal conditions (internal short circuits, excessive heating due to resistance build-up, etc.), of which the mechanical-abuse-induced short circuit is a very practical problem. In order to better understand the behavior of Lithium-ion batteries under mechanical abuse, a coupled modeling methodology encompassing the mechanical, thermal and electrical response has been developed for predicting short circuit under external crush.

  10. Multi-criteria assessment of the Representative Elementary Watershed approach on the Donga catchment (Benin using a downward approach of model complexity

    Directory of Open Access Journals (Sweden)

    N. Varado

    2005-11-01

    Full Text Available This study is part of the AMMA – African Multidisciplinary Monsoon Analysis – project and aims at a better understanding and modelling of the Donga catchment (580 km2, Benin behaviour. For this purpose, we applied the REW concept proposed by Reggiani et al. (1998, 1999, which allows the description of the main local processes at the sub-watershed scale. Such distributed hydrological models, which represent hydrological processes at various scales, should be evaluated not only on the discharge at the outlet but also on each of the represented processes and in several points of the catchment. This kind of multi-criteria evaluation is of importance in order to assess the global behaviour of the models. We applied such multi-criteria strategy to the Donga catchment (586 km2, in Benin. The work is supported by a strategy of observation, undertaken since 1998 consisting in a network of 20 rain gauges, an automatic meteorological station, 6 discharge stations and 18 wells.

    The first goal of this study is to assess the model ability to reproduce the discharge at the outlet, the water table dynamics in several points of the catchment and the vadose zone dynamics at the sub-catchment scale. We tested two spatial discretisations of increasing resolution. To test the internal structure of the model, we looked at its ability to represent also the discharge at intermediary stations. After adjustment of soil parameters, the model is shown to accurately represent discharge down to a drainage area of 100 km2, whereas poorer simulation is achieved on smaller catchments. We introduced the spatial variability of rainfall by distributing the daily rainfall over the REW and obtained a very low sensitivity of the model response to this variability. Our results suggest that processes in the unsaturated zone should first be improved, in order to better simulate soil water dynamics and represent perched water tables which

  11. Multi-criteria assessment of the Representative Elementary Watershed approach on the Donga catchment (Benin using a downward approach of model complexity

    Directory of Open Access Journals (Sweden)

    N. Varado

    2006-01-01

    Full Text Available This study is part of the AMMA - African Multidisciplinary Monsoon Analysis- project and aims at a better understanding and modelling of the Donga catchment (580 km2, Benin behaviour in order to determine its spatially distributed water balance. For this purpose, we applied the REW concept proposed by Reggiani et al. (1998, 1999, which allows the description of the main local processes at the sub-watershed scale. Such distributed hydrological models, which represent hydrological processes at various scales, should be evaluated not only on the discharge at the outlet but also on each of the represented processes and in several points of the catchment. This multi-criteria approach is required in order to assess the global behaviour of hydrological models. We applied such multi-criteria strategy to the Donga catchment (586 km2, in Benin. The work was supported by an observation set up, undertaken since 1998 consisting in a network of 20 rain gauges, an automatic meteorological station, 6 discharge stations and 18 wells. The main goal of this study was to assess the model's ability to reproduce the discharge at the outlet, the water table dynamics in several points of the catchment and the vadose zone dynamics at the sub-catchment scale. We tested two spatial discretisations of increasing resolution. To test the internal structure of the model, we looked at its ability to represent also the discharge at intermediate stations. After adjustment of soil parameters, the model is shown to accurately represent discharge down to a drainage area of 100 km2, whereas poorer simulation is achieved on smaller catchments. We introduced the spatial variability of rainfall by distributing the daily rainfall over the REW and obtained a very low sensitivity of the model response to this variability. Simulation of groundwater levels was poor and our results, in conjunction with new data available at the local scale, suggest that the representation of the processes

  12. Dimension-based quality modeling of transmitted speech

    CERN Document Server

    Wältermann, Marcel

    2013-01-01

    In this book, speech transmission quality is modeled on the basis of perceptual dimensions. The author identifies those dimensions that are relevant for today's public-switched and packet-based telecommunication systems, regarding the complete transmission path from the mouth of the speaker to the ear of the listener. Both narrowband (300-3400 Hz) as well as wideband (50-7000 Hz) speech transmission is taken into account. A new analytical assessment method is presented that allows the dimensions to be rated by non-expert listeners in a direct way. Due to the efficiency of the test method, a relatively large number of stimuli can be assessed in auditory tests. The test method is applied in two auditory experiments. The book gives the evidence that this test method provides meaningful and reliable results. The resulting dimension scores together with respective overall quality ratings form the basis for a new parametric model for the quality estimation of transmitted speech based on the perceptual dimensions. I...

  13. Estimating Spoken Dialog System Quality with User Models

    CERN Document Server

    Engelbrecht, Klaus-Peter

    2013-01-01

    Spoken dialog systems have the potential to offer highly intuitive user interfaces, as they allow systems to be controlled using natural language. However, the complexity inherent in natural language dialogs means that careful testing of the system must be carried out from the very beginning of the design process.   This book examines how user models can be used to support such early evaluations in two ways:  by running simulations of dialogs, and by estimating the quality judgments of users. First, a design environment supporting the creation of dialog flows, the simulation of dialogs, and the analysis of the simulated data is proposed.  How the quality of user simulations may be quantified with respect to their suitability for both formative and summative evaluation is then discussed. The remainder of the book is dedicated to the problem of predicting quality judgments of users based on interaction data. New modeling approaches are presented, which process the dialogs as sequences, and which allow knowl...

  14. Site S-7 Representative Model and Application for the Vadose Zone Monitoring System (VZMS) McClellan AFB - 1998 Semi-Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    James, A.L.; Oldenburg, C.M.

    1998-12-01

    Vadose zone data collection and enhanced data analysis are continuing for the Vadose Zone Monitoring System (VZMS) installed at site S-7 in IC 34 at McClellan MB. Data from core samples from boreholes drilled in 1998 and from VZMS continuous monitoring are evaluated and compared to previously collected data and analyses. The suite of data collected to date is used to develop and constrain a spatially averaged, one-dimensional site S-7 representative model that is implemented into T2VOC. Testing of the conceptual model under conditions of recharge of 100 mm/yr produces plausible moisture contents relative to data from several sources. Further scoping calculations involving gas-phase TCE transport in the representative model were undertaken. We investigate the role of recharge on TCE transport as well as the role of ion- and gas-phase flow driven by density and barometric pumping effects. This report provides the first example of the application of the site S-7 representative model in th e investigation of subsurface VOC movement.

  15. Modeling Water Clarity and Light Quality in Oceans

    Directory of Open Access Journals (Sweden)

    Mohamed A. Abdelrhman

    2016-11-01

    Full Text Available Phytoplankton is a primary producer of organic compounds, and it forms the base of the food chain in ocean waters. The concentration of phytoplankton in the water column controls water clarity and the amount and quality of light that penetrates through it. The availability of adequate light intensity is a major factor in the health of algae and phytoplankton. There is a strong negative coupling between light intensity and phytoplankton concentration (e.g., through self-shading by the cells, which reduces available light and in return affects the growth rate of the cells. Proper modeling of this coupling is essential to understand primary productivity in the oceans. This paper provides the methodology to model light intensity in the water column, which can be included in relevant water quality models. The methodology implements relationships from bio-optical models, which use phytoplankton chlorophyll a (chl-a concentration as a surrogate for light attenuation, including absorption and scattering by other attenuators. The presented mathematical methodology estimates the reduction in light intensity due to absorption by pure seawater, chl-a pigment, non-algae particles (NAPs and colored dissolved organic matter (CDOM, as well as backscattering by pure seawater, phytoplankton particles and NAPs. The methods presented facilitate the prediction of the effects of various environmental and management scenarios (e.g., global warming, altered precipitation patterns, greenhouse gases on the wellbeing of phytoplankton communities in the oceans as temperature-driven chl-a changes take place.

  16. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wild land fire spread and behavior are complex phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-Fire- Chem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  17. Forest fire forecasting tool for air quality modelling systems

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J.L.; Perez, L.; Gonzalez, R.M.; Pecci, J.; Palacios, M.

    2015-07-01

    Adverse effects of smoke on air quality are of great concern; however, even today the estimates of atmospheric fire emissions are a key issue. It is necessary to implement systems for predicting smoke into an air quality modelling system, and in this work a first attempt towards creating a system of this type is presented. Wildland fire spread and behavior are complex Phenomena due to both the number of involved physic-chemical factors, and the nonlinear relationship between variables. WRF-Fire was employed to simulate spread and behavior of some real fires occurred in South-East of Spain and North of Portugal. The use of fire behavior models requires the availability of high resolution environmental and fuel data. A new custom fuel moisture content model has been developed. The new module allows each time step to calculate the fuel moisture content of the dead fuels and live fuels. The results confirm that the use of accurate meteorological data and a custom fuel moisture content model is crucial to obtain precise simulations of fire behavior. To simulate air pollution over Europe, we use the regional meteorological-chemistry transport model WRF-Chem. In this contribution, we show the impact of using two different fire emissions inventories (FINN and IS4FIRES) and how the coupled WRF-FireChem model improves the results of the forest fire emissions and smoke concentrations. The impact of the forest fire emissions on concentrations is evident, and it is quite clear from these simulations that the choice of emission inventory is very important. We conclude that using the WRF-fire behavior model produces better results than using forest fire emission inventories although the requested computational power is much higher. (Author)

  18. Quality Management Model Embodying Circular Economy and Sustainable Manufacture—— Quality Concept Models Based on a Chinese Traditional Cultural Principle

    Institute of Scientific and Technical Information of China (English)

    CHEN Xiang-yu; LIANG Gong-qian; MA Shi-ning

    2005-01-01

    Based on circular economy and sustainable manufacture theories, the drawbacks in the existing quality management models are analyzed. The requests that satisfy the big system "environment-society-economy" are summarized to build up a model. A Chinese traditional cultural principle is applied and the relevant contents are sublimated as the platform to set up the model. The new quality management concept models are put forward: "T-D-R" three dimension structures and "ecological quality loop" model, from which the new quality concepts are formed. The paper elaborates the contents and the process of setting up the models. The big system quality problems can be handled by the new quality concept and model that are validated in practice.

  19. Quality guaranteed aggregation based model predictive control and stability analysis

    Institute of Scientific and Technical Information of China (English)

    LI DeWei; XI YuGeng

    2009-01-01

    The input aggregation strategy can reduce the online computational burden of the model predictive controller. But generally aggregation based MPC controller may lead to poor control quality. Therefore, a new concept, equivalent aggregation, is proposed to guarantee the control quality of aggregation based MPC. From the general framework of input linear aggregation, the design methods of equivalent aggregation are developed for unconstrained and terminal zero constrained MPC, which guarantee the actual control inputs exactly to be equal to that of the original MPC. For constrained MPC, quasi-equivalent aggregation strategies are also discussed, aiming to make the difference between the control inputs of aggregation based MPC and original MPC as small as possible. The stability conditions are given for the quasi-equivalent aggregation based MPC as well.

  20. Importance of demand modelling in network water quality models: a review

    Directory of Open Access Journals (Sweden)

    J. C. van Dijk

    2008-01-01

    Full Text Available Today, there is a growing interest in network water quality modelling. The water quality issues of interest relate to both dissolved and particulate substances, with the main interest in residual chlorine and (microbiological contaminant propagation, respectively in sediment leading to discolouration. There is a strong influence of flows and velocities on transport, mixing, production and decay of these substances in the network. This imposes a different approach to demand modelling which is reviewed in this article.

    For transport systems the current hydraulic models suffice; for the more detailed distribution system a network water quality model is needed that is based on short time scale demands that considers the effect of dispersion and transients. Demand models that provide stochastic residential demands per individual home and on a one-second time scale are available. A stochastic demands based network water quality model needs to be developed and validated with field measurements. Such a model will be probabilistic in nature and will offer a new perspective for assessing water quality in the DWDS.

  1. Ontologies for the Integration of Air Quality Models and 3D City Models

    CERN Document Server

    Métral, Claudine; Karatzas, Kostas

    2012-01-01

    The holistic approach to sustainable urban planning implies using different models in an integrated way that is capable of simulating the urban system. As the interconnection of such models is not a trivial task, one of the key elements that may be applied is the description of the urban geometric properties in an "interoperable" way. Focusing on air quality as one of the most pronounced urban problems, the geometric aspects of a city may be described by objects such as those defined in CityGML, so that an appropriate air quality model can be applied for estimating the quality of the urban air on the basis of atmospheric flow and chemistry equations. In this paper we first present theoretical background and motivations for the interconnection of 3D city models and other models related to sustainable development and urban planning. Then we present a practical experiment based on the interconnection of CityGML with an air quality model. Our approach is based on the creation of an ontology of air quality models ...

  2. Rainfall-runoff modelling in a catchment with a complex groundwater flow system: application of the Representative Elementary Watershed (REW) approach

    Science.gov (United States)

    Zhang, G. P.; Savenije, H. H. G.

    2005-09-01

    Based on the Representative Elementary Watershed (REW) approach, the modelling tool REWASH (Representative Elementary WAterShed Hydrology) has been developed and applied to the Geer river basin. REWASH is deterministic, semi-distributed, physically based and can be directly applied to the watershed scale. In applying REWASH, the river basin is divided into a number of sub-watersheds, so called REWs, according to the Strahler order of the river network. REWASH describes the dominant hydrological processes, i.e. subsurface flow in the unsaturated and saturated domains, and overland flow by the saturation-excess and infiltration-excess mechanisms. The coupling of surface and subsurface flow processes in the numerical model is realised by simultaneous computation of flux exchanges between surface and subsurface domains for each REW. REWASH is a parsimonious tool for modelling watershed hydrological response. However, it can be modified to include more components to simulate specific processes when applied to a specific river basin where such processes are observed or considered to be dominant. In this study, we have added a new component to simulate interception using a simple parametric approach. Interception plays an important role in the water balance of a watershed although it is often disregarded. In addition, a refinement for the transpiration in the unsaturated zone has been made. Finally, an improved approach for simulating saturation overland flow by relating the variable source area to both the topography and the groundwater level is presented. The model has been calibrated and verified using a 4-year data set, which has been split into two for calibration and validation. The model performance has been assessed by multi-criteria evaluation. This work represents a complete application of the REW approach to watershed rainfall-runoff modelling in a real watershed. The results demonstrate that the REW approach provides an alternative blueprint for physically

  3. Integration of the Hydrologic Simulation Program--FORTRAN (HSPF) Watershed Water Quality Model into the Watershed Modeling System (WMS)

    National Research Council Canada - National Science Library

    Deliman, Patrick

    1999-01-01

    ...) into the Watershed Modeling System (WMS) was initiated as part of an overall goal of the Water Quality Research Program to provide water quality capabilities within the framework of a comprehensive graphical modeling environment...

  4. Unintentional Interpersonal Synchronization Represented as a Reciprocal Visuo-Postural Feedback System: A Multivariate Autoregressive Modeling Approach.

    Directory of Open Access Journals (Sweden)

    Shuntaro Okazaki

    Full Text Available People's behaviors synchronize. It is difficult, however, to determine whether synchronized behaviors occur in a mutual direction--two individuals influencing one another--or in one direction--one individual leading the other, and what the underlying mechanism for synchronization is. To answer these questions, we hypothesized a non-leader-follower postural sway synchronization, caused by a reciprocal visuo-postural feedback system operating on pairs of individuals, and tested that hypothesis both experimentally and via simulation. In the behavioral experiment, 22 participant pairs stood face to face either 20 or 70 cm away from each other wearing glasses with or without vision blocking lenses. The existence and direction of visual information exchanged between pairs of participants were systematically manipulated. The time series data for the postural sway of these pairs were recorded and analyzed with cross correlation and causality. Results of cross correlation showed that postural sway of paired participants was synchronized, with a shorter time lag when participant pairs could see one another's head motion than when one of the participants was blindfolded. In addition, there was less of a time lag in the observed synchronization when the distance between participant pairs was smaller. As for the causality analysis, noise contribution ratio (NCR, the measure of influence using a multivariate autoregressive model, was also computed to identify the degree to which one's postural sway is explained by that of the other's and how visual information (sighted vs. blindfolded interacts with paired participants' postural sway. It was found that for synchronization to take place, it is crucial that paired participants be sighted and exert equal influence on one another by simultaneously exchanging visual information. Furthermore, a simulation for the proposed system with a wider range of visual input showed a pattern of results similar to the

  5. Unintentional Interpersonal Synchronization Represented as a Reciprocal Visuo-Postural Feedback System: A Multivariate Autoregressive Modeling Approach.

    Science.gov (United States)

    Okazaki, Shuntaro; Hirotani, Masako; Koike, Takahiko; Bosch-Bayard, Jorge; Takahashi, Haruka K; Hashiguchi, Maho; Sadato, Norihiro

    2015-01-01

    People's behaviors synchronize. It is difficult, however, to determine whether synchronized behaviors occur in a mutual direction--two individuals influencing one another--or in one direction--one individual leading the other, and what the underlying mechanism for synchronization is. To answer these questions, we hypothesized a non-leader-follower postural sway synchronization, caused by a reciprocal visuo-postural feedback system operating on pairs of individuals, and tested that hypothesis both experimentally and via simulation. In the behavioral experiment, 22 participant pairs stood face to face either 20 or 70 cm away from each other wearing glasses with or without vision blocking lenses. The existence and direction of visual information exchanged between pairs of participants were systematically manipulated. The time series data for the postural sway of these pairs were recorded and analyzed with cross correlation and causality. Results of cross correlation showed that postural sway of paired participants was synchronized, with a shorter time lag when participant pairs could see one another's head motion than when one of the participants was blindfolded. In addition, there was less of a time lag in the observed synchronization when the distance between participant pairs was smaller. As for the causality analysis, noise contribution ratio (NCR), the measure of influence using a multivariate autoregressive model, was also computed to identify the degree to which one's postural sway is explained by that of the other's and how visual information (sighted vs. blindfolded) interacts with paired participants' postural sway. It was found that for synchronization to take place, it is crucial that paired participants be sighted and exert equal influence on one another by simultaneously exchanging visual information. Furthermore, a simulation for the proposed system with a wider range of visual input showed a pattern of results similar to the behavioral results.

  6. Modelling the Cost and Quality of Preservation Imaging and Archiving

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2009-01-01

    in the OAIS Reference Model. The cost model divides the OAIS functions in a hierarchy of cost critical activities and measurable components, which are implemented as formulas in a spreadsheet. So far the model has only been completed for activities relating to preservation planning and digital migrations......, fire and other risks. In this PhD thesis it is examined how one may evaluate the long‐term costs and benefits to cultural heritage institutions of different preservation strategies for digital copies. The investigated alternatives are preserving the copies in a digital repository, and printing...... the files out on microfilm and preserving them in a non‐digital repository. In order to obtain empirical data and to understand the decisive cost factors in preservation copying, a case study was set up in which degrading sheet‐film negatives were digitised. Requirements for image quality and metadata were...

  7. Relevant Criteria for Testing the Quality of Turbulence Models

    DEFF Research Database (Denmark)

    Frandsen, Sten; Jørgensen, Hans E.; Sørensen, John Dalsgaard

    2007-01-01

    turbines when seeking wind characteristics that correspond to one blade and the entire rotor, respectively. For heights exceeding 50-60m the gust factor increases with wind speed. For heights larger the 60-80m, present assumptions on the value of the gust factor are significantly conservative, both for 3......Seeking relevant criteria for testing the quality of turbulence models, the scale of turbulence and the gust factor have been estimated from data and compared with predictions from first-order models of these two quantities. It is found that the mean of the measured length scales is approx. 10......% smaller than the IEC model, for wind turbine hub height levels. The mean is only marginally dependent on trends in time series. It is also found that the coefficient of variation of the measured length scales is about 50%. 3sec and 10sec pre-averaging of wind speed data are relevant for MW-size wind...

  8. A Technology Enhanced Learning Model for Quality Education

    Science.gov (United States)

    Sherly, Elizabeth; Uddin, Md. Meraj

    Technology Enhanced Learning and Teaching (TELT) Model provides learning through collaborations and interactions with a framework for content development and collaborative knowledge sharing system as a supplementary for learning to improve the quality of education system. TELT deals with a unique pedagogy model for Technology Enhanced Learning System which includes course management system, digital library, multimedia enriched contents and video lectures, open content management system and collaboration and knowledge sharing systems. Open sources like Moodle and Wiki for content development, video on demand solution with a low cost mid range system, an exhaustive digital library are provided in a portal system. The paper depicts a case study of e-learning initiatives with TELT model at IIITM-K and how effectively implemented.

  9. A stochastic dynamic programming model for stream water quality management

    Indian Academy of Sciences (India)

    P P Mujumdar; Pavan Saxena

    2004-10-01

    This paper deals with development of a seasonal fraction-removal policy model for waste load allocation in streams addressing uncertainties due to randomness and fuzziness. A stochastic dynamic programming (SDP) model is developed to arrive at the steady-state seasonal fraction-removal policy. A fuzzy decision model (FDM) developed by us in an earlier study is used to compute the system performance measure required in the SDP model. The state of the system in a season is defined by streamflows at the headwaters during the season and the initial DO deficit at some pre-specified checkpoints. The random variation of streamflows is included in the SDP model through seasonal transitional probabilities. The decision vector consists of seasonal fraction-removal levels for the effluent dischargers. Uncertainty due to imprecision (fuzziness) associated with water quality goals is addressed using the concept of fuzzy decision. Responses of pollution control agencies to the resulting end-of-season DO deficit vector and that of dischargers to the fraction-removal levels are treated as fuzzy, and modelled with appropriate membership functions. Application of the model is illustrated with a case study of the Tungabhadra river in India.

  10. A generic model for keeping quality of vegetable produce during storage and distribution

    NARCIS (Netherlands)

    Tijskens, L.M.M.; Polderdijk, J.J.

    1996-01-01

    A generic model on the keeping quality of perishable produce was formulated, based on the kinetics of the decrease of individual quality attributes. The model includes the effects of temperature, chilling injury and different levels of initial quality and of quality acceptance limits. Keeping qualit

  11. Exploring the critical quality attributes and models of smart homes.

    Science.gov (United States)

    Ted Luor, Tainyi; Lu, Hsi-Peng; Yu, Hueiju; Lu, Yinshiu

    2015-12-01

    Research on smart homes has significantly increased in recent years owing to their considerably improved affordability and simplicity. However, the challenge is that people have different needs (or attitudes toward smart homes), and provision should be tailored to individuals. A few studies have classified the functions of smart homes. Therefore, the Kano model is first adopted as a theoretical base to explore whether the functional classifications of smart homes are attractive or necessary, or both. Second, three models and test user attitudes toward three function types of smart homes are proposed. Based on the Kano model, the principal results, namely, two "Attractive Quality" and nine "Indifferent Quality" items, are found. Verification of the hypotheses also indicates that the entertainment, security, and automation functions are significantly correlated with the variables "perceive useful" and "attitude." Cost consideration is negatively correlated with attitudes toward entertainment and automation. Results suggest that smart home providers should survey user needs for their product instead of merely producing smart homes based on the design of the builder or engineer.

  12. Applying fuzzy analytic network process in quality function deployment model

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Afsharkazemi

    2012-08-01

    Full Text Available In this paper, we propose an empirical study of QFD implementation when fuzzy numbers are used to handle the uncertainty associated with different components of the proposed model. We implement fuzzy analytical network to find the relative importance of various criteria and using fuzzy numbers we calculate the relative importance of these factors. The proposed model of this paper uses fuzzy matrix and house of quality to study the products development in QFD and also the second phase i.e. part deployment. In most researches, the primary objective is only on CRs to implement the quality function deployment and some other criteria such as production costs, manufacturing costs etc were disregarded. The results of using fuzzy analysis network process based on the QFD model in Daroupat packaging company to develop PVDC show that the most important indexes are being waterproof, resistant pill packages, and production cost. In addition, the PVDC coating is the most important index in terms of company experts’ point of view.

  13. Temporal variations analyses and predictive modeling of microbiological seawater quality.

    Science.gov (United States)

    Lušić, Darija Vukić; Kranjčević, Lado; Maćešić, Senka; Lušić, Dražen; Jozić, Slaven; Linšak, Željko; Bilajac, Lovorka; Grbčić, Luka; Bilajac, Neiro

    2017-08-01

    Bathing water quality is a major public health issue, especially for tourism-oriented regions. Currently used methods within EU allow at least a 2.2 day period for obtaining the analytical results, making outdated the information forwarded to the public. Obtained results and beach assessment are influenced by the temporal and spatial characteristics of sample collection, and numerous environmental parameters, as well as by differences of official water standards. This paper examines the temporal variation of microbiological parameters during the day, as well as the influence of the sampling hour, on decision processes in the management of the beach. Apart from the fecal indicators stipulated by the EU Bathing Water Directive (E. coli and enterococci), additional fecal (C. perfringens) and non-fecal (S. aureus and P. aeriginosa) parameters were analyzed. Moreover, the effects of applying different evaluation criteria (national, EU and U.S. EPA) to beach ranking were studied, and the most common reasons for exceeding water-quality standards were investigated. In order to upgrade routine monitoring, a predictive statistical model was developed. The highest concentrations of fecal indicators were recorded early in the morning (6 AM) due to the lack of solar radiation during the night period. When compared to enterococci, E. coli criteria appears to be more stringent for the detection of fecal pollution. In comparison to EU and U.S. EPA criteria, Croatian national evaluation criteria provide stricter public health standards. Solar radiation and precipitation were the predominant environmental parameters affecting beach water quality, and these parameters were included in the predictive model setup. Predictive models revealed great potential for the monitoring of recreational water bodies, and with further development can become a useful tool for the improvement of public health protection. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. A New Model for Software Engineering Systems Quality Improvement

    Directory of Open Access Journals (Sweden)

    Ahmad A. Al-Rababah

    2014-04-01

    Full Text Available In the continuing effort to improve the system analysis and design process, several different approaches have been developed. This study will propose a new process methodology solves some problems in traditional system development methodologies it will study the strength and limitation of existing system development methodologies from traditional waterfall to iterative model including (Prototyping, Spiral, Rapid Application Development, XP and RUP to Agility. Propose a new methodology focus on produce a high quality product and suitable for all kind of project. Compare the new methodology with others to view some features that is differentiating it from previous methodologies.

  15. Linkage between an advanced air quality model and a mechanistic watershed model

    Directory of Open Access Journals (Sweden)

    K. Vijayaraghavan

    2010-09-01

    Full Text Available An offline linkage between two advanced multi-pollutant air quality and watershed models is presented. The models linked are (1 the Advanced Modeling System for Transport, Emissions, Reactions and Deposition of Atmospheric Matter (AMSTERDAM (a three-dimensional Eulerian plume-in-grid model derived from the Community Multiscale Air Quality (CMAQ model and (2 the Watershed Analysis Risk Management Framework (WARMF. The pollutants linked include gaseous and particulate nitrogen, sulfur and mercury compounds. The linkage may also be used to obtain meteorological fields such as precipitation and air temperature required by WARMF from the outputs of the meteorology chemistry interface processor (MCIP that processes meteorology simulated by the fifth generation Mesoscale Model (MM5 or the Weather Research and Forecast (WRF model for input to AMSTERDAM. The linkage is tested in the Catawba River basin of North and South Carolina for ammonium, nitrate and sulfate. Modeled air quality and meteorological fields transferred by the linkage can supplement the conventional measurements used to drive WARMF and may be used to help predict the impact of changes in atmospheric emissions on water quality.

  16. Linkage between an advanced air quality model and a mechanistic watershed model

    Science.gov (United States)

    Vijayaraghavan, K.; Herr, J.; Chen, S.-Y.; Knipping, E.

    2010-09-01

    An offline linkage between two advanced multi-pollutant air quality and watershed models is presented. The models linked are (1) the Advanced Modeling System for Transport, Emissions, Reactions and Deposition of Atmospheric Matter (AMSTERDAM) (a three-dimensional Eulerian plume-in-grid model derived from the Community Multiscale Air Quality (CMAQ) model) and (2) the Watershed Analysis Risk Management Framework (WARMF). The pollutants linked include gaseous and particulate nitrogen, sulfur and mercury compounds. The linkage may also be used to obtain meteorological fields such as precipitation and air temperature required by WARMF from the outputs of the meteorology chemistry interface processor (MCIP) that processes meteorology simulated by the fifth generation Mesoscale Model (MM5) or the Weather Research and Forecast (WRF) model for input to AMSTERDAM. The linkage is tested in the Catawba River basin of North and South Carolina for ammonium, nitrate and sulfate. Modeled air quality and meteorological fields transferred by the linkage can supplement the conventional measurements used to drive WARMF and may be used to help predict the impact of changes in atmospheric emissions on water quality.

  17. Mathematical models for predicting indoor air quality from smoking activity.

    Science.gov (United States)

    Ott, W R

    1999-05-01

    Much progress has been made over four decades in developing, testing, and evaluating the performance of mathematical models for predicting pollutant concentrations from smoking in indoor settings. Although largely overlooked by the regulatory community, these models provide regulators and risk assessors with practical tools for quantitatively estimating the exposure level that people receive indoors for a given level of smoking activity. This article reviews the development of the mass balance model and its application to predicting indoor pollutant concentrations from cigarette smoke and derives the time-averaged version of the model from the basic laws of conservation of mass. A simple table is provided of computed respirable particulate concentrations for any indoor location for which the active smoking count, volume, and concentration decay rate (deposition rate combined with air exchange rate) are known. Using the indoor ventilatory air exchange rate causes slightly higher indoor concentrations and therefore errs on the side of protecting health, since it excludes particle deposition effects, whereas using the observed particle decay rate gives a more accurate prediction of indoor concentrations. This table permits easy comparisons of indoor concentrations with air quality guidelines and indoor standards for different combinations of active smoking counts and air exchange rates. The published literature on mathematical models of environmental tobacco smoke also is reviewed and indicates that these models generally give good agreement between predicted concentrations and actual indoor measurements.

  18. Gulf of Mexico dissolved oxygen model (GoMDOM) research and quality assurance project plan

    Science.gov (United States)

    An integrated high resolution mathematical modeling framework is being developed that will link hydrodynamic, atmospheric, and water quality models for the northern Gulf of Mexico. This Research and Quality Assurance Project Plan primarily focuses on the deterministic Gulf of Me...

  19. Economic Integration and Quality Standards in a Duopoly Model with Horizontal and Vertical Product Differentiation

    DEFF Research Database (Denmark)

    Hansen, Jørgen Drud; Nielsen, Jørgen Ulff-Møller

    2006-01-01

    This paper examines the effects of trade barriers on quality levels in a duopoly model for two countries with one producer in each country. The products are both vertically and horizontally differentiated. In absence of quality regulation, the two producers determine prices and quality levels...... the quality levels in favour of the small country. Furthermore, in case of implementation of a minimum quality standard, which forces the low quality producer from the small country to increase the quality level, the producer from the large country reacts strategically by lowering the quality level of his...... standards are also ambiguous depending on the parameters of the model....

  20. Selection of a Representative Subset of Global Climate Models that Captures the Profile of Regional Changes for Integrated Climate Impacts Assessment

    Science.gov (United States)

    Ruane, Alex C.; Mcdermid, Sonali P.

    2017-01-01

    We present the Representative Temperature and Precipitation (T&P) GCM Subsetting Approach developed within the Agricultural Model Intercomparison and Improvement Project (AgMIP) to select a practical subset of global climate models (GCMs) for regional integrated assessment of climate impacts when resource limitations do not permit the full ensemble of GCMs to be evaluated given the need to also focus on impacts sector and economics models. Subsetting inherently leads to a loss of information but can free up resources to explore important uncertainties in the integrated assessment that would otherwise be prohibitive. The Representative T&P GCM Subsetting Approach identifies five individual GCMs that capture a profile of the full ensemble of temperature and precipitation change within the growing season while maintaining information about the probability that basic classes of climate changes (relatively cool/wet, cool/dry, middle, hot/wet, and hot/dry) are projected in the full GCM ensemble. We demonstrate the selection methodology for maize impacts in Ames, Iowa, and discuss limitations and situations when additional information may be required to select representative GCMs. We then classify 29 GCMs over all land areas to identify regions and seasons with characteristic diagonal skewness related to surface moisture as well as extreme skewness connected to snow-albedo feedbacks and GCM uncertainty. Finally, we employ this basic approach to recognize that GCM projections demonstrate coherence across space, time, and greenhouse gas concentration pathway. The Representative T&P GCM Subsetting Approach provides a quantitative basis for the determination of useful GCM subsets, provides a practical and coherent approach where previous assessments selected solely on availability of scenarios, and may be extended for application to a range of scales and sectoral impacts.

  1. Selection of a Representative Subset of Global Climate Models that Captures the Profile of Regional Changes for Integrated Climate Impacts Assessment

    Science.gov (United States)

    Ruane, Alex C.; Mcdermid, Sonali P.

    2017-01-01

    We present the Representative Temperature and Precipitation (T&P) GCM Subsetting Approach developed within the Agricultural Model Intercomparison and Improvement Project (AgMIP) to select a practical subset of global climate models (GCMs) for regional integrated assessment of climate impacts when resource limitations do not permit the full ensemble of GCMs to be evaluated given the need to also focus on impacts sector and economics models. Subsetting inherently leads to a loss of information but can free up resources to explore important uncertainties in the integrated assessment that would otherwise be prohibitive. The Representative T&P GCM Subsetting Approach identifies five individual GCMs that capture a profile of the full ensemble of temperature and precipitation change within the growing season while maintaining information about the probability that basic classes of climate changes (relatively cool/wet, cool/dry, middle, hot/wet, and hot/dry) are projected in the full GCM ensemble. We demonstrate the selection methodology for maize impacts in Ames, Iowa, and discuss limitations and situations when additional information may be required to select representative GCMs. We then classify 29 GCMs over all land areas to identify regions and seasons with characteristic diagonal skewness related to surface moisture as well as extreme skewness connected to snow-albedo feedbacks and GCM uncertainty. Finally, we employ this basic approach to recognize that GCM projections demonstrate coherence across space, time, and greenhouse gas concentration pathway. The Representative T&P GCM Subsetting Approach provides a quantitative basis for the determination of useful GCM subsets, provides a practical and coherent approach where previous assessments selected solely on availability of scenarios, and may be extended for application to a range of scales and sectoral impacts.

  2. A multi-model assessment of the co-benefits of climate mitigation for global air quality

    Science.gov (United States)

    Rao, Shilpa; Klimont, Zbigniew; Leitao, Joana; Riahi, Keywan; van Dingenen, Rita; Aleluia Reis, Lara; Calvin, Katherine; Dentener, Frank; Drouet, Laurent; Fujimori, Shinichiro; Harmsen, Mathijs; Luderer, Gunnar; Heyes, Chris; Strefler, Jessica; Tavoni, Massimo; van Vuuren, Detlef P.

    2016-12-01

    We present a model comparison study that combines multiple integrated assessment models with a reduced-form global air quality model to assess the potential co-benefits of global climate mitigation policies in relation to the World Health Organization (WHO) goals on air quality and health. We include in our assessment, a range of alternative assumptions on the implementation of current and planned pollution control policies. The resulting air pollution emission ranges significantly extend those in the Representative Concentration Pathways. Climate mitigation policies complement current efforts on air pollution control through technology and fuel transformations in the energy system. A combination of stringent policies on air pollution control and climate change mitigation results in 40% of the global population exposed to PM levels below the WHO air quality guideline; with the largest improvements estimated for India, China, and Middle East. Our results stress the importance of integrated multisector policy approaches to achieve the Sustainable Development Goals.

  3. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model.

  4. Spatiotemporal Variability of Lake Water Quality in the Context of Remote Sensing Models

    Directory of Open Access Journals (Sweden)

    Carly Hyatt Hansen

    2017-04-01

    Full Text Available This study demonstrates a number of methods for using field sampling and observed lake characteristics and patterns to improve techniques for development of algae remote sensing models and applications. As satellite and airborne sensors improve and their data are more readily available, applications of models to estimate water quality via remote sensing are becoming more practical for local water quality monitoring, particularly of surface algal conditions. Despite the increasing number of applications, there are significant concerns associated with remote sensing model development and application, several of which are addressed in this study. These concerns include: (1 selecting sensors which are suitable for the spatial and temporal variability in the water body; (2 determining appropriate uses of near-coincident data in empirical model calibration; and (3 recognizing potential limitations of remote sensing measurements which are biased toward surface and near-surface conditions. We address these issues in three lakes in the Great Salt Lake surface water system (namely the Great Salt Lake, Farmington Bay, and Utah Lake through sampling at scales that are representative of commonly used sensors, repeated sampling, and sampling at both near-surface depths and throughout the water column. The variability across distances representative of the spatial resolutions of Landsat, SENTINEL-2 and MODIS sensors suggests that these sensors are appropriate for this lake system. We also use observed temporal variability in the system to evaluate sensors. These relationships proved to be complex, and observed temporal variability indicates the revisit time of Landsat may be problematic for detecting short events in some lakes, while it may be sufficient for other areas of the system with lower short-term variability. Temporal variability patterns in these lakes are also used to assess near-coincident data in empirical model development. Finally, relationships

  5. A Reaction-Based River/Stream Water Quality Model: Reaction Network Decomposition and Model Application

    OpenAIRE

    2012-01-01

    This paper describes details of an automatic matrix decomposition approach for a reaction-based stream water quality model. The method yields a set of equilibrium equations, a set of kinetic-variable transport equations involving kinetic reactions only, and a set of component transport equations involving no reactions. Partial decomposition of the system of water quality constituent transport equations is performed via Gauss-Jordan column reduction of the reaction network by pivoting on equil...

  6. A Neural Network Model for Prediction of Sound Quality

    DEFF Research Database (Denmark)

    Nielsen,, Lars Bramsløw

    error on the test set. The overall concept proved functional, but further testing with data obtained from a new rating experiment is necessary to better assess the utility of this measure. The weights in the trained neural networks were analyzed to qualitatively interpret the relation between...... obtained in subjective sound quality rating experiments based on input data from an auditory model. Various types of input data and data representations from the auditory model were used as input data for the chosen network structure, which was a three-layer perceptron. This network was trained by means...... was evaluated for two types of test set extracted from the complete data set. With a test set consisting of mixed stimuli, the prediction error was only slightly larger than the statistical error in the training data itself. Using a particular group of stimuli for the test set, there was a systematic prediction...

  7. Eye Model for Inspecting the Image Quality of IOLs

    Institute of Scientific and Technical Information of China (English)

    Zhenping Huang; Renfeng Xu; Chunyan Xue; Yong Wu; Huachun Wang; Degao Zhao

    2007-01-01

    Purpose: To inspect and compare the image quality of an aspheric intraocular lens (IQ, Alcon) with those of conventional monofocal silicone and acrylic intraocular lens and multifocal intraocular lens (Array).Methods: The IOLs were tested in the eye model, which was designed to be optically equivalent to the theoretical eye model. The eye model is a combination of a spherical photographic lens with 35 mm focal length (IOL put in a water cell) and a charge coupled device (CCD) camera. The images constructed by the lenses are observed on a monitor of personal computer and the contrasts of the images are analyzed by using commercial image processing software. SHARP value is used to measure and estimate image definition.Results: The images constructed by changing the diameter of aperture stop and IOL. Observed by this eye model, the image definition of aspheric intraocular lens (IQ, Alcon) is better than others.Discussion: The proposed eye model is useful for testing functional vision and for inspecting the differences of intraocular lens.

  8. Integrated hydro-bacterial modelling for predicting bathing water quality

    Science.gov (United States)

    Huang, Guoxian; Falconer, Roger A.; Lin, Binliang

    2017-03-01

    In recent years health risks associated with the non-compliance of bathing water quality have received increasing worldwide attention. However, it is particularly challenging to establish the source of any non-compliance, due to the complex nature of the source of faecal indicator organisms, and the fate and delivery processes and scarcity of field measured data in many catchments and estuaries. In the current study an integrated hydro-bacterial model, linking a catchment, 1-D model and 2-D model were integrated to simulate the adsorption-desorption processes of faecal bacteria to and from sediment particles in river, estuarine and coastal waters, respectively. The model was then validated using hydrodynamic, sediment and faecal bacteria concentration data, measured in 2012, in the Ribble river and estuary, and along the Fylde coast, UK. Particular emphasis has been placed on the mechanism of faecal bacteria transport and decay through the deposition and resuspension of suspended sediments. The results showed that by coupling the E.coli concentration with the sediment transport processes, the accuracy of the predicted E.coli levels was improved. A series of scenario runs were then carried out to investigate the impacts of different management scenarios on the E.coli concentration levels in the coastal bathing water sites around Liverpool Bay, UK. The model results show that the level of compliance with the new EU bathing water standards can be improved significantly by extending outfalls and/or reducing urban sources by typically 50%.

  9. Assessment and modeling of groundwater quality using WQI and GIS in Upper Egypt area.

    Science.gov (United States)

    Rabeiy, Ragab ElSayed

    2017-04-04

    The continuous growth and development of population need more fresh water for drinking, irrigation, and domestic in arid countries like Egypt. Evaluation the quality of groundwater is an essential study to ensure its suitability for different purposes. In this study, 812 groundwater samples were taken within the middle area of Upper Egypt (Sohag Governorate) to assess the quality of groundwater for drinking and irrigation purposes. Eleven water parameters were analyzed at each groundwater sample (Na(+), K(+), Ca(2+), Mg(2+), HCO3(-) SO4(2-), Fe(2+), Mn(2+), Cl(-), electrical conductivity, and pH) to exploit them in water quality evaluation. A classical statistics were applied for the raw data to examine the distribution of physicochemical parameters in the investigated area. The relationship between groundwater parameters was tested using the correlation coefficient where a strong relationship was found between several water parameters such as Ca(2+) and Cl(-). Water quality index (WQI) is a mathematical model used to transform many water parameters into a single indicator value which represents the water quality level. Results of WQI showed that 20% of groundwater samples are excellent, 75% are good for drinking, and 7% are very poor water while only 1% of samples are unsuitable for drinking. To test the suitability of groundwater for irrigation, three indices are used; they are sodium adsorption ration (SAR), sodium percentage (Na%), and permeability index (PI). For irrigation suitability, the study proved that most sampling sites are suitable while less than 3% are unsuitable for irrigation. The spatial distribution of the estimated values of WQI, SAR, Na%, PI, and each groundwater parameter was spatially modeled using GIS.

  10. A quality metric for homology modeling: the H-factor

    Science.gov (United States)

    2011-01-01

    Background The analysis of protein structures provides fundamental insight into most biochemical functions and consequently into the cause and possible treatment of diseases. As the structures of most known proteins cannot be solved experimentally for technical or sometimes simply for time constraints, in silico protein structure prediction is expected to step in and generate a more complete picture of the protein structure universe. Molecular modeling of protein structures is a fast growing field and tremendous works have been done since the publication of the very first model. The growth of modeling techniques and more specifically of those that rely on the existing experimental knowledge of protein structures is intimately linked to the developments of high resolution, experimental techniques such as NMR, X-ray crystallography and electron microscopy. This strong connection between experimental and in silico methods is however not devoid of criticisms and concerns among modelers as well as among experimentalists. Results In this paper, we focus on homology-modeling and more specifically, we review how it is perceived by the structural biology community and what can be done to impress on the experimentalists that it can be a valuable resource to them. We review the common practices and provide a set of guidelines for building better models. For that purpose, we introduce the H-factor, a new indicator for assessing the quality of homology models, mimicking the R-factor in X-ray crystallography. The methods for computing the H-factor is fully described and validated on a series of test cases. Conclusions We have developed a web service for computing the H-factor for models of a protein structure. This service is freely accessible at http://koehllab.genomecenter.ucdavis.edu/toolkit/h-factor. PMID:21291572

  11. A variation reduction allocation model for quality improvement to minimize investment and quality costs by considering suppliers’ learning curve

    Science.gov (United States)

    Rosyidi, C. N.; Jauhari, WA; Suhardi, B.; Hamada, K.

    2016-02-01

    Quality improvement must be performed in a company to maintain its product competitiveness in the market. The goal of such improvement is to increase the customer satisfaction and the profitability of the company. In current practice, a company needs several suppliers to provide the components in assembly process of a final product. Hence quality improvement of the final product must involve the suppliers. In this paper, an optimization model to allocate the variance reduction is developed. Variation reduction is an important term in quality improvement for both manufacturer and suppliers. To improve suppliers’ components quality, the manufacturer must invest an amount of their financial resources in learning process of the suppliers. The objective function of the model is to minimize the total cost consists of investment cost, and quality costs for both internal and external quality costs. The Learning curve will determine how the employee of the suppliers will respond to the learning processes in reducing the variance of the component.

  12. A manufacturing quality assessment model based-on two stages interval type-2 fuzzy logic

    Science.gov (United States)

    Purnomo, Muhammad Ridwan Andi; Helmi Shintya Dewi, Intan

    2016-01-01

    This paper presents the development of an assessment models for manufacturing quality using Interval Type-2 Fuzzy Logic (IT2-FL). The proposed model is developed based on one of building block in sustainable supply chain management (SSCM), which is benefit of SCM, and focuses more on quality. The proposed model can be used to predict the quality level of production chain in a company. The quality of production will affect to the quality of product. Practically, quality of production is unique for every type of production system. Hence, experts opinion will play major role in developing the assessment model. The model will become more complicated when the data contains ambiguity and uncertainty. In this study, IT2-FL is used to model the ambiguity and uncertainty. A case study taken from a company in Yogyakarta shows that the proposed manufacturing quality assessment model can work well in determining the quality level of production.

  13. Examination of the Five Comparable Component Scores of the Diet Quality Indexes HEI-2005 and RC-DQI Using a Nationally Representative Sample of 2–18 Year Old Children: NHANES 2003–2006

    Directory of Open Access Journals (Sweden)

    Sibylle Kranz

    2013-01-01

    Full Text Available Obesity has been associated with low diet quality and the suboptimal intake of food groups and nutrients. Two composite diet quality measurement tools are appropriate for Americans 2–18 years old: the Healthy Eating Index (HEI 2005 and the Revised Children’s Diet Quality Index (RC-DQI. The five components included in both indexes are fruits, vegetables, total grains, whole grains, and milk/dairy. Component scores ranged from 0 to 5 or 0 to 10 points with lower scores indicating suboptimal intake. To allow direct comparisons, one component was rescaled by dividing it by 2; then, all components ranged from 0 to 5 points. The aim of this study was to directly compare the scoring results of these five components using dietary data from a nationally representative sample of children (NHANES 2003–2006, . Correlation coefficients within and between indexes showed less internal consistency in the HEI; age- and ethnic-group stratified analyses indicated higher sensitivity of the RC-DQI. HEI scoring was likely to dichotomize the population into two groups (those with 0 and those with 5 points, while RC-DQI scores resulted in a larger distribution of scores. The scoring scheme of diet quality indexes for children results in great variation of the outcomes, and researchers must be aware of those effects.

  14. Understanding the State of Quality of Software on the basis of Time Gap, Quality Gap and Difference with Standard Model

    Directory of Open Access Journals (Sweden)

    Ekbal Rashid

    2013-06-01

    Full Text Available This paper tries to introduce a new mathematical model to understand the state of quality of software by calculating parameters such as the time gap and quality gap with relation to some predefinedstandard software quality or in relation to some chalked out software quality plan. The paper also suggests methods to calculate the difference in quality of the software being developed and the modelsoftware which has been decided upon as the criteria for comparison. These methods can be employed to better understand the state of quality as compared to other standards. In order to obtain the graphical representation of data we have used Microsoft office 2007 graphical chart. Which facilitate easy simulation of time and quality gap.

  15. Improved first-order uncertainty method for water-quality modeling

    Science.gov (United States)

    Melching, C.S.; Anmangandla, S.

    1992-01-01

    Uncertainties are unavoidable in water-quality modeling and subsequent management decisions. Monte Carlo simulation and first-order uncertainty analysis (involving linearization at central values of the uncertain variables) have been frequently used to estimate probability distributions for water-quality model output due to their simplicity. Each method has its drawbacks: Monte Carlo simulation's is mainly computational time; and first-order analysis are mainly questions of accuracy and representativeness, especially for nonlinear systems and extreme conditions. An improved (advanced) first-order method is presented, where the linearization point varies to match the output level whose exceedance probability is sought. The advanced first-order method is tested on the Streeter-Phelps equation to estimate the probability distribution of critical dissolved-oxygen deficit and critical dissolved oxygen using two hypothetical examples from the literature. The advanced first-order method provides a close approximation of the exceedance probability for the Streeter-Phelps model output estimated by Monte Carlo simulation using less computer time - by two orders of magnitude - regardless of the probability distributions assumed for the uncertain model parameters.

  16. Developing a Total Quality Management Model for Health Care Systems

    Directory of Open Access Journals (Sweden)

    AM Mosadegh Rad

    2005-10-01

    Full Text Available Background: Total quality management (TQM is a managerial practice to improve the effectiveness, efficiency, flexibility, and competitiveness of a business as a whole. However, in practice, these TQM benefits are not easy to achieve. Despite its theoretical promise and the enthusiastic response to TQM, recent evidence suggests that attempts to implement it are often unsuccessful. Many of these TQM programmes have been cancelled, or are in the process of being cancelled, as a result of the negative impact on profits. Therefore, there is a pressing need for a clinical approach to establishing TQM. Method: The aim of this article is therefore: “To identify the strengths and weakness of TQM, the logical steps towards TQM, and to develop a model so that health care organizations aiming at using TQM to achieve excellence can follow through easily”. Based on the research questions proposed in this study, the research strategies of a literature review, a questionnaire survey, semi-structured interviews, and a participatory action research were adopted in this study. For determining the success and barriers of TQM in health care organizations, a questionnaire survey has done in 90 health acre organizations in Isfahan Province, which implement TQM. The results of this survey were used for introducing a new model of TQM. This model will be developed via a semi-structured interview with at minimum 10 health care and quality managers. Then, through a participatory action research, this model will be implemented in 3 sites. At this time, the questionnaire survey has done and the model is introduced. Therefore, developing the model and its implementation will be done later. Results: In this survey, the mean score of TQM success was 3.48±0.68 (medium from 5 credits. Implementation of TQM was very low, low, medium, high and very high successful respectively in 3.6, 10.9, 21.8, 56.4 and 7.3 percent of health care organizations. TQM had the most effect on

  17. A method to represent ozone response to large changes in precursor emissions using high-order sensitivity analysis in photochemical models

    Directory of Open Access Journals (Sweden)

    G. Yarwood

    2013-09-01

    Full Text Available Photochemical grid models (PGMs are used to simulate tropospheric ozone and quantify its response to emission changes. PGMs are often applied for annual simulations to provide both maximum concentrations for assessing compliance with air quality standards and frequency distributions for assessing human exposure. Efficient methods for computing ozone at different emission levels can improve the quality of ozone air quality management efforts. This study demonstrates the feasibility of using the decoupled direct method (DDM to calculate first- and second-order sensitivity of ozone to anthropogenic NOx and VOC emissions in annual PGM simulations at continental scale. Algebraic models are developed that use Taylor series to produce complete annual frequency distributions of hourly ozone at any location and any anthropogenic emission level between zero and 100%, adjusted independently for NOx and VOC. We recommend computing the sensitivity coefficients at the midpoint of the emissions range over which they are intended to be applied, in this case with 50% anthropogenic emissions. The algebraic model predictions can be improved by combining sensitivity coefficients computed at 10 and 50% anthropogenic emissions. Compared to brute force simulations, algebraic model predictions tend to be more accurate in summer than winter, at rural than urban locations, and with 100% than zero anthropogenic emissions. Equations developed to combine sensitivity coefficients computed with 10 and 50% anthropogenic emissions are able to reproduce brute force simulation results with zero and 100% anthropogenic emissions with a mean bias of less than 2 ppb and mean error of less than 3 ppb averaged over 22 US cities.

  18. A method to represent ozone response to large changes in precursor emissions using high-order sensitivity analysis in photochemical models

    Directory of Open Access Journals (Sweden)

    G. Yarwood

    2013-04-01

    Full Text Available Photochemical grid models (PGMs are used to simulate tropospheric ozone and quantify its response to emission changes. PGMs are often applied for annual simulations to provide both maximum concentrations for assessing compliance with air quality standards and frequency distributions for assessing human exposure. Efficient methods for computing ozone at different emission levels can improve the quality of ozone air quality management efforts. This study demonstrates the feasibility of using the decoupled direct method (DDM to calculate first- and second-order sensitivity of ozone to anthropogenic NOx and VOC emissions in annual PGM simulations at continental scale. Algebraic models are developed that use Taylor series to produce complete annual frequency distributions of hourly ozone at any location and any anthropogenic emission level between zero and 100%, adjusted independently for NOx and VOC. We recommend computing the sensitivity coefficients at the mid-point of the emissions range over which they are intended to be applied, in this case with 50% anthropogenic emissions. The algebraic model predictions can be improved by combining sensitivity coefficients computed at 10% and 50% anthropogenic emissions. Compared to brute force simulations, algebraic model predictions tend to be more accurate in summer than winter, at rural than urban locations, and with 100% than zero anthropogenic emissions. Equations developed to combine sensitivity coefficients computed with 10% and 50% anthropogenic emissions are able to reproduce brute force simulation results with zero and 100% anthropogenic emissions with mean bias less than 2 ppb and mean error less than 3 ppb averaged over 22 US cities.

  19. Understanding Flow Pathways, Mixing and Transit Times for Water Quality Modelling

    Science.gov (United States)

    Dunn, S. M.; Bacon, J. R.; Soulsby, C.; Tetzlaff, D.

    2007-12-01

    Water quality modelling requires representation of the physical processes controlling the movement of solutes and particulates at an appropriate level of detail to address the objective of the model simulations. To understand and develop mitigation strategies for diffuse pollution at catchment scales, it is necessary for models to be able to represent the sources and age of water reaching rivers at different times. Experimental and modelling studies undertaken on several catchments in the north east of Scotland have used natural hydrochemical and isotopic tracers as a means of obtaining spatially integrated information about mixing processes. Methods for obtaining and integrating appropriate data are considered together with the implications of neglecting it. The tracer data have been incorporated in a conceptual hydrological model to study the sensitivity of the modelled tracer response to factors that may not affect runoff simulations but do affect mixing and transit times of the water. Results from the studies have shown how model structural and parameter uncertainties can lead to errors in the representation of: the flow pathways of water; the degree to which these flow pathways have mixed and the length of time for which water has been stored within the soil / groundwater system. It has been found to be difficult to eliminate structural uncertainty regarding the mechanisms of mixing, and parameter uncertainty regarding the role of groundwater. Simulations of nitrate pollution, resulting from the application of agricultural fertilisers, have been undertaken to demonstrate the sensitivity of water quality simulations to the potential errors in physical transport mechanisms, inherent in models that fail to account correctly for flow pathways, mixing and transit times.

  20. Representing the effects of alpine grassland vegetation cover on the simulation of soil thermal dynamics by ecosystem models applied to the Qinghai-Tibetan Plateau

    Science.gov (United States)

    Yi, S.; Li, N.; Xiang, B.; Wang, X.; Ye, B.; McGuire, A.D.

    2013-01-01

    Soil surface temperature is a critical boundary condition for the simulation of soil temperature by environmental models. It is influenced by atmospheric and soil conditions and by vegetation cover. In sophisticated land surface models, it is simulated iteratively by solving surface energy budget equations. In ecosystem, permafrost, and hydrology models, the consideration of soil surface temperature is generally simple. In this study, we developed a methodology for representing the effects of vegetation cover and atmospheric factors on the estimation of soil surface temperature for alpine grassland ecosystems on the Qinghai-Tibetan Plateau. Our approach integrated measurements from meteorological stations with simulations from a sophisticated land surface model to develop an equation set for estimating soil surface temperature. After implementing this equation set into an ecosystem model and evaluating the performance of the ecosystem model in simulating soil temperature at different depths in the soil profile, we applied the model to simulate interactions among vegetation cover, freeze-thaw cycles, and soil erosion to demonstrate potential applications made possible through the implementation of the methodology developed in this study. Results showed that (1) to properly estimate daily soil surface temperature, algorithms should use air temperature, downward solar radiation, and vegetation cover as independent variables; (2) the equation set developed in this study performed better than soil surface temperature algorithms used in other models; and (3) the ecosystem model performed well in simulating soil temperature throughout the soil profile using the equation set developed in this study. Our application of the model indicates that the representation in ecosystem models of the effects of vegetation cover on the simulation of soil thermal dynamics has the potential to substantially improve our understanding of the vulnerability of alpine grassland ecosystems to