WorldWideScience

Sample records for modeling requires assumptions

  1. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  2. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  3. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  4. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  5. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    Science.gov (United States)

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    waste LCA models. This review infers that some of the differences in waste LCA models are inherent to the time they were developed. It is expected that models developed later, benefit from past modelling assumptions and knowledge and issues. Models developed in different countries furthermore rely...

  7. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... for the component. The CPN model can be used to validate the environment assumptions and the requirements. The validation is performed by execution of the model during which traces of events and states are automatically generated and evaluated against the requirements....

  8. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed by Fro...... that there is indeed a constructive role for a wide suite of ecosystem models to evaluate fishing strategies in an ecosystem context...

  9. Lightweight Graphical Models for Selectivity Estimation Without Independence Assumptions

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2011-01-01

    , propagated exponentially, can lead to severely sub-optimal plans. Modern optimizers typically maintain one-dimensional statistical summaries and make the attribute value independence and join uniformity assumptions for efficiently estimating selectivities. Therefore, selectivity estimation errors in today......’s optimizers are frequently caused by missed correlations between attributes. We present a selectivity estimation approach that does not make the independence assumptions. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution of all...

  10. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  11. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus.

    Science.gov (United States)

    Taliotis, Constantinos; Taibi, Emanuele; Howells, Mark; Rogner, Holger; Bazilian, Morgan; Welsch, Manuel

    2017-10-01

    The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  12. PKreport: report generation for checking population pharmacokinetic model assumptions

    Directory of Open Access Journals (Sweden)

    Li Jun

    2011-05-01

    Full Text Available Abstract Background Graphics play an important and unique role in population pharmacokinetic (PopPK model building by exploring hidden structure among data before modeling, evaluating model fit, and validating results after modeling. Results The work described in this paper is about a new R package called PKreport, which is able to generate a collection of plots and statistics for testing model assumptions, visualizing data and diagnosing models. The metric system is utilized as the currency for communicating between data sets and the package to generate special-purpose plots. It provides ways to match output from diverse software such as NONMEM, Monolix, R nlme package, etc. The package is implemented with S4 class hierarchy, and offers an efficient way to access the output from NONMEM 7. The final reports take advantage of the web browser as user interface to manage and visualize plots. Conclusions PKreport provides 1 a flexible and efficient R class to store and retrieve NONMEM 7 output, 2 automate plots for users to visualize data and models, 3 automatically generated R scripts that are used to create the plots; 4 an archive-oriented management tool for users to store, retrieve and modify figures, 5 high-quality graphs based on the R packages, lattice and ggplot2. The general architecture, running environment and statistical methods can be readily extended with R class hierarchy. PKreport is free to download at http://cran.r-project.org/web/packages/PKreport/index.html.

  13. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  14. On the underlying assumptions of threshold Boolean networks as a model for genetic regulatory network behavior.

    Science.gov (United States)

    Tran, Van; McCall, Matthew N; McMurray, Helene R; Almudevar, Anthony

    2013-01-01

    Boolean networks (BoN) are relatively simple and interpretable models of gene regulatory networks. Specifying these models with fewer parameters while retaining their ability to describe complex regulatory relationships is an ongoing methodological challenge. Additionally, extending these models to incorporate variable gene decay rates, asynchronous gene response, and synergistic regulation while maintaining their Markovian nature increases the applicability of these models to genetic regulatory networks (GRN). We explore a previously-proposed class of BoNs characterized by linear threshold functions, which we refer to as threshold Boolean networks (TBN). Compared to traditional BoNs with unconstrained transition functions, these models require far fewer parameters and offer a more direct interpretation. However, the functional form of a TBN does result in a reduction in the regulatory relationships which can be modeled. We show that TBNs can be readily extended to permit self-degradation, with explicitly modeled degradation rates. We note that the introduction of variable degradation compromises the Markovian property fundamental to BoN models but show that a simple state augmentation procedure restores their Markovian nature. Next, we study the effect of assumptions regarding self-degradation on the set of possible steady states. Our findings are captured in two theorems relating self-degradation and regulatory feedback to the steady state behavior of a TBN. Finally, we explore assumptions of synchronous gene response and asynergistic regulation and show that TBNs can be easily extended to relax these assumptions. Applying our methods to the budding yeast cell-cycle network revealed that although the network is complex, its steady state is simplified by the presence of self-degradation and lack of purely positive regulatory cycles.

  15. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Directory of Open Access Journals (Sweden)

    Giordano James

    2010-01-01

    Full Text Available Abstract A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order, and what counts as abnormality (i.e.- disorder. The distinction(s between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice.

  16. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Science.gov (United States)

    2010-01-01

    A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order), and what counts as abnormality (i.e.- disorder). The distinction(s) between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice. PMID:20109176

  17. Lifelong learning: Foundational models, underlying assumptions and critiques

    Science.gov (United States)

    Regmi, Kapil Dev

    2015-04-01

    Lifelong learning has become a catchword in almost all countries because of its growing influence on education policies in the globalised world. In the Organisation for Economic Cooperation and Development (OECD) and the European Union (EU), the promotion of lifelong learning has been a strategy to speed up economic growth and become competitive. For UNESCO and the World Bank, lifelong learning has been a novel education model to improve educational policies and programmes in developing countries. In the existing body of literature on the topic, various models of lifelong learning are discussed. After reviewing a number of relevant seminal texts by proponents of a variety of schools, this paper argues that the vast number of approaches are actually built on two foundational models, which the author calls the "human capital model" and the "humanistic model". The former aims to increase productive capacity by encouraging competition, privatisation and human capital formation so as to enhance economic growth. The latter aims to strengthen democracy and social welfare by fostering citizenship education, building social capital and expanding capability.

  18. Testing the habituation assumption underlying models of parasitoid foraging behavior

    NARCIS (Netherlands)

    Abram, Paul K.; Cusumano, Antonino; Abram, Katrina; Colazza, Stefano; Peri, Ezio

    2017-01-01

    Background. Habituation, a form of non-associative learning, has several well-defined characteristics that apply to a wide range of physiological and behavioral responses in many organisms. In classic patch time allocation models, habituation is considered to be a major mechanistic component of

  19. A note on the translation of conceptual data models into description logics: disjointness and covering assumptions

    CSIR Research Space (South Africa)

    Casini, G

    2012-10-01

    Full Text Available ). In this paper we propose two simple procedures to assist modelers with integrating these assumptions into their models, thereby allowing for a more complete translation into DLs....

  20. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  1. A computational model to investigate assumptions in the headturn preference procedure

    Directory of Open Access Journals (Sweden)

    Christina eBergmann

    2013-10-01

    Full Text Available In this paper we use a computational model to investigate four assumptions that are tacitly present in interpreting the results of studies on infants' speech processing abilities using the Headturn Preference Procedure (HPP: (1 behavioural differences originate in different processing; (2 processing involves some form of recognition; (3 words are segmented from connected speech; and (4 differences between infants should not affect overall results. In addition, we investigate the impact of two potentially important aspects in the design and execution of the experiments: (a the specific voices used in the two parts on HPP experiments (familiarisation and test and (b the experimenter's criterion for what is a sufficient headturn angle. The model is designed to be maximise cognitive plausibility. It takes real speech as input, and it contains a module that converts the output of internal speech processing and recognition into headturns that can yield real-time listening preference measurements. Internal processing is based on distributed episodic representations in combination with a matching procedure based on the assumptions that complex episodes can be decomposed as positive weighted sums of simpler constituents. Model simulations show that the first assumptions hold under two different definitions of recognition. However, explicit segmentation is not necessary to simulate the behaviours observed in infant studies. Differences in attention span between infants can affect the outcomes of an experiment. The same holds for the experimenter's decision criterion. The speakers used in experiments affect outcomes in complex ways that require further investigation. The paper ends with recommendations for future studies using the HPP.

  2. Sensitivity of fluvial sediment source apportionment to mixing model assumptions: A Bayesian model comparison.

    Science.gov (United States)

    Cooper, Richard J; Krueger, Tobias; Hiscock, Kevin M; Rawlins, Barry G

    2014-11-01

    Mixing models have become increasingly common tools for apportioning fluvial sediment load to various sediment sources across catchments using a wide variety of Bayesian and frequentist modeling approaches. In this study, we demonstrate how different model setups can impact upon resulting source apportionment estimates in a Bayesian framework via a one-factor-at-a-time (OFAT) sensitivity analysis. We formulate 13 versions of a mixing model, each with different error assumptions and model structural choices, and apply them to sediment geochemistry data from the River Blackwater, Norfolk, UK, to apportion suspended particulate matter (SPM) contributions from three sources (arable topsoils, road verges, and subsurface material) under base flow conditions between August 2012 and August 2013. Whilst all 13 models estimate subsurface sources to be the largest contributor of SPM (median ∼76%), comparison of apportionment estimates reveal varying degrees of sensitivity to changing priors, inclusion of covariance terms, incorporation of time-variant distributions, and methods of proportion characterization. We also demonstrate differences in apportionment results between a full and an empirical Bayesian setup, and between a Bayesian and a frequentist optimization approach. This OFAT sensitivity analysis reveals that mixing model structural choices and error assumptions can significantly impact upon sediment source apportionment results, with estimated median contributions in this study varying by up to 21% between model versions. Users of mixing models are therefore strongly advised to carefully consider and justify their choice of model structure prior to conducting sediment source apportionment investigations. An OFAT sensitivity analysis of sediment fingerprinting mixing models is conductedBayesian models display high sensitivity to error assumptions and structural choicesSource apportionment results differ between Bayesian and frequentist approaches.

  3. Analysis of Modeling Assumptions used in Production Cost Models for Renewable Integration Studies

    Energy Technology Data Exchange (ETDEWEB)

    Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Townsend, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bloom, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-01

    Renewable energy integration studies have been published for many different regions exploring the question of how higher penetration of renewable energy will impact the electric grid. These studies each make assumptions about the systems they are analyzing; however the effect of many of these assumptions has not been yet been examined and published. In this paper we analyze the impact of modeling assumptions in renewable integration studies, including the optimization method used (linear or mixed-integer programming) and the temporal resolution of the dispatch stage (hourly or sub-hourly). We analyze each of these assumptions on a large and a small system and determine the impact of each assumption on key metrics including the total production cost, curtailment of renewables, CO2 emissions, and generator starts and ramps. Additionally, we identified the impact on these metrics if a four-hour ahead commitment step is included before the dispatch step and the impact of retiring generators to reduce the degree to which the system is overbuilt. We find that the largest effect of these assumptions is at the unit level on starts and ramps, particularly for the temporal resolution, and saw a smaller impact at the aggregate level on system costs and emissions. For each fossil fuel generator type we measured the average capacity started, average run-time per start, and average number of ramps. Linear programming results saw up to a 20% difference in number of starts and average run time of traditional generators, and up to a 4% difference in the number of ramps, when compared to mixed-integer programming. Utilizing hourly dispatch instead of sub-hourly dispatch saw no difference in coal or gas CC units for either start metric, while gas CT units had a 5% increase in the number of starts and 2% increase in the average on-time per start. The number of ramps decreased up to 44%. The smallest effect seen was on the CO2 emissions and total production cost, with a 0.8% and 0

  4. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid

  5. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klaauw, B.; Koning, R.H.

    2003-01-01

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  6. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klauw, B.; Koning, R.H.

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  7. Modelling carbonaceous aerosol from residential solid fuel burning with different assumptions for emissions

    Directory of Open Access Journals (Sweden)

    R. Ots

    2018-04-01

    Full Text Available Evidence is accumulating that emissions of primary particulate matter (PM from residential wood and coal combustion in the UK may be underestimated and/or spatially misclassified. In this study, different assumptions for the spatial distribution and total emission of PM from solid fuel (wood and coal burning in the UK were tested using an atmospheric chemical transport model. Modelled concentrations of the PM components were compared with measurements from aerosol mass spectrometers at four sites in central and Greater London (ClearfLo campaign, 2012, as well as with measurements from the UK black carbon network.The two main alternative emission scenarios modelled were Base4x and combRedist. For Base4x, officially reported PM2.5 from the residential and other non-industrial combustion source sector were increased by a factor of four. For the combRedist experiment, half of the baseline emissions from this same source were redistributed by residential population density to simulate the effect of allocating some emissions to the smoke control areas (that are assumed in the national inventory to have no emissions from this source. The Base4x scenario yielded better daily and hourly correlations with measurements than the combRedist scenario for year-long comparisons of the solid fuel organic aerosol (SFOA component at the two London sites. However, the latter scenario better captured mean measured concentrations across all four sites. A third experiment, Redist – all emissions redistributed linearly to population density, is also presented as an indicator of the maximum concentrations an assumption like this could yield.The modelled elemental carbon (EC concentrations derived from the combRedist experiments also compared well with seasonal average concentrations of black carbon observed across the network of UK sites. Together, the two model scenario simulations of SFOA and EC suggest both that residential solid fuel emissions may be higher than

  8. Modelling carbonaceous aerosol from residential solid fuel burning with different assumptions for emissions

    Science.gov (United States)

    Ots, Riinu; Heal, Mathew R.; Young, Dominique E.; Williams, Leah R.; Allan, James D.; Nemitz, Eiko; Di Marco, Chiara; Detournay, Anais; Xu, Lu; Ng, Nga L.; Coe, Hugh; Herndon, Scott C.; Mackenzie, Ian A.; Green, David C.; Kuenen, Jeroen J. P.; Reis, Stefan; Vieno, Massimo

    2018-04-01

    Evidence is accumulating that emissions of primary particulate matter (PM) from residential wood and coal combustion in the UK may be underestimated and/or spatially misclassified. In this study, different assumptions for the spatial distribution and total emission of PM from solid fuel (wood and coal) burning in the UK were tested using an atmospheric chemical transport model. Modelled concentrations of the PM components were compared with measurements from aerosol mass spectrometers at four sites in central and Greater London (ClearfLo campaign, 2012), as well as with measurements from the UK black carbon network.The two main alternative emission scenarios modelled were Base4x and combRedist. For Base4x, officially reported PM2.5 from the residential and other non-industrial combustion source sector were increased by a factor of four. For the combRedist experiment, half of the baseline emissions from this same source were redistributed by residential population density to simulate the effect of allocating some emissions to the smoke control areas (that are assumed in the national inventory to have no emissions from this source). The Base4x scenario yielded better daily and hourly correlations with measurements than the combRedist scenario for year-long comparisons of the solid fuel organic aerosol (SFOA) component at the two London sites. However, the latter scenario better captured mean measured concentrations across all four sites. A third experiment, Redist - all emissions redistributed linearly to population density, is also presented as an indicator of the maximum concentrations an assumption like this could yield.The modelled elemental carbon (EC) concentrations derived from the combRedist experiments also compared well with seasonal average concentrations of black carbon observed across the network of UK sites. Together, the two model scenario simulations of SFOA and EC suggest both that residential solid fuel emissions may be higher than inventory

  9. Requirements engineering for cross-organizational ERP implementation: Undocumented assumptions and potential mismatches

    NARCIS (Netherlands)

    Daneva, Maia; Wieringa, Roelf J.

    A key issue in Requirements Engineering (RE) for Enterprise Resource Planning (ERP) in a crossorganizational context is how to find a match between the ERP application modules and requirements for business coordination. This paper proposes a conceptual framework for analyzing coordination

  10. A computational model to investigate assumptions in the headturn preference procedure

    NARCIS (Netherlands)

    Bergmann, C.; Bosch, L.F.M. ten; Fikkert, J.P.M.; Boves, L.W.J.

    2013-01-01

    In this paper we use a computational model to investigate four assumptions that are tacitly present in interpreting the results of studies on infants' speech processing abilities using the Headturn Preference Procedure (HPP): (1) behavioral differences originate in different processing; (2)

  11. A method for the analysis of assumptions in model-based environmental assessments

    NARCIS (Netherlands)

    Kloprogge, P.; van der Sluijs, J.P.; Petersen, A.C.

    2011-01-01

    make many assumptions. This inevitably involves – to some degree – subjective judgements by the analysts. Although the potential value-ladenness of model-based assessments has been extensively problematized in literature, this has not so far led to a systematic strategy for analyzing this

  12. Assessing the skill of hydrology models at simulating the water cycle in the HJ Andrews LTER: Assumptions, strengths and weaknesses

    Science.gov (United States)

    Simulated impacts of climate on hydrology can vary greatly as a function of the scale of the input data, model assumptions, and model structure. Four models are commonly used to simulate streamflow in model assumptions, and model structure. Four models are commonly used to simu...

  13. Camera traps and mark-resight models: The value of ancillary data for evaluating assumptions

    Science.gov (United States)

    Parsons, Arielle W.; Simons, Theodore R.; Pollock, Kenneth H.; Stoskopf, Michael K.; Stocking, Jessica J.; O'Connell, Allan F.

    2015-01-01

    Unbiased estimators of abundance and density are fundamental to the study of animal ecology and critical for making sound management decisions. Capture–recapture models are generally considered the most robust approach for estimating these parameters but rely on a number of assumptions that are often violated but rarely validated. Mark-resight models, a form of capture–recapture, are well suited for use with noninvasive sampling methods and allow for a number of assumptions to be relaxed. We used ancillary data from continuous video and radio telemetry to evaluate the assumptions of mark-resight models for abundance estimation on a barrier island raccoon (Procyon lotor) population using camera traps. Our island study site was geographically closed, allowing us to estimate real survival and in situ recruitment in addition to population size. We found several sources of bias due to heterogeneity of capture probabilities in our study, including camera placement, animal movement, island physiography, and animal behavior. Almost all sources of heterogeneity could be accounted for using the sophisticated mark-resight models developed by McClintock et al. (2009b) and this model generated estimates similar to a spatially explicit mark-resight model previously developed for this population during our study. Spatially explicit capture–recapture models have become an important tool in ecology and confer a number of advantages; however, non-spatial models that account for inherent individual heterogeneity may perform nearly as well, especially where immigration and emigration are limited. Non-spatial models are computationally less demanding, do not make implicit assumptions related to the isotropy of home ranges, and can provide insights with respect to the biological traits of the local population.

  14. Assumptions to the model of managing knowledge workers in modern organizations

    Directory of Open Access Journals (Sweden)

    Igielski Michał

    2017-05-01

    Full Text Available Changes in the twenty-first century are faster, suddenly appear, not always desirable for the smooth functioning of the company. This is the domain of globalization, in which new events - opportunities or threats, forcing the company all the time to act. More and more things depend on the intangible assets of the undertaking, its strategic potential. Certain types of work require more knowledge, experience and independent thinking, and custom than others. Therefore in this article the author has taken up the subject of knowledge workers in contemporary organizations. The aim of the study is to attempt to create assumptions about the knowledge management model in these organizations, based on literature analysis and empirical research. In this regard, the author describes the contemporary conditions of employee management and the skills and competences of knowledge workers. In addition, he conducted research (2016 in 100 medium enterprises in the province of Pomerania, using a tool in the form of a questionnaire and an interview. Already at the beginning of the analysis of the data collected, it turned out that for all employers it should be important to discern differences in the creation of a new category of managers who have knowledge useful for the functioning of the company. Moreover, with the experience gained in a similar research process previously carried out in companies from the Baltic Sea Region, the author knew about the positive influence of these people on creating new solutions or improving the quality of already existing products or services.

  15. Comparison of 2D Finite Element Modeling Assumptions with Results From 3D Analysis for Composite Skin-Stiffener Debonding

    Science.gov (United States)

    Krueger, Ronald; Paris, Isbelle L.; OBrien, T. Kevin; Minguet, Pierre J.

    2004-01-01

    The influence of two-dimensional finite element modeling assumptions on the debonding prediction for skin-stiffener specimens was investigated. Geometrically nonlinear finite element analyses using two-dimensional plane-stress and plane-strain elements as well as three different generalized plane strain type approaches were performed. The computed skin and flange strains, transverse tensile stresses and energy release rates were compared to results obtained from three-dimensional simulations. The study showed that for strains and energy release rate computations the generalized plane strain assumptions yielded results closest to the full three-dimensional analysis. For computed transverse tensile stresses the plane stress assumption gave the best agreement. Based on this study it is recommended that results from plane stress and plane strain models be used as upper and lower bounds. The results from generalized plane strain models fall between the results obtained from plane stress and plane strain models. Two-dimensional models may also be used to qualitatively evaluate the stress distribution in a ply and the variation of energy release rates and mixed mode ratios with delamination length. For more accurate predictions, however, a three-dimensional analysis is required.

  16. Random Regression Models Based On The Skew Elliptically Contoured Distribution Assumptions With Applications To Longitudinal Data *

    Science.gov (United States)

    Zheng, Shimin; Rao, Uma; Bartolucci, Alfred A.; Singh, Karan P.

    2011-01-01

    Bartolucci et al.(2003) extended the distribution assumption from the normal (Lyles et al., 2000) to the elliptical contoured distribution (ECD) for random regression models used in analysis of longitudinal data accounting for both undetectable values and informative drop-outs. In this paper, the random regression models are constructed on the multivariate skew ECD. A real data set is used to illustrate that the skew ECDs can fit some unimodal continuous data better than the Gaussian distributions or more general continuous symmetric distributions when the symmetric distribution assumption is violated. Also, a simulation study is done for illustrating the model fitness from a variety of skew ECDs. The software we used is SAS/STAT, V. 9.13. PMID:21637734

  17. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions.

    Science.gov (United States)

    Flores-Alsina, Xavier; Gernaey, Krist V; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant was compared for a series of model assumptions. Three different model approaches describing BNR are considered. In the reference case, the original model implementations are used to simulate WWTP1 (ASM1 & 3) and WWTP2 (ASM2d). The second set of models includes a reactive settler, which extends the description of the non-reactive TSS sedimentation and transport in the reference case with the full set of ASM processes. Finally, the third set of models is based on including electron acceptor dependency of biomass decay rates for ASM1 (WWTP1) and ASM2d (WWTP2). The results show that incorporation of a reactive settler: (1) increases the hydrolysis of particulates; (2) increases the overall plant's denitrification efficiency by reducing the S(NOx) concentration at the bottom of the clarifier; (3) increases the oxidation of COD compounds; (4) increases X(OHO) and X(ANO) decay; and, finally, (5) increases the growth of X(PAO) and formation of X(PHA,Stor) for ASM2d, which has a major impact on the whole P removal system. Introduction of electron acceptor dependent decay leads to a substantial increase of the concentration of X(ANO), X(OHO) and X(PAO) in the bottom of the clarifier. The paper ends with a critical discussion of the influence of the different model assumptions, and emphasizes the need for a model user to understand the significant differences in simulation results that are obtained when applying different combinations of 'standard' models.

  18. Single family heating and cooling requirements: Assumptions, methods, and summary results

    Energy Technology Data Exchange (ETDEWEB)

    Ritschard, R.L.; Hanford, J.W.; Sezgen, A.O. (Lawrence Berkeley Lab., CA (United States))

    1992-03-01

    The research has created a data base of hourly building loads using a state-of-the-art building simulation code (DOE-2.ID) for 8 prototypes, representing pre-1940s to 1990s building practices, in 16 US climates. The report describes the assumed modeling inputs and building operations, defines the building prototypes and selection of base cities, compares the simulation results to both surveyed and measured data sources, and discusses the results. The full data base with hourly space conditioning, water heating, and non-HVAC electricity consumption is available from GRI. In addition, the estimated loads on a per square foot basis are included as well as the peak heating and cooling loads.

  19. Using Covert Response Activation to Test Latent Assumptions of Formal Decision-Making Models in Humans.

    Science.gov (United States)

    Servant, Mathieu; White, Corey; Montagnini, Anna; Burle, Borís

    2015-07-15

    Most decisions that we make build upon multiple streams of sensory evidence and control mechanisms are needed to filter out irrelevant information. Sequential sampling models of perceptual decision making have recently been enriched by attentional mechanisms that weight sensory evidence in a dynamic and goal-directed way. However, the framework retains the longstanding hypothesis that motor activity is engaged only once a decision threshold is reached. To probe latent assumptions of these models, neurophysiological indices are needed. Therefore, we collected behavioral and EMG data in the flanker task, a standard paradigm to investigate decisions about relevance. Although the models captured response time distributions and accuracy data, EMG analyses of response agonist muscles challenged the assumption of independence between decision and motor processes. Those analyses revealed covert incorrect EMG activity ("partial error") in a fraction of trials in which the correct response was finally given, providing intermediate states of evidence accumulation and response activation at the single-trial level. We extended the models by allowing motor activity to occur before a commitment to a choice and demonstrated that the proposed framework captured the rate, latency, and EMG surface of partial errors, along with the speed of the correction process. In return, EMG data provided strong constraints to discriminate between competing models that made similar behavioral predictions. Our study opens new theoretical and methodological avenues for understanding the links among decision making, cognitive control, and motor execution in humans. Sequential sampling models of perceptual decision making assume that sensory information is accumulated until a criterion quantity of evidence is obtained, from where the decision terminates in a choice and motor activity is engaged. The very existence of covert incorrect EMG activity ("partial error") during the evidence accumulation

  20. Uncovering Implicit Assumptions: a Large-Scale Study on Students' Mental Models of Diffusion

    Science.gov (United States)

    Stains, Marilyne; Sevian, Hannah

    2015-12-01

    Students' mental models of diffusion in a gas phase solution were studied through the use of the Structure and Motion of Matter (SAMM) survey. This survey permits identification of categories of ways students think about the structure of the gaseous solute and solvent, the origin of motion of gas particles, and trajectories of solute particles in the gaseous medium. A large sample of data ( N = 423) from students across grade 8 (age 13) through upper-level undergraduate was subjected to a cluster analysis to determine the main mental models present. The cluster analysis resulted in a reduced data set ( N = 308), and then, mental models were ascertained from robust clusters. The mental models that emerged from analysis were triangulated through interview data and characterised according to underlying implicit assumptions that guide and constrain thinking about diffusion of a solute in a gaseous medium. Impacts of students' level of preparation in science and relationships of mental models to science disciplines studied by students were examined. Implications are discussed for the value of this approach to identify typical mental models and the sets of implicit assumptions that constrain them.

  1. Sensitivity of Population Size Estimation for Violating Parametric Assumptions in Log-linear Models

    Directory of Open Access Journals (Sweden)

    Gerritse Susanna C.

    2015-09-01

    Full Text Available An important quality aspect of censuses is the degree of coverage of the population. When administrative registers are available undercoverage can be estimated via capture-recapture methodology. The standard approach uses the log-linear model that relies on the assumption that being in the first register is independent of being in the second register. In models using covariates, this assumption of independence is relaxed into independence conditional on covariates. In this article we describe, in a general setting, how sensitivity analyses can be carried out to assess the robustness of the population size estimate. We make use of log-linear Poisson regression using an offset, to simulate departure from the model. This approach can be extended to the case where we have covariates observed in both registers, and to a model with covariates observed in only one register. The robustness of the population size estimate is a function of implied coverage: as implied coverage is low the robustness is low. We conclude that it is important for researchers to investigate and report the estimated robustness of their population size estimate for quality reasons. Extensions are made to log-linear modeling in case of more than two registers and the multiplier method

  2. Modeling hepatitis C virus transmission among people who inject drugs: Assumptions, limitations and future challenges.

    Science.gov (United States)

    Scott, Nick; Hellard, Margaret; McBryde, Emma Sue

    2016-01-01

    The discovery of highly effective hepatitis C virus (HCV) treatments has led to discussion of elimination and intensified interest in models of HCV transmission. In developed settings, HCV disproportionally affects people who inject drugs (PWID), and models are typically used to provide an evidence base for the effectiveness of interventions such as needle and syringe programs, opioid substitution therapy and more recently treating PWID with new generation therapies to achieve specified reductions in prevalence and / or incidence. This manuscript reviews deterministic compartmental S-I, deterministic compartmental S-I-S and network-based transmission models of HCV among PWID. We detail typical assumptions made when modeling injecting risk behavior, virus transmission, treatment and re-infection and how they correspond with available evidence and empirical data.

  3. Pore Formation During Solidification of Aluminum: Reconciliation of Experimental Observations, Modeling Assumptions, and Classical Nucleation Theory

    Science.gov (United States)

    Yousefian, Pedram; Tiryakioğlu, Murat

    2018-02-01

    An in-depth discussion of pore formation is presented in this paper by first reinterpreting in situ observations reported in the literature as well as assumptions commonly made to model pore formation in aluminum castings. The physics of pore formation is reviewed through theoretical fracture pressure calculations based on classical nucleation theory for homogeneous and heterogeneous nucleation, with and without dissolved gas, i.e., hydrogen. Based on the fracture pressure for aluminum, critical pore size and the corresponding probability of vacancies clustering to form that size have been calculated using thermodynamic data reported in the literature. Calculations show that it is impossible for a pore to nucleate either homogeneously or heterogeneously in aluminum, even with dissolved hydrogen. The formation of pores in aluminum castings can only be explained by inflation of entrained surface oxide films (bifilms) under reduced pressure and/or with dissolved gas, which involves only growth, avoiding any nucleation problem. This mechanism is consistent with the reinterpretations of in situ observations as well as the assumptions made in the literature to model pore formation.

  4. Recursive Subspace Identification of AUV Dynamic Model under General Noise Assumption

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A recursive subspace identification algorithm for autonomous underwater vehicles (AUVs is proposed in this paper. Due to the advantages at handling nonlinearities and couplings, the AUV model investigated here is for the first time constructed as a Hammerstein model with nonlinear feedback in the linear part. To better take the environment and sensor noises into consideration, the identification problem is concerned as an errors-in-variables (EIV one which means that the identification procedure is under general noise assumption. In order to make the algorithm recursively, propagator method (PM based subspace approach is extended into EIV framework to form the recursive identification method called PM-EIV algorithm. With several identification experiments carried out by the AUV simulation platform, the proposed algorithm demonstrates its effectiveness and feasibility.

  5. NONLINEAR MODELS FOR DESCRIPTION OF CACAO FRUIT GROWTH WITH ASSUMPTION VIOLATIONS

    Directory of Open Access Journals (Sweden)

    JOEL AUGUSTO MUNIZ

    2017-01-01

    Full Text Available Cacao (Theobroma cacao L. is an important fruit in the Brazilian economy, which is mainly cultivated in the southern State of Bahia. The optimal stage for harvesting is a major factor for fruit quality and the knowledge on its growth curves can help, especially in identifying the ideal maturation stage for harvesting. Nonlinear regression models have been widely used for description of growth curves. However, several studies in this subject do not consider the residual analysis, the existence of a possible dependence between longitudinal observations, or the sample variance heterogeneity, compromising the modeling quality. The objective of this work was to compare the fit of nonlinear regression models, considering residual analysis and assumption violations, in the description of the cacao (clone Sial-105 fruit growth. The data evaluated were extracted from Brito and Silva (1983, who conducted the experiment in the Cacao Research Center, Ilheus, State of Bahia. The variables fruit length, diameter and volume as a function of fruit age were studied. The use of weighting and incorporation of residual dependencies was efficient, since the modeling became more consistent, improving the model fit. Considering the first-order autoregressive structure, when needed, leads to significant reduction in the residual standard deviation, making the estimates more reliable. The Logistic model was the most efficient for the description of the cacao fruit growth.

  6. Does the Assumption on Innovation Process Play an Important Role for Filtered Historical Simulation Model?

    Directory of Open Access Journals (Sweden)

    Emrah Altun

    2018-01-01

    Full Text Available Most of the financial institutions compute the Value-at-Risk (VaR of their trading portfolios using historical simulation-based methods. In this paper, we examine the Filtered Historical Simulation (FHS model introduced by Barone-Adesi et al. (1999 theoretically and empirically. The main goal of this study is to find an answer for the following question: “Does the assumption on innovation process play an important role for the Filtered Historical Simulation model?”. For this goal, we investigate the performance of FHS model with skewed and fat-tailed innovations distributions such as normal, skew normal, Student’s-t, skew-T, generalized error, and skewed generalized error distributions. The performances of FHS models are evaluated by means of unconditional and conditional likelihood ratio tests and loss functions. Based on the empirical results, we conclude that the FHS models with generalized error and skew-T distributions produce more accurate VaR forecasts.

  7. Spatial modelling of assumption of tourism development with geographic IT using

    Directory of Open Access Journals (Sweden)

    Jitka Machalová

    2010-01-01

    Full Text Available The aim of this article is to show the possibilities of spatial modelling and analysing of assumptions of tourism development in the Czech Republic with the objective to make decision-making processes in tourism easier and more efficient (for companies, clients as well as destination managements. The development and placement of tourism depend on the factors (conditions that influence its application in specific areas. These factors are usually divided into three groups: selective, localization and realization. Tourism is inseparably connected with space – countryside. The countryside can be modelled and consecutively analysed by the means of geographical information technologies. With the help of spatial modelling and following analyses the localization and realization conditions in the regions of the Czech Republic have been evaluated. The best localization conditions have been found in the Liberecký region. The capital city of Prague has negligible natural conditions; however, those social ones are on a high level. Next, the spatial analyses have shown that the best realization conditions are provided by the capital city of Prague. Then the Central-Bohemian, South-Moravian, Moravian-Silesian and Karlovarský regions follow. The development of tourism destination is depended not only on the localization and realization factors but it is basically affected by the level of local destination management. Spatial modelling can help destination managers in decision-making processes in order to optimal use of destination potential and efficient targeting their marketing activities.

  8. Cloud-turbulence interactions: Sensitivity of a general circulation model to closure assumptions

    International Nuclear Information System (INIS)

    Brinkop, S.; Roeckner, E.

    1993-01-01

    Several approaches to parameterize the turbulent transport of momentum, heat, water vapour and cloud water for use in a general circulation model (GCM) have been tested in one-dimensional and three-dimensional model simulations. The schemes differ with respect to their closure assumptions (conventional eddy diffusivity model versus turbulent kinetic energy closure) and also regarding their treatment of cloud-turbulence interactions. The basis properties of these parameterizations are discussed first in column simulations of a stratocumulus-topped atmospheric boundary layer (ABL) under a strong subsidence inversion during the KONTROL experiment in the North Sea. It is found that the K-models tend to decouple the cloud layer from the adjacent layers because the turbulent activity is calculated from local variables. The higher-order scheme performs better in this respect because internally generated turbulence can be transported up and down through the action of turbulent diffusion. Thus, the TKE-scheme provides not only a better link between the cloud and the sub-cloud layer but also between the cloud and the inversion as a result of cloud-top entrainment. In the stratocumulus case study, where the cloud is confined by a pronounced subsidence inversion, increased entrainment favours cloud dilution through enhanced evaporation of cloud droplets. In the GCM study, however, additional cloud-top entrainment supports cloud formation because indirect cloud generating processes are promoted through efficient ventilation of the ABL, such as the enhanced moisture supply by surface evaporation and the increased depth of the ABL. As a result, tropical convection is more vigorous, the hydrological cycle is intensified, the whole troposphere becomes warmer and moister in general and the cloudiness in the upper part of the ABL is increased. (orig.)

  9. The biosphere at Laxemar. Data, assumptions and models used in the SR-Can assessment

    International Nuclear Information System (INIS)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern

    2006-10-01

    This is essentially a compilation of a variety of reports concerning the site investigations, the research activities and information derived from other sources important for the safety assessment. The main objective is to present prerequisites, methods and data used, in the biosphere modelling for the safety assessment SR-Can at the Laxemar site. A major part of the report focuses on how site-specific data are used, recalculated or modified in order to be applicable in the safety assessment context; and the methods and sub-models that are the basis for the biosphere modelling. Furthermore, the assumptions made as to the future states of surface ecosystems are mainly presented in this report. A similar report is provided for the Forsmark area. This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. A central tool in the work is the description of the topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary in collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis

  10. The biosphere at Forsmark. Data, assumptions and models used in the SR-Can assessment

    International Nuclear Information System (INIS)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern

    2006-10-01

    This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. The parameters are topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary, e.g. collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis. The report presents descriptions and estimates not presented elsewhere, as well as summaries of important steps in the biosphere modelling that are presented in more detail in separate reports. The intention is to give the reader a coherent description of the steps taken to calculate doses to biota and humans, including a description of the data used, the rationale for a number of assumptions made during parameterisation, and of how the landscape context is applied in the modelling, and also to present the models used and the results obtained

  11. The biosphere at Laxemar. Data, assumptions and models used in the SR-Can assessment

    Energy Technology Data Exchange (ETDEWEB)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern (eds.)

    2006-10-15

    This is essentially a compilation of a variety of reports concerning the site investigations, the research activities and information derived from other sources important for the safety assessment. The main objective is to present prerequisites, methods and data used, in the biosphere modelling for the safety assessment SR-Can at the Laxemar site. A major part of the report focuses on how site-specific data are used, recalculated or modified in order to be applicable in the safety assessment context; and the methods and sub-models that are the basis for the biosphere modelling. Furthermore, the assumptions made as to the future states of surface ecosystems are mainly presented in this report. A similar report is provided for the Forsmark area. This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. A central tool in the work is the description of the topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary in collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis.

  12. The biosphere at Forsmark. Data, assumptions and models used in the SR-Can assessment

    Energy Technology Data Exchange (ETDEWEB)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern (eds.)

    2006-10-15

    This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. The parameters are topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary, e.g. collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis. The report presents descriptions and estimates not presented elsewhere, as well as summaries of important steps in the biosphere modelling that are presented in more detail in separate reports. The intention is to give the reader a coherent description of the steps taken to calculate doses to biota and humans, including a description of the data used, the rationale for a number of assumptions made during parameterisation, and of how the landscape context is applied in the modelling, and also to present the models used and the results obtained.

  13. TESTING THE ASSUMPTIONS AND INTERPRETING THE RESULTS OF THE RASCH MODEL USING LOG-LINEAR PROCEDURES IN SPSS

    NARCIS (Netherlands)

    TENVERGERT, E; GILLESPIE, M; KINGMA, J

    This paper shows how to use the log-linear subroutine of SPSS to fit the Rasch model. It also shows how to fit less restrictive models obtained by relaxing specific assumptions of the Rasch model. Conditional maximum likelihood estimation was achieved by including dummy variables for the total

  14. Sensitivity of wetland methane emissions to model assumptions: application and model testing against site observations

    Directory of Open Access Journals (Sweden)

    L. Meng

    2012-07-01

    Full Text Available Methane emissions from natural wetlands and rice paddies constitute a large proportion of atmospheric methane, but the magnitude and year-to-year variation of these methane sources are still unpredictable. Here we describe and evaluate the integration of a methane biogeochemical model (CLM4Me; Riley et al., 2011 into the Community Land Model 4.0 (CLM4CN in order to better explain spatial and temporal variations in methane emissions. We test new functions for soil pH and redox potential that impact microbial methane production in soils. We also constrain aerenchyma in plants in always-inundated areas in order to better represent wetland vegetation. Satellite inundated fraction is explicitly prescribed in the model, because there are large differences between simulated fractional inundation and satellite observations, and thus we do not use CLM4-simulated hydrology to predict inundated areas. A rice paddy module is also incorporated into the model, where the fraction of land used for rice production is explicitly prescribed. The model is evaluated at the site level with vegetation cover and water table prescribed from measurements. Explicit site level evaluations of simulated methane emissions are quite different than evaluating the grid-cell averaged emissions against available measurements. Using a baseline set of parameter values, our model-estimated average global wetland emissions for the period 1993–2004 were 256 Tg CH4 yr−1 (including the soil sink and rice paddy emissions in the year 2000 were 42 Tg CH4 yr−1. Tropical wetlands contributed 201 Tg CH4 yr−1, or 78% of the global wetland flux. Northern latitude (>50 N systems contributed 12 Tg CH4 yr−1. However, sensitivity studies show a large range (150–346 Tg CH4 yr−1 in predicted global methane emissions (excluding emissions from rice paddies. The large range is

  15. Matrix Diffusion for Performance Assessment - Experimental Evidence, Modelling Assumptions and Open Issues

    International Nuclear Information System (INIS)

    Jakob, A.

    2004-07-01

    In this report a comprehensive overview on the matrix diffusion of solutes in fractured crystalline rocks is presented. Some examples from observations in crystalline bedrock are used to illustrate that matrix diffusion indeed acts on various length scales. Fickian diffusion is discussed in detail followed by some considerations on rock porosity. Due to the fact that the dual-porosity medium model is a very common and versatile method for describing solute transport in fractured porous media, the transport equations and the fundamental assumptions, approximations and simplifications are discussed in detail. There is a variety of geometrical aspects, processes and events which could influence matrix diffusion. The most important of these, such as, e.g., the effect of the flow-wetted fracture surface, channelling and the limited extent of the porous rock for matrix diffusion etc., are addressed. In a further section open issues and unresolved problems related to matrix diffusion are mentioned. Since matrix diffusion is one of the key retarding processes in geosphere transport of dissolved radionuclide species, matrix diffusion was consequently taken into account in past performance assessments of radioactive waste repositories in crystalline host rocks. Some issues regarding matrix diffusion are site-specific while others are independent of the specific situation of a planned repository for radioactive wastes. Eight different performance assessments from Finland, Sweden and Switzerland were considered with the aim of finding out how matrix diffusion was addressed, and whether a consistent picture emerges regarding the varying methodology of the different radioactive waste organisations. In the final section of the report some conclusions are drawn and an outlook is given. An extensive bibliography provides the reader with the key papers and reports related to matrix diffusion. (author)

  16. Influence of road network and population demand assumptions in evacuation modeling for distant tsunamis

    Science.gov (United States)

    Henry, Kevin; Wood, Nathan J.; Frazier, Tim G.

    2017-01-01

    Tsunami evacuation planning in coastal communities is typically focused on local events where at-risk individuals must move on foot in a matter of minutes to safety. Less attention has been placed on distant tsunamis, where evacuations unfold over several hours, are often dominated by vehicle use and are managed by public safety officials. Traditional traffic simulation models focus on estimating clearance times but often overlook the influence of varying population demand, alternative modes, background traffic, shadow evacuation, and traffic management alternatives. These factors are especially important for island communities with limited egress options to safety. We use the coastal community of Balboa Island, California (USA), as a case study to explore the range of potential clearance times prior to wave arrival for a distant tsunami scenario. We use a first-in–first-out queuing simulation environment to estimate variations in clearance times, given varying assumptions of the evacuating population (demand) and the road network over which they evacuate (supply). Results suggest clearance times are less than wave arrival times for a distant tsunami, except when we assume maximum vehicle usage for residents, employees, and tourists for a weekend scenario. A two-lane bridge to the mainland was the primary traffic bottleneck, thereby minimizing the effect of departure times, shadow evacuations, background traffic, boat-based evacuations, and traffic light timing on overall community clearance time. Reducing vehicular demand generally reduced clearance time, whereas improvements to road capacity had mixed results. Finally, failure to recognize non-residential employee and tourist populations in the vehicle demand substantially underestimated clearance time.

  17. Sensitivity of tsunami evacuation modeling to direction and land cover assumptions

    Science.gov (United States)

    Schmidtlein, Mathew C.; Wood, Nathan J.

    2015-01-01

    Although anisotropic least-cost-distance (LCD) modeling is becoming a common tool for estimating pedestrian-evacuation travel times out of tsunami hazard zones, there has been insufficient attention paid to understanding model sensitivity behind the estimates. To support tsunami risk-reduction planning, we explore two aspects of LCD modeling as it applies to pedestrian evacuations and use the coastal community of Seward, Alaska, as our case study. First, we explore the sensitivity of modeling to the direction of movement by comparing standard safety-to-hazard evacuation times to hazard-to-safety evacuation times for a sample of 3985 points in Seward's tsunami-hazard zone. Safety-to-hazard evacuation times slightly overestimated hazard-to-safety evacuation times but the strong relationship to the hazard-to-safety evacuation times, slightly conservative bias, and shorter processing times of the safety-to-hazard approach make it the preferred approach. Second, we explore how variations in land cover speed conservation values (SCVs) influence model performance using a Monte Carlo approach with one thousand sets of land cover SCVs. The LCD model was relatively robust to changes in land cover SCVs with the magnitude of local model sensitivity greatest in areas with higher evacuation times or with wetland or shore land cover types, where model results may slightly underestimate travel times. This study demonstrates that emergency managers should be concerned not only with populations in locations with evacuation times greater than wave arrival times, but also with populations with evacuation times lower than but close to expected wave arrival times, particularly if they are required to cross wetlands or beaches.

  18. Stability and disease persistence in an age-structured SIS epidemic model with vertical transmission and proportionate mixing assumption

    International Nuclear Information System (INIS)

    El-Doma, M.

    2001-02-01

    The stability of the endemic equilibrium of an SIS age-structured epidemic model of a vertically as well as horizontally transmitted disease is investigated when the force of infection is of proportionate mixing assumption type. We also investigate the uniform weak disease persistence. (author)

  19. Preliminary Review of Models, Assumptions, and Key Data used in Performance Assessments and Composite Analysis at the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Arthur S. Rood; Swen O. Magnuson

    2009-07-01

    This document is in response to a request by Ming Zhu, DOE-EM to provide a preliminary review of existing models and data used in completed or soon to be completed Performance Assessments and Composite Analyses (PA/CA) documents, to identify codes, methodologies, main assumptions, and key data sets used.

  20. Do irregular grids make a difference? Relaxing the spatial regularity assumption, in cellular models of social dynamics

    NARCIS (Netherlands)

    Flache, A; Hegselmann, R

    2001-01-01

    Three decades of CA-modelling in the social sciences have shown that the cellular automata framework is a useful tool to explore the relationship between micro assumptions and macro outcomes in social dynamics. However, virtually all CA-applications in the social sciences rely on a potentially

  1. Bioclim Deliverable D6b: application of statistical down-scaling within the BIOCLIM hierarchical strategy: methods, data requirements and underlying assumptions

    International Nuclear Information System (INIS)

    2004-01-01

    The overall aim of BIOCLIM is to assess the possible long term impacts due to climate change on the safety of radioactive waste repositories in deep formations. The coarse spatial scale of the Earth-system Models of Intermediate Complexity (EMICs) used in BIOCLIM compared with the BIOCLIM study regions and the needs of performance assessment creates a need for down-scaling. Most of the developmental work on down-scaling methodologies undertaken by the international research community has focused on down-scaling from the general circulation model (GCM) scale (with a typical spatial resolution of 400 km by 400 km over Europe in the current generation of models) using dynamical down-scaling (i.e., regional climate models (RCMs), which typically have a spatial resolution of 50 km by 50 km for models whose domain covers the European region) or statistical methods (which can provide information at the point or station scale) in order to construct scenarios of anthropogenic climate change up to 2100. Dynamical down-scaling (with the MAR RCM) is used in BIOCLIM WP2 to down-scale from the GCM (i.e., IPSL C M4 D ) scale. In the original BIOCLIM description of work, it was proposed that UEA would apply statistical down-scaling to IPSL C M4 D output in WP2 as part of the hierarchical strategy. Statistical down-scaling requires the identification of statistical relationships between the observed large-scale and regional/local climate, which are then applied to large-scale GCM output, on the assumption that these relationships remain valid in the future (the assumption of stationarity). Thus it was proposed that UEA would investigate the extent to which it is possible to apply relationships between the present-day large-scale and regional/local climate to the relatively extreme conditions of the BIOCLIM WP2 snapshot simulations. Potential statistical down-scaling methodologies were identified from previous work performed at UEA. Appropriate station data from the case

  2. The Role Of Modeling Assumptions And Policy Instruments in Evaluating The Global Implications Of U.S. Biofuel Policies

    Energy Technology Data Exchange (ETDEWEB)

    Oladosu, Gbadebo A [ORNL; Kline, Keith L [ORNL

    2010-01-01

    The primary objective of current U.S. biofuel law the Energy Independence and Security Act of 2007 (EISA) is to reduce dependence on imported oil, but the law also requires biofuels to meet carbon emission reduction thresholds relative to petroleum fuels. EISA created a renewable fuel standard with annual targets for U.S. biofuel use that climb gradually from 9 billion gallons per year in 2008 to 36 billion gallons (or about 136 billion liters) of biofuels per year by 2022. The most controversial aspects of the biofuel policy have centered on the global social and environmental implications of its potential land use effects. In particular, there is an ongoing debate about whether indirect land use change (ILUC) make biofuels a net source, rather sink, of carbon emissions. However, estimates of ILUC induced by biofuel production and use can only be inferred through modeling. This paper evaluates how model structure, underlying assumptions, and the representation of policy instruments influence the results of U.S. biofuel policy simulations. The analysis shows that differences in these factors can lead to divergent model estimates of land use and economic effects. Estimates of the net conversion of forests and grasslands induced by U.S. biofuel policy range from 0.09 ha/1000 gallons described in this paper to 0.73 ha/1000 gallons from early studies in the ILUC change debate. We note that several important factors governing LUC change remain to be examined. Challenges that must be addressed to improve global land use change modeling are highlighted.

  3. Sensitivity to assumptions in models of generalist predation on a cyclic prey.

    Science.gov (United States)

    Matthiopoulos, Jason; Graham, Kate; Smout, Sophie; Asseburg, Christian; Redpath, Stephen; Thirgood, Simon; Hudson, Peter; Harwood, John

    2007-10-01

    Ecological theory predicts that generalist predators should damp or suppress long-term periodic fluctuations (cycles) in their prey populations and depress their average densities. However, the magnitude of these impacts is likely to vary depending on the availability of alternative prey species and the nature of ecological mechanisms driving the prey cycles. These multispecies effects can be modeled explicitly if parameterized functions relating prey consumption to prey abundance, and realistic population dynamical models for the prey, are available. These requirements are met by the interaction between the Hen Harrier (Circus cyaneus) and three of its prey species in the United Kingdom, the Meadow Pipit (Anthus pratensis), the field vole (Microtus agrestis), and the Red Grouse (Lagopus lagopus scoticus). We used this system to investigate how the availability of alternative prey and the way in which prey dynamics are modeled might affect the behavior of simple trophic networks. We generated cycles in one of the prey species (Red Grouse) in three different ways: through (1) the interaction between grouse density and macroparasites, (2) the interaction between grouse density and male grouse aggressiveness, and (3) a generic, delayed density-dependent mechanism. Our results confirm that generalist predation can damp or suppress grouse cycles, but only when the densities of alternative prey are low. They also demonstrate that diametrically opposite indirect effects between pairs of prey species can occur together in simple systems. In this case, pipits and grouse are apparent competitors, whereas voles and grouse are apparent facilitators. Finally, we found that the quantitative impacts of the predator on prey density differed among the three models of prey dynamics, and these differences were robust to uncertainty in parameter estimation and environmental stochasticity.

  4. Assumptions of Multiple Regression: Correcting Two Misconceptions

    Directory of Open Access Journals (Sweden)

    Matt N. Williams

    2013-09-01

    Full Text Available In 2002, an article entitled - Four assumptions of multiple regression that researchers should always test- by.Osborne and Waters was published in PARE. This article has gone on to be viewed more than 275,000 times.(as of August 2013, and it is one of the first results displayed in a Google search for - regression.assumptions- . While Osborne and Waters' efforts in raising awareness of the need to check assumptions.when using regression are laudable, we note that the original article contained at least two fairly important.misconceptions about the assumptions of multiple regression: Firstly, that multiple regression requires the.assumption of normally distributed variables; and secondly, that measurement errors necessarily cause.underestimation of simple regression coefficients. In this article, we clarify that multiple regression models.estimated using ordinary least squares require the assumption of normally distributed errors in order for.trustworthy inferences, at least in small samples, but not the assumption of normally distributed response or.predictor variables. Secondly, we point out that regression coefficients in simple regression models will be.biased (toward zero estimates of the relationships between variables of interest when measurement error is.uncorrelated across those variables, but that when correlated measurement error is present, regression.coefficients may be either upwardly or downwardly biased. We conclude with a brief corrected summary of.the assumptions of multiple regression when using ordinary least squares.

  5. Assumptions, conjectures, and other miracles: The application of evaluative thinking to theory of change models in community development.

    Science.gov (United States)

    Archibald, Thomas; Sharrock, Guy; Buckley, Jane; Cook, Natalie

    2016-12-01

    Unexamined and unjustified assumptions are the Achilles' heel of development programs. In this paper, we describe an evaluation capacity building (ECB) approach designed to help community development practitioners work more effectively with assumptions through the intentional infusion of evaluative thinking (ET) into the program planning, monitoring, and evaluation process. We focus specifically on one component of our ET promotion approach involving the creation and analysis of theory of change (ToC) models. We describe our recent efforts to pilot this ET ECB approach with Catholic Relief Services (CRS) in Ethiopia and Zambia. The use of ToC models, plus the addition of ET, is a way to encourage individual and organizational learning and adaptive management that supports more reflective and responsive programming. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. From requirements to Java in a snap model-driven requirements engineering in practice

    CERN Document Server

    Smialek, Michal

    2015-01-01

    This book provides a coherent methodology for Model-Driven Requirements Engineering which stresses the systematic treatment of requirements within the realm of modelling and model transformations. The underlying basic assumption is that detailed requirements models are used as first-class artefacts playing a direct role in constructing software. To this end, the book presents the Requirements Specification Language (RSL) that allows precision and formality, which eventually permits automation of the process of turning requirements into a working system by applying model transformations and co

  7. Discrete-State and Continuous Models of Recognition Memory: Testing Core Properties under Minimal Assumptions

    Science.gov (United States)

    Kellen, David; Klauer, Karl Christoph

    2014-01-01

    A classic discussion in the recognition-memory literature concerns the question of whether recognition judgments are better described by continuous or discrete processes. These two hypotheses are instantiated by the signal detection theory model (SDT) and the 2-high-threshold model, respectively. Their comparison has almost invariably relied on…

  8. A Memory-Based Model of Posttraumatic Stress Disorder: Evaluating Basic Assumptions Underlying the PTSD Diagnosis

    Science.gov (United States)

    Rubin, David C.; Berntsen, Dorthe; Bohni, Malene Klindt

    2008-01-01

    In the mnemonic model of posttraumatic stress disorder (PTSD), the current memory of a negative event, not the event itself, determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; American Psychiatric Association,…

  9. The reality behind the assumptions: Modelling and simulation support for the SAAF

    CSIR Research Space (South Africa)

    Naidoo, K

    2015-10-01

    Full Text Available : Modelling and simulation support for the SAAF Kavendra Naidoo Military Aerospace Trends & Strategy Military aerospace trends • National security includes other dimensions: social, economic development, environmental, energy security, etc...

  10. Endogenous Fishing Mortality in Life History Models: Relaxing Some Implicit Assumptions

    OpenAIRE

    Smith, Martin D.

    2007-01-01

    Life history models can include a wide range of biological and ecological features that affect exploited fish populations. However, they typically treat fishing mortality as an exogenous parameter. Implicitly, this approach assumes that the supply of fishing effort is perfectly inelastic. That is, the supply curve of effort is vertical. Fishery modelers often run simulations for different values of fishing mortality, but his exercise also assumes vertical supply and simply explores a series o...

  11. Nucleon deep-inelastic structure functions in a quark model with factorizability assumptions

    International Nuclear Information System (INIS)

    Linkevich, A.D.; Skachkov, N.B.

    1979-01-01

    Formula for structure functions of deep-inelastic electron scattering on nucleon is derived. For this purpose the dynamic model of factorizing quark amplitudes is used. It has been found that with increase of Q 2 transferred pulse square at great values of x kinemastic variable the decrease of structure function values is observed. At x single values the increase of structure function values is found. The comparison With experimental data shows a good agreement of the model with experiment

  12. Model of the electric energy market in Poland. Assumptions, structure and operation principles

    International Nuclear Information System (INIS)

    Kulagowski, W.

    1994-01-01

    Present state of works on model of electric energy market in Poland with special consideration of bulk energy market is presented. The designed model based on progressive, evolutionary changes is so elastic, that when keeping general structure and fundamentals the particular solutions can be verified or corrected. The changes in the electric energy market are considered as an integral part of existing restructuring process of Polish electric energy sector. The rate of those changes and the mode of their introduction influence on introduction speed of the new solutions. (author). 14 refs, 4 figs

  13. Multiverse Assumptions and Philosophy

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2018-02-01

    Full Text Available Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong topics such as: infinity, duplicate yous, hypothetical fields, more than three space dimensions, Hilbert space, advanced civilizations, and reality established by mathematical relationships. It is easy to confuse multiverse proposals because many divergent models exist. This overview defines the characteristics of eleven popular multiverse proposals. The characteristics compared are: initial conditions, values of constants, laws of nature, number of space dimensions, number of universes, and fine tuning explanations. Future scientific experiments may validate selected assumptions; but until they do, proposals by philosophers may be as valid as theoretical scientific theories.

  14. White Noise Assumptions Revisited : Regression Models and Statistical Designs for Simulation Practice

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Classic linear regression models and their concomitant statistical designs assume a univariate response and white noise.By definition, white noise is normally, independently, and identically distributed with zero mean.This survey tries to answer the following questions: (i) How realistic are these

  15. Accurate reduction of a model of circadian rhythms by delayed quasi steady state assumptions

    Czech Academy of Sciences Publication Activity Database

    Vejchodský, Tomáš

    2014-01-01

    Roč. 139, č. 4 (2014), s. 577-585 ISSN 0862-7959 Grant - others:European Commission(XE) StochDetBioModel(328008) Program:FP7 Institutional support: RVO:67985840 Keywords : biochemical networks * gene regulatory networks * oscillating systems * periodic solution Subject RIV: BA - General Mathematics http://hdl.handle.net/10338.dmlcz/144135

  16. The reality behind the assumptions: Modelling and simulation support for the SAAF

    CSIR Research Space (South Africa)

    Naidoo, K

    2015-10-01

    Full Text Available : Modelling and simulation support for the SAAF Kavendra Naidoo Military Aerospace Trends & Strategy Military aerospace trends • National security includes other dimensions: social, economic development, environmental, energy security, etc... warfare … to this! http://sistemasdearmas.com.br/ggn/ggn07strikewar.html http://xbradtc.com/2013/12/19/cutaway-thursday-saab-gripen/ The Defence Review • DEFENCE SCIENCE, ENGINEERING & TECHNOLOGY CAPABILITY – GUIDELINE • 49. Science, Engineering...

  17. Oceanographic and behavioural assumptions in models of the fate of coral and coral reef fish larvae.

    Science.gov (United States)

    Wolanski, Eric; Kingsford, Michael J

    2014-09-06

    A predictive model of the fate of coral reef fish larvae in a reef system is proposed that combines the oceanographic processes of advection and turbulent diffusion with the biological process of horizontal swimming controlled by olfactory and auditory cues within the timescales of larval development. In the model, auditory cues resulted in swimming towards the reefs when within hearing distance of the reef, whereas olfactory cues resulted in the larvae swimming towards the natal reef in open waters by swimming against the concentration gradients in the smell plume emanating from the natal reef. The model suggested that the self-seeding rate may be quite large, at least 20% for the larvae of rapidly developing reef fish species, which contrasted with a self-seeding rate less than 2% for non-swimming coral larvae. The predicted self-recruitment rate of reefs was sensitive to a number of parameters, such as the time at which the fish larvae reach post-flexion, the pelagic larval duration of the larvae, the horizontal turbulent diffusion coefficient in reefal waters and the horizontal swimming behaviour of the fish larvae in response to auditory and olfactory cues, for which better field data are needed. Thus, the model suggested that high self-seeding rates for reef fish are possible, even in areas where the 'sticky water' effect is minimal and in the absence of long-term trapping in oceanic fronts and/or large-scale oceanic eddies or filaments that are often argued to facilitate the return of the larvae after long periods of drifting at sea. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  18. In modelling effects of global warming, invalid assumptions lead to unrealistic projections.

    Science.gov (United States)

    Lefevre, Sjannie; McKenzie, David J; Nilsson, Göran E

    2018-02-01

    In their recent Opinion, Pauly and Cheung () provide new projections of future maximum fish weight (W ∞ ). Based on criticism by Lefevre et al. (2017) they changed the scaling exponent for anabolism, d G . Here we find that changing both d G and the scaling exponent for catabolism, b, leads to the projection that fish may even become 98% smaller with a 1°C increase in temperature. This unrealistic outcome indicates that the current W ∞ is unlikely to be explained by the Gill-Oxygen Limitation Theory (GOLT) and, therefore, GOLT cannot be used as a mechanistic basis for model projections about fish size in a warmer world. © 2017 John Wiley & Sons Ltd.

  19. On the Empirical Importance of the Conditional Skewness Assumption in Modelling the Relationship between Risk and Return

    Science.gov (United States)

    Pipień, M.

    2008-09-01

    We present the results of an application of Bayesian inference in testing the relation between risk and return on the financial instruments. On the basis of the Intertemporal Capital Asset Pricing Model, proposed by Merton we built a general sampling distribution suitable in analysing this relationship. The most important feature of our assumptions is that the skewness of the conditional distribution of returns is used as an alternative source of relation between risk and return. This general specification relates to Skewed Generalized Autoregressive Conditionally Heteroscedastic-in-Mean model. In order to make conditional distribution of financial returns skewed we considered the unified approach based on the inverse probability integral transformation. In particular, we applied hidden truncation mechanism, inverse scale factors, order statistics concept, Beta and Bernstein distribution transformations and also a constructive method. Based on the daily excess returns on the Warsaw Stock Exchange Index we checked the empirical importance of the conditional skewness assumption on the relation between risk and return on the Warsaw Stock Market. We present posterior probabilities of all competing specifications as well as the posterior analysis of the positive sign of the tested relationship.

  20. Flawed Assumptions, Models and Decision Making: Misconceptions Concerning Human Elements in Complex System

    International Nuclear Information System (INIS)

    FORSYTHE, JAMES C.; WENNER, CAREN A.

    1999-01-01

    The history of high consequence accidents is rich with events wherein the actions, or inaction, of humans was critical to the sequence of events preceding the accident. Moreover, it has been reported that human error may contribute to 80% of accidents, if not more (dougherty and Fragola, 1988). Within the safety community, this reality is widely recognized and there is a substantially greater awareness of the human contribution to system safety today than has ever existed in the past. Despite these facts, and some measurable reduction in accident rates, when accidents do occur, there is a common lament. No matter how hard we try, we continue to have accidents. Accompanying this lament, there is often bewilderment expressed in statements such as, ''There's no explanation for why he/she did what they did''. It is believed that these statements are a symptom of inadequacies in how they think about humans and their role within technological systems. In particular, while there has never been a greater awareness of human factors, conceptual models of human involvement in engineered systems are often incomplete and in some cases, inaccurate

  1. Flawed Assumptions, Models and Decision Making: Misconceptions Concerning Human Elements in Complex System

    Energy Technology Data Exchange (ETDEWEB)

    FORSYTHE,JAMES C.; WENNER,CAREN A.

    1999-11-03

    The history of high consequence accidents is rich with events wherein the actions, or inaction, of humans was critical to the sequence of events preceding the accident. Moreover, it has been reported that human error may contribute to 80% of accidents, if not more (dougherty and Fragola, 1988). Within the safety community, this reality is widely recognized and there is a substantially greater awareness of the human contribution to system safety today than has ever existed in the past. Despite these facts, and some measurable reduction in accident rates, when accidents do occur, there is a common lament. No matter how hard we try, we continue to have accidents. Accompanying this lament, there is often bewilderment expressed in statements such as, ''There's no explanation for why he/she did what they did''. It is believed that these statements are a symptom of inadequacies in how they think about humans and their role within technological systems. In particular, while there has never been a greater awareness of human factors, conceptual models of human involvement in engineered systems are often incomplete and in some cases, inaccurate.

  2. Cost effectiveness of self-monitoring of blood glucose (SMBG) for patients with type 2 diabetes and not on insulin: impact of modelling assumptions on recent Canadian findings.

    Science.gov (United States)

    Tunis, Sandra L

    2011-11-01

    Canadian patients, healthcare providers and payers share interest in assessing the value of self-monitoring of blood glucose (SMBG) for individuals with type 2 diabetes but not on insulin. Using the UKPDS (UK Prospective Diabetes Study) model, the Canadian Optimal Prescribing and Utilization Service (COMPUS) conducted an SMBG cost-effectiveness analysis. Based on the results, COMPUS does not recommend routine strip use for most adults with type 2 diabetes who are not on insulin. Cost-effectiveness studies require many assumptions regarding cohort, clinical effect, complication costs, etc. The COMPUS evaluation included several conservative assumptions that negatively impacted SMBG cost effectiveness. Current objectives were to (i) review key, impactful COMPUS assumptions; (ii) illustrate how alternative inputs can lead to more favourable results for SMBG cost effectiveness; and (iii) provide recommendations for assessing its long-term value. A summary of COMPUS methods and results was followed by a review of assumptions (for trial-based glycosylated haemoglobin [HbA(1c)] effect, patient characteristics, costs, simulation pathway) and their potential impact. The UKPDS model was used for a 40-year cost-effectiveness analysis of SMBG (1.29 strips per day) versus no SMBG in the Canadian payer setting. COMPUS assumptions for patient characteristics (e.g. HbA(1c) 8.4%), SMBG HbA(1c) advantage (-0.25%) and costs were retained. As with the COMPUS analysis, UKPDS HbA(1c) decay curves were incorporated into SMBG and no-SMBG pathways. An important difference was that SMBG HbA(1c) benefits in the current study could extend beyond the initial simulation period. Sensitivity analyses examined SMBG HbA(1c) advantage, adherence, complication history and cost inputs. Outcomes (discounted at 5%) included QALYs, complication rates, total costs (year 2008 values) and incremental cost-effectiveness ratios (ICERs). The base-case ICER was $Can63 664 per QALY gained; approximately 56% of

  3. Requirements for Medical Modeling Languages

    Science.gov (United States)

    van der Maas, Arnoud A.F.; Ter Hofstede, Arthur H.M.; Ten Hoopen, A. Johannes

    2001-01-01

    Objective: The development of tailor-made domain-specific modeling languages is sometimes desirable in medical informatics. Naturally, the development of such languages should be guided. The purpose of this article is to introduce a set of requirements for such languages and show their application in analyzing and comparing existing modeling languages. Design: The requirements arise from the practical experience of the authors and others in the development of modeling languages in both general informatics and medical informatics. The requirements initially emerged from the analysis of information modeling techniques. The requirements are designed to be orthogonal, i.e., one requirement can be violated without violation of the others. Results: The proposed requirements for any modeling language are that it be “formal” with regard to syntax and semantics, “conceptual,” “expressive,” “comprehensible,” “suitable,” and “executable.” The requirements are illustrated using both the medical logic modules of the Arden Syntax as a running example and selected examples from other modeling languages. Conclusion: Activity diagrams of the Unified Modeling Language, task structures for work flows, and Petri nets are discussed with regard to the list of requirements, and various tradeoffs are thus made explicit. It is concluded that this set of requirements has the potential to play a vital role in both the evaluation of existing domain-specific languages and the development of new ones. PMID:11230383

  4. Requirements for effective modelling strategies.

    NARCIS (Netherlands)

    Gaunt, J.L.; Riley, J.; Stein, A.; Penning de Vries, F.W.T.

    1997-01-01

    As the result of a recent BBSRC-funded workshop between soil scientists, modellers, statisticians and others to discuss issues relating to the derivation of complex environmental models, a set of modelling guidelines is presented and the required associated research areas are discussed.

  5. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  6. Contextuality under weak assumptions

    International Nuclear Information System (INIS)

    Simmons, Andrew W; Rudolph, Terry; Wallman, Joel J; Pashayan, Hakop; Bartlett, Stephen D

    2017-01-01

    The presence of contextuality in quantum theory was first highlighted by Bell, Kochen and Specker, who discovered that for quantum systems of three or more dimensions, measurements could not be viewed as deterministically revealing pre-existing properties of the system. More precisely, no model can assign deterministic outcomes to the projectors of a quantum measurement in a way that depends only on the projector and not the context (the full set of projectors) in which it appeared, despite the fact that the Born rule probabilities associated with projectors are independent of the context. A more general, operational definition of contextuality introduced by Spekkens, which we will term ‘probabilistic contextuality’, drops the assumption of determinism and allows for operations other than measurements to be considered contextual. Even two-dimensional quantum mechanics can be shown to be contextual under this generalised notion. Probabilistic noncontextuality represents the postulate that elements of an operational theory that cannot be distinguished from each other based on the statistics of arbitrarily many repeated experiments (they give rise to the same operational probabilities) are ontologically identical. In this paper, we introduce a framework that enables us to distinguish between different noncontextuality assumptions in terms of the relationships between the ontological representations of objects in the theory given a certain relation between their operational representations. This framework can be used to motivate and define a ‘possibilistic’ analogue, encapsulating the idea that elements of an operational theory that cannot be unambiguously distinguished operationally can also not be unambiguously distinguished ontologically. We then prove that possibilistic noncontextuality is equivalent to an alternative notion of noncontextuality proposed by Hardy. Finally, we demonstrate that these weaker noncontextuality assumptions are sufficient to prove

  7. Relevance of collisionality in the transport model assumptions for divertor detachment multi-fluid modelling on JET

    DEFF Research Database (Denmark)

    Wiesen, S.; Fundamenski, W.; Wischmeier, M.

    2011-01-01

    A revised formulation of the perpendicular diffusive transport model in 2D multi-fluid edge codes is proposed. Based on theoretical predictions and experimental observations a dependence on collisionality is introduced into the transport model of EDGE2D–EIRENE. The impact on time-dependent JET gas...... fuelled ramp-up scenario modelling of the full transient from attached divertor into the high-recycling regime, following a target flux roll over into divertor detachment, ultimately ending in a density limit is presented. A strong dependence on divertor geometry is observed which can mask features...... of the new transport model: a smoothly decaying target recycling flux roll over, an asymmetric drop of temperature and pressure along the field lines as well as macroscopic power dependent plasma oscillations near the density limit which had been previously observed also experimentally. The latter effect...

  8. Do unreal assumptions pervert behaviour?

    DEFF Research Database (Denmark)

    Petersen, Verner C.

    After conducting a series of experiments involving economics students Miller concludes: "The experience of taking a course in microeconomics actually altered students' conceptions of the appropriateness of acting in a self-interested manner, not merely their definition of self-interest." Being...... become taken for granted and tacitly included into theories and models of management. Guiding business and manage¬ment to behave in a fashion that apparently makes these assumptions become "true". Thus in fact making theories and models become self-fulfilling prophecies. The paper elucidates some...... of the basic assumptions underlying the theories found in economics. Assumptions relating to the primacy of self-interest, to resourceful, evaluative, maximising models of man, to incentive systems and to agency theory. The major part of the paper then discusses how these assumptions and theories may pervert...

  9. Formal Requirements Modeling for Reactive Systems with Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon

    . This is important because it represents the identi cation of what is being designed (the reactive system), and what is given and being made assumptions about (the environment). The representation of the environment is further partitioned to distinguish human actors from non-human actors. This allows the modeler...... to addressing the problem of validating formal requirements models through interactive graphical animations is presented. Executable Use Cases (EUCs) provide a framework for integrating three tiers of descriptions of specifications and environment assumptions: the lower tier is an informal description...... to distinguish the modeling artifacts describing the environment from those describing the specifications for a reactive system. The formalization allows for clear identi cation of interfaces between interacting domains, where the interaction takes place through an abstraction of possibly parameterized states...

  10. Hawaiian forest bird trends: using log-linear models to assess long-term trends is supported by model diagnostics and assumptions (reply to Freed and Cann 2013)

    Science.gov (United States)

    Camp, Richard J.; Pratt, Thane K.; Gorresen, P. Marcos; Woodworth, Bethany L.; Jeffrey, John J.

    2014-01-01

    Freed and Cann (2013) criticized our use of linear models to assess trends in the status of Hawaiian forest birds through time (Camp et al. 2009a, 2009b, 2010) by questioning our sampling scheme, whether we met model assumptions, and whether we ignored short-term changes in the population time series. In the present paper, we address these concerns and reiterate that our results do not support the position of Freed and Cann (2013) that the forest birds in the Hakalau Forest National Wildlife Refuge (NWR) are declining, or that the federally listed endangered birds are showing signs of imminent collapse. On the contrary, our data indicate that the 21-year long-term trends for native birds in Hakalau Forest NWR are stable to increasing, especially in areas that have received active management.

  11. Linking assumptions in amblyopia

    Science.gov (United States)

    LEVI, DENNIS M.

    2017-01-01

    Over the last 35 years or so, there has been substantial progress in revealing and characterizing the many interesting and sometimes mysterious sensory abnormalities that accompany amblyopia. A goal of many of the studies has been to try to make the link between the sensory losses and the underlying neural losses, resulting in several hypotheses about the site, nature, and cause of amblyopia. This article reviews some of these hypotheses, and the assumptions that link the sensory losses to specific physiological alterations in the brain. Despite intensive study, it turns out to be quite difficult to make a simple linking hypothesis, at least at the level of single neurons, and the locus of the sensory loss remains elusive. It is now clear that the simplest notion—that reduced contrast sensitivity of neurons in cortical area V1 explains the reduction in contrast sensitivity—is too simplistic. Considerations of noise, noise correlations, pooling, and the weighting of information also play a critically important role in making perceptual decisions, and our current models of amblyopia do not adequately take these into account. Indeed, although the reduction of contrast sensitivity is generally considered to reflect “early” neural changes, it seems plausible that it reflects changes at many stages of visual processing. PMID:23879956

  12. Multiverse Assumptions and Philosophy

    OpenAIRE

    James R. Johnson

    2018-01-01

    Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong) topics such as: infinity, duplicate yous, hypothetical fields, mo...

  13. Distributional Assumptions in Educational Assessments Analysis: Normal Distributions versus Generalized Beta Distribution in Modeling the Phenomenon of Learning

    Science.gov (United States)

    Campos, Jose Alejandro Gonzalez; Moraga, Paulina Saavedra; Del Pozo, Manuel Freire

    2013-01-01

    This paper introduces the generalized beta (GB) model as a new modeling tool in the educational assessment area and evaluation analysis, specifically. Unlike normal model, GB model allows us to capture some real characteristics of data and it is an important tool for understanding the phenomenon of learning. This paper develops a contrast with the…

  14. A test of the Olson Circumplex Model: examining its curvilinear assumption and the presence of extreme types.

    Science.gov (United States)

    Anderson, S A; Gavazzi, S M

    1990-09-01

    The debate over the usefulness of different family models continues. Recent attention has been paid to comparisons between the Olson Circumplex Model and the Beavers Systems Model. The present study seeks to contribute evidence that bears directly upon one of the most fundamental points of controversy surrounding the Olson model--the linear versus curvilinear nature of the cohesion and adaptability dimensions. A further contribution is an examination of the actual occurrence of the Circumplex Model's extreme types in a clinical population.

  15. Sensitivity Analysis Without Assumptions.

    Science.gov (United States)

    Ding, Peng; VanderWeele, Tyler J

    2016-05-01

    Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.

  16. Limitations of individual causal models, causal graphs, and ignorability assumptions, as illustrated by random confounding and design unfaithfulness.

    Science.gov (United States)

    Greenland, Sander; Mansournia, Mohammad Ali

    2015-10-01

    We describe how ordinary interpretations of causal models and causal graphs fail to capture important distinctions among ignorable allocation mechanisms for subject selection or allocation. We illustrate these limitations in the case of random confounding and designs that prevent such confounding. In many experimental designs individual treatment allocations are dependent, and explicit population models are needed to show this dependency. In particular, certain designs impose unfaithful covariate-treatment distributions to prevent random confounding, yet ordinary causal graphs cannot discriminate between these unconfounded designs and confounded studies. Causal models for populations are better suited for displaying these phenomena than are individual-level models, because they allow representation of allocation dependencies as well as outcome dependencies across individuals. Nonetheless, even with this extension, ordinary graphical models still fail to capture distinctions between hypothetical superpopulations (sampling distributions) and observed populations (actual distributions), although potential-outcome models can be adapted to show these distinctions and their consequences.

  17. Variation in estimated ozone-related health impacts of climate change due to modeling choices and assumptions.

    Science.gov (United States)

    Post, Ellen S; Grambsch, Anne; Weaver, Chris; Morefield, Philip; Huang, Jin; Leung, Lai-Yung; Nolte, Christopher G; Adams, Peter; Liang, Xin-Zhong; Zhu, Jin-Hong; Mahoney, Hardee

    2012-11-01

    Future climate change may cause air quality degradation via climate-induced changes in meteorology, atmospheric chemistry, and emissions into the air. Few studies have explicitly modeled the potential relationships between climate change, air quality, and human health, and fewer still have investigated the sensitivity of estimates to the underlying modeling choices. Our goal was to assess the sensitivity of estimated ozone-related human health impacts of climate change to key modeling choices. Our analysis included seven modeling systems in which a climate change model is linked to an air quality model, five population projections, and multiple concentration-response functions. Using the U.S. Environmental Protection Agency's (EPA's) Environmental Benefits Mapping and Analysis Program (BenMAP), we estimated future ozone (O(3))-related health effects in the United States attributable to simulated climate change between the years 2000 and approximately 2050, given each combination of modeling choices. Health effects and concentration-response functions were chosen to match those used in the U.S. EPA's 2008 Regulatory Impact Analysis of the National Ambient Air Quality Standards for O(3). Different combinations of methodological choices produced a range of estimates of national O(3)-related mortality from roughly 600 deaths avoided as a result of climate change to 2,500 deaths attributable to climate change (although the large majority produced increases in mortality). The choice of the climate change and the air quality model reflected the greatest source of uncertainty, with the other modeling choices having lesser but still substantial effects. Our results highlight the need to use an ensemble approach, instead of relying on any one set of modeling choices, to assess the potential risks associated with O(3)-related human health effects resulting from climate change.

  18. Violation of the Sphericity Assumption and Its Effect on Type-I Error Rates in Repeated Measures ANOVA and Multi-Level Linear Models (MLM

    Directory of Open Access Journals (Sweden)

    Nicolas Haverkamp

    2017-10-01

    Full Text Available We investigated the effects of violations of the sphericity assumption on Type I error rates for different methodical approaches of repeated measures analysis using a simulation approach. In contrast to previous simulation studies on this topic, up to nine measurement occasions were considered. Effects of the level of inter-correlations between measurement occasions on Type I error rates were considered for the first time. Two populations with non-violation of the sphericity assumption, one with uncorrelated measurement occasions and one with moderately correlated measurement occasions, were generated. One population with violation of the sphericity assumption combines uncorrelated with highly correlated measurement occasions. A second population with violation of the sphericity assumption combines moderately correlated and highly correlated measurement occasions. From these four populations without any between-group effect or within-subject effect 5,000 random samples were drawn. Finally, the mean Type I error rates for Multilevel linear models (MLM with an unstructured covariance matrix (MLM-UN, MLM with compound-symmetry (MLM-CS and for repeated measures analysis of variance (rANOVA models (without correction, with Greenhouse-Geisser-correction, and Huynh-Feldt-correction were computed. To examine the effect of both the sample size and the number of measurement occasions, sample sizes of n = 20, 40, 60, 80, and 100 were considered as well as measurement occasions of m = 3, 6, and 9. With respect to rANOVA, the results plead for a use of rANOVA with Huynh-Feldt-correction, especially when the sphericity assumption is violated, the sample size is rather small and the number of measurement occasions is large. For MLM-UN, the results illustrate a massive progressive bias for small sample sizes (n = 20 and m = 6 or more measurement occasions. This effect could not be found in previous simulation studies with a smaller number of measurement

  19. TWODEE: the Health and Safety Laboratory's shallow layer model for heavy gas dispersion. Part 1. Mathematical basis and physical assumptions.

    Science.gov (United States)

    Hankin, R K; Britter, R E

    1999-05-14

    The Major Hazard Assessment Unit of the Health and Safety Executive (HSE) provides advice to local planning authorities on land use planning in the vicinity of major hazard sites. For sites with the potential for large scale releases of toxic heavy gases such as chlorine this advice is based on risk levels and is informed by use of the computerised risk assessment tool RISKAT [C. Nussey, M. Pantony, R. Smallwood, HSE's risk assessment tool RISKAT, Major Hazards: Onshore and Offshore, October, 1992]. At present RISKAT uses consequence models for heavy gas dispersion that assume flat terrain. This paper is the first part of a three part paper. Part 1 describes the mathematical basis of TWODEE, the Health and Safety Laboratory's shallow layer model for heavy gas dispersion. The shallow layer approach used by TWODEE is a compromise between the complexity of CFD models and the simpler integral models. Motivated by the low aspect ratio of typical heavy gas clouds, shallow layer models use depth-averaged variables to describe the flow behaviour. This approach is particularly well suited to assess the effect of complex terrain because the downslope buoyancy force is easily included. Entrainment may be incorporated into a shallow layer model by the use of empirical formulae. Part 2 of this paper presents the numerical scheme used to solve the TWODEE mathematical model, and validated against theoretical results. Part 3 compares the results of the TWODEE model with the experimental results taken at Thorney Island [J. McQuaid, B. Roebuck, The dispersion of heavier-than-air gas from a fenced enclosure. Final report to the US Coast Guard on contract with the Health and Safety Executive, Technical Report RPG 1185, Safety Engineering Laboratory, Research and Laboratory Services Division, Broad Lane, Sheffield S3 7HQ, UK, 1985]. Crown Copyright Copyright 1999 Published by Elsevier Science B.V.

  20. A Test of Three Basic Assumptions of Situational Leadership® II Model and Their Implications for HRD Practitioners

    Science.gov (United States)

    Zigarmi, Drea; Roberts, Taylor Peyton

    2017-01-01

    Purpose: This study aims to test the following three assertions underlying the Situational Leadership® II (SLII) Model: all four leadership styles are received by followers; all four leadership styles are needed by followers; and if there is a fit between the leadership style a follower receives and needs, that follower will demonstrate favorable…

  1. ASSESSING GOING CONCERN ASSUMPTION BY USING RATING VALUATION MODELS BASED UPON ANALYTICAL PROCEDURES IN CASE OF FINANCIAL INVESTMENT COMPANIES

    OpenAIRE

    Tatiana Danescu; Ovidiu Spatacean; Paula Nistor; Andrea Cristina Danescu

    2010-01-01

    Designing and performing analytical procedures aimed to assess the rating of theFinancial Investment Companies are essential activities both in the phase of planning a financialaudit mission and in the phase of issuing conclusions regarding the suitability of using by themanagement and other persons responsible for governance of going concern, as the basis forpreparation and disclosure of financial statements. The paper aims to examine the usefulness ofrecognized models used in the practice o...

  2. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption. In this ......Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption....... In this paper we investigate a method for testing the mar assumption in the presence of other distributional constraints. We present methods to (approximately) compute a test statistic consisting of the ratio of two profile likelihood functions. This requires the optimization of the likelihood under...... no assumptionson the missingness mechanism, for which we use our recently proposed AI \\& M algorithm. We present experimental results on synthetic data that show that our approximate test statistic is a good indicator for whether data is mar relative to the given distributional assumptions....

  3. Verification of a decision analytic model assumption using real-world practice data: implications for the cost effectiveness of cyclo-oxygenase 2 inhibitors (COX-2s).

    Science.gov (United States)

    Cox, Emily R; Motheral, Brenda; Mager, Doug

    2003-12-01

    To verify the gastroprotective agent (GPA) rate assumption used in cost-effectiveness models for cyclo-oxygenase 2 inhibitors (COX-2s) and to re-estimate model outcomes using GPA rates from actual practice. Prescription and medical claims data obtained from January 1, 1999, through May 31, 2001, from a large preferred provider organization in the Midwest, were used to estimate GPA rates within 3 groups of patients aged at least 18 years who were new to nonselective nonsteroidal anti-inflammatory drugs (NSAIDs) and COX-2 therapy: all new NSAID users, new NSAID users with a diagnosis of rheumatoid arthritis (RA) or osteoarthritis (OA), and a matched cohort of new NSAID users. Of the more than 319,000 members with at least 1 day of eligibility, 1900 met the study inclusion criteria for new NSAID users, 289 had a diagnosis of OA or RA, and 1232 were included in the matched cohort. Gastroprotective agent estimates for nonselective NSAID and COX-2 users were consistent across all 3 samples (all new NSAID users, new NSAID users with a diagnosis of OA or RA, and the matched cohort), with COX-2 GPA rates of 22%, 21%, and 20%, and nonselective NSAID GPA rates of 15%, 15%, and 18%, respectively. Re-estimation of the cost-effectiveness model increased the cost per year of life saved for COX-2s from $18,614 to more than $100,000. Contrary to COX-2 cost-effectiveness model assumptions, the rate of GPA use is positive and marginally higher among COX-2 users than among nonselective NSAID users. These findings call into question the use of expert opinion in estimating practice pattern model inputs prior to a product's use in clinical practice. A re-evaluation of COX-2 cost-effectiveness models is warranted.

  4. Modelling human resource requirements for the nuclear industry in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Roelofs, Ferry [Nuclear Research and Consultancy Group (NRG) (Netherlands); Flore, Massimo; Estorff, Ulrik von [Joint Research Center (JRC) (Netherlands)

    2017-11-15

    The European Human Resource Observatory for Nuclear (EHRO-N) provides the European Commission with essential data related to supply and demand for nuclear experts in the EU-28 and the enlargement and integration countries based on bottom-up information from the nuclear industry. The objective is to assess how the supply of experts for the nuclear industry responds to the needs for the same experts for present and future nuclear projects in the region. Complementary to the bottom-up approach taken by the EHRO-N team at JRC, a top-down modelling approach has been taken in a collaboration with NRG in the Netherlands. This top-down modelling approach focuses on the human resource requirements for operation, construction, decommissioning, and efforts for long term operation of nuclear power plants. This paper describes the top-down methodology, the model input, the main assumptions, and the results of the analyses.

  5. Meta-requirements that Model Change

    OpenAIRE

    Gouri Prakash

    2010-01-01

    One of the common problems encountered in software engineering is addressing and responding to the changing nature of requirements. While several approaches have been devised to address this issue, ranging from instilling resistance to changing requirements in order to mitigate impact to project schedules, to developing an agile mindset towards requirements, the approach discussed in this paper is one of conceptualizing the delta in requirement and modeling it, in order t...

  6. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  7. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. On Assumptions in Development of a Mathematical Model of Thermo-gravitational Convection in the Large Volume Process Tanks Taking into Account Fermentation

    Directory of Open Access Journals (Sweden)

    P. M. Shkapov

    2015-01-01

    Full Text Available The paper provides a mathematical model of thermo-gravity convection in a large volume vertical cylinder. The heat is removed from the product via the cooling jacket at the top of the cylinder. We suppose that a laminar fluid motion takes place. The model is based on the NavierStokes equation, the equation of heat transfer through the wall, and the heat transfer equation. The peculiarity of the process in large volume tanks was the distribution of the physical parameters of the coordinates that was taken into account when constructing the model. The model corresponds to a process of wort beer fermentation in the cylindrical-conical tanks (CCT. The CCT volume is divided into three zones and for each zone model equations was obtained. The first zone has an annular cross-section and it is limited to the height by the cooling jacket. In this zone the heat flow from the cooling jacket to the product is uppermost. Model equation of the first zone describes the process of heat transfer through the wall and is presented by linear inhomogeneous differential equation in partial derivatives that is solved analytically. For the second and third zones description there was a number of engineering assumptions. The fluid was considered Newtonian, viscous and incompressible. Convective motion considered in the Boussinesq approximation. The effect of viscous dissipation is not considered. The topology of fluid motion is similar to the cylindrical Poiseuille. The second zone model consists of the Navier-Stokes equations in cylindrical coordinates with the introduction of a simplified and the heat equation in the liquid layer. The volume that is occupied by an upward convective flow pertains to the third area. Convective flows do not mix and do not exchange heat. At the start of the process a medium has the same temperature and a zero initial velocity in the whole volume that allows us to specify the initial conditions for the process. The paper shows the

  10. Effect of Selected Modeling Assumptions on Subsurface Radionuclide Transport Projections for the Potential Environmental Management Disposal Facility at Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    Painter, Scott L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Environmental Sciences Division

    2016-06-28

    The Department of Energy’s Office of Environmental Management recently revised a Remedial Investigation/ Feasibility Study (RI/FS) that included an analysis of subsurface radionuclide transport at a potential new Environmental Management Disposal Facility (EMDF) in East Bear Creek Valley near Oak Ridge, Tennessee. The effect of three simplifying assumptions used in the RI/FS analyses are investigated using the same subsurface pathway conceptualization but with more flexible modeling tools. Neglect of vadose zone dispersion was found to be conservative or non-conservative, depending on the retarded travel time and the half-life. For a given equilibrium distribution coefficient, a relatively narrow range of half-life was identified for which neglect of vadose zone transport is non-conservative and radionuclide discharge into surface water is non-negligible. However, there are two additional conservative simplifications in the reference case that compensate for the non-conservative effect of neglecting vadose zone dispersion: the use of a steady infiltration rate and vadose zone velocity, and the way equilibrium sorption is used to represent transport in the fractured material of the saturated aquifer. With more realistic representations of all three processes, the RI/FS reference case was found to either provide a reasonably good approximation to the peak concentration or was significantly conservative (pessimistic) for all parameter combinations considered.

  11. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  12. How Does Temperature Impact Leaf Size and Shape in Four Woody Dicot Species? Testing the Assumptions of Leaf Physiognomy-Climate Models

    Science.gov (United States)

    McKee, M.; Royer, D. L.

    2017-12-01

    The physiognomy (size and shape) of fossilized leaves has been used to reconstruct the mean annual temperature of ancient environments. Colder temperatures often select for larger and more abundant leaf teeth—serrated edges on leaf margins—as well as a greater degree of leaf dissection. However, to be able to accurately predict paleotemperature from the morphology of fossilized leaves, leaves must be able to react quickly and in a predictable manner to changes in temperature. We examined the extent to which temperature affects leaf morphology in four tree species: Carpinus caroliniana, Acer negundo, Ilex opaca, and Ostrya virginiana. Saplings of these species were grown in two growth cabinets under contrasting temperatures (17 and 25 °C). Compared to the cool treatment, in the warm treatment Carpinus caroliniana leaves had significantly fewer leaf teeth and a lower ratio of total number of leaf teeth to internal perimeter; and Acer negundo leaves had a significantly lower feret diameter ratio (a measure of leaf dissection). In addition, a two-way ANOVA tested the influence of temperature and species on leaf physiognomy. This analysis revealed that all plants, regardless of species, tended to develop more highly dissected leaves with more leaf teeth in the cool treatment. Because the cabinets maintained equivalent moisture, humidity, and CO2 concentration between the two treatments, these results demonstrate that these species could rapidly adapt to changes in temperature. However, not all of the species reacted identically to temperature changes. For example, Acer negundo, Carpinus caroliniana, and Ostrya virginiana all had a higher number of total teeth in the cool treatment compared to the warm treatment, but the opposite was true for Ilex opaca. Our work questions a fundamental assumption common to all models predicting paleotemperature from the physiognomy of fossilized leaves: a given climate will inevitably select for the same leaf physiognomy

  13. Modeling requirements for in situ vitrification

    International Nuclear Information System (INIS)

    MacKinnon, R.J.; Mecham, D.C.; Hagrman, D.L.; Johnson, R.W.; Murray, P.E.; Slater, C.E.; Marwil, E.S.; Weaver, R.A.; Argyle, M.D.

    1991-11-01

    This document outlines the requirements for the model being developed at the INEL which will provide analytical support for the ISV technology assessment program. The model includes representations of the electric potential field, thermal transport with melting, gas and particulate release, vapor migration, off-gas combustion and process chemistry. The modeling objectives are to (1) help determine the safety of the process by assessing the air and surrounding soil radionuclide and chemical pollution hazards, the nuclear criticality hazard, and the explosion and fire hazards, (2) help determine the suitability of the ISV process for stabilizing the buried wastes involved, and (3) help design laboratory and field tests and interpret results therefrom

  14. Teaching the Pursuit of Assumptions

    Science.gov (United States)

    Gardner, Peter; Johnson, Stephen

    2015-01-01

    Within the school of thought known as Critical Thinking, identifying or finding missing assumptions is viewed as one of the principal thinking skills. Within the new subject in schools and colleges, usually called Critical Thinking, the skill of finding missing assumptions is similarly prominent, as it is in that subject's public examinations. In…

  15. The Modeling of Factors That Influence Coast Guard Manpower Requirements

    Science.gov (United States)

    2014-12-01

    Requirements Determination process. Finally, sincere thanks, hugs, and kisses to my family. I appreciate your enduring patience and encouragement. I...allowances. To help clarify the process, Phase II has guiding principles and core assumptions that direct the Phase. Three of the four guiding principles are...analyst is determining for the first time what manpower is required. The second notable guiding principle is “MRD analysts shall identify and

  16. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    Rose, Jeremy; Aaen, Ivan; Nielsen, Peter Axel

    2008-01-01

    thinking about large production and manufacturing organisations (particularly in America) in the late industrial age. Many of the difficulties reported with CMMI can be attributed basing practice on these assumptions in organisations which have different cultures and management traditions, perhaps...... in different countries operating different economic and social models. Characterizing CMMI in this way opens the door to another question: are there other sets of organisational and management assumptions which would be better suited to other types of organisations operating in other cultural contexts?...

  17. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  18. The Axioms and Special Assumptions

    Science.gov (United States)

    Borchers, Hans-Jürgen; Sen, Rathindra Nath

    For ease of reference, the axioms, the nontriviality assumptions (3.1.10), the definition of a D-set and the special assumptions of Chaps. 5 and 6 are collected together in the following. The verbal explanations that follow the formal definitions a)-f) of (4.2.1) have been omitted. The entries below are numbered as they are in the text. Recall that βC is the subset of the cone C which, in a D-set, is seen to coincide with the boundary of C after the topology is introduced (Sects. 3.2 and 3.2.1).

  19. Classical Causal Models for Bell and Kochen-Specker Inequality Violations Require Fine-Tuning

    Directory of Open Access Journals (Sweden)

    Eric G. Cavalcanti

    2018-04-01

    Full Text Available Nonlocality and contextuality are at the root of conceptual puzzles in quantum mechanics, and they are key resources for quantum advantage in information-processing tasks. Bell nonlocality is best understood as the incompatibility between quantum correlations and the classical theory of causality, applied to relativistic causal structure. Contextuality, on the other hand, is on a more controversial foundation. In this work, I provide a common conceptual ground between nonlocality and contextuality as violations of classical causality. First, I show that Bell inequalities can be derived solely from the assumptions of no signaling and no fine-tuning of the causal model. This removes two extra assumptions from a recent result from Wood and Spekkens and, remarkably, does not require any assumption related to independence of measurement settings—unlike all other derivations of Bell inequalities. I then introduce a formalism to represent contextuality scenarios within causal models and show that all classical causal models for violations of a Kochen-Specker inequality require fine-tuning. Thus, the quantum violation of classical causality goes beyond the case of spacelike-separated systems and already manifests in scenarios involving single systems.

  20. Challenged assumptions and invisible effects

    DEFF Research Database (Denmark)

    Wimmelmann, Camilla Lawaetz; Vitus, Kathrine; Jervelund, Signe Smith

    2017-01-01

    of two complete intervention courses and an analysis of the official intervention documents. Findings – This case study exemplifies how the basic normative assumptions behind an immigrant-oriented intervention and the intrinsic power relations therein may be challenged and negotiated by the participants...

  1. Portfolios: Assumptions, Tensions, and Possibilities.

    Science.gov (United States)

    Tierney, Robert J.; Clark, Caroline; Fenner, Linda; Herter, Roberta J.; Simpson, Carolyn Staunton; Wiser, Bert

    1998-01-01

    Presents a discussion between two educators of the history, assumptions, tensions, and possibilities surrounding the use of portfolios in multiple classroom contexts. Includes illustrative commentaries that offer alternative perspectives from a range of other educators with differing backgrounds and interests in portfolios. (RS)

  2. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  3. Integrated modelling requires mass collaboration (Invited)

    Science.gov (United States)

    Moore, R. V.

    2009-12-01

    The need for sustainable solutions to the world’s problems is self evident; the challenge is to anticipate where, in the environment, economy or society, the proposed solution will have negative consequences. If we failed to realise that the switch to biofuels would have the seemingly obvious result of reduced food production, how much harder will it be to predict the likely impact of policies whose impacts may be more subtle? It has been clear for a long time that models and data will be important tools for assessing the impact of events and the measures for their mitigation. They are an effective way of encapsulating knowledge of a process and using it for prediction. However, most models represent a single or small group of processes. The sustainability challenges that face us now require not just the prediction of a single process but the prediction of how many interacting processes will respond in given circumstances. These processes will not be confined to a single discipline but will often straddle many. For example, the question, “What will be the impact on river water quality of the medical plans for managing a ‘flu pandemic and could they cause a further health hazard?” spans medical planning, the absorption of drugs by the body, the spread of disease, the hydraulic and chemical processes in sewers and sewage treatment works and river water quality. This question nicely reflects the present state of the art. We have models of the processes and standards, such as the Open Modelling Interface (the OpenMI), allow them to be linked together and to datasets. We can therefore answer the question but with the important proviso that we thought to ask it. The next and greater challenge is to deal with the open question, “What are the implications of the medical plans for managing a ‘flu pandemic?”. This implies a system that can make connections that may well not have occurred to us and then evaluate their probable impact. The final touch will be to

  4. 40 CFR 264.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... FACILITIES Financial Requirements § 264.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure, post-closure care, or... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  5. 40 CFR 265.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ..., STORAGE, AND DISPOSAL FACILITIES Financial Requirements § 265.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  6. 40 CFR 267.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... STANDARDIZED PERMIT Financial Requirements § 267.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure care or liability... 40 Protection of Environment 26 2010-07-01 2010-07-01 false State assumption of responsibility...

  7. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption. In this ......Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption...

  8. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  9. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  10. PFP issues/assumptions development and management planning guide

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The PFP Issues/Assumptions Development and Management Planning Guide presents the strategy and process used for the identification, allocation, and maintenance of an Issues/Assumptions Management List for the Plutonium Finishing Plant (PFP) integrated project baseline. Revisions to this document will include, as attachments, the most recent version of the Issues/Assumptions Management List, both open and current issues/assumptions (Appendix A), and closed or historical issues/assumptions (Appendix B). This document is intended be a Project-owned management tool. As such, this document will periodically require revisions resulting from improvements of the information, processes, and techniques as now described. Revisions that suggest improved processes will only require PFP management approval

  11. Assumptions for the Annual Energy Outlook 1992

    International Nuclear Information System (INIS)

    1992-01-01

    This report serves a auxiliary document to the Energy Information Administration (EIA) publication Annual Energy Outlook 1992 (AEO) (DOE/EIA-0383(92)), released in January 1992. The AEO forecasts were developed for five alternative cases and consist of energy supply, consumption, and price projections by major fuel and end-use sector, which are published at a national level of aggregation. The purpose of this report is to present important quantitative assumptions, including world oil prices and macroeconomic growth, underlying the AEO forecasts. The report has been prepared in response to external requests, as well as analyst requirements for background information on the AEO and studies based on the AEO forecasts

  12. Performance of two formal tests based on martingales residuals to check the proportional hazard assumption and the functional form of the prognostic factors in flexible parametric excess hazard models.

    Science.gov (United States)

    Danieli, Coraline; Bossard, Nadine; Roche, Laurent; Belot, Aurelien; Uhry, Zoe; Charvat, Hadrien; Remontet, Laurent

    2017-07-01

    Net survival, the one that would be observed if the disease under study was the only cause of death, is an important, useful, and increasingly used indicator in public health, especially in population-based studies. Estimates of net survival and effects of prognostic factor can be obtained by excess hazard regression modeling. Whereas various diagnostic tools were developed for overall survival analysis, few methods are available to check the assumptions of excess hazard models. We propose here two formal tests to check the proportional hazard assumption and the validity of the functional form of the covariate effects in the context of flexible parametric excess hazard modeling. These tests were adapted from martingale residual-based tests for parametric modeling of overall survival to allow adding to the model a necessary element for net survival analysis: the population mortality hazard. We studied the size and the power of these tests through an extensive simulation study based on complex but realistic data. The new tests showed sizes close to the nominal values and satisfactory powers. The power of the proportionality test was similar or greater than that of other tests already available in the field of net survival. We illustrate the use of these tests with real data from French cancer registries. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. REQUIREMENTS PARTICLE NETWORKS: AN APPROACH TO FORMAL SOFTWARE FUNCTIONAL REQUIREMENTS MODELLING

    OpenAIRE

    Wiwat Vatanawood; Wanchai Rivepiboon

    2001-01-01

    In this paper, an approach to software functional requirements modelling using requirements particle networks is presented. In our approach, a set of requirements particles is defined as an essential tool to construct a visual model of software functional requirements specification during the software analysis phase and the relevant formal specification is systematically generated without the experience of writing formal specification. A number of algorithms are presented to perform these for...

  14. Building traceable Event-B models from requirements

    OpenAIRE

    Alkhammash, Eman; Butler, Michael; Fathabadi, Asieh Salehi; Cîrstea, Corina

    2015-01-01

    Abstract Bridging the gap between informal requirements and formal specifications is a key challenge in systems engineering. Constructing appropriate abstractions in formal models requires skill and managing the complexity of the relationships between requirements and formal models can be difficult. In this paper we present an approach that aims to address the twin challenges of finding appropriate abstractions and managing traceability between requirements and models. Our approach is based o...

  15. Cognitive ageing on latent constructs for visual processing capacity: A novel Structural Equation Modelling framework with causal assumptions based on A Theory of Visual Attention

    Directory of Open Access Journals (Sweden)

    Simon eNielsen

    2015-01-01

    Full Text Available We examined the effects of normal ageing on visual cognition in a sample of 112 healthy adults aged 60-75. A testbattery was designed to capture high-level measures of visual working memory and low-level measures of visuospatial attention and memory. To answer questions of how cognitive ageing affects specific aspects of visual processing capacity, we used confirmatory factor analyses in Structural Equation Modelling (SEM; Model 2, informed by functional structures that were modelled with path analyses in SEM (Model 1. The results show that ageing effects were selective to measures of visual processing speed compared to visual short-term memory (VSTM capacity (Model 2. These results are consistent with some studies reporting selective ageing effects on processing speed, and inconsistent with other studies reporting ageing effects on both processing speed and VSTM capacity. In the discussion we argue that this discrepancy may be mediated by differences in age ranges, and variables of demography. The study demonstrates that SEM is a sensitive method to detect cognitive ageing effects even within a narrow age-range, and a useful approach to structure the relationships between measured variables, and the cognitive functional foundation they supposedly represent.

  16. Climate change scenarios in Mexico from models results under the assumption of a doubling in the atmospheric CO{sub 2}

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, V.M.; Villanueva, E.E.; Garduno, R.; Adem, J. [Centro de Ciencias de la Atmosfera, Mexico (Mexico)

    1995-12-31

    General circulation models (GCMs) and energy balance models (EBMs) are the best way to simulate the complex large-scale dynamic and thermodynamic processes in the atmosphere. These models have been used to estimate the global warming due to an increase of atmospheric CO{sub 2}. In Japan Ohta with coworkers has developed a physical model based on the conservation of thermal energy applied to pounded shallow water, to compute the change in the water temperature, using the atmospheric warming and the precipitation due to the increase in the atmospheric CO{sub 2} computed by the GISS-GCM. In this work, a method similar to the Ohta`s one is used for computing the change in ground temperature, soil moisture, evaporation, runoff and dryness index in eleven hydrological zones, using in this case the surface air temperature and precipitation due to CO{sub 2} doubling, computed by the GFDLR30-GCM and the version of the Adem thermodynamic climate model (CTM-EBM), which contains the three feedbacks (cryosphere, clouds and water vapor), and does not include water vapor in the CO{sub 2} atmospheric spectral band (12-19{mu})

  17. The theory of reasoned action as a model of marijuana use: tests of implicit assumptions and applicability to high-risk young women.

    Science.gov (United States)

    Morrison, Diane M; Golder, Seana; Keller, Thomas E; Gillmore, Mary Rogers

    2002-09-01

    The theory of reasoned action (TRA) is used to model decisions about substance use among young mothers who became premaritally pregnant at age 17 or younger. The results of structural equation modeling to test the TRA indicated that most relationships specified by the model were significant and in the predicted direction. Attitude was a stronger predictor of intention than norm, but both were significantly related to intention, and intention was related to actual marijuana use 6 months later. Outcome beliefs were bidimensional, and positive outcome beliefs, but not negative beliefs, were significantly related to attitude. Prior marijuana use was only partially mediated by the TRA variables; it also was directly related to intentions to use marijuana and to subsequent use.

  18. Mixing Formal and Informal Model Elements for Tracing Requirements

    DEFF Research Database (Denmark)

    Jastram, Michael; Hallerstede, Stefan; Ladenberger, Lukas

    2011-01-01

    Tracing between informal requirements and formal models is challenging. A method for such tracing should permit to deal efficiently with changes to both the requirements and the model. A particular challenge is posed by the persisting interplay of formal and informal elements. In this paper, we...... describe an incremental approach to requirements validation and systems modelling. Formal modelling facilitates a high degree of automation: it serves for validation and traceability. The foundation for our approach are requirements that are structured according to the WRSPM reference model. We provide...... a system for traceability with a state-based formal method that supports refinement. We do not require all specification elements to be modelled formally and support incremental incorporation of new specification elements into the formal model. Refinement is used to deal with larger amounts of requirements...

  19. Forecasting Renewable Energy Consumption under Zero Assumptions

    Directory of Open Access Journals (Sweden)

    Jie Ma

    2018-02-01

    Full Text Available Renewable energy, as an environmentally friendly and sustainable source of energy, is key to realizing the nationally determined contributions of the United States (US to the December 2015 Paris agreement. Policymakers in the US rely on energy forecasts to draft and implement cost-minimizing, efficient and realistic renewable and sustainable energy policies but the inaccuracies in past projections are considerably high. The inaccuracies and inconsistencies in forecasts are due to the numerous factors considered, massive assumptions and modeling flaws in the underlying model. Here, we propose and apply a machine learning forecasting algorithm devoid of massive independent variables and assumptions to model and forecast renewable energy consumption (REC in the US. We employ the forecasting technique to make projections on REC from biomass (REC-BMs and hydroelectric (HE-EC sources for the 2009–2016 period. We find that, relative to reference case projections in Energy Information Administration’s Annual Energy Outlook 2008, projections based on our proposed technique present an enormous improvement up to ~138.26-fold on REC-BMs and ~24.67-fold on HE-EC; and that applying our technique saves the US ~2692.62PJ petajoules(PJ on HE-EC and ~9695.09PJ on REC-BMs for the 8-year forecast period. The achieved high-accuracy is also replicable to other regions.

  20. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    ... associated with military training exercises. The model combines the business practice of Material Requirements Planning and the commercial spreadsheet software capabilities of Lotus 1-2-3 to calculate the requirements for food, consumable...

  1. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  2. Effects of atmospheric turbulence on the single-photon receiving efficiency and the performance of quantum channel with the modified approximate elliptic-beam model assumption

    Science.gov (United States)

    Wang, Xiao-yang; Zhao, Nan; Chen, Nan; Zhu, Chang-hua; Pei, Chang-xing

    2018-01-01

    In free space quantum channel, with the introduction and implementation of the satellite-ground link transmission, the researches of single-photon transmission have attracted great interest. We propose a single-photon receiving model and analyze the influence of the atmospheric turbulence on the single-photon transmission. We obtain the relationship between single-photon receiving efficiency and atmospheric turbulence, and analyze the influence of the atmospheric turbulence on the quantum channel performance by the single-photon counting. Finally, we present a reasonable simulation analysis. Simulation results show that as the strength of the atmospheric fluctuations increases, the counting distribution gradually broadens, and the utilization of quantum channel drops. Furthermore, the key generation rate and transmission distance decreases sharply in the case of strong turbulence.

  3. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  4. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  5. Estimates of volume and magma input in crustal magmatic systems from zircon geochronology: the effect of modelling assumptions and system variables

    Directory of Open Access Journals (Sweden)

    Luca eCaricchi

    2016-04-01

    Full Text Available Magma fluxes in the Earth’s crust play an important role in regulating the relationship between the frequency and magnitude of volcanic eruptions, the chemical evolution of magmatic systems and the distribution of geothermal energy and mineral resources on our planet. Therefore, quantifying magma productivity and the rate of magma transfer within the crust can provide valuable insights to characterise the long-term behaviour of volcanic systems and to unveil the link between the physical and chemical evolution of magmatic systems and their potential to generate resources. We performed thermal modelling to compute the temperature evolution of crustal magmatic intrusions with different final volumes assembled over a variety of timescales (i.e., at different magma fluxes. Using these results, we calculated synthetic populations of zircon ages assuming the number of zircons crystallising in a given time period is directly proportional to the volume of magma at temperature within the zircon crystallisation range. The statistical analysis of the calculated populations of zircon ages shows that the mode, median and standard deviation of the populations varies coherently as function of the rate of magma injection and final volume of the crustal intrusions. Therefore, the statistical properties of the population of zircon ages can add useful constraints to quantify the rate of magma injection and the final volume of magmatic intrusions.Here, we explore the effect of different ranges of zircon saturation temperature, intrusion geometry, and wall rock temperature on the calculated distributions of zircon ages. Additionally, we determine the effect of undersampling on the variability of mode, median and standards deviation of calculated populations of zircon ages to estimate the minimum number of zircon analyses necessary to obtain meaningful estimates of magma flux and final intrusion volume.

  6. Detecting and accounting for violations of the constancy assumption in non-inferiority clinical trials.

    Science.gov (United States)

    Koopmeiners, Joseph S; Hobbs, Brian P

    2018-05-01

    Randomized, placebo-controlled clinical trials are the gold standard for evaluating a novel therapeutic agent. In some instances, it may not be considered ethical or desirable to complete a placebo-controlled clinical trial and, instead, the placebo is replaced by an active comparator with the objective of showing either superiority or non-inferiority to the active comparator. In a non-inferiority trial, the experimental treatment is considered non-inferior if it retains a pre-specified proportion of the effect of the active comparator as represented by the non-inferiority margin. A key assumption required for valid inference in the non-inferiority setting is the constancy assumption, which requires that the effect of the active comparator in the non-inferiority trial is consistent with the effect that was observed in previous trials. It has been shown that violations of the constancy assumption can result in a dramatic increase in the rate of incorrectly concluding non-inferiority in the presence of ineffective or even harmful treatment. In this paper, we illustrate how Bayesian hierarchical modeling can be used to facilitate multi-source smoothing of the data from the current trial with the data from historical studies, enabling direct probabilistic evaluation of the constancy assumption. We then show how this result can be used to adapt the non-inferiority margin when the constancy assumption is violated and present simulation results illustrating that our method controls the type-I error rate when the constancy assumption is violated, while retaining the power of the standard approach when the constancy assumption holds. We illustrate our adaptive procedure using a non-inferiority trial of raltegravir, an antiretroviral drug for the treatment of HIV.

  7. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  8. Modeling Domain Variability in Requirements Engineering with Contexts

    Science.gov (United States)

    Lapouchnian, Alexei; Mylopoulos, John

    Various characteristics of the problem domain define the context in which the system is to operate and thus impact heavily on its requirements. However, most requirements specifications do not consider contextual properties and few modeling notations explicitly specify how domain variability affects the requirements. In this paper, we propose an approach for using contexts to model domain variability in goal models. We discuss the modeling of contexts, the specification of their effects on system goals, and the analysis of goal models with contextual variability. The approach is illustrated with a case study.

  9. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  10. Capabilities and requirements for modelling radionuclide transport in the geosphere

    International Nuclear Information System (INIS)

    Paige, R.W.; Piper, D.

    1989-02-01

    This report gives an overview of geosphere flow and transport models suitable for use by the Department of the Environment in the performance assessment of radioactive waste disposal sites. An outline methodology for geosphere modelling is proposed, consisting of a number of different types of model. A brief description of each of the component models is given, indicating the purpose of the model, the processes being modelled and the methodologies adopted. Areas requiring development are noted. (author)

  11. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Even...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  12. Requirements Traceability and Transformation Conformance in Model-Driven Development

    NARCIS (Netherlands)

    Andrade Almeida, João; van Eck, Pascal; Iacob, Maria Eugenia

    2006-01-01

    The variety of design artefacts (models) produced in a model-driven design process results in an intricate rela-tionship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship. This frame-work is a basis for tracing

  13. Assumptions and Policy Decisions for Vital Area Identification Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myungsu; Bae, Yeon-Kyoung; Lee, Youngseung [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    U.S. Nuclear Regulatory Commission and IAEA guidance indicate that certain assumptions and policy questions should be addressed to a Vital Area Identification (VAI) process. Korea Hydro and Nuclear Power conducted a VAI based on current Design Basis Threat and engineering judgement to identify APR1400 vital areas. Some of the assumptions were inherited from Probabilistic Safety Assessment (PSA) as a sabotage logic model was based on PSA logic tree and equipment location data. This paper illustrates some important assumptions and policy decisions for APR1400 VAI analysis. Assumptions and policy decisions could be overlooked at the beginning stage of VAI, however they should be carefully reviewed and discussed among engineers, plant operators, and regulators. Through APR1400 VAI process, some of the policy concerns and assumptions for analysis were applied based on document research and expert panel discussions. It was also found that there are more assumptions to define for further studies for other types of nuclear power plants. One of the assumptions is mission time, which was inherited from PSA.

  14. Technological assumptions for biogas purification.

    Science.gov (United States)

    Makareviciene, Violeta; Sendzikiene, Egle

    2015-01-01

    Biogas can be used in the engines of transport vehicles and blended into natural gas networks, but it also requires the removal of carbon dioxide, hydrogen sulphide, and moisture. Biogas purification process flow diagrams have been developed for a process enabling the use of a dolomite suspension, as well as for solutions obtained by the filtration of the suspension, to obtain biogas free of hydrogen sulphide and with a carbon dioxide content that does not exceed 2%. The cost of biogas purification was evaluated on the basis of data on biogas production capacity and biogas production cost obtained from local water treatment facilities. It has been found that, with the use of dolomite suspension, the cost of biogas purification is approximately six times lower than that in the case of using a chemical sorbent such as monoethanolamine. The results showed travelling costs using biogas purified by dolomite suspension are nearly 1.5 time lower than travelling costs using gasoline and slightly lower than travelling costs using mineral diesel fuel.

  15. On the Necessary and Sufficient Assumptions for UC Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Nielsen, Jesper Buus; Orlandi, Claudio

    2010-01-01

    -transfer protocol for the stand-alone model. Since a KRA where the secret keys can be computed from the public keys is useless, and some setup assumption is needed for UC secure computation, this establishes the best we could hope for the KRA model: any non-trivial KRA is sufficient for UC computation. •  We show......We study the necessary and sufficient assumptions for universally composable (UC) computation, both in terms of setup and computational assumptions. We look at the common reference string model, the uniform random string model and the key-registration authority model (KRA), and provide new results...... for all of them. Perhaps most interestingly we show that: •  For even the minimal meaningful KRA, where we only assume that the secret key is a value which is hard to compute from the public key, one can UC securely compute any poly-time functionality if there exists a passive secure oblivious...

  16. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other...... application domains of the theory. We argue that assumptional symmetry leads to theoretical advancement by promoting the development of theory with greater falsifiability and stronger ontological grounding. Thus, strategic management theory may be advanced by systematically searching for asymmetrical...

  17. Peacebuilding: assumptions, practices and critiques

    Directory of Open Access Journals (Sweden)

    Cravo, Teresa Almeida

    2017-05-01

    Full Text Available Peacebuilding has become a guiding principle of international intervention in the periphery since its inclusion in the Agenda for Peace of the United Nations in 1992. The aim of creating the conditions for a self-sustaining peace in order to prevent a return to armed conflict is, however, far from easy or consensual. The conception of liberal peace proved particularly limited, and inevitably controversial, and the reality of war-torn societies far more complex than anticipated by international actors that today assume activities in the promotion of peace in post-conflict contexts. With a trajectory full of contested successes and some glaring failures, the current model has been the target of harsh criticism and widespread scepticism. This article critically examines the theoretical background and practicalities of peacebuilding, exploring its ambition as well as the weaknesses of the paradigm adopted by the international community since the 1990s.

  18. Assumptions for the Annual Energy Outlook 1993

    International Nuclear Information System (INIS)

    1993-01-01

    This report is an auxiliary document to the Annual Energy Outlook 1993 (AEO) (DOE/EIA-0383(93)). It presents a detailed discussion of the assumptions underlying the forecasts in the AEO. The energy modeling system is an economic equilibrium system, with component demand modules representing end-use energy consumption by major end-use sector. Another set of modules represents petroleum, natural gas, coal, and electricity supply patterns and pricing. A separate module generates annual forecasts of important macroeconomic and industrial output variables. Interactions among these components of energy markets generate projections of prices and quantities for which energy supply equals energy demand. This equilibrium modeling system is referred to as the Intermediate Future Forecasting System (IFFS). The supply models in IFFS for oil, coal, natural gas, and electricity determine supply and price for each fuel depending upon consumption levels, while the demand models determine consumption depending upon end-use price. IFFS solves for market equilibrium for each fuel by balancing supply and demand to produce an energy balance in each forecast year

  19. Commonsense Psychology and the Functional Requirements of Cognitive Models

    National Research Council Canada - National Science Library

    Gordon, Andrew S

    2005-01-01

    In this paper we argue that previous models of cognitive abilities (e.g. memory, analogy) have been constructed to satisfy functional requirements of implicit commonsense psychological theories held by researchers and nonresearchers alike...

  20. Requirements engineering for cross-sectional information chain models.

    Science.gov (United States)

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed.

  1. Spreadsheet Decision Support Model for Training Exercise Material Requirements Planning

    National Research Council Canada - National Science Library

    Tringali, Arthur

    1997-01-01

    This thesis focuses on developing a spreadsheet decision support model that can be used by combat engineer platoon and company commanders in determining the material requirements and estimated costs...

  2. Hybriding CMMI and requirement engineering maturity and capability models

    OpenAIRE

    Buglione, Luigi; Hauck, Jean Carlo R.; Gresse von Wangenheim, Christiane; Mc Caffery, Fergal

    2012-01-01

    peer-reviewed Estimation represents one of the most critical processes for any project and it is highly dependent on the quality of requirements elicitation and management. Therefore, the management of requirements should be prioritised in any process improvement program, because the less precise the requirements gathering, analysis and sizing, the greater the error in terms of time and cost estimation. Maturity and Capability Models (MCM) represent a good tool for assessing the status of ...

  3. Generic skills requirements (KSA model) towards future mechanical ...

    African Journals Online (AJOL)

    Generic Skills is a basic requirement that engineers need to master in all areas of Engineering. This study was conducted throughout the peninsular Malaysia involving small, medium and heavy industries using the KSA Model. The objectives of this study are studying the level of requirement of Generic Skills that need to be ...

  4. Wrong assumptions in the financial crisis

    NARCIS (Netherlands)

    Aalbers, M.B.

    2009-01-01

    Purpose - The purpose of this paper is to show how some of the assumptions about the current financial crisis are wrong because they misunderstand what takes place in the mortgage market. Design/methodology/approach - The paper discusses four wrong assumptions: one related to regulation, one to

  5. Irrigation Requirement Estimation Using Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Bounoua, Lahouari; Imhoff, Marc L.; Franks, Shannon

    2010-01-01

    We explore an inverse biophysical modeling process forced by satellite and climatological data to quantify irrigation requirements in semi-arid agricultural areas. We constrain the carbon and water cycles modeled under both equilibrium, balance between vegetation and climate, and non-equilibrium, water added through irrigation. We postulate that the degree to which irrigated dry lands vary from equilibrium climate conditions is related to the amount of irrigation. The amount of water required over and above precipitation is considered as an irrigation requirement. For July, results show that spray irrigation resulted in an additional amount of water of 1.3 mm per occurrence with a frequency of 24.6 hours. In contrast, the drip irrigation required only 0.6 mm every 45.6 hours or 46% of that simulated by the spray irrigation. The modeled estimates account for 87% of the total reported irrigation water use, when soil salinity is not important and 66% in saline lands.

  6. Business Process Simulation: Requirements for Business and Resource Models

    OpenAIRE

    Audrius Rima; Olegas Vasilecas

    2015-01-01

    The purpose of Business Process Model and Notation (BPMN) is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  7. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  8. Relaxing the zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Haegeman, Bart; Etienne, Rampal S.

    2008-01-01

    The zero-sum assumption is one of the ingredients of the standard neutral model of biodiversity by Hubbell. It states that the community is saturated all the time, which in this model means that the total number of individuals in the community is constant over time, and therefore introduces a

  9. Building a Narrative Based Requirements Engineering Mediation Model

    Science.gov (United States)

    Ma, Nan; Hall, Tracy; Barker, Trevor

    This paper presents a narrative-based Requirements Engineering (RE) mediation model to help RE practitioners to effectively identify, define, and resolve conflicts of interest, goals, and requirements. Within the SPI community, there is a common belief that social, human, and organizational issues significantly impact on the effectiveness of software process improvement in general and the requirements engineering process in particularl. Conflicts among different stakeholders are an important human and social issue that need more research attention in the SPI and RE community. By drawing on the conflict resolution literature and IS literature, we argue that conflict resolution in RE is a mediated process, in which a requirements engineer can act as a mediator among different stakeholders. To address socio-psychological aspects of conflict in RE and SPI, Winslade and Monk (2000)'s narrative mediation model is introduced, justified, and translated into the context of RE.

  10. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    Science.gov (United States)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation

  11. Risk patterns in drug safety study using relative times by accelerated failure time models when proportional hazards assumption is questionable: an illustrative case study of cancer risk of patients on glucose-lowering therapies.

    Science.gov (United States)

    Ng, Edmond S-W; Klungel, Olaf H; Groenwold, Rolf H H; van Staa, Tjeerd-Pieter

    2015-01-01

    Observational drug safety studies may be susceptible to confounding or protopathic bias. This bias may cause a spurious relationship between drug exposure and adverse side effect when none exists and may lead to unwarranted safety alerts. The spurious relationship may manifest itself through substantially different risk levels between exposure groups at the start of follow-up when exposure is deemed too short to have any plausible biological effect of the drug. The restrictive proportional hazards assumption with its arbitrary choice of baseline hazard function renders the commonly used Cox proportional hazards model of limited use for revealing such potential bias. We demonstrate a fully parametric approach using accelerated failure time models with an illustrative safety study of glucose-lowering therapies and show that its results are comparable against other methods that allow time-varying exposure effects. Our approach includes a wide variety of models that are based on the flexible generalized gamma distribution and allows direct comparisons of estimated hazard functions following different exposure-specific distributions of survival times. This approach lends itself to two alternative metrics, namely relative times and difference in times to event, allowing physicians more ways to communicate patient's prognosis without invoking the concept of risks, which some may find hard to grasp. In our illustrative case study, substantial differences in cancer risks at drug initiation followed by a gradual reduction towards null were found. This evidence is compatible with the presence of protopathic bias, in which undiagnosed symptoms of cancer lead to switches in diabetes medication. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Modelling Security Requirements Through Extending Scrum Agile Development Framework

    OpenAIRE

    Alotaibi, Minahi

    2016-01-01

    Security is today considered as a basic foundation in software development and therefore, the modelling and implementation of security requirements is an essential part of the production of secure software systems. Information technology organisations are moving towards agile development methods in order to satisfy customers' changing requirements in light of accelerated evolution and time restrictions with their competitors in software production. Security engineering is considered difficult...

  13. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    Web usage mining concerns the discovery of common browsing patterns, i.e., pages requested in sequence, from web logs. To cope with the enormous amounts of data, several aggregated structures based on statistical models of web surfing have appeared, e.g., the Hypertext Probabilistic Grammar (HPG......) model~\\cite{borges99data}. These techniques typically rely on the \\textit{Markov assumption with history depth} $n$, i.e., it is assumed that the next requested page is only dependent on the last $n$ pages visited. This is not always valid, i.e. false browsing patterns may be discovered. However, to our...... knowledge there has been no systematic study of the validity of the Markov assumption wrt.\\ web usage mining and the resulting quality of the mined browsing patterns. In this paper we systematically investigate the quality of browsing patterns mined from structures based on the Markov assumption. Formal...

  14. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  15. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    Science.gov (United States)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  16. Critical Business Requirements Model and Metrics for Intranet ROI

    OpenAIRE

    Luqi; Jacoby, Grant A.

    2005-01-01

    Journal of Electronic Commerce Research, Vol. 6, No. 1, pp. 1-30. This research provides the first theoretical model, the Intranet Efficiency and Effectiveness Model (IEEM), to measure intranet overall value contributions based on a corporation’s critical business requirements by applying a balanced baseline of metrics and conversion ratios linked to key business processes of knowledge workers, IT managers and business decision makers -- in effect, closing the gap of understanding...

  17. ZNJPrice/Earnings Ratio Model through Dividend Yield and Required Yield Above Expected Inflation

    Directory of Open Access Journals (Sweden)

    Emil Mihalina

    2010-07-01

    Full Text Available Price/earnings ratio is the most popular and most widespread evaluation model used to assess relative capital asset value on financial markets. In functional terms, company earnings in the very long term can be described with high significance. Empirically, it is visible from long-term statistics that the demanded (required yield on capital markets has certain regularity. Thus, investors first require a yield above the stable inflation rate and then a dividend yield and a capital increase caused by the growth of earnings that influence the price, with the assumption that the P/E ratio is stable. By combining the Gordon model for current dividend value, the model of market capitalization of earnings (price/earnings ratio and bearing in mind the influence of the general price levels on company earnings, it is possible to adjust the price/earnings ratio by deriving a function of the required yield on capital markets measured by a market index through dividend yield and inflation rate above the stable inflation rate increased by profit growth. The S&P 500 index for example, has in the last 100 years grown by exactly the inflation rate above the stable inflation rate increased by profit growth. The comparison of two series of price/earnings ratios, a modelled one and an average 7-year ratio, shows a notable correlation in the movement of two series of variables, with a three year deviation. Therefore, it could be hypothesized that three years of the expected inflation level, dividend yield and profit growth rate of the market index are discounted in the current market prices. The conclusion is that, at the present time, the relationship between the adjusted average price/earnings ratio and its effect on the market index on one hand and the modelled price/earnings ratio on the other can clearly show the expected dynamics and course in the following period.

  18. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  19. Models of protein and amino acid requirements for cattle

    Directory of Open Access Journals (Sweden)

    Luis Orlindo Tedeschi

    2015-03-01

    Full Text Available Protein supply and requirements by ruminants have been studied for more than a century. These studies led to the accumulation of lots of scientific information about digestion and metabolism of protein by ruminants as well as the characterization of the dietary protein in order to maximize animal performance. During the 1980s and 1990s, when computers became more accessible and powerful, scientists began to conceptualize and develop mathematical nutrition models, and to program them into computers to assist with ration balancing and formulation for domesticated ruminants, specifically dairy and beef cattle. The most commonly known nutrition models developed during this period were the National Research Council (NRC in the United States, Agricultural Research Council (ARC in the United Kingdom, Institut National de la Recherche Agronomique (INRA in France, and the Commonwealth Scientific and Industrial Research Organization (CSIRO in Australia. Others were derivative works from these models with different degrees of modifications in the supply or requirement calculations, and the modeling nature (e.g., static or dynamic, mechanistic, or deterministic. Circa 1990s, most models adopted the metabolizable protein (MP system over the crude protein (CP and digestible CP systems to estimate supply of MP and the factorial system to calculate MP required by the animal. The MP system included two portions of protein (i.e., the rumen-undegraded dietary CP - RUP - and the contributions of microbial CP - MCP as the main sources of MP for the animal. Some models would explicitly account for the impact of dry matter intake (DMI on the MP required for maintenance (MPm; e.g., Cornell Net Carbohydrate and Protein System - CNCPS, the Dutch system - DVE/OEB, while others would simply account for scurf, urinary, metabolic fecal, and endogenous contributions independently of DMI. All models included milk yield and its components in estimating MP required for lactation

  20. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We model examples like reliable bit transmission and sequence transmission protocols in this framework and discuss how assumption-commitment structure facilitates compositional design of such protocols. We prove a decomposition theorem which states that every protocol specified globally as a finite state system can ...

  1. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  2. Atmospheric disturbance modelling requirements for flying qualities applications

    Science.gov (United States)

    Moorhouse, D. J.

    1978-01-01

    Flying qualities are defined as those airplane characteristics which govern the ease or precision with which the pilot can accomplish the mission. Some atmospheric disturbance modelling requirements for aircraft flying qualities applications are reviewed. It is concluded that some simplifications are justified in identifying the primary influence on aircraft response and pilot control. It is recommended that a universal environmental model be developed, which could form the reference for different applications. This model should include the latest information on winds, turbulence, gusts, visibility, icing and precipitation. A chosen model would be kept by a national agency and updated regularly by feedback from users. A user manual is believed to be an essential part of such a model.

  3. Information Models, Data Requirements, and Agile Data Curation

    Science.gov (United States)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  4. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  5. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  6. Culturally Biased Assumptions in Counseling Psychology

    Science.gov (United States)

    Pedersen, Paul B.

    2003-01-01

    Eight clusters of culturally biased assumptions are identified for further discussion from Leong and Ponterotto's (2003) article. The presence of cultural bias demonstrates that cultural bias is so robust and pervasive that is permeates the profession of counseling psychology, even including those articles that effectively attack cultural bias…

  7. Mexican-American Cultural Assumptions and Implications.

    Science.gov (United States)

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  8. Assumptions of Multiple Regression: Correcting Two Misconceptions

    Science.gov (United States)

    Williams, Matt N.; Gomez Grajales, Carlos Alberto; Kurkiewicz, Dason

    2013-01-01

    In 2002, an article entitled "Four assumptions of multiple regression that researchers should always test" by Osborne and Waters was published in "PARE." This article has gone on to be viewed more than 275,000 times (as of August 2013), and it is one of the first results displayed in a Google search for "regression…

  9. Categorical Judgment Scaling with Ordinal Assumptions.

    Science.gov (United States)

    Hofacker, C F

    1984-01-01

    One of the most common activities of psychologists and other researchers is to construct Likert scales and then proceed to analyze them as if the numbers constituted an equal interval scale. There are several alternatives to this procedure (Thurstone & Chave, 1929; Muthen, 1983) that make normality assumptions but which do not assume that the answer categories as used by subjects constitute an equal interval scale. In this paper a new alternative is proposed that uses additive conjoint measurement. It is assumed that subjects can report their attitudes towards stimuli in the appropriate rank order. Neither within-subject nor between-subject distributional assumptions are made. Nevertheless, interval level stimulus values, as well as response category boundaries, are extracted by the procedure. This approach is applied to three sets of attitude data. In these three cases, the equal interval assumption is clearly wrong. Despite this, arithmetic means seem to closely reflect group attitudes towards the stimuli. In one data set, the normality assumption of Thurstone and Chave (1929) and Muthen (1983) is supported, and in the two others it is supported with reservations.

  10. Critically Challenging Some Assumptions in HRD

    Science.gov (United States)

    O'Donnell, David; McGuire, David; Cross, Christine

    2006-01-01

    This paper sets out to critically challenge five interrelated assumptions prominent in the (human resource development) HRD literature. These relate to: the exploitation of labour in enhancing shareholder value; the view that employees are co-contributors to and co-recipients of HRD benefits; the distinction between HRD and human resource…

  11. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  12. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  14. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  15. The 'revealed preferences' theory: Assumptions and conjectures

    International Nuclear Information System (INIS)

    Green, C.H.

    1983-01-01

    Being kind of intuitive psychology the 'Revealed-Preferences'- theory based approaches towards determining the acceptable risks are a useful method for the generation of hypotheses. In view of the fact that reliability engineering develops faster than methods for the determination of reliability aims the Revealed-Preferences approach is a necessary preliminary help. Some of the assumptions on which the 'Revealed-Preferences' theory is based will be identified and analysed and afterwards compared with experimentally obtained results. (orig./DG) [de

  16. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  17. Towards New Probabilistic Assumptions in Business Intelligence

    OpenAIRE

    Schumann Andrew; Szelc Andrzej

    2015-01-01

    One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot ...

  18. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  19. Leakage-Resilient Circuits without Computational Assumptions

    DEFF Research Database (Denmark)

    Dziembowski, Stefan; Faust, Sebastian

    2012-01-01

    Physical cryptographic devices inadvertently leak information through numerous side-channels. Such leakage is exploited by so-called side-channel attacks, which often allow for a complete security breache. A recent trend in cryptography is to propose formal models to incorporate leakage...... into the model and to construct schemes that are provably secure within them. We design a general compiler that transforms any cryptographic scheme, e.g., a block-cipher, into a functionally equivalent scheme which is resilient to any continual leakage provided that the following three requirements are satisfied......: (i) in each observation the leakage is bounded, (ii) different parts of the computation leak independently, and (iii) the randomness that is used for certain operations comes from a simple (non-uniform) distribution. In contrast to earlier work on leakage resilient circuit compilers, which relied...

  20. Requirements traceability in model-driven development: Applying model and transformation conformance

    NARCIS (Netherlands)

    Andrade Almeida, João; Iacob, Maria Eugenia; van Eck, Pascal

    The variety of design artifacts (models) produced in a model-driven design process results in an intricate relationship between requirements and the various models. This paper proposes a methodological framework that simplifies management of this relationship, which helps in assessing the quality of

  1. Data-driven smooth tests of the proportional hazards assumption

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2007-01-01

    Roč. 13, č. 1 (2007), s. 1-16 ISSN 1380-7870 R&D Projects: GA AV ČR(CZ) IAA101120604; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * Neyman's smooth test * proportional hazards assumption * Schwarz's selection rule Subject RIV: BA - General Mathematics Impact factor: 0.491, year: 2007

  2. About tests of the "simplifying" assumption for conditional copulas

    OpenAIRE

    Derumigny, Alexis; Fermanian, Jean-David

    2016-01-01

    We discuss the so-called “simplifying assumption” of conditional copulas in a general framework. We introduce several tests of the latter assumption for non- and semiparametric copula models. Some related test procedures based on conditioning subsets instead of point-wise events are proposed. The limiting distributions of such test statistics under the null are approximated by several bootstrap schemes, most of them being new. We prove the validity of a particular semiparametric bootstrap sch...

  3. A GENERALIZATION OF TRADITIONAL KANO MODEL FOR CUSTOMER REQUIREMENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Renáta Turisová

    2015-07-01

    Full Text Available Purpose: The theory of attractiveness determines the relationship between the technically achieved and customer perceived quality of product attributes. The most frequently used approach in the theory of attractiveness is the implementation of Kano‘s model. There exist a lot of generalizations of that model which take into consideration various aspects and approaches focused on understanding the customer preferences and identification of his priorities for a selling  product. The aim of this article is to outline another possible generalization of Kano‘s model.Methodology/Approach: The traditional Kano’s model captures the nonlinear relationship between reached attributes of quality and customer requirements. The individual attributes of quality are divided into three main categories: must-be, one-dimensional, attractive quality and into two side categories: indifferent and reverse quality. The well selling product has to contain the must-be attribute. It should contain as many one-dimensional attributes as possible. If there are also supplementary attractive attributes, it means that attractiveness of the entire product, from the viewpoint of the customer, nonlinearly sharply rises what has a direct positive impact on a decision of potential customer when purchasing the product. In this article, we show that inclusion of individual quality attributes of a product to the mentioned categories depends, among other things, also on costs on life cycle of the product, respectively on a price of the product on the market.Findings: In practice, we are often encountering the inclusion of products into different price categories: lower, middle and upper class. For a certain type of products the category is either directly declared by a producer (especially in automotive industry, or is determined by a customer by means of assessment of available market prices. To each of those groups of a products different customer expectations can be assigned

  4. Modeling the minimum enzymatic requirements for optimal cellulose conversion

    International Nuclear Information System (INIS)

    Den Haan, R; Van Zyl, W H; Van Zyl, J M; Harms, T M

    2013-01-01

    Hydrolysis of cellulose is achieved by the synergistic action of endoglucanases, exoglucanases and β-glucosidases. Most cellulolytic microorganisms produce a varied array of these enzymes and the relative roles of the components are not easily defined or quantified. In this study we have used partially purified cellulases produced heterologously in the yeast Saccharomyces cerevisiae to increase our understanding of the roles of some of these components. CBH1 (Cel7), CBH2 (Cel6) and EG2 (Cel5) were separately produced in recombinant yeast strains, allowing their isolation free of any contaminating cellulolytic activity. Binary and ternary mixtures of the enzymes at loadings ranging between 3 and 100 mg g −1 Avicel allowed us to illustrate the relative roles of the enzymes and their levels of synergy. A mathematical model was created to simulate the interactions of these enzymes on crystalline cellulose, under both isolated and synergistic conditions. Laboratory results from the various mixtures at a range of loadings of recombinant enzymes allowed refinement of the mathematical model. The model can further be used to predict the optimal synergistic mixes of the enzymes. This information can subsequently be applied to help to determine the minimum protein requirement for complete hydrolysis of cellulose. Such knowledge will be greatly informative for the design of better enzymatic cocktails or processing organisms for the conversion of cellulosic biomass to commodity products. (letter)

  5. New Assumptions to Guide SETI Research

    Science.gov (United States)

    Colombano, S. P.

    2018-01-01

    The recent Kepler discoveries of Earth-like planets offer the opportunity to focus our attention on detecting signs of life and technology in specific planetary systems, but I feel we need to become more flexible in our assumptions. The reason is that, while it is still reasonable and conservative to assume that life is most likely to have originated in conditions similar to ours, the vast time differences in potential evolutions render the likelihood of "matching" technologies very slim. In light of these challenges I propose a more "aggressive"� approach to future SETI exploration in directions that until now have received little consideration.

  6. Explorations in statistics: the assumption of normality.

    Science.gov (United States)

    Curran-Everett, Douglas

    2017-09-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This twelfth installment of Explorations in Statistics explores the assumption of normality, an assumption essential to the meaningful interpretation of a t test. Although the data themselves can be consistent with a normal distribution, they need not be. Instead, it is the theoretical distribution of the sample mean or the theoretical distribution of the difference between sample means that must be roughly normal. The most versatile approach to assess normality is to bootstrap the sample mean, the difference between sample means, or t itself. We can then assess whether the distributions of these bootstrap statistics are consistent with a normal distribution by studying their normal quantile plots. If we suspect that an inference we make from a t test may not be justified-if we suspect that the theoretical distribution of the sample mean or the theoretical distribution of the difference between sample means is not normal-then we can use a permutation method to analyze our data. Copyright © 2017 the American Physiological Society.

  7. Research on Computer Aided Innovation Model of Weapon Equipment Requirement Demonstration

    Science.gov (United States)

    Li, Yong; Guo, Qisheng; Wang, Rui; Li, Liang

    Firstly, in order to overcome the shortcoming of using only AD or TRIZ solely, and solve the problems currently existed in weapon equipment requirement demonstration, the paper construct the method system of weapon equipment requirement demonstration combining QFD, AD, TRIZ, FA. Then, we construct a CAI model frame of weapon equipment requirement demonstration, which include requirement decomposed model, requirement mapping model and requirement plan optimization model. Finally, we construct the computer aided innovation model of weapon equipment requirement demonstration, and developed CAI software of equipment requirement demonstration.

  8. A 2nd generation static model of greenhouse energy requirements (horticern) : a comparison with dynamic models

    CERN Document Server

    Jolliet, O; Munday, G L

    1989-01-01

    Optimisation of a greenhouse and its components requires a suitable model permitting precise determination of its energy requirements. Existing static models are simple but lack precision; dynamic models though more precise, are unsuitable for use over long periods and difficult to handle in practice. A theoretical study and measurements from the CERN trial greenhouse have allowed the development of new static model named "HORTICERN", precise and easy to use for predicting energy consumption and which takes into account effects of solar energy, wind and radiative loss to the sky. This paper compares the HORTICERN model with the dynamic models of Bot, Takakura, Van Bavel and Gembloux, and demonstrates that its precision is comparable; differences on average being less than 5%, it is independent of type of greenhouse (e.g. single or double glazing, Hortiplus, etc.) and climate. The HORTICERN method has been developed for PC use and is proving to be a powerful tool for greenhouse optimisation by research work...

  9. Challenging the assumptions for thermal sensation scales

    DEFF Research Database (Denmark)

    Schweiker, Marcel; Fuchs, Xaver; Becker, Susanne

    2016-01-01

    Scales are widely used to assess the personal experience of thermal conditions in built environments. Most commonly, thermal sensation is assessed, mainly to determine whether a particular thermal condition is comfortable for individuals. A seven-point thermal sensation scale has been used...... extensively, which is suitable for describing a one-dimensional relationship between physical parameters of indoor environments and subjective thermal sensation. However, human thermal comfort is not merely a physiological but also a psychological phenomenon. Thus, it should be investigated how scales for its...... assessment could benefit from a multidimensional conceptualization. The common assumptions related to the usage of thermal sensation scales are challenged, empirically supported by two analyses. These analyses show that the relationship between temperature and subjective thermal sensation is non...

  10. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  11. Evaluating the Effects of Ankle-Foot Orthosis Mechanical Property Assumptions on Gait Simulation Muscle Force Results.

    Science.gov (United States)

    Hegarty, Amy K; Petrella, Anthony J; Kurz, Max J; Silverman, Anne K

    2017-03-01

    Musculoskeletal modeling and simulation techniques have been used to gain insights into movement disabilities for many populations, such as ambulatory children with cerebral palsy (CP). The individuals who can benefit from these techniques are often limited to those who can walk without assistive devices, due to challenges in accurately modeling these devices. Specifically, many children with CP require the use of ankle-foot orthoses (AFOs) to improve their walking ability, and modeling these devices is important to understand their role in walking mechanics. The purpose of this study was to quantify the effects of AFO mechanical property assumptions, including rotational stiffness, damping, and equilibrium angle of the ankle and subtalar joints, on the estimation of lower-limb muscle forces during stance for children with CP. We analyzed two walking gait cycles for two children with CP while they were wearing their own prescribed AFOs. We generated 1000-trial Monte Carlo simulations for each of the walking gait cycles, resulting in a total of 4000 walking simulations. We found that AFO mechanical property assumptions influenced the force estimates for all the muscles in the model, with the ankle muscles having the largest resulting variability. Muscle forces were most sensitive to assumptions of AFO ankle and subtalar stiffness, which should therefore be measured when possible. Muscle force estimates were less sensitive to estimates of damping and equilibrium angle. When stiffness measurements are not available, limitations on the accuracy of muscle force estimates for all the muscles in the model, especially the ankle muscles, should be acknowledged.

  12. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J.; Kahn, Yonatan; McCullough, Matthew

    2015-10-06

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  13. Development of an estimation model for the evaluation of the energy requirement of dilute acid pretreatments of biomass☆

    Science.gov (United States)

    Mafe, Oluwakemi A.T.; Davies, Scott M.; Hancock, John; Du, Chenyu

    2015-01-01

    This study aims to develop a mathematical model to evaluate the energy required by pretreatment processes used in the production of second generation ethanol. A dilute acid pretreatment process reported by National Renewable Energy Laboratory (NREL) was selected as an example for the model's development. The energy demand of the pretreatment process was evaluated by considering the change of internal energy of the substances, the reaction energy, the heat lost and the work done to/by the system based on a number of simplifying assumptions. Sensitivity analyses were performed on the solid loading rate, temperature, acid concentration and water evaporation rate. The results from the sensitivity analyses established that the solids loading rate had the most significant impact on the energy demand. The model was then verified with data from the NREL benchmark process. Application of this model on other dilute acid pretreatment processes reported in the literature illustrated that although similar sugar yields were reported by several studies, the energy required by the different pretreatments varied significantly. PMID:26109752

  14. Development of an estimation model for the evaluation of the energy requirement of dilute acid pretreatments of biomass.

    Science.gov (United States)

    Mafe, Oluwakemi A T; Davies, Scott M; Hancock, John; Du, Chenyu

    2015-01-01

    This study aims to develop a mathematical model to evaluate the energy required by pretreatment processes used in the production of second generation ethanol. A dilute acid pretreatment process reported by National Renewable Energy Laboratory (NREL) was selected as an example for the model's development. The energy demand of the pretreatment process was evaluated by considering the change of internal energy of the substances, the reaction energy, the heat lost and the work done to/by the system based on a number of simplifying assumptions. Sensitivity analyses were performed on the solid loading rate, temperature, acid concentration and water evaporation rate. The results from the sensitivity analyses established that the solids loading rate had the most significant impact on the energy demand. The model was then verified with data from the NREL benchmark process. Application of this model on other dilute acid pretreatment processes reported in the literature illustrated that although similar sugar yields were reported by several studies, the energy required by the different pretreatments varied significantly.

  15. Estimating the global prevalence of inadequate zinc intake from national food balance sheets: effects of methodological assumptions.

    Directory of Open Access Journals (Sweden)

    K Ryan Wessells

    Full Text Available The prevalence of inadequate zinc intake in a population can be estimated by comparing the zinc content of the food supply with the population's theoretical requirement for zinc. However, assumptions regarding the nutrient composition of foods, zinc requirements, and zinc absorption may affect prevalence estimates. These analyses were conducted to: (1 evaluate the effect of varying methodological assumptions on country-specific estimates of the prevalence of dietary zinc inadequacy and (2 generate a model considered to provide the best estimates.National food balance data were obtained from the Food and Agriculture Organization of the United Nations. Zinc and phytate contents of these foods were estimated from three nutrient composition databases. Zinc absorption was predicted using a mathematical model (Miller equation. Theoretical mean daily per capita physiological and dietary requirements for zinc were calculated using recommendations from the Food and Nutrition Board of the Institute of Medicine and the International Zinc Nutrition Consultative Group. The estimated global prevalence of inadequate zinc intake varied between 12-66%, depending on which methodological assumptions were applied. However, country-specific rank order of the estimated prevalence of inadequate intake was conserved across all models (r = 0.57-0.99, P<0.01. A "best-estimate" model, comprised of zinc and phytate data from a composite nutrient database and IZiNCG physiological requirements for absorbed zinc, estimated the global prevalence of inadequate zinc intake to be 17.3%.Given the multiple sources of uncertainty in this method, caution must be taken in the interpretation of the estimated prevalence figures. However, the results of all models indicate that inadequate zinc intake may be fairly common globally. Inferences regarding the relative likelihood of zinc deficiency as a public health problem in different countries can be drawn based on the country

  16. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    International Nuclear Information System (INIS)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-01-01

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model

  17. Transsexual parenthood and new role assumptions.

    Science.gov (United States)

    Faccio, Elena; Bordin, Elena; Cipolletta, Sabrina

    2013-01-01

    This study explores the parental role of transsexuals and compares this to common assumptions about transsexuality and parentage. We conducted semi-structured interviews with 14 male-to-female transsexuals and 14 men, half parents and half non-parents, in order to explore four thematic areas: self-representation of the parental role, the description of the transsexual as a parent, the common representations of transsexuals as a parent, and male and female parental stereotypes. We conducted thematic and lexical analyses of the interviews using Taltac2 software. The results indicate that social representations of transsexuality and parenthood have a strong influence on processes of self-representation. Transsexual parents accurately understood conventional male and female parental prototypes and saw themselves as competent, responsible parents. They constructed their role based on affection toward the child rather than on the complementary role of their wives. In contrast, men's descriptions of transsexual parental roles were simpler and the descriptions of their parental role coincided with their personal experiences. These results suggest that the transsexual journey toward parenthood involves a high degree of re-adjustment, because their parental role does not coincide with a conventional one.

  18. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  19. Economic Growth Assumptions in Climate and Energy Policy

    Directory of Open Access Journals (Sweden)

    Nir Y. Krakauer

    2014-03-01

    Full Text Available The assumption that the economic growth seen in recent decades will continue has dominated the discussion of future greenhouse gas emissions and the mitigation of and adaptation to climate change. Given that long-term economic growth is uncertain, the impacts of a wide range of growth trajectories should be considered. In particular, slower economic growth would imply that future generations will be relatively less able to invest in emissions controls or adapt to the detrimental impacts of climate change. Taking into consideration the possibility of economic slowdown therefore heightens the urgency of reducing greenhouse gas emissions now by moving to renewable energy sources, even if this incurs short-term economic cost. I quantify this counterintuitive impact of economic growth assumptions on present-day policy decisions in a simple global economy-climate model (Dynamic Integrated model of Climate and the Economy (DICE. In DICE, slow future growth increases the economically optimal present-day carbon tax rate and the utility of taxing carbon emissions, although the magnitude of the increase is sensitive to model parameters, including the rate of social time preference and the elasticity of the marginal utility of consumption. Future scenario development should specifically include low-growth scenarios, and the possibility of low-growth economic trajectories should be taken into account in climate policy analyses.

  20. Validation of Power Requirement Model for Active Loudspeakers

    DEFF Research Database (Denmark)

    Schneider, Henrik; Madsen, Anders Normann; Bjerregaard, Ruben

    2015-01-01

    The actual power requirement of an active loudspeaker during playback of music has not received much attention in the literature. This is probably because no single and simple solution exists and because a complete system knowledge from input voltage to output sound pressure level is required...

  1. Closed loop models for analyzing engineering requirements for simulators

    Science.gov (United States)

    Baron, S.; Muralidharan, R.; Kleinman, D.

    1980-01-01

    A closed loop analytic model, incorporating a model for the human pilot, (namely, the optimal control model) that would allow certain simulation design tradeoffs to be evaluated quantitatively was developed. This model was applied to a realistic flight control problem. The resulting model is used to analyze both overall simulation effects and the effects of individual elements. The results show that, as compared to an ideal continuous simulation, the discrete simulation can result in significant performance and/or workload penalties.

  2. Comparison of listing strategies for allosensitized heart transplant candidates requiring transplant at high urgency: a decision model analysis.

    Science.gov (United States)

    Feingold, B; Webber, S A; Bryce, C L; Park, S Y; Tomko, H E; Comer, D M; Mahle, W T; Smith, K J

    2015-02-01

    Allosensitized children who require a negative prospective crossmatch have a high risk of death awaiting heart transplantation. Accepting the first suitable organ offer, regardless of the possibility of a positive crossmatch, would improve waitlist outcomes but it is unclear whether it would result in improved survival at all times after listing, including posttransplant. We created a Markov decision model to compare survival after listing with a requirement for a negative prospective donor cell crossmatch (WAIT) versus acceptance of the first suitable offer (TAKE). Model parameters were derived from registry data on status 1A (highest urgency) pediatric heart transplant listings. We assumed no possibility of a positive crossmatch in the WAIT strategy and a base-case probability of a positive crossmatch in the TAKE strategy of 47%, as estimated from cohort data. Under base-case assumptions, TAKE showed an incremental survival benefit of 1.4 years over WAIT. In multiple sensitivity analyses, including variation of the probability of a positive crossmatch from 10% to 100%, TAKE was consistently favored. While model input data were less well suited to comparing survival when awaiting transplantation across a negative virtual crossmatch, our analysis suggests that taking the first suitable organ offer under these circumstances is also favored. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.

  3. Assumptions for well-known statistical techniques: Disturbing explanations for why they are seldom checked

    Directory of Open Access Journals (Sweden)

    Rink eHoekstra

    2012-05-01

    Full Text Available A valid interpretation of most statistical techniques requires that the criteria for one or more assumptions are met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another, more disquieting, explanation would be that violations of assumptions are hardly checked for in the first place. In this article a study is presented on whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. They were asked to analyze the data as they would their own data, for which often used and well-known techniques like the t-procedure, ANOVA and regression were required. It was found that they hardly ever checked for violations of assumptions. Interviews afterwards revealed that mainly lack of knowledge and nonchalance, rather than more rational reasons like being aware of the robustness of a technique or unfamiliarity with an alternative, seem to account for this behavior. These data suggest that merely encouraging people to check for violations of assumptions will not lead them to do so, and that the use of statistics is opportunistic.

  4. Required Collaborative Work in Online Courses: A Predictive Modeling Approach

    Science.gov (United States)

    Smith, Marlene A.; Kellogg, Deborah L.

    2015-01-01

    This article describes a predictive model that assesses whether a student will have greater perceived learning in group assignments or in individual work. The model produces correct classifications 87.5% of the time. The research is notable in that it is the first in the education literature to adopt a predictive modeling methodology using data…

  5. User Requirements Model for University Timetable Management System

    OpenAIRE

    Ahmad Althunibat; Mohammad I. Muhairat

    2016-01-01

    Automated timetables are used to schedule courses, lectures and rooms in universities by considering some constraints. Inconvenient and ineffective timetables often waste time and money. Therefore, it is important to investigate the requirements and potential needs of users. Thus, eliciting user requirements of University Timetable Management System (TMS) and their implication becomes an important process for the implementation of TMS. On this note, this paper seeks to propose a m...

  6. A new scenario framework for climate change research: the concept of shared climate policy assumptions

    NARCIS (Netherlands)

    Kriegler, E.; Edmonds, J.; Hallegatte, S.; Ebi, K.L.; Kram, T.; Riahi, K.; Winkler, J.; van Vuuren, Detlef

    2014-01-01

    The new scenario framework facilitates the coupling of multiple socioeconomic reference pathways with climate model products using the representative concentration pathways. This will allow for improved assessment of climate impacts, adaptation and mitigation. Assumptions about climate policy play a

  7. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Science.gov (United States)

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  8. Halo-independent direct detection analyses without mass assumptions

    International Nuclear Information System (INIS)

    Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m χ −σ n plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v min −g-tilde plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v min to nuclear recoil momentum (p R ), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call h-til-tilde(p R ). The entire family of conventional halo-independent g-tilde(v min ) plots for all DM masses are directly found from the single h-tilde(p R ) plot through a simple rescaling of axes. By considering results in h-tilde(p R ) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple g-tilde(v min ) plots for different DM masses. We conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity

  9. Bioenergy crop models: Descriptions, data requirements and future challenges

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Kang, Shujiang [ORNL; Zhang, Xuesong [Pacific Northwest National Laboratory (PNNL); Miguez, Fernando [Iowa State University; Izaurralde, Dr. R. Cesar [Pacific Northwest National Laboratory (PNNL); Post, Wilfred M [ORNL; Dietze, Michael [University of Illinois, Urbana-Champaign; Lynd, L. [Dartmouth College; Wullschleger, Stan D [ORNL

    2012-01-01

    Field studies that address the production of lignocellulosic biomass as a source of renewable energy provide critical data for the development of bioenergy crop models. A literature survey revealed that 14 models have been used for simulating bioenergy crops including herbaceous and woody bioenergy crops, and for crassulacean acid metabolism (CAM) crops. These models simulate field-scale production of biomass for switchgrass (ALMANAC, EPIC, and Agro-BGC), miscanthus (MISCANFOR, MISCANMOD, and WIMOVAC), sugarcane (APSIM, AUSCANE, and CANEGRO), and poplar and willow (SECRETS and 3PG). Two models are adaptations of dynamic global vegetation models and simulate biomass yields of miscanthus and sugarcane at regional scales (Agro-IBIS and LPJmL). Although it lacks the complexity of other bioenergy crop models, the environmental productivity index (EPI) is the only model used to estimate biomass production of CAM (Agave and Opuntia) plants. Except for the EPI model, all models include representations of leaf area dynamics, phenology, radiation interception and utilization, biomass production, and partitioning of biomass to roots and shoots. A few models simulate soil water, nutrient, and carbon cycle dynamics, making them especially useful for assessing the environmental consequences (e.g., erosion and nutrient losses) associated with the large-scale deployment of bioenergy crops. The rapid increase in use of models for energy crop simulation is encouraging; however, detailed information on the influence of climate, soils, and crop management practices on biomass production is scarce. Thus considerable work remains regarding the parameterization and validation of process-based models for bioenergy crops; generation and distribution of high-quality field data for model development and validation; and implementation of an integrated framework for efficient, high-resolution simulations of biomass production for use in planning sustainable bioenergy systems.

  10. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  11. 41 CFR 60-3.9 - No assumption of validity.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true No assumption of validity... assumption of validity. A. Unacceptable substitutes for evidence of validity. Under no circumstances will the... of it's validity be accepted in lieu of evidence of validity. Specifically ruled out are: assumptions...

  12. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, R.S.; Alonso, D.; McKane, A.J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  13. The Causes and Consequences of Differing Pensions Accounting Assumptions in UK Pension Schemes

    OpenAIRE

    Thomas, Gareth

    2006-01-01

    Anecdotal evidence and a number of empirical studies from the US suggest that the providers of corporate pension schemes may manipulate the actuarial assumptions used to estimate the value of the scheme. By manipulating the pension scheme assumptions corporations can reduce their required contribution to the scheme in order to manage their perceived performance. A sample of 92 FTSE 100 companies during the period 2002-2004 was taken and the link between corporate financial constraint and pe...

  14. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-05-23

    Protein-protein interactions are critically dependent on just a few residues (“hot spots”) at the interfaces. Hot spots make a dominant contribution to the binding free energy and if mutated they can disrupt the interaction. As mutagenesis studies require significant experimental efforts, there exists a need for accurate and reliable computational hot spot prediction methods. Compared to the supervised hot spot prediction algorithms, the semi-supervised prediction methods can take into consideration both the labeled and unlabeled residues in the dataset during the prediction procedure. The transductive support vector machine has been utilized for this task and demonstrated a better prediction performance. To the best of our knowledge, however, none of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue prediction, by considering all the three semisupervised assumptions using nonlinear models. Our algorithm, IterPropMCS, works in an iterative manner. In each iteration, the algorithm first propagates the labels of the labeled residues to the unlabeled ones, along the shortest path between them on a graph, assuming that they lie on a nonlinear manifold. Then it selects the most confident residues as the labeled ones for the next iteration, according to the cluster and smoothness criteria, which is implemented by a nonlinear density estimator. Experiments on a benchmark dataset, using protein structure-based features, demonstrate that our approach is effective in predicting hot spots and compares favorably to other available methods. The results also show that our method outperforms the state-of-the-art transductive learning methods.

  15. Modelling Imperfect Product Line Requirements with Fuzzy Feature Diagrams

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Weston, Nathan; Rashid, Awais

    In this article, we identify that partial, vague and conflicting information can severely limit the effectiveness of approaches that derive feature trees from textual requirement specifications. We examine the impact such imperfect information has on feature tree extraction and we propose the use of

  16. From requirement document to formal modelling and decomposition of control systems

    OpenAIRE

    Yeganefard, Sanaz

    2014-01-01

    Formal modelling of control systems can help with identifying missing requirements and design flaws before implementing them. However, modelling using formal languages can be challenging and time consuming. Therefore intermediate steps may be required to simplify the transition from informal requirements to a formal model.In this work we firstly provide a four-stage approach for structuring and formalising requirements of a control system. This approach is based on monitored, controlled, mode...

  17. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2011-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of F_{q} ``in the exponent'' of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring R_f= \\F...... DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...... and that in fact the problems become harder with increasing d and hence form an infinite hierarchy. We show that hardness of VDDH implies CCA-secure encryption, efficient Naor-Reingold style pseudorandom functions, and auxiliary input secure encryption, a strong form of leakage resilience. This can be seen...

  18. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2012-01-01

    and security proof but get better security and moreover, the amortized complexity (e.g, computation per encrypted bit) is the same as when using DDH. We also show that d-DDH, just like DDH, is easy in bilinear groups. We therefore suggest a different type of assumption, the d-vector DDH problems (d......We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R......-VDDH), which are based on f(X) = Xd, but with a twist to avoid problems with reducible polynomials. We show in the generic group model that d-VDDH is hard in bilinear groups and that the problems become harder with increasing d. We show that hardness of d-VDDH implies CCA-secure encryption, efficient Naor...

  19. The Benefit of Ambiguity in Understanding Goals in Requirements Modelling

    DEFF Research Database (Denmark)

    Paay, Jeni; Pedell, Sonja; Sterling, Leon

    2011-01-01

    of their research is to create technologies that support more flexible and meaningful social interactions, by combining best practice in software engineering with ethnographic techniques to model complex social interactions from their socially oriented life for the purposes of building rich socio...... ambiguity in the process of elicitation and analysis through the use of empirically informed quality goals attached to functional goals. The authors demonstrate the benefit of articulating a quality goal without turning it into a functional goal. Their study shows that quality goals kept at a high level...... of abstraction, ambiguous and open for conversations through the modelling process add richness to goal models, and communicate quality attributes of the interaction being modelled to the design phase, where this ambiguity is regarded as a resource for design....

  20. Strategic Reform: A Battle of Assumptions

    Science.gov (United States)

    2015-06-01

    threats to gestate in unexpected corners of the globe. All are treated as proof of uncertainty and co-opted into a narrative justifying broad...States may currently establish the rule sets that enable the globalized market space, but the market itself requires the manufacturing engine and...missions” established a new role for the Chinese military in ensuring those markets.10 That new role exposes new vulnerabilities. Countering force

  1. Requirements and Problems in Parallel Model Development at DWD

    Directory of Open Access Journals (Sweden)

    Ulrich Schäattler

    2000-01-01

    Full Text Available Nearly 30 years after introducing the first computer model for weather forecasting, the Deutscher Wetterdienst (DWD is developing the 4th generation of its numerical weather prediction (NWP system. It consists of a global grid point model (GME based on a triangular grid and a non-hydrostatic Lokal Modell (LM. The operational demand for running this new system is immense and can only be met by parallel computers. From the experience gained in developing earlier NWP models, several new problems had to be taken into account during the design phase of the system. Most important were portability (including efficieny of the programs on several computer architectures and ease of code maintainability. Also the organization and administration of the work done by developers from different teams and institutions is more complex than it used to be. This paper describes the models and gives some performance results. The modular approach used for the design of the LM is explained and the effects on the development are discussed.

  2. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    Science.gov (United States)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  3. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  4. Public key cryptography from weaker assumptions

    DEFF Research Database (Denmark)

    Zottarel, Angela

    in the case the adversary is granted access to partial information about the secret state of the primitive. To do so, we work in an extension of the standard black-box model, a new framework where possible leakage from the secret state is taken into account. In particular, we give the first construction......This dissertation is focused on the construction of public key cryptographic primitives and on the relative security analysis in a meaningful theoretic model. This work takes two orthogonal directions. In the first part, we study cryptographic constructions preserving their security properties also...... of signature schemes in a very general leakage model known as auxiliary input. We also study how leakage influences the notion of simulation-based security, comparing leakage tolerance to adaptive security in the UC-framework. In the second part of this dissertation, we turn our attention to hardness...

  5. Breakdown of Hydrostatic Assumption in Tidal Channel with Scour Holes

    Directory of Open Access Journals (Sweden)

    Chunyan Li

    2016-10-01

    Full Text Available Hydrostatic condition is a common assumption in tidal and subtidal motions in oceans and estuaries.. Theories with this assumption have been largely successful. However, there is no definite criteria separating the hydrostatic from the non-hydrostatic regimes in real applications because real problems often times have multiple scales. With increased refinement of high resolution numerical models encompassing smaller and smaller spatial scales, the need for non-hydrostatic models is increasing. To evaluate the vertical motion over bathymetric changes in tidal channels and assess the validity of the hydrostatic approximation, we conducted observations using a vessel-based acoustic Doppler current profiler (ADCP. Observations were made along a straight channel 18 times over two scour holes of 25 m deep, separated by 330 m, in and out of an otherwise flat 8 m deep tidal pass leading to the Lake Pontchartrain over a time period of 8 hours covering part of the diurnal tidal cycle. Out of the 18 passages over the scour holes, 11 of them showed strong upwelling and downwelling which resulted in the breakdown of hydrostatic condition. The maximum observed vertical velocity was ~ 0.35 m/s, a high value in a tidal channel, and the estimated vertical acceleration reached a high value of 1.76×10-2 m/s2. Analysis demonstrated that the barotropic non-hydrostatic acceleration was dominant. The cause of the non-hydrostatic flow was the that over steep slopes. This demonstrates that in such a system, the bathymetric variation can lead to the breakdown of hydrostatic conditions. Models with hydrostatic restrictions will not be able to correctly capture the dynamics in such a system with significant bathymetric variations particularly during strong tidal currents.

  6. Modelling production of field crops and its requirements

    NARCIS (Netherlands)

    Wit, de C.T.; Keulen, van H.

    1987-01-01

    Simulation models are being developed that enable quantitative estimates of the growth and production of the main agricultural crops under a wide range of weather and soil conditions. For this purpose, several hierarchically ordered production situations are distinguished in such a way that the

  7. Predicting Flu Season Requirements: An Undergraduate Modeling Project

    Science.gov (United States)

    Kramlich, Gary R., II; Braunstein Fierson, Janet L.; Wright, J. Adam

    2010-01-01

    This project was designed to be used in a freshman calculus class whose students had already been introduced to logistic functions and basic data modeling techniques. It need not be limited to such an audience, however; it has also been implemented in a topics in mathematics class for college upperclassmen. Originally intended to be presented in…

  8. Measuring sound absorption using local field assumptions

    NARCIS (Netherlands)

    Kuipers, E.R.

    2013-01-01

    To more effectively apply acoustically absorbing materials, it is desirable to measure angle-dependent sound absorption coefficients, preferably in situ. Existing measurement methods are based on an overall model of the acoustic field in front of the absorber, and are therefore sensitive to

  9. Modelling of capital requirements in the energy sector: capital market access. Final memorandum

    Energy Technology Data Exchange (ETDEWEB)

    1978-04-01

    Formal modelling techniques for analyzing the capital requirements of energy industries have been performed at DOE. A survey has been undertaken of a number of models which forecast energy-sector capital requirements or which detail the interactions of the energy sector and the economy. Models are identified which can be useful as prototypes for some portion of DOE's modelling needs. The models are examined to determine any useful data bases which could serve as inputs to an original DOE model. A selected group of models are examined which can comply with the stated capabilities. The data sources being used by these models are covered and a catalog of the relevant data bases is provided. The models covered are: capital markets and capital availability models (Fossil 1, Bankers Trust Co., DRI Macro Model); models of physical capital requirements (Bechtel Supply Planning Model, ICF Oil and Gas Model and Coal Model, Stanford Research Institute National Energy Model); macroeconomic forecasting models with input-output analysis capabilities (Wharton Annual Long-Term Forecasting Model, Brookhaven/University of Illinois Model, Hudson-Jorgenson/Brookhaven Model); utility models (MIT Regional Electricity Model-Baughman Joskow, Teknekron Electric Utility Simulation Model); and others (DRI Energy Model, DRI/Zimmerman Coal Model, and Oak Ridge Residential Energy Use Model).

  10. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  11. Requirements for Logical Models for Value-Added Tax Legislation

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    Enterprise resource planning (ERP) systems are ubiquitous in commercial enterprises of all sizes and invariably need to account for the notion of value-added tax (VAT). The legal and technical difficulties in handling VAT are exacerbated by spanning a broad and chaotic spectrum of intricate country...... of the Danish VAT law in Web Ontology Language (OWL) and in Con¿git Product Modeling Language (CPML)....

  12. Single High Fidelity Geometric Data Sets for LCM - Model Requirements

    Science.gov (United States)

    2006-11-01

    triangles (.raw) to the native triangular facet file (.facet). The software vendors recommend the use of McNeil and Associates’ Rhinoceros 3D for all...surface modeling and export. Rhinoceros has the capability and precision to create highly detailed 3D surface geometry suitable for radar cross section... white before ending up at blue as the temperature increases [27]. IR radiation was discovered in 1800 but its application is still limited in

  13. Modelo Century de dinâmica da matéria orgânica do solo: equações e pressupostos Century model of soil organic matter dynamics: equations and assumptions

    Directory of Open Access Journals (Sweden)

    Luiz Fernando Carvalho Leite

    2003-08-01

    Full Text Available A modelagem de processos biológicos tem por objetivos o planejamento do uso da terra, o estabelecimento de padrões ambientais e as estimativas dos riscos reais e potenciais das atividades agrícolas e ambientais. Diversos modelos têm sido criados nos últimos 25 anos. Century é um modelo mecanístico que analisa em longo prazo a dinâmica da matéria orgânica do solo e de nutrientes no sistema solo-planta em diversos agroecossistemas. O submodelo de matéria orgânica do solo possui os compartimentos ativo (biomassa microbiana e produtos, lento (produtos microbianos e vegetais, fisicamente protegidos ou biologicamente resistentes à decomposição e passivo (quimicamente recalcitrante ou também fisicamente protegido com diferentes taxas de decomposição. Equações de primeira ordem são usadas para modelar todos os compartimentos da matéria orgânica do solo e a temperatura e umidade do solo modificam as taxas de decomposição. A reciclagem do compartimento ativo e a formação do passivo são controladas pelo teor de areia e de argila do solo, respectivamente. Os resíduos vegetais são divididos em compartimentos dependentes dos teores de lignina e nitrogênio. Por meio do modelo, pode-se relacionar matéria orgânica aos níveis de fertilidade e ao manejo atual e futuro, otimizando o entendimento das transformações dos nutrientes em solos de diversos agroecossistemas.The modeling of biological processes has as objectives the planning of land use, setting environmental standards and estimating the actual and potential risks of the agricultural and environmental activities. Several models have been created in the last 25 years. Century is a mechanistic model that analyzes in long-term the dynamics of soil organic matter and of nutrients in soil-plant system in several agroecosystems. The soil organic matter submodel has the active (microbial biomass and products, slow (plant and microbial products that are physically protected or

  14. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  15. Nonlinear dynamics in work groups with Bion's basic assumptions.

    Science.gov (United States)

    Dal Forno, Arianna; Merlone, Ugo

    2013-04-01

    According to several authors Bion's contribution has been a landmark in the thought and conceptualization of the unconscious functioning of human beings in groups. We provide a mathematical model of group behavior in which heterogeneous members may behave as if shared to different degrees what in Bion's theory is a common basic assumption. Our formalization combines both individual characteristics and group dynamics. By this formalization we analyze the group dynamics as the result of the individual dynamics of the members and prove that, under some conditions, each individual reproduces the group dynamics in a different scale. In particular, we provide an example in which the chaotic behavior of the group is reflected in each member.

  16. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  17. Are Gaussian spectra a viable perceptual assumption in color appearance?

    Science.gov (United States)

    Mizokami, Yoko; Webster, Michael A

    2012-02-01

    Natural illuminant and reflectance spectra can be roughly approximated by a linear model with as few as three basis functions, and this has suggested that the visual system might construct a linear representation of the spectra by estimating the weights of these functions. However, such models do not accommodate nonlinearities in color appearance, such as the Abney effect. Previously, we found that these nonlinearities are qualitatively consistent with a perceptual inference that stimulus spectra are instead roughly Gaussian, with the hue tied to the inferred centroid of the spectrum [J. Vision 6(9), 12 (2006)]. Here, we examined to what extent a Gaussian inference provides a sufficient approximation of natural color signals. Reflectance and illuminant spectra from a wide set of databases were analyzed to test how well the curves could be fit by either a simple Gaussian with three parameters (amplitude, peak wavelength, and standard deviation) versus the first three principal component analysis components of standard linear models. The resulting Gaussian fits were comparable to linear models with the same degrees of freedom, suggesting that the Gaussian model could provide a plausible perceptual assumption about stimulus spectra for a trichromatic visual system. © 2012 Optical Society of America

  18. Finansal Varlıkları Fiyatlama Modelinin Analizi: Varsayımlar, Bulgular ve Hakkındaki Eleştiriler(An Analysis of Capital Asset Pricing Model: Assumptions, Arguments and Critics

    Directory of Open Access Journals (Sweden)

    Hakan Bilir

    2016-03-01

    Full Text Available Yatırım fırsatlarının değerlendirilmesi süreci beklene getiri ve riskin ölçümüne bağlıdır. Finansal Varlıkları Fiyatlama Modeli (CAPM, çok uzun yıllardır modern finans teorisinin temel taşlarından bir tanesini oluşturmaktadır. Model, varlıkların beklenen getirisi ve sistematik riski arasındaki basit doğrusal ilişkiyi ortaya koymaktadır. Model halen, sermaye maliyetinin hesaplanması, portföy yönetiminin performansının ölçülmesi ve yatırımların değerlendirilmesi amacıyla kullanılmaktadır. CAPM’in çekiciliği, riskin ve beklenen getiri ve risk arasındaki ilişkinin ölçümlenmesi konusundaki güçlü tahmin yeteneğinden gelmektedir. Bununla birlikte modelin bu yeteneği 30 yılı aşkın bir süredir akademisyenler ve uygulamacılar tarafından sorgulanmaktadır. Tartışmalar büyük ölçüde ampirik düzeyde gerçekleştirilmektedir. CAPM’in ampirik düzeydeki problemleri, çok sayıda basitleştirilmiş varsayımı içermesi nedeniyle teorik hatalardır. Çok sayıdaki gerçekçi olmayan varsayımlar modeli pratik olarak kullanışsız hale getirmektedir. Model ile ilgili temel eleştiriler ise risksiz faiz oranı, pazar portföyü ve beta katsayı üzerinde yoğunlaşmaktadır.

  19. Performance Requirements Modeling andAssessment for Active Power Ancillary Services

    DEFF Research Database (Denmark)

    Bondy, Daniel Esteban Morales; Thavlov, Anders; Tougaard, Janus Bundsgaard Mosbæk

    2017-01-01

    system operation, a reliable service delivery is required, yet it may not be appropriate to apply conventional performance requirements to new technologies and methods. The service performance requirements and assessment methods therefore need to be generalized and standardized in order to include future...... ancillary service sources. This paper develops a modeling method for ancillary services performance requirements, including performance and verification indices. The use of the modeling method and the indices is exemplified in two case studies....

  20. 7 Mass casualty incidents: a review of triage severity planning assumptions.

    Science.gov (United States)

    Hunt, Paul

    2017-12-01

    Recent events involving a significant number of casualties have emphasised the importance of appropriate preparation for receiving hospitals, especially Emergency Departments, during the initial response phase of a major incident. Development of a mass casualty resilience and response framework in the Northern Trauma Network included a review of existing planning assumptions in order to ensure effective resource allocation, both in local receiving hospitals and system-wide.Existing planning assumptions regarding categorisation by triage level are generally stated as a ratio for P1:P2:P3 of 25%:25%:50% of the total number of injured survivors. This may significantly over-, or underestimate, the number in each level of severity in the case of a large-scale incident. A pilot literature review was conducted of the available evidence from historical incidents in order to gather data regarding the confirmed number of overall casualties, 'critical' cases, admitted cases, and non-urgent or discharged cases. This data was collated and grouped by mechanism in order to calculate an appropriate severity ratio for each incident type. 12 articles regarding mass casualty incidents from the last two decades were identified covering three main incident types: (1) Mass transportation crash, (2) Building fire, and (3) Bomb and related terrorist attacks and involving a total of 3615 injured casualties. The overall mortality rate was calculated as 12.3%. Table 1 summarises the available patient casualty data from each of the specific incidents reported and calculated proportions of critical ('P1'), admitted ('P2'), and non-urgent or ambulatory cases ('P3'). Despite the heterogeneity of data and range of incident type there is sufficient evidence to suggest that current planning assumptions are incorrect and a more refined model is required. An important finding is the variation in proportion of critical cases depending upon the mechanism. For example, a greater than expected proportion

  1. An examination of the impact of care giving styles (accommodation and skilful communication and support) on the one year outcome of adolescent anorexia nervosa: Testing the assumptions of the cognitive interpersonal model in anorexia nervosa.

    Science.gov (United States)

    Salerno, Laura; Rhind, Charlotte; Hibbs, Rebecca; Micali, Nadia; Schmidt, Ulrike; Gowers, Simon; Macdonald, Pamela; Goddard, Elizabeth; Todd, Gillian; Lo Coco, Gianluca; Treasure, Janet

    2016-02-01

    The cognitive interpersonal model predicts that parental caregiving style will impact on the rate of improvement of anorexia nervosa symptoms. The study aims to examine whether the absolute levels and the relative congruence between mothers' and fathers' care giving styles influenced the rate of change of their children's symptoms of anorexia nervosa over 12 months. Triads (n=54) consisting of patients with anorexia nervosa and both of their parents were included in the study. Caregivers completed the Caregiver Skills scale and the Accommodation and Enabling Scale at intake. Patients completed the Short Evaluation of Eating Disorders at intake and at monthly intervals for one year. Polynomial Hierarchical Linear Modeling was used for the analysis. There is a person/dose dependant relationship between accommodation and patients' outcome, i.e. when both mother and father are highly accommodating outcome is poor, if either is highly accommodating outcome is intermediate and if both parents are low on accommodation outcome is good. Outcome is also good if both parents or mother alone have high levels of carer skills and poor if both have low levels of skills. Including only a sub-sample of an adolescent clinical population; not considering time spent care giving, and reporting patient's self-reported outcome data limits the generalisability of the current findings. Accommodating and enabling behaviours by family members can serve to maintain eating disorder behaviours. However, skilful behaviours particularly by mothers, can aid recovery. Clinical interventions to optimise care giving skills and to reduce accommodation by both parents may be an important addition to treatment for anorexia nervosa. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. "Open Access" Requires Clarification: Medical Journal Publication Models Evolve.

    Science.gov (United States)

    Lubowitz, James H; Brand, Jefferson C; Rossi, Michael J; Provencher, Matthew T

    2017-03-01

    While Arthroscopy journal is a traditional subscription model journal, our companion journal Arthroscopy Techniques is "open access." We used to believe open access simply meant online and free of charge. However, while open-access journals are free to readers, in 2017 authors must make a greater sacrifice in the form of an article-processing charge (APC). Again, while this does not apply to Arthroscopy, the APC will apply to Arthroscopy Techniques. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  3. Formal Requirements Modeling with Executable Use Cases and Coloured Petri Nets

    OpenAIRE

    Jørgensen, Jens Bæk; Tjell, Simon; Fernandes, Joao Miguel

    2009-01-01

    This paper presents executable use cases (EUCs), which constitute a model-based approach to requirements engineering. EUCs may be used as a supplement to model-driven development (MDD) and can describe and link user-level requirements and more technical software specifications. In MDD, user-level requirements are not always explicitly described, since usually it is sufficient that one provides a specification, or platform-independent model, of the software that is to be developed. Th...

  4. Requirements for modeling airborne microbial contamination in space stations

    Science.gov (United States)

    Van Houdt, Rob; Kokkonen, Eero; Lehtimäki, Matti; Pasanen, Pertti; Leys, Natalie; Kulmala, Ilpo

    2018-03-01

    Exposure to bioaerosols is one of the facets that affect indoor air quality, especially for people living in densely populated or confined habitats, and is associated to a wide range of health effects. Good indoor air quality is thus vital and a prerequisite for fully confined environments such as space habitats. Bioaerosols and microbial contamination in these confined space stations can have significant health impacts, considering the unique prevailing conditions and constraints of such habitats. Therefore, biocontamination in space stations is strictly monitored and controlled to ensure crew and mission safety. However, efficient bioaerosol control measures rely on solid understanding and knowledge on how these bioaerosols are created and dispersed, and which factors affect the survivability of the associated microorganisms. Here we review the current knowledge gained from relevant studies in this wide and multidisciplinary area of bioaerosol dispersion modeling and biological indoor air quality control, specifically taking into account the specific space conditions.

  5. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  6. A Comprehensive Energy Analysis and Related Carbon Footprint of Dairy Farms, Part 2: Investigation and Modeling of Indirect Energy Requirements

    Directory of Open Access Journals (Sweden)

    Giuseppe Todde

    2018-02-01

    Full Text Available Dairy cattle farms are continuously developing more intensive systems of management, which require higher utilization of durable and non-durable inputs. These inputs are responsible for significant direct and indirect fossil energy requirements, which are related to remarkable emissions of CO2. This study focused on investigating the indirect energy requirements of 285 conventional dairy farms and the related carbon footprint. A detailed analysis of the indirect energy inputs related to farm buildings, machinery and agricultural inputs was carried out. A partial life cycle assessment approach was carried out to evaluate indirect energy inputs and the carbon footprint of farms over a period of one harvest year. The investigation highlights the importance and the weight related to the use of agricultural inputs, which represent more than 80% of the total indirect energy requirements. Moreover, the analyses carried out underline that the assumption of similarity in terms of requirements of indirect energy and related carbon emissions among dairy farms is incorrect especially when observing different farm sizes and milk production levels. Moreover, a mathematical model to estimate the indirect energy requirements of dairy farms has been developed in order to provide an instrument allowing researchers to assess the energy incorporated into farm machinery, agricultural inputs and buildings. Combining the results of this two-part series, the total energy demand (expressed in GJ per farm results in being mostly due to agricultural inputs and fuel consumption, which have the largest share of the annual requirements for each milk yield class. Direct and indirect energy requirements increased, going from small sized farms to larger ones, from 1302–5109 GJ·y−1, respectively. However, the related carbon dioxide emissions expressed per 100 kg of milk showed a negative trend going from class <5000 to >9000 kg of milk yield, where larger farms were able to

  7. 40 CFR 261.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... Excluded Hazardous Secondary Materials § 261.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure or liability... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  8. 40 CFR 144.66 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM Financial Responsibility: Class I Hazardous Waste Injection Wells § 144.66 State assumption of responsibility. (a) If a State either assumes legal... 40 Protection of Environment 22 2010-07-01 2010-07-01 false State assumption of responsibility...

  9. 40 CFR 761.2 - PCB concentration assumptions for use.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false PCB concentration assumptions for use..., AND USE PROHIBITIONS General § 761.2 PCB concentration assumptions for use. (a)(1) Any person may..., oil-filled cable, and rectifiers whose PCB concentration is not established contain PCBs at < 50 ppm...

  10. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the ...

  11. Basic assumptions in statistical analyses of data in biomedical ...

    African Journals Online (AJOL)

    If one or more assumptions are violated, an alternative procedure must be used to obtain valid results. This article aims at highlighting some basic assumptions in statistical analyses of data in biomedical sciences. Keywords: samples, independence, non-parametric, parametric, statistical analyses. Int. J. Biol. Chem. Sci. Vol.

  12. 29 CFR 1607.9 - No assumption of validity.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false No assumption of validity. 1607.9 Section 1607.9 Labor... EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 1607.9 No assumption of validity. A. Unacceptable substitutes for evidence of validity. Under no circumstances will the general reputation of a test or other...

  13. Indoor Slope and Edge Detection by Using Two-Dimensional EKF-SLAM with Orthogonal Assumption

    Directory of Open Access Journals (Sweden)

    Jixin Lv

    2015-04-01

    Full Text Available In an indoor environment, slope and edge detection is an important problem in simultaneous localization and mapping (SLAM, which is a basic requirement for mobile robot autonomous navigation. Slope detection allows the robot to find areas that are more traversable while the edge detection can prevent robot from falling. Three-dimensional (3D solutions usually require a large memory and high computational costs. This study proposes an efficient two-dimensional (2D solution to combine slope and edge detection with a line-segment-based extended Kalman filter SLAM (EKF-SLAM in a structured indoor area. The robot is designed to use two fixed 2D laser range finders (LRFs to perform horizontal and vertical scans. With local area orthogonal assumption, the slope and edge are modelled into line segments swiftly from each vertical scan, and then are merged into the EKF-SLAM framework. The EKF-SLAM framework features an optional prediction model that can automatically decide whether the application of iterative closest point (ICP is necessary to compensate for the dead reckoning error. The experimental results demonstrate that the proposed algorithm is capable of building an accurate 2D map swiftly, which contains crucial information of the edge and slope.

  14. On Early Conflict Identification by Requirements Modeling of Energy System Control Structures

    DEFF Research Database (Denmark)

    Heussen, Kai; Gehrke, Oliver; Niemann, Hans Henrik

    2015-01-01

    Control systems are purposeful systems involving goal-oriented information processing (cyber) and technical (physical) structures. Requirements modeling formalizes fundamental concepts and relations of a system architecture at a high-level design stage and can be used to identify potential design...... issues early. For requirements formulation of control structures, cyber and physical aspects need to be jointly represented to express interdependencies, check for consistency and discover potentially conflicting requirements. Early identification of potential conflicts may prevent larger problems...... modeling for early requirements checking using a suitable modeling language, and illustrates how this approach enables the identification of several classes of controller conflict....

  15. A semi-automated approach for generating natural language requirements documents based on business process models

    NARCIS (Netherlands)

    Aysolmaz, Banu; Leopold, Henrik; Reijers, Hajo A.; Demirörs, Onur

    2018-01-01

    Context: The analysis of requirements for business-related software systems is often supported by using business process models. However, the final requirements are typically still specified in natural language. This means that the knowledge captured in process models must be consistently

  16. Requirements-level semantics and model checking of object-oriented statecharts

    NARCIS (Netherlands)

    Eshuis, H.; Jansen, D.N.; Wieringa, Roelf J.

    2002-01-01

    In this paper we define a requirements-level execution semantics for object-oriented statecharts and show how properties of a system specified by these statecharts can be model checked using tool support for model checkers. Our execution semantics is requirements-level because it uses the perfect

  17. Lifelong Learning: Foundational Models, Underlying Assumptions and Critiques

    Science.gov (United States)

    Regmi, Kapil Dev

    2015-01-01

    Lifelong learning has become a catchword in almost all countries because of its growing influence on education policies in the globalised world. In the Organisation for Economic Cooperation and Development (OECD) and the European Union (EU), the promotion of lifelong learning has been a strategy to speed up economic growth and become competitive.…

  18. Assessing and relaxing assumptions in quasi-simplex models

    NARCIS (Netherlands)

    Lugtig, Peter|info:eu-repo/dai/nl/304824658; Cernat, Alexandru; Uhrig, Noah; Watson, Nicole

    2014-01-01

    Panel data (repeated measures of the same individuals) has become more and more popular in research as it has a number of unique advantages such as enabling researchers to answer questions about individual change and help deal (partially) with the issues linked to causality. But this type of data

  19. Stream of consciousness: Quantum and biochemical assumptions regarding psychopathology.

    Science.gov (United States)

    Tonello, Lucio; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A

    2017-04-01

    The accepted paradigms of mainstream neuropsychiatry appear to be incompletely adequate and in various cases offer equivocal analyses. However, a growing number of new approaches are being proposed that suggest the emergence of paradigm shifts in this area. In particular, quantum theories of mind, brain and consciousness seem to offer a profound change to the current approaches. Unfortunately these quantum paradigms harbor at least two serious problems. First, they are simply models, theories, and assumptions, with no convincing experiments supporting their claims. Second, they deviate from contemporary mainstream views of psychiatric illness and do so in revolutionary ways. We suggest a possible way to integrate experimental neuroscience with quantum models in order to address outstanding issues in psychopathology. A key role is played by the phenomenon called the "stream of consciousness", which can be linked to the so-called "Gamma Synchrony" (GS), which is clearly demonstrated by EEG data. In our novel proposal, a unipolar depressed patient could be seen as a subject with an altered stream of consciousness. In particular, some clues suggest that depression is linked to an "increased power" stream of consciousness. It is additionally suggested that such an approach to depression might be extended to psychopathology in general with potential benefits to diagnostics and therapeutics in neuropsychiatry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  1. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  2. Meta-Model and UML Profile for Requirements Management of Software and Embedded Systems

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2011-01-01

    Full Text Available Software and embedded system companies today encounter problems related to requirements management tool integration, incorrect tool usage, and lack of traceability. This is due to utilized tools with no clear meta-model and semantics to communicate requirements between different stakeholders. This paper presents a comprehensive meta-model for requirements management. The focus is on software and embedded system domains. The goal is to define generic requirements management domain concepts and abstract interfaces between requirements management and system development. This leads to a portable requirements management meta-model which can be adapted with various system modeling languages. The created meta-model is prototyped by translating it into a UML profile. The profile is imported into a UML tool which is used for rapid evaluation of meta-model concepts in practice. The developed profile is associated with a proof of concept report generator tool that automatically produces up-to-date documentation from the models in form of web pages. The profile is adopted to create an example model of embedded system requirement specification which is built with the profile.

  3. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    Science.gov (United States)

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  4. Testing the Assumptions of Sequential Bifurcation for Factor Screening (revision of CentER DP 2015-034)

    NARCIS (Netherlands)

    Shi, Wen; Kleijnen, J.P.C.

    2017-01-01

    Sequential bifurcation (or SB) is an efficient and effective factor-screening method; i.e., SB quickly identifies the important factors (inputs) in experiments with simulation models that have very many factors—provided the SB assumptions are valid. The specific SB assumptions are: (i) a secondorder

  5. MONITORED GEOLOGIC REPOSITORY LIFE CYCLE COST ESTIMATE ASSUMPTIONS DOCUMENT

    International Nuclear Information System (INIS)

    R.E. Sweeney

    2001-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost (LCC) estimate and schedule update incorporating information from the Viability Assessment (VA) , License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  6. Monitored Geologic Repository Life Cycle Cost Estimate Assumptions Document

    International Nuclear Information System (INIS)

    Sweeney, R.

    2000-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost estimate and schedule update incorporating information from the Viability Assessment (VA), License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  7. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  8. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  9. Quasi-experimental study designs series-paper 7: assessing the assumptions.

    Science.gov (United States)

    Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian

    2017-09-01

    Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  11. Linking nitrogen deposition to nitrate concentrations in groundwater below nature areas : modelling approach and data requirements

    NARCIS (Netherlands)

    Bonten, L.T.C.; Mol-Dijkstra, J.P.; Wieggers, H.J.J.; Vries, de W.; Pul, van W.A.J.; Hoek, van den K.W.

    2009-01-01

    This study determines the most suitable model and required model improvements to link atmospheric deposition of nitrogen and other elements in the Netherlands to measurements of nitrogen and other elements in the upper groundwater. The deterministic model SMARTml was found to be the most suitable

  12. A framework for the organizational assumptions underlying safety culture

    International Nuclear Information System (INIS)

    Packer, Charles

    2002-01-01

    The safety culture of the nuclear organization can be addressed at the three levels of culture proposed by Edgar Schein. The industry literature provides a great deal of insight at the artefact and espoused value levels, although as yet it remains somewhat disorganized. There is, however, an overall lack of understanding of the assumption level of safety culture. This paper describes a possible framework for conceptualizing the assumption level, suggesting that safety culture is grounded in unconscious beliefs about the nature of the safety problem, its solution and how to organize to achieve the solution. Using this framework, the organization can begin to uncover the assumptions at play in its normal operation, decisions and events and, if necessary, engage in a process to shift them towards assumptions more supportive of a strong safety culture. (author)

  13. Different Random Distributions Research on Logistic-Based Sample Assumption

    Directory of Open Access Journals (Sweden)

    Jing Pan

    2014-01-01

    Full Text Available Logistic-based sample assumption is proposed in this paper, with a research on different random distributions through this system. It provides an assumption system of logistic-based sample, including its sample space structure. Moreover, the influence of different random distributions for inputs has been studied through this logistic-based sample assumption system. In this paper, three different random distributions (normal distribution, uniform distribution, and beta distribution are used for test. The experimental simulations illustrate the relationship between inputs and outputs under different random distributions. Thereafter, numerical analysis infers that the distribution of outputs depends on that of inputs to some extent, and this assumption system is not independent increment process, but it is quasistationary.

  14. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  15. How much confidence do we need in animal experiments? Statistical assumptions in sample size estimation.

    Science.gov (United States)

    Richter, Veronika; Muche, Rainer; Mayer, Benjamin

    2018-01-26

    Statistical sample size calculation is a crucial part of planning nonhuman animal experiments in basic medical research. The 3R principle intends to reduce the number of animals to a sufficient minimum. When planning experiments, one may consider the impact of less rigorous assumptions during sample size determination as it might result in a considerable reduction in the number of required animals. Sample size calculations conducted for 111 biometrical reports were repeated. The original effect size assumptions remained unchanged, but the basic properties (type 1 error 5%, two-sided hypothesis, 80% power) were varied. The analyses showed that a less rigorous assumption on the type 1 error level (one-sided 5% instead of two-sided 5%) was associated with a savings potential of 14% regarding the original number of required animals. Animal experiments are predominantly exploratory studies. In light of the demonstrated potential reduction in the numbers of required animals, researchers should discuss whether less rigorous assumptions during the process of sample size calculation may be reasonable for the purpose of optimizing the number of animals in experiments according to the 3R principle.

  16. Operation Cottage: A Cautionary Tale of Assumption and Perceptual Bias

    Science.gov (United States)

    2015-01-01

    but they can also set a lethal trap for unsuspecting mission planners , decisionmakers, and intelli- gence analysts.2 Assumptions are extremely...the planning process, but the planning staff must not become so wedded to their assumptions that they reject or overlook information that is not in...operations specialist who had served as principal planner for the Attu invasion. Major General Charles Corlett was to command the landing force, an

  17. Developing a Predictive Model for Unscheduled Maintenance Requirements on United States Air Force Installations

    National Research Council Canada - National Science Library

    Kovich, Matthew D; Norton, J. D

    2008-01-01

    .... This paper develops one such method by using linear regression and time series analysis to develop a predictive model to forecast future year man-hour and funding requirements for unscheduled maintenance...

  18. Personal recommender systems for learners in lifelong learning: requirements, techniques and model

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2008). Personal recommender systems for learners in lifelong learning: requirements, techniques and model. International Journal of Learning Technology, 3(4), 404-423.

  19. Towards a framework for improving goal-oriented requirement models quality

    OpenAIRE

    Cares, Carlos; Franch Gutiérrez, Javier

    2009-01-01

    Goal-orientation is a widespread and useful approach to Requirements Engineering. However, quality assessment frameworks focused on goal-oriented processes are either limited or remain on the theoretical side. Requirements quality initiatives range from simple metrics applicable to requirements documents, to general-purpose quality frameworks that include syntactic, semantic and pragmatic concerns. In some recent works, we have proposed a metrics framework for goal-oriented models, b...

  20. A manpower training requirements model for new weapons systems, with applications to the infantry fighting vehicle

    OpenAIRE

    Kenehan, Douglas J.

    1981-01-01

    Approved for public release; distribution is unlimited This thesis documents the methodology and parameters used in designing a manpower training requirements model for new weapons systems. This model provides manpower planners with the capability of testing alternative fielding policies and adjusting model parameters to improve the use of limited personnel resources. Use of the model is illustrated in a detailed analysis of the planned introduction of the Infantry Fighting Vehicle into t...

  1. Fuel requirements for experimental devices in MTR reactors. A perturbation model for reactor core analysis

    International Nuclear Information System (INIS)

    Beeckmans de West-Meerbeeck, A.

    1991-01-01

    Irradiation in neutron absorbing devices, requiring high fast neutron fluxes in the core or high thermal fluxes in the reflector and flux traps, lead to higher density fuel and larger core dimensions. A perturbation model of the reactor core helps to estimate the fuel requirements. (orig.)

  2. Model-Based Requirements Analysis for Reactive Systems with UML Sequence Diagrams and Coloured Petri Nets

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe a formal foundation for a specialized approach to automatically checking traces against real-time requirements. The traces are obtained from simulation of Coloured Petri Net (CPN) models of reactive systems. The real-time requirements are expressed in terms of a derivat...

  3. The Nuremberg Code subverts human health and safety by requiring animal modeling

    OpenAIRE

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-01-01

    Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive...

  4. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  5. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  6. Revealing patterns of cultural transmission from frequency data: equilibrium and non-equilibrium assumptions

    Science.gov (United States)

    Crema, Enrico R.; Kandler, Anne; Shennan, Stephen

    2016-12-01

    A long tradition of cultural evolutionary studies has developed a rich repertoire of mathematical models of social learning. Early studies have laid the foundation of more recent endeavours to infer patterns of cultural transmission from observed frequencies of a variety of cultural data, from decorative motifs on potsherds to baby names and musical preferences. While this wide range of applications provides an opportunity for the development of generalisable analytical workflows, archaeological data present new questions and challenges that require further methodological and theoretical discussion. Here we examine the decorative motifs of Neolithic pottery from an archaeological assemblage in Western Germany, and argue that the widely used (and relatively undiscussed) assumption that observed frequencies are the result of a system in equilibrium conditions is unwarranted, and can lead to incorrect conclusions. We analyse our data with a simulation-based inferential framework that can overcome some of the intrinsic limitations in archaeological data, as well as handle both equilibrium conditions and instances where the mode of cultural transmission is time-variant. Results suggest that none of the models examined can produce the observed pattern under equilibrium conditions, and suggest. instead temporal shifts in the patterns of cultural transmission.

  7. Controversies in psychotherapy research: epistemic differences in assumptions about human psychology.

    Science.gov (United States)

    Shean, Glenn D

    2013-01-01

    It is the thesis of this paper that differences in philosophical assumptions about the subject matter and treatment methods of psychotherapy have contributed to disagreements about the external validity of empirically supported therapies (ESTs). These differences are evident in the theories that are the basis for both the design and interpretation of recent psychotherapy efficacy studies. The natural science model, as applied to psychotherapy outcome research, transforms the constitutive features of the study subject in a reciprocal manner so that problems, treatments, and indicators of effectiveness are limited to what can be directly observed. Meaning-based approaches to therapy emphasize processes and changes that do not lend themselves to experimental study. Hermeneutic philosophy provides a supplemental model to establishing validity in those instances where outcome indicators do not lend themselves to direct observation and measurement and require "deep" interpretation. Hermeneutics allows for a broadening of psychological study that allows one to establish a form of validity that is applicable when constructs do not refer to things that literally "exist" in nature. From a hermeneutic perspective the changes that occur in meaning-based therapies must be understood and evaluated on the manner in which they are applied to new situations, the logical ordering and harmony of the parts with the theoretical whole, and the capability of convincing experts and patients that the interpretation can stand up against other ways of understanding. Adoption of this approach often is necessary to competently evaluate the effectiveness of meaning-based therapies.

  8. Estimating required information size by quantifying diversity in random-effects model meta-analyses

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Thorlund, Kristian; Brok, Jesper

    2009-01-01

    BACKGROUND: There is increasing awareness that meta-analyses require a sufficiently large information size to detect or reject an anticipated intervention effect. The required information size in a meta-analysis may be calculated from an anticipated a priori intervention effect or from...... an intervention effect suggested by trials with low-risk of bias. METHODS: Information size calculations need to consider the total model variance in a meta-analysis to control type I and type II errors. Here, we derive an adjusting factor for the required information size under any random-effects model meta......-trial variability and a sampling error estimate considering the required information size. D2 is different from the intuitively obvious adjusting factor based on the common quantification of heterogeneity, the inconsistency (I2), which may underestimate the required information size. Thus, D2 and I2 are compared...

  9. A Functional Model of Quality Assurance for Psychiatric Hospitals and Corresponding Staffing Requirements.

    Science.gov (United States)

    Kamis-Gould, Edna; And Others

    1991-01-01

    A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…

  10. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance ad equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  11. Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    NARCIS (Netherlands)

    Di Nicolo, G.; Gamba, A.; Lucchetta, M.

    2011-01-01

    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on

  12. Knowledge Based Characterization of Cross-Models Constraints to Check Design and Modeling Requirements

    Science.gov (United States)

    Simonn Zayas, David; Monceaux, Anne; Ait-Ameur, Yamine

    2011-08-01

    Nowadays, complexity of systems frequently implies different engineering teams handling various descriptive models. Each team having a variety of expertise backgrounds, domain knowledge and modeling practices, the heterogeneity of the models themselves is a logical consequence. Therefore, even individually models are well managed; their diversity becomes a problem when engineers need to share their models to perform some overall validations. One way of reducing this heterogeneity is to take into consideration the implicit knowledge which is not contained in the models but it is cardinal to understand them. In a first stage of our research, we have defined and implemented an approach recommending the formalization of implicit knowledge to enrich models in order to ease cross- model checks. Nevertheless, to fill the gap between the specification of the system and the validation of a cross- model constraint, in this paper we suggest giving values to some relevant characteristics to reinforce the approach.

  13. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high...... accuracy. The work aims to identify the minimum amount of input data required for parameterizing an accurate model of the PV plant. The analysis was carried out for both amorphous silicon (a-Si) and cadmium telluride (CdTe), using crystalline silicon (c-Si) as a base for comparison. In the studied cases...

  14. The Nuremberg Code subverts human health and safety by requiring animal modeling

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-07-01

    Full Text Available Abstract Background The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. Discussion We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. Summary We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  15. The Nuremberg Code subverts human health and safety by requiring animal modeling.

    Science.gov (United States)

    Greek, Ray; Pippus, Annalea; Hansen, Lawrence A

    2012-07-08

    The requirement that animals be used in research and testing in order to protect humans was formalized in the Nuremberg Code and subsequent national and international laws, codes, and declarations. We review the history of these requirements and contrast what was known via science about animal models then with what is known now. We further analyze the predictive value of animal models when used as test subjects for human response to drugs and disease. We explore the use of animals for models in toxicity testing as an example of the problem with using animal models. We conclude that the requirements for animal testing found in the Nuremberg Code were based on scientifically outdated principles, compromised by people with a vested interest in animal experimentation, serve no useful function, increase the cost of drug development, and prevent otherwise safe and efficacious drugs and therapies from being implemented.

  16. Workshop on Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials

    Energy Technology Data Exchange (ETDEWEB)

    Giles, GE

    2005-02-03

    The purpose of this Workshop on ''Functional Requirements for the Modeling of Fate and Transport of Waterborne CBRN Materials'' was to solicit functional requirements for tools that help Incident Managers plan for and deal with the consequences of industrial or terrorist releases of materials into the nation's waterways and public water utilities. Twenty representatives attended and several made presentations. Several hours of discussions elicited a set of requirements. These requirements were summarized in a form for the attendees to vote on their highest priority requirements. These votes were used to determine the prioritized requirements that are reported in this paper and can be used to direct future developments.

  17. Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering

    Science.gov (United States)

    Cole, Bjorn F.; Jenkins, J. Steven

    2015-01-01

    In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.

  18. Non-monotonic modelling from initial requirements: a proposal and comparison with monotonic modelling methods

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wupper, H.; Wieringa, Roelf J.

    2008-01-01

    Researchers make a significant effort to develop new modelling languages and tools. However, they spend less effort developing methods for constructing models using these languages and tools. We are developing a method for building an embedded system model for formal verification. Our method

  19. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases......: (1) IT PPM as the top management marketplace, (2) IT PPM as the cause of social dilemmas at the lower organizational levels (3) IT PPM as polity between different organizational interests, (4) IT PPM as power relations that suppress creativity and diversity. Our metaphors can be used by practitioners...

  20. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    articles across various research disciplines. We find and classify a stock of 107 relevant articles into four scientific discourses: the normative, the interpretive, the critical, and the dialogical discourses, as formulated by Deetz (1996). We find that the normative discourse dominates the IT PPM...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases......DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...

  1. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  2. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2017-01-01

    A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...... are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute...

  3. Requirements for psychological models to support design: Towards ecological task analysis

    Science.gov (United States)

    Kirlik, Alex

    1991-01-01

    Cognitive engineering is largely concerned with creating environmental designs to support skillful and effective human activity. A set of necessary conditions are proposed for psychological models capable of supporting this enterprise. An analysis of the psychological nature of the design product is used to identify a set of constraints that models must meet if they can usefully guide design. It is concluded that cognitive engineering requires models with resources for describing the integrated human-environment system, and that these models must be capable of describing the activities underlying fluent and effective interaction. These features are required in order to be able to predict the cognitive activity that will be required given various design concepts, and to design systems that promote the acquisition of fluent, skilled behavior. These necessary conditions suggest that an ecological approach can provide valuable resources for psychological modeling to support design. Relying heavily on concepts from Brunswik's and Gibson's ecological theories, ecological task analysis is proposed as a framework in which to predict the types of cognitive activity required to achieve productive behavior, and to suggest how interfaces can be manipulated to alleviate certain types of cognitive demands. The framework is described in terms, and illustrated with an example from the previous research on modeling skilled human-environment interaction.

  4. Does Artificial Neural Network Support Connectivism's Assumptions?

    Science.gov (United States)

    AlDahdouh, Alaa A.

    2017-01-01

    Connectivism was presented as a learning theory for the digital age and connectivists claim that recent developments in Artificial Intelligence (AI) and, more specifically, Artificial Neural Network (ANN) support their assumptions of knowledge connectivity. Yet, very little has been done to investigate this brave allegation. Does the advancement…

  5. Exploring five common assumptions on Attention Deficit Hyperactivity Disorder

    NARCIS (Netherlands)

    Batstra, Laura; Nieweg, Edo H.; Hadders-Algra, Mijna

    The number of children diagnosed with attention deficit hyperactivity disorder (ADHD) and treated with medication is steadily increasing. The aim of this paper was to critically discuss five debatable assumptions on ADHD that may explain these trends to some extent. These are that ADHD (i) causes

  6. Judgment: Deductive Logic and Assumption Recognition: Grades 7-12.

    Science.gov (United States)

    Instructional Objectives Exchange, Los Angeles, CA.

    This collection of objectives and related measures deals with one side of judgment: deductive logic and assumption recognition. They are suggestive of students' ability to make judgments based on logical analysis rather than comprehensive indices of overall capacity for judgment. They include Conditional Reasoning Index, Class Reasoning Index,…

  7. Child Development Knowledge and Teacher Preparation: Confronting Assumptions.

    Science.gov (United States)

    Katz, Lilian G.

    This paper questions the widely held assumption that acquiring knowledge of child development is an essential part of teacher preparation and teaching competence, especially among teachers of young children. After discussing the influence of culture, parenting style, and teaching style on developmental expectations and outcomes, the paper asserts…

  8. Observing gravitational-wave transient GW150914 with minimal assumptions

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. C.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, R.D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, A.L.S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, T.C; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brocki, P.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderon Bustillo, J.; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Qian; Chua, S. E.; Chung, E.S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, A.C.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, A.L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Debra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.A.; DeRosa, R. T.; Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M.G.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H. -B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, T. M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.M.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. R.; Flaminio, R.; Fletcher, M; Fournier, J. -D.; Franco, S; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.P.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.M.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; de Haas, R.; Hacker, J. J.; Buffoni-Hall, R.; Hall, E. D.; Hammond, G.L.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, P.J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hinder, I.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.H.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kefelian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan., S.; Khan, Z.; Khazanov, E. A.; Kijhunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.M.; King, E. J.; King, P. J.; Kinsey, M.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Laguna, P.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, R.; Leavey, S.; Lebigot, E. O.; Lee, C.H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lueck, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magana-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R.M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, J.C.; Moraru, D.; Gutierrez Moreno, M.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Gutierrez-Neri, M.; Neunzert, A.; Newton-Howes, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J.; Oh, S. H.; Ohme, F.; Oliver, M. B.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Page, J.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prolchorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Puerrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosinska, D.; Rowan, S.; Ruediger, A.; Ruggi, P.; Ryan, K.A.; Sachdev, P.S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabel, R.B.; Schofield, R. M. S.; Schoenbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, M.S.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shithriar, M. S.; Shaltev, M.; Shao, Z.M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, António Dias da; Simakov, D.; Singer, A; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, J.R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepanczyk, M. J.; Tacca, M.D.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Toyra, D.; Travasso, F.; Traylor, G.; Trifiro, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlhruch, H.; Vajente, G.; Valdes, G.; Van Bakel, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasuth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, R. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.M.; Wessels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, D.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J.L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. -P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be

  9. Origins and Traditions in Comparative Education: Challenging Some Assumptions

    Science.gov (United States)

    Manzon, Maria

    2018-01-01

    This article questions some of our assumptions about the history of comparative education. It explores new scholarship on key actors and ways of knowing in the field. Building on the theory of the social constructedness of the field of comparative education, the paper elucidates how power shapes our scholarly histories and identities.

  10. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  11. Seven Assumptions of a Solution-Focused Conversational Leader.

    Science.gov (United States)

    Paull, Robert C.; McGrevin, Carol Z.

    1996-01-01

    Effective psychologists and school leaders know how to manage conversations to help clients or stakeholders move toward solutions. This article presents the assumptions of solution-focused brief therapy in a school leadership context. Key components are focusing on solutions, finding exceptions, identifying changes, starting small, listening to…

  12. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  13. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  14. Posttraumatic Growth and Shattered World Assumptions Among Ex-POWs

    DEFF Research Database (Denmark)

    Lahav, Y.; Bellin, Elisheva S.; Solomon, Z.

    2016-01-01

    world assumptions (WAs) and that the co-occurrence of high PTG and negative WAs among trauma survivors reflects reconstruction of an integrative belief system. The present study aimed to test these claims by investigating, for the first time, the mediating role of dissociation in the relation between...

  15. 7 CFR 1980.476 - Transfer and assumptions.

    Science.gov (United States)

    2010-01-01

    ...) PROGRAM REGULATIONS (CONTINUED) GENERAL Business and Industrial Loan Program § 1980.476 Transfer and... give to secure the debt, will be adequate to secure the balance of the total guaranteed loan owed, plus... assumption provisions if the guaranteed loan debt balance is within his/her individual loan approval...

  16. Definition of common support equipment and space station interface requirements for IOC model technology experiments

    Science.gov (United States)

    Russell, Richard A.; Waiss, Richard D.

    1988-01-01

    A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.

  17. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    Science.gov (United States)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  18. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    Science.gov (United States)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  19. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  20. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    International Nuclear Information System (INIS)

    St John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program

  1. Tale of Two Courthouses: A Critique of the Underlying Assumptions in Chronic Disease Self-Management for Aboriginal People

    Directory of Open Access Journals (Sweden)

    Isabelle Ellis

    2009-12-01

    Full Text Available This article reviews the assumptions that underpin thecommonly implemented Chronic Disease Self-Managementmodels. Namely that there are a clear set of instructions forpatients to comply with, that all health care providers agreewith; and that the health care provider and the patient agreewith the chronic disease self-management plan that wasdeveloped as part of a consultation. These assumptions areevaluated for their validity in the remote health care context,particularly for Aboriginal people. These assumptions havebeen found to lack validity in this context, therefore analternative model to enhance chronic disease care isproposed.

  2. Conclusions of the workshop on the ATLAS requirements on shower models

    CERN Document Server

    Bosman, M; Efthymiopoulos, I; Froidevaux, D; Gianotti, F; Kiryunin, A E; Knobloch, J; Loch, P; Osculati, B; Perini, L; Sala, P R; Seman, M

    1999-01-01

    The workshop addressed the question of shower models/packages and related issues needed for the simulation of ATLAS physics and test beam data. Part of the material discussed during the workshop is reviewed in this note. Results presented on the comparison berween ATLAS test beam data and Monte Carlo prediction of various shower models are briefly summarized. The requirements put forward by the various detector communities and first attempts to quantify them are being reviewed.

  3. System Design Description and Requirements for Modeling the Off-Gas Systems for Fuel Recycling Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Daryl R. Haefner; Jack D. Law; Troy J. Tranter

    2010-08-01

    This document provides descriptions of the off-gases evolved during spent nuclear fuel processing and the systems used to capture the gases of concern. Two reprocessing techniques are discussed, namely aqueous separations and electrochemical (pyrochemical) processing. The unit operations associated with each process are described in enough detail so that computer models to mimic their behavior can be developed. The document also lists the general requirements for the desired computer models.

  4. 3D Core Model for simulation of nuclear power plants: Simulation requirements, model features, and validation

    International Nuclear Information System (INIS)

    Zerbino, H.

    1999-01-01

    In 1994-1996, Thomson Training and Simulation (TT and S) earned out the D50 Project, which involved the design and construction of optimized replica simulators for one Dutch and three German Nuclear Power Plants. It was recognized early on that the faithful reproduction of the Siemens reactor control and protection systems would impose extremely stringent demands on the simulation models, particularly the Core physics and the RCS thermohydraulics. The quality of the models, and their thorough validation, were thus essential. The present paper describes the main features of the fully 3D Core model implemented by TT and S, and its extensive validation campaign, which was defined in extremely positive collaboration with the Customer and the Core Data suppliers. (author)

  5. A Conceptual Model and Process for Client-driven Agile Requirements Prioritization

    NARCIS (Netherlands)

    Racheva, Z.; Daneva, Maia; Herrmann, Andrea; Wieringa, Roelf J.

    Continuous customer-centric requirements reprioritization is essential in successfully performing agile software development. Yet, in the agile RE literature, very little is known about how agile reprioritization happens in practice. Generic conceptual models about this process are missing, which in

  6. INTEGRATED DATA CAPTURING REQUIREMENTS FOR 3D SEMANTIC MODELLING OF CULTURAL HERITAGE: THE INCEPTION PROTOCOL

    Directory of Open Access Journals (Sweden)

    R. Di Giulio

    2017-02-01

    In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  7. Handling non-functional requirements in model-driven development: an ongoing industrial survey

    NARCIS (Netherlands)

    Ameller, David; Franch, Xavier; Gómez, Cristina; Araújo, João; Berntsson Svensson, Richard; Biffle, Stefan; Cabot, Jordi; Cortelessa, Vittorio; Daneva, Maia; Méndez Fernández, Daniel; Moreira, Ana; Muccini, Henry; Vallecillo, Antonio; Wimmer, Manuel; Amaral, Vasco; Brunelière, Hugo; Burgueño, Loli; Goulão, Miguel; Schätz, Bernard; Teufl, Sabine

    2015-01-01

    Model-Driven Development (MDD) is no longer a novel development paradigm. It has become mature from a research perspective and recent studies show its adoption in industry. Still, some issues remain a challenge. Among them, we are interested in the treatment of non-functional requirements (NFRs) in

  8. Projected irrigation requirements for upland crops using soil moisture model under climate change in South Korea

    Science.gov (United States)

    An increase in abnormal climate change patterns and unsustainable irrigation in uplands cause drought and affect agricultural water security, crop productivity, and price fluctuations. In this study, we developed a soil moisture model to project irrigation requirements (IR) for upland crops under cl...

  9. Teaching Tip: Using Rapid Game Prototyping for Exploring Requirements Discovery and Modeling

    Science.gov (United States)

    Dalal, Nikunj

    2012-01-01

    We describe the use of rapid game prototyping as a pedagogic technique to experientially explore and learn requirements discovery, modeling, and specification in systems analysis and design courses. Students have a natural interest in gaming that transcends age, gender, and background. Rapid digital game creation is used to build computer games…

  10. Understanding the relationship between Kano model's customer satisfaction scores and self-stated requirements importance.

    Science.gov (United States)

    Mkpojiogu, Emmanuel O C; Hashim, Nor Laily

    2016-01-01

    Customer satisfaction is the result of product quality and viability. The place of the perceived satisfaction of users/customers for a software product cannot be neglected especially in today competitive market environment as it drives the loyalty of customers and promotes high profitability and return on investment. Therefore understanding the importance of requirements as it is associated with the satisfaction of users/customers when their requirements are met is worth the pain considering. It is necessary to know the relationship between customer satisfactions when their requirements are met (or their dissatisfaction when their requirements are unmet) and the importance of such requirement. So many works have been carried out on customer satisfaction in connection with the importance of requirements but the relationship between customer satisfaction scores (coefficients) of the Kano model and users/customers self-stated requirements importance have not been sufficiently explored. In this study, an attempt is made to unravel the underlying relationship existing between Kano model's customer satisfaction indexes and users/customers self reported requirements importance. The results of the study indicate some interesting associations between these considered variables. These bivariate associations reveal that customer satisfaction index (SI), and average satisfaction coefficient (ASC) and customer dissatisfaction index (DI) and average satisfaction coefficient (ASC) are highly correlated (r = 96 %) and thus ASC can be used in place of either SI or DI in representing customer satisfaction scores. Also, these Kano model's customer satisfaction variables (SI, DI, and ASC) are each associated with self-stated requirements importance (IMP). Further analysis indicates that the value customers or users place on requirements that are met or on features that are incorporated into a product influences the level of satisfaction such customers derive from the product. The

  11. Bion, basic assumptions, and violence: a corrective reappraisal.

    Science.gov (United States)

    Roth, Bennett

    2013-10-01

    Group psychoanalytic theory rests on many of the same psychoanalytic assumptions as individual psychoanalytic theory but has been slow in developing its own language and unique understanding of conflict within the group, as many group phenomena are not the same as individual psychic events. Regressive fantasies and alliances within and to the group are determined by group composition and the interaction of fantasies among members and leader. Bion's useful but incomplete early abstract formulation of psychic regression in groups was the initial attempt to move beyond Freud's largely sociological view. This paper explores some of the origins of Bion's neglect of murderous violence in groups as a result of his own experiences in the first European war. In the following, I present evidence for the existence of a violent basic assumption and offer evidence as to Bion's avoidance of murderous and violent acts.

  12. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  13. Bases, Assumptions, and Results of the Flowsheet Calculations for the Decision Phase Salt Disposition Alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Dimenna, R.A.; Jacobs, R.A.; Taylor, G.A.; Durate, O.E.; Paul, P.K.; Elder, H.H.; Pike, J.A.; Fowler, J.R.; Rutland, P.L.; Gregory, M.V.; Smith III, F.G.; Hang, T.; Subosits, S.G.; Campbell, S.G.

    2001-03-26

    The High Level Waste (HLW) Salt Disposition Systems Engineering Team was formed on March 13, 1998, and chartered to identify options, evaluate alternatives, and recommend a selected alternative(s) for processing HLW salt to a permitted wasteform. This requirement arises because the existing In-Tank Precipitation process at the Savannah River Site, as currently configured, cannot simultaneously meet the HLW production and Authorization Basis safety requirements. This engineering study was performed in four phases. This document provides the technical bases, assumptions, and results of this engineering study.

  14. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Natarajan, Anand

    2015-01-01

    (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses......Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production...... distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against...

  15. Bank stress testing under different balance sheet assumptions

    OpenAIRE

    Busch, Ramona; Drescher, Christian; Memmel, Christoph

    2017-01-01

    Using unique supervisory survey data on the impact of a hypothetical interest rate shock on German banks, we analyse price and quantity effects on banks' net interest margin components under different balance sheet assumptions. In the first year, the cross-sectional variation of banks' simulated price effect is nearly eight times as large as the one of the simulated quantity effect. After five years, however, the importance of both effects converges. Large banks adjust their balance sheets mo...

  16. Baseline requirements of the proposed action for the Transportation Management Division routing models

    International Nuclear Information System (INIS)

    Johnson, P.E.; Joy, D.S.

    1995-02-01

    The potential impacts associated with the transportation of hazardous materials are important to shippers, carriers, and the general public. This is particularly true for shipments of radioactive material. The shippers are primarily concerned with safety, security, efficiency, and equipment requirements. The carriers are concerned with the potential impact that radioactive shipments may have on their operations--particularly if such materials are involved in an accident. The general public has also expressed concerns regarding the safety of transporting radioactive and other hazardous materials through their communities. Because transportation routes are a central concern in hazardous material transport, the prediction of likely routes is the first step toward resolution of these issues. In response to these routing needs, several models have been developed over the past fifteen years at Oak Ridge National Laboratory (ORNL). The HIGHWAY routing model is used to predict routes for truck transportation, the INTERLINE routing model is used to predict both rail and barge routes, and the AIRPORT locator model is used to determine airports with specified criteria near a specific location. As part of the ongoing improvement of the US Department of Energy's (DOE) Environmental Management Transportation Management Division's (EM-261) computer systems and development efforts, a Baseline Requirements Assessment Session on the HIGHWAY, INTERLINE, and AIRPORT models was held at ORNL on April 27, 1994. The purpose of this meeting was to discuss the existing capabilities of the models and data bases and to review enhancements of the models and data bases to expand their usefulness. The results of the Baseline Requirements Assessment Section will be discussed in this report. The discussions pertaining to the different models are contained in separate sections

  17. The realisation of legal protection requirements with the aid of models of nuclear facilities

    International Nuclear Information System (INIS)

    Wildberg, D.W.; Herrmann, H.J.

    1978-08-01

    In the Federal Republic of Germany, the model-based planning, construction and operation of nuclear facilities is still in its initial stages. Based on a few examples, the authors show that with the atomic energy legislature and with the laws in the conventional sector, the legislator had enacted requirements at a relatively early stage for the protection of the individual person in the facility and for the population at large in the vicinity of the facility. However, in the realization of these protection requirements, there are still problems, and these are often very basic in nature. The best solution here seems to be to tackle the problems with the help of models. This would permit subjects like serviceability, testability, use of external personnel, spatial distribution of redundancies, rescue of injured persons, fire protection measures, physical protection and the dismantling of facilities, which are multifarious in nature and have overlapping requirements, to be presented and discussed in greater depth and detail. The positive aspects of the use of models are presented, and the advantages and disadvantages of models are discussed in detail. Finally, the variety of models, which can ben used during the different phases of a nuclear facility, are discussed, and some remarks are made regarding the costs of models. One section of the report deals with examples of the practical use of models: Models have proved themselves in the past in the construction of refineries and chemical plants, and have successfully demonstrated their suitability in the field of nuclear technology. The examples of these need not be limited to those in the Federal Republic of Germany. (orig.) 891 HP [de

  18. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  19. Singularity free N-body simulations called 'Dynamic Universe Model' don't require dark matter

    Science.gov (United States)

    Naga Parameswara Gupta, Satyavarapu

    For finding trajectories of Pioneer satellite (Anomaly), New Horizons satellite going to Pluto, the Calculations of Dynamic Universe model can be successfully applied. No dark matter is assumed within solar system radius. The effect on the masses around SUN shows as though there is extra gravitation pull toward SUN. It solves the Dynamics of Extra-solar planets like Planet X, satellite like Pioneer and NH for 3-Position, 3-velocity 3-accelaration for their masses, considering the complex situation of Multiple planets, Stars, Galaxy parts and Galaxy centre and other Galaxies Using simple Newtonian Physics. It already solved problems Missing mass in Galaxies observed by galaxy circular velocity curves successfully. Singularity free Newtonian N-body simulations Historically, King Oscar II of Sweden an-nounced a prize to a solution of N-body problem with advice given by Güsta Mittag-Leffler in 1887. He announced `Given a system of arbitrarily many mass points that attract each according to Newton's law, under the assumption that no two points ever collide, try to find a representation of the coordinates of each point as a series in a variable that is some known function of time and for all of whose values the series converges uniformly.'[This is taken from Wikipedia]. The announced dead line that time was1st June 1888. And after that dead line, on 21st January 1889, Great mathematician Poincaré claimed that prize. Later he himself sent a telegram to journal Acta Mathematica to stop printing the special issue after finding the error in his solution. Yet for such a man of science reputation is important than money. [ Ref Book `Celestial mechanics: the waltz of the planets' By Alessandra Celletti, Ettore Perozzi, page 27]. He realized that he has been wrong in his general stability result! But till now nobody could solve that problem or claimed that prize. Later all solutions resulted in singularities and collisions of masses, given by many people

  20. Modeling Requirements for Simulating the Effects of Extreme Acts of Terrorism: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Allen, M.; Hiebert-Dodd, K.; Marozas, D.; Paananen, O.; Pryor, R.J.; Reinert, R.K.

    1998-10-01

    This white paper presents the initial requirements for developing a new computer model for simulating the effects of extreme acts of terrorism in the United States. General characteristics of the model are proposed and the level of effort to prepare a complete written description of the model, prior to coding, is detailed. The model would simulate the decision processes and interactions of complex U. S. systems engaged in responding to and recovering from four types of terrorist incidents. The incident scenarios span the space of extreme acts of terrorism that have the potential to affect not only the impacted area, but also the entire nation. The model would be useful to decision-makers in assessing and analyzing the vulnerability of the nation's complex infrastructures, in prioritizing resources to reduce risk, and in planning strategies for immediate response and for subsequent recovery from terrorist incidents.

  1. On meeting capital requirements with a chance-constrained optimization model.

    Science.gov (United States)

    Atta Mills, Ebenezer Fiifi Emire; Yu, Bo; Gu, Lanlan

    2016-01-01

    This paper deals with a capital to risk asset ratio chance-constrained optimization model in the presence of loans, treasury bill, fixed assets and non-interest earning assets. To model the dynamics of loans, we introduce a modified CreditMetrics approach. This leads to development of a deterministic convex counterpart of capital to risk asset ratio chance constraint. We pursue the scope of analyzing our model under the worst-case scenario i.e. loan default. The theoretical model is analyzed by applying numerical procedures, in order to administer valuable insights from a financial outlook. Our results suggest that, our capital to risk asset ratio chance-constrained optimization model guarantees banks of meeting capital requirements of Basel III with a likelihood of 95 % irrespective of changes in future market value of assets.

  2. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  3. Load assumption for fatigue design of structures and components counting methods, safety aspects, practical application

    CERN Document Server

    Köhler, Michael; Pötter, Kurt; Zenner, Harald

    2017-01-01

    Understanding the fatigue behaviour of structural components under variable load amplitude is an essential prerequisite for safe and reliable light-weight design. For designing and dimensioning, the expected stress (load) is compared with the capacity to withstand loads (fatigue strength). In this process, the safety necessary for each particular application must be ensured. A prerequisite for ensuring the required fatigue strength is a reliable load assumption. The authors describe the transformation of the stress- and load-time functions which have been measured under operational conditions to spectra or matrices with the application of counting methods. The aspects which must be considered for ensuring a reliable load assumption for designing and dimensioning are discussed in detail. Furthermore, the theoretical background for estimating the fatigue life of structural components is explained, and the procedures are discussed for numerous applications in practice. One of the prime intentions of the authors ...

  4. How Many Model Evaluations Are Required To Predict The AEP Of A Wind Power Plant?

    International Nuclear Information System (INIS)

    Murcia, J P; Réthoré, P E; Natarajan, A; Sørensen, J D

    2015-01-01

    Wind farm flow models have advanced considerably with the use of large eddy simulations (LES) and Reynolds averaged Navier-Stokes (RANS) computations. The main limitation of these techniques is their high computational time requirements; which makes their use for wind farm annual energy production (AEP) predictions expensive. The objective of the present paper is to minimize the number of model evaluations required to capture the wind power plant's AEP using stationary wind farm flow models. Polynomial chaos techniques are proposed based on arbitrary Weibull distributed wind speed and Von Misses distributed wind direction. The correlation between wind direction and wind speed are captured by defining Weibull-parameters as functions of wind direction. In order to evaluate the accuracy of these methods the expectation and variance of the wind farm power distributions are compared against the traditional binning method with trapezoidal and Simpson's integration rules.The wind farm flow model used in this study is the semi-empirical wake model developed by Larsen [1]. Three test cases are studied: a single turbine, a simple and a real offshore wind power plant. A reduced number of model evaluations for a general wind power plant is proposed based on the convergence of the present method for each case. (paper)

  5. A New Rapid Simplified Model for Urban Rainstorm Inundation with Low Data Requirements

    Directory of Open Access Journals (Sweden)

    Ji Shen

    2016-11-01

    Full Text Available This paper proposes a new rapid simplified inundation model (NRSIM for flood inundation caused by rainstorms in an urban setting that can simulate the urban rainstorm inundation extent and depth in a data-scarce area. Drainage basins delineated from a floodplain map according to the distribution of the inundation sources serve as the calculation cells of NRSIM. To reduce data requirements and computational costs of the model, the internal topography of each calculation cell is simplified to a circular cone, and a mass conservation equation based on a volume spreading algorithm is established to simulate the interior water filling process. Moreover, an improved D8 algorithm is outlined for the simulation of water spilling between different cells. The performance of NRSIM is evaluated by comparing the simulated results with those from a traditional rapid flood spreading model (TRFSM for various resolutions of digital elevation model (DEM data. The results are as follows: (1 given high-resolution DEM data input, the TRFSM model has better performance in terms of precision than NRSIM; (2 the results from TRFSM are seriously affected by the decrease in DEM data resolution, whereas those from NRSIM are not; and (3 NRSIM always requires less computational time than TRFSM. Apparently, compared with the complex hydrodynamic or traditional rapid flood spreading model, NRSIM has much better applicability and cost-efficiency in real-time urban inundation forecasting for data-sparse areas.

  6. Oil production, oil prices, and macroeconomic adjustment under different wage assumptions

    International Nuclear Information System (INIS)

    Harvie, C.; Maleka, P.T.

    1992-01-01

    In a previous paper one of the authors developed a simple model to try to identify the possible macroeconomic adjustment processes arising in an economy experiencing a temporary period of oil production, under alternative wage adjustment assumptions, namely nominal and real wage rigidity. Certain assumptions were made regarding the characteristics of actual production, the permanent revenues generated from that oil production, and the net exports/imports of oil. The role of the price of oil, and possible changes in that price was essentially ignored. Here we attempt to incorporate the price of oil, as well as changes in that price, in conjunction with the production of oil, the objective being to identify the contribution which the price of oil, and changes in it, make to the adjustment process itself. The emphasis in this paper is not given to a mathematical derivation and analysis of the model's dynamics of adjustment or its comparative statics, but rather to the derivation of simulation results from the model, for a specific assumed case, using a numerical algorithm program, conducive to the type of theoretical framework utilized here. The results presented suggest that although the adjustment profiles of the macroeconomic variables of interest, for either wage adjustment assumption, remain fundamentally the same, the magnitude of these adjustments is increased. Hence to derive a more accurate picture of the dimensions of adjustment of these macroeconomic variables, it is essential to include the price of oil as well as changes in that price. (Author)

  7. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  8. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Science.gov (United States)

    Hsu, Anne; Griffiths, Thomas L

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  9. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  10. Evaluation of risk impact of changes to surveillance requirements addressing model and parameter uncertainties

    International Nuclear Information System (INIS)

    Martorell, S.; Villamizar, M.; Martón, I.; Villanueva, J.F.; Carlos, S.; Sánchez, A.I.

    2014-01-01

    This paper presents a three steps based approach for the evaluation of risk impact of changes to Surveillance Requirements based on the use of the Probabilistic Risk Assessment and addressing identification, treatment and analysis of model and parameter uncertainties in an integrated manner. The paper includes also an example of application that focuses on the evaluation of the risk impact of a Surveillance Frequency change for the Reactor Protection System of a Nuclear Power Plant using a level 1 Probabilistic Risk Assessment. Surveillance Requirements are part of Technical Specifications that are included into the Licensing Basis for operation of Nuclear Power Plants. Surveillance Requirements aim at limiting risk of undetected downtimes of safety related equipment by imposing equipment operability checks, which consist of testing of equipment operational parameters with established Surveillance Frequency and Test Strategy

  11. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  12. Estimates of methionine and sulfur amino acid requirements for laying hens using different models

    Directory of Open Access Journals (Sweden)

    AA Saki

    2012-09-01

    Full Text Available This experiment was conducted to evaluate the effects of dietary methionine (Met content on the performance of white commercial laying hens and to determine Met and total sulfur amino acids requirements (TSAA. These requirements were estimated using three statistical models (broken-line regression, exponential and second order equations to evaluate their abilit to determine amino acid requirements. A total of 216 laying hens (23 wks of age was used in a completely randomized design (CRD with six treatments with four replicates of nine birds each. The basal diet contained 15.25% crude protein, 2830.16 kcal/kg ME and 0.24% Met. Synthetic DL-Met was added to the deficient (basal diet in 0.05% increments to make the other five experimental diets (0.29, 0.34, 0.39, 0.44 and 0.49% Met. Increasing Met level from 0.24 to 0.34% significantly increased egg production, egg weight, egg mass, egg content, and feed intake and decreased feed conversion ratio (p<0.05. However, further Met increases, from 0.34 to 0.49%, no longer influenced these parameters. Out of the three models, the broken-line regression model presented better estimates of AA requirements. Based on broken-line equations, average Met and TSAA requirements of the laying hens were 0.31 and 0.60% (245.50 and 469.25 mg/hen/day from 22 to 36 wks of age, respectively.

  13. Dependencies among Architectural Views Got from Software Requirements Based on a Formal Model

    Directory of Open Access Journals (Sweden)

    Osis Janis

    2014-12-01

    Full Text Available A system architect has software requirements and some unspecified knowledge about a problem domain (e.g., an enterprise as source information for assessment and evaluation of possible solutions and getting the target point, a preliminary software design. The solving factor is architect’s experience and expertise in the problem domain (“AS-IS”. A proposed approach is dedicated to assist a system architect in making an appropriate decision on the solution (“TO-BE”. It is based on a formal mathematical model, Topological Functioning Model (TFM. Compliant TFMs can be transformed into software architectural views. The paper demonstrates and discusses tracing dependency links from the requirements to and between the architectural views.

  14. Footbridge Response Predictions and Their Sensitivity to Stochastic Load Assumptions

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2011-01-01

    Knowledge about footbridges response to actions of walking is important in assessments of vibration serviceability. In a number of design codes for footbridges, the vibration serviceability limit state is assessed using a walking load model in which the walking parameters (step frequency, pedestr......Knowledge about footbridges response to actions of walking is important in assessments of vibration serviceability. In a number of design codes for footbridges, the vibration serviceability limit state is assessed using a walking load model in which the walking parameters (step frequency...... of pedestrians for predicting footbridge response, which is meaningful, and a step forward. Modelling walking parameters stochastically, however, requires decisions to be made in terms of their statistical distribution and the parameters describing the statistical distribution. The paper investigates...... the sensitivity of results of computations of bridge response to some of the decisions to be made in this respect. This is a useful approach placing focus on which decisions (and which information) are important for sound estimation of bridge response. The studies involve estimating footbridge responses using...

  15. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  16. HYPROLOG: A New Logic Programming Language with Assumptions and Abduction

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2005-01-01

    . The language shows a novel flexibility in the interaction between the different paradigms, including all additional built-in predicates and constraints solvers that may be available. Assumptions and abduction are especially useful for language processing, and we can show how HYPROLOG works seamlessly together...... with the grammar notation provided by the underlying Prolog system. An operational semantics is given which complies with standard declarative semantics for the ``pure'' sublanguages, while for the full HYPROLOG language, it must be taken as definition. The implementation is straightforward and seems to provide...

  17. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  18. Seamless Requirements

    OpenAIRE

    Naumchev, Alexandr; Meyer, Bertrand

    2017-01-01

    Popular notations for functional requirements specifications frequently ignore developers' needs, target specific development models, or require translation of requirements into tests for verification; the results can give out-of-sync or downright incompatible artifacts. Seamless Requirements, a new approach to specifying functional requirements, contributes to developers' understanding of requirements and to software quality regardless of the process, while the process itself becomes lighter...

  19. High resolution weather data for urban hydrological modelling and impact assessment, ICT requirements and future challenges

    Science.gov (United States)

    ten Veldhuis, Marie-claire; van Riemsdijk, Birna

    2013-04-01

    Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This

  20. The metaphysics of D-CTCs: On the underlying assumptions of Deutsch's quantum solution to the paradoxes of time travel

    Science.gov (United States)

    Dunlap, Lucas

    2016-11-01

    I argue that Deutsch's model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a "multiverse", and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent a quantum solution to the paradoxes of time travel.

  1. A critical review of the data requirements for fluid flow models through fractured rock

    International Nuclear Information System (INIS)

    Priest, S.D.

    1986-01-01

    The report is a comprehensive critical review of the data requirements for ten models of fluid flow through fractured rock, developed in Europe and North America. The first part of the report contains a detailed review of rock discontinuities and how their important geometrical properties can be quantified. This is followed by a brief summary of the fundamental principles in the analysis of fluid flow through two-dimensional discontinuity networks and an explanation of a new approach to the incorporation of variability and uncertainty into geotechnical models. The report also contains a review of the geological and geotechnical properties of anhydrite and granite. Of the ten fluid flow models reviewed, only three offer a realistic fracture network model for which it is feasible to obtain the input data. Although some of the other models have some valuable or novel features, there is a tendency to concentrate on the simulation of contaminant transport processes, at the expense of providing a realistic fracture network model. Only two of the models reviewed, neither of them developed in Europe, have seriously addressed the problem of analysing fluid flow in three-dimensional networks. (author)

  2. Minimal Requirements for Primary HIV Latency Models Based on a Systematic Review.

    Science.gov (United States)

    Bonczkowski, Pawel; De Scheerder, Marie-Angélique; De Spiegelaere, Ward; Vandekerckhove, Linos

    2016-01-01

    Due to the scarcity of HIV-1 latently infected cells in patients, in vitro primary latency models are now commonly used to study the HIV-1 reservoir. To this end, a number of experimental systems have been developed. Most of these models differ based on the nature of the primary CD4+ T-cell type, the used HIV strains, activation methods, and latency assessment strategies. Despite these differences, most models share some common characteristics. Here, we provide a systematic review covering the primary HIV latency models that have been used to date with the aim to compare these models and identify minimal requirements for such experiments. A systematic search on PubMed and Web of Science databases generated a short list of 17 unique publications that propose new in vitro latency models. Based on the described methods, we propose and discuss a generalized workflow, visualizing all the necessary steps to perform such an in vitro study, with the key choices and validation steps that need to be made; from cell type selection until the model readout.

  3. Our Electron Model vindicates Schr"odinger's Incomplete Results and Require Restatement of Heisenberg's Uncertainty Principle

    Science.gov (United States)

    McLeod, David; McLeod, Roger

    2008-04-01

    The electron model used in our other joint paper here requires revision of some foundational physics. That electron model followed from comparing the experimentally proved results of human vision models using spatial Fourier transformations, SFTs, of pincushion and Hermann grids. Visual systems detect ``negative'' electric field values for darker so-called ``illusory'' diagonals that are physical consequences of the lens SFT of the Hermann grid, distinguishing this from light ``illusory'' diagonals. This indicates that oppositely directed vectors of the separate illusions are discretely observable, constituting another foundational fault in quantum mechanics, QM. The SFT of human vision is merely the scaled SFT of QM. Reciprocal space results of wavelength and momentum mimic reciprocal relationships between space variable x and spatial frequency variable p, by the experiment mentioned. Nobel laureate physicist von B'ek'esey, physiology of hearing, 1961, performed pressure input Rect x inputs that the brain always reports as truncated Sinc p, showing again that the brain is an adjunct built by sight, preserves sign sense of EMF vectors, and is hard wired as an inverse SFT. These require vindication of Schr"odinger's actual, but incomplete, wave model of the electron as having physical extent over the wave, and question Heisenberg's uncertainty proposal.

  4. Simulation of fertilizer requirement for irrigated wheat in Eastern India using the QUEFTS model

    Directory of Open Access Journals (Sweden)

    Debtanu Maiti

    2006-01-01

    Full Text Available Crop modeling can provide us with information about fertilizer dose to achieve the target yield, crop conditions, etc. Due to conventional and imbalanced fertilizer application, nutrient use efficiency in wheat is low. Estimation of fertilizer requirements based on quantitative approaches can assist in improving yields and nutrient use efficiency. Field experiments were conducted at 20 sites in eastern India (Nadia district of West Bengal to assess the soil supply, requirement, and internal efficiency of N, P, K, and Zn in wheat. The data were used to calibrate the QUEFTS (Quantitative Evaluation of the Fertility of Tropical Soils model for site-specific, balanced fertilizer recommendations. The parameters of maximum accumulation (a and maximum dilution (d in wheat were calculated for N (35, 100, P (129, 738, K (17, 56, and Zn (21502, 140244. Grain yield of wheat showed statistically significant correlation with N (R2 = 0.937**, P (R2 = 0.901**, and K uptake (R2 = 0.801**. The NPK ratio to produce 1 tonne grain yield of wheat was calculated to be 4.9:1.0:8.9. The relationships between chemical properties and nutrient-supplying capacity of soils were also established. The model was validated using the data from four other experiments. Observed yields with different amounts of N, P, K, and Zn were in good agreement with the predicted values, suggesting that the validated QUEFTS model can be used for site-specific nutrient management of wheat.

  5. Modeling orbital relative motion to enable formation design from application requirements

    Science.gov (United States)

    Fasano, Giancarmine; D'Errico, Marco

    2009-11-01

    While trajectory design for single satellite Earth observation missions is usually performed by means of analytical and relatively simple models of orbital dynamics including the main perturbations for the considered cases, most literature on formation flying dynamics is devoted to control issues rather than mission design. This work aims at bridging the gap between mission requirements and relative dynamics in multi-platform missions by means of an analytical model that describes relative motion for satellites moving on near circular low Earth orbits. The development is based on the orbital parameters approach and both the cases of close and large formations are taken into account. Secular Earth oblateness effects are included in the derivation. Modeling accuracy, when compared to a nonlinear model with two body and J2 forces, is shown to be of the order of 0.1% of relative coordinates for timescales of hundreds of orbits. An example of formation design is briefly described shaping a two-satellite formation on the basis of geometric requirements for synthetic aperture radar interferometry.

  6. Antidepressant-like Effects of Electroconvulsive Seizures Require Adult Neurogenesis in a Neuroendocrine Model of Depression.

    Science.gov (United States)

    Schloesser, Robert J; Orvoen, Sophie; Jimenez, Dennisse V; Hardy, Nicholas F; Maynard, Kristen R; Sukumar, Mahima; Manji, Husseini K; Gardier, Alain M; David, Denis J; Martinowich, Keri

    2015-01-01

    Neurogenesis continues throughout life in the hippocampal dentate gyrus. Chronic treatment with monoaminergic antidepressant drugs stimulates hippocampal neurogenesis, and new neurons are required for some antidepressant-like behaviors. Electroconvulsive seizures (ECS), a laboratory model of electroconvulsive therapy (ECT), robustly stimulate hippocampal neurogenesis. ECS requires newborn neurons to improve behavioral deficits in a mouse neuroendocrine model of depression. We utilized immunohistochemistry for doublecortin (DCX), a marker of migrating neuroblasts, to assess the impact of Sham or ECS treatments (1 treatment per day, 7 treatments over 15 days) on hippocampal neurogenesis in animals receiving 6 weeks of either vehicle or chronic corticosterone (CORT) treatment in the drinking water. We conducted tests of anxiety- and depressive-like behavior to investigate the ability of ECS to reverse CORT-induced behavioral deficits. We also determined whether adult neurons are required for the effects of ECS. For these studies we utilized a pharmacogenetic model (hGFAPtk) to conditionally ablate adult born neurons. We then evaluated behavioral indices of depression after Sham or ECS treatments in CORT-treated wild-type animals and CORT-treated animals lacking neurogenesis. ECS is able to rescue CORT-induced behavioral deficits in indices of anxiety- and depressive-like behavior. ECS increases both the number and dendritic complexity of adult-born migrating neuroblasts. The ability of ECS to promote antidepressant-like behavior is blocked in mice lacking adult neurogenesis. ECS ameliorates a number of anxiety- and depressive-like behaviors caused by chronic exposure to CORT. ECS requires intact hippocampal neurogenesis for its efficacy in these behavioral indices. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. The extended evolutionary synthesis: its structure, assumptions and predictions.

    Science.gov (United States)

    Laland, Kevin N; Uller, Tobias; Feldman, Marcus W; Sterelny, Kim; Müller, Gerd B; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-08-22

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the 'extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism-environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. © 2015 The Author(s).

  8. Basic concepts and assumptions behind the new ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1979-01-01

    A review is given of some of the basic concepts and assumptions behind the current recommendations by the International Commission on Radiological Protection in ICRP Publications 26 and 28, which form the basis for the revision of the Basic Safety Standards jointly undertaken by IAEA, ILO, NEA and WHO. Special attention is given to the assumption of a linear, non-threshold dose-response relationship for stochastic radiation effects such as cancer and hereditary harm. The three basic principles of protection are discussed: justification of practice, optimization of protection and individual risk limitation. In the new ICRP recommendations particular emphasis is given to the principle of keeping all radiation doses as low as is reasonably achievable. A consequence of this is that the ICRP dose limits are now given as boundary conditions for the justification and optimization procedures rather than as values that should be used for purposes of planning and design. The fractional increase in total risk at various ages after continuous exposure near the dose limits is given as an illustration. The need for taking other sources, present and future, into account when applying the dose limits leads to the use of the commitment concept. This is briefly discussed as well as the new quantity, the effective dose equivalent, introduced by ICRP. (author)

  9. The contour method cutting assumption: error minimization and correction

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Kastengren, Alan L [ANL

    2010-01-01

    The recently developed contour method can measure 2-D, cross-sectional residual-stress map. A part is cut in two using a precise and low-stress cutting technique such as electric discharge machining. The contours of the new surfaces created by the cut, which will not be flat if residual stresses are relaxed by the cutting, are then measured and used to calculate the original residual stresses. The precise nature of the assumption about the cut is presented theoretically and is evaluated experimentally. Simply assuming a flat cut is overly restrictive and misleading. The critical assumption is that the width of the cut, when measured in the original, undeformed configuration of the body is constant. Stresses at the cut tip during cutting cause the material to deform, which causes errors. The effect of such cutting errors on the measured stresses is presented. The important parameters are quantified. Experimental procedures for minimizing these errors are presented. An iterative finite element procedure to correct for the errors is also presented. The correction procedure is demonstrated on experimental data from a steel beam that was plastically bent to put in a known profile of residual stresses.

  10. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  11. Estimation of cold extremes and the identical distribution assumption

    Science.gov (United States)

    Parey, Sylvie

    2016-04-01

    Extreme, generally not observed, values of meteorological (or other) hazards are estimated by use of observed time series and application of the statistical extreme value theory. This theory is based on the essential assumption that the events are independent and identically distributed. This assumption is generally not verified for meteorological hazards, firstly because these phenomena are seasonal, and secondly because climate change may induce temporal trends. These issues can be dealt with, by selecting the season of occurrence or handling trends in the extreme distribution parameters for example. When recently updating extreme cold temperatures, we faced different rather new difficulties: the threshold choice, when applying the Peak Over Threshold (POT) approach happened to be exceptionally difficult, and when applying block maxima, different block sizes could lead to significantly different return levels. A more detailed analysis of the exceedances of different cold thresholds showed that when the threshold becomes more extreme, the exceedances are not identically distributed across the years. This behaviour could have been related to the preferred phase of the North Atlantic Oscillation (NAO) during each winter, and the return level estimation has then been based on a sub-sampling between negative and positive NAO winters. The approach and the return level estimation from the sub-samples will be illustrated with an example.

  12. Bogen's Critique of Linear-No-Threshold Default Assumptions.

    Science.gov (United States)

    Crump, Kenny S

    2017-10-01

    In an article recently published in this journal, Bogen (1) concluded that an NRC committee's recommendations that default linear, nonthreshold (LNT) assumptions be applied to dose- response assessment for noncarcinogens and nonlinear mode of action carcinogens are not justified. Bogen criticized two arguments used by the committee for LNT: when any new dose adds to a background dose that explains background levels of risk (additivity to background or AB), or when there is substantial interindividual heterogeneity in susceptibility (SIH) in the exposed human population. Bogen showed by examples that SIH can be false. Herein is outlined a general proof that confirms Bogen's claim. However, it is also noted that SIH leads to a nonthreshold population distribution even if individual distributions all have thresholds, and that small changes to SIH assumptions can result in LNT. Bogen criticizes AB because it only applies when there is additivity to background, but offers no help in deciding when or how often AB holds. Bogen does not contradict the fact that AB can lead to LNT but notes that, even if low-dose linearity results, the response at higher doses may not be useful in predicting the amount of low-dose linearity. Although this is theoretically true, it seems reasonable to assume that generally there is some quantitative relationship between the low-dose slope and the slope suggested at higher doses. Several incorrect or misleading statements by Bogen are noted. © 2016 Society for Risk Analysis.

  13. Expansion of the Kano model to identify relevant customer segments and functional requirements

    DEFF Research Database (Denmark)

    Atlason, Reynir Smari; Stefansson, Arnaldur Smari; Wietz, Miriam

    2017-01-01

    or a service. A current limitation of the Kano model is that it does not allow developers to visualise which combined sets of FRs would provide the highest satisfaction between different customer segments. In this paper, a stepwise method to address this particular shortcoming is presented. First......The Kano model of customer satisfaction has been widely used to analyse perceived needs of customers. The model provides product developers valuable information about if, and then how much a given functional requirement (FR) will impact customer satisfaction if implemented within a product, system...... more than one combined customer segment. It further shows which segments provide the highest possibility for high satisfaction of combined sets of FRs. We demonstrate the usefulness of this approach in a case study involving customers’ preference for outdoor sports equipment....

  14. Evaluation of olive flowering at low latitude sites in Argentina using a chilling requirement model

    Energy Technology Data Exchange (ETDEWEB)

    Aybar, V.E.; Melo-Abreu, J.P. de; Searles, P.S.; Matias, A.G.; Del Rio, C.; Caballero, C. M.; Rousseaux, M.C.

    2015-07-01

    Olive production has expanded significantly from the Mediterranean Basin into the New World over the last two decades. In some cases, cultivars of European origin have been introduced at a large commercial scale with little previous evaluation of potential productivity. The objective of this study was to evaluate whether a temperature-driven simulation model developed in the Mediterranean Basin to predict normal flowering occurrence and flowering date using cultivar-specific thermal requirements was suitable for the low latitude areas of Northwest Argentina. The model was validated at eight sites over several years and a wide elevation range (350–1200 m above mean sea level) for three cultivars (‘Arbequina’, ‘Frantoio’, ‘Leccino’) with potentially different chilling requirements. In ‘Arbequina’, normal flowering was observed at almost all sites and in all years, while normal flowering events in ‘Frantoio’ and ‘Leccino’ were uncommon. The model successfully predicted if flowering would be normal in 92% and 83% of the cases in ‘Arbequina’ and ‘Frantoio’, respectively, but was somewhat less successful in ‘Leccino’ (61%). When flowering occurred, the predicted flowering date was within ± 7 days of the observed date in 71% of the cases. Overall, the model results indicate that cultivar-specific simulation models may be used as an approximate tool to predict whether individual cultivars will be successful in new growing areas. In Northwest Argentina, the model could be used to identify cultivars to replace ‘Frantoio’ and ‘Leccino’ and to simulate global warming scenarios. (Author)

  15. Topographic controls on shallow groundwater levels in a steep, prealpine catchment: When are the TWI assumptions valid?

    NARCIS (Netherlands)

    Rinderer, M.; van Meerveld, H.J.; Seibert, J.

    2014-01-01

    Topographic indices like the Topographic Wetness Index (TWI) have been used to predict spatial patterns of average groundwater levels and to model the dynamics of the saturated zone during events (e.g., TOPMODEL). However, the assumptions underlying the use of the TWI in hydrological models, of

  16. A review of some critical assumptions in the relationship between economic activity and freight transport

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Kveiborg, Ole

    2004-01-01

    national accounts. With these data we are able to check some of the assumptions that have commonly been made. Our findings thus have implications for future freight modelling exercises, in particular for what data it is necessary to collect and what relationships it is necessary to seek to model explicitly....... We find that it is necessary to account for changing composition of production across industries, but that the commodity mix within each industry safely can be regarded as constant. Changing value densities account for almost a third of transport growth; however, this is attributable to the first...

  17. Application and project portfolio valuation using enterprise architecture and business requirements modelling

    Science.gov (United States)

    Quartel, Dick; Steen, Maarten W. A.; Lankhorst, Marc M.

    2012-05-01

    This article describes an architecture-based approach to IT valuation. This approach offers organisations an instrument to valuate their application and project portfolios and to make well-balanced decisions about IT investments. The value of a software application is assessed in terms of its contribution to a selection of business goals. Based on such assessments, the value of different applications can be compared, and requirements for innovation, development, maintenance and phasing out can be identified. IT projects are proposed to realise the requirements. The value of each project is assessed in terms of the value it adds to one or more applications. This value can be obtained by relating the 'as-is' application portfolio to the 'to-be' portfolio that is being proposed by the project portfolio. In this way, projects can be ranked according to their added value, given a certain selection of business goals. The approach uses ArchiMate to model the relationship between software applications, business processes, services and products. In addition, two language extensions are used to model the relationship of these elements to business goals and requirements and to projects and project portfolios. The approach is illustrated using the portfolio method of Bedell and has been implemented in BiZZdesign Architect.

  18. Safety Culture: A Requirement for New Business Models — Lessons Learned from Other High Risk Industries

    International Nuclear Information System (INIS)

    Kecklund, L.

    2016-01-01

    Technical development and changes on global markets affects all high risk industries creating opportunities as well as risks related to the achievement of safety and business goals. Changes in legal and regulatory frameworks as well as in market demands create a need for major changes. Several high risk industries are facing a situation where they have to develop new business models. Within the transportation domain, e.g., aviation and railways, there is a growing concern related to how the new business models may affects safety issues. New business models in aviation and railways include extensive use of outsourcing and subcontractors to reduce costs resulting in, e.g., negative changes in working conditions, work hours, employment conditions and high turnover rates. The energy sector also faces pressures to create new business models for transition to renewable energy production to comply with new legal and regulatory requirements and to make best use of new reactor designs. In addition, large scale phase out and decommissioning of nuclear facilities have to be managed by the nuclear industry. Some negative effects of new business models have already arisen within the transportation domain, e.g., the negative effects of extensive outsourcing and subcontractor use. In the railway domain the infrastructure manager is required by European and national regulations to assure that all subcontractors are working according to the requirements in the infrastructure managers SMS (Safety Management System). More than ten levels of subcontracts can be working in a major infrastructure project making the system highly complex and thus difficult to control. In the aviation domain, tightly coupled interacting computer networks supplying airport services, as well as air traffic control, are managed and maintained by several different companies creating numerous interfaces which must be managed by the SMS. There are examples where a business model with several low

  19. Modelling regional variability of irrigation requirements due to climate change in Northern Germany.

    Science.gov (United States)

    Riediger, Jan; Breckling, Broder; Svoboda, Nikolai; Schröder, Winfried

    2016-01-15

    The question whether global climate change invalidates the efficiency of established land use practice cannot be answered without systemic considerations on a region specific basis. In this context plant water availability and irrigation requirements, respectively, were investigated in Northern Germany. The regions under investigation--Diepholz, Uelzen, Fläming and Oder-Spree--represent a climatic gradient with increasing continentality from West to East. Besides regional climatic variation and climate change, soil conditions and crop management differ on the regional scale. In the model regions, temporal seasonal droughts influence crop success already today, but on different levels of intensity depending mainly on climate conditions. By linking soil water holding capacities, crop management data and calculations of evapotranspiration and precipitation from the climate change scenario RCP 8.5 irrigation requirements for maintaining crop productivity were estimated for the years 1991 to 2070. Results suggest that water requirement for crop irrigation is likely to increase with considerable regional variation. For some of the regions, irrigation requirements might increase to such an extent that the established regional agricultural practice might be hard to retain. Where water availability is limited, agricultural practice, like management and cultivated crop spectrum, has to be changed to deal with the new challenges. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Modeling traceability information and functionality requirement in export-oriented tilapia chain.

    Science.gov (United States)

    Zhang, Xiaoshuan; Feng, Jianying; Xu, Mark; Hu, Jinyou

    2011-05-01

    Tilapia has been named as the 'food fish of the 21st century' and has become the most important farmed fish. China is the world leader in tilapia production and export. Identifying information and functional requirements is critical in developing an efficient traceability system because traceability has become a fundamental prerequisite for exporting aquaculture products. This paper examines the export-oriented tilapia chains and information flow in the chains, and identifies the key actors, information requirements and information-capturing points. Unified Modeling Language (UML) technology is adopted to describe the information and functionality requirement for chain traceability. The barriers of traceability system adoption are also identified. The results show that the traceability data consist of four categories that must be recorded by each link in the chain. The functionality requirement is classified into four categories from the fundamental information record to decisive quality control; the top three barriers to the traceability system adoption are: high costs of implementing the system, lack of experienced and professional staff; and low level of government involvement and support. Copyright © 2011 Society of Chemical Industry.

  1. 78 FR 42009 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2013-07-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... assumptions--for paying plan benefits under terminating single-employer plans covered by title IV of the... assumptions are intended to reflect current conditions in the financial and annuity markets. Assumptions under...

  2. 78 FR 11093 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2013-02-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... assumptions--for paying plan benefits under terminating single-employer plans covered by title IV of the... assumptions are intended to reflect current conditions in the financial and annuity markets. Assumptions under...

  3. Influence of simulation assumptions and input parameters on energy balance calculations of residential buildings

    International Nuclear Information System (INIS)

    Dodoo, Ambrose; Tettey, Uniben Yao Ayikoe; Gustavsson, Leif

    2017-01-01

    In this study, we modelled the influence of different simulation assumptions on energy balances of two variants of a residential building, comprising the building in its existing state and with energy-efficient improvements. We explored how selected parameter combinations and variations affect the energy balances of the building configurations. The selected parameters encompass outdoor microclimate, building thermal envelope and household electrical equipment including technical installations. Our modelling takes into account hourly as well as seasonal profiles of different internal heat gains. The results suggest that the impact of parameter interactions on calculated space heating of buildings is somewhat small and relatively more noticeable for an energy-efficient building in contrast to a conventional building. We find that the influence of parameters combinations is more apparent as more individual parameters are varied. The simulations show that a building's calculated space heating demand is significantly influenced by how heat gains from electrical equipment are modelled. For the analyzed building versions, calculated final energy for space heating differs by 9–14 kWh/m 2 depending on the assumed energy efficiency level for electrical equipment. The influence of electrical equipment on calculated final space heating is proportionally more significant for an energy-efficient building compared to a conventional building. This study shows the influence of different simulation assumptions and parameter combinations when varied simultaneously. - Highlights: • Energy balances are modelled for conventional and efficient variants of a building. • Influence of assumptions and parameter combinations and variations are explored. • Parameter interactions influence is apparent as more single parameters are varied. • Calculated space heating demand is notably affected by how heat gains are modelled.

  4. Technical support document for proposed revision of the model energy code thermal envelope requirements

    Energy Technology Data Exchange (ETDEWEB)

    Conner, C.C.; Lucas, R.G.

    1993-02-01

    This report documents the development of the proposed revision of the council of American Building Officials' (CABO) 1993 supplement to the 1992 Model Energy Code (MEC) (referred to as the 1993 MEC) building thermal envelope requirements for single-family and low-rise multifamily residences. The goal of this analysis was to develop revised guidelines based on an objective methodology that determined the most cost-effective (least total life-cycle cost [LCC]) combination of energy conservation measures (ECMs) for residences in different locations. The ECMs with the lowest LCC were used as a basis for proposing revised MEC maximum U[sub o]-value (thermal transmittance) curves in the MEC format. The changes proposed here affect the requirements for group R'' residences. The group R residences are detached one- and two-family dwellings (referred to as single-family) and all other residential buildings three stories or less (referred to as multifamily).

  5. Technical support document for proposed revision of the model energy code thermal envelope requirements

    Energy Technology Data Exchange (ETDEWEB)

    Conner, C.C.; Lucas, R.G.

    1993-02-01

    This report documents the development of the proposed revision of the council of American Building Officials` (CABO) 1993 supplement to the 1992 Model Energy Code (MEC) (referred to as the 1993 MEC) building thermal envelope requirements for single-family and low-rise multifamily residences. The goal of this analysis was to develop revised guidelines based on an objective methodology that determined the most cost-effective (least total life-cycle cost [LCC]) combination of energy conservation measures (ECMs) for residences in different locations. The ECMs with the lowest LCC were used as a basis for proposing revised MEC maximum U{sub o}-value (thermal transmittance) curves in the MEC format. The changes proposed here affect the requirements for ``group R`` residences. The group R residences are detached one- and two-family dwellings (referred to as single-family) and all other residential buildings three stories or less (referred to as multifamily).

  6. Model of assessment of requirements of privacy, security and quality of service for mobile medical applications

    Directory of Open Access Journals (Sweden)

    Edward Paul Guillen Pinto

    2017-08-01

    Full Text Available Introduction: The development of mobile technologies has facilitated the creation of mHealth applications, which are considered key tools for safe and quality care for patients from remote populations and with lack of infrastructure for the provision of health services. The article considers a proposal for an evaluation model that allows to determine weaknesses and vulnerabilities at the security level and quality of service (QoS in mHealth applications. Objective: To carry out an approximation of a model of analysis that supports the decision making, concerning the use and production of safe applications, minimizing the impact and the probability of occurrence of the risks of computer security. Materials and methods: The type of applied research is of a descriptive type, because each one details the characteristics that the mobile health applications must have to achieve an optimum level of safety. The methodology uses the rules that regulate applications and mixes them with techniques of security analysis, using the characterization of risks posed by Open Web Application Security Project-OWASP and the QoS requirements of the International Telecommunication Union-ITU. Results: An effective analysis was obtained in actual current applications, which shows their weaknesses and the aspects to be corrected to comply with appropriate security parameters. Conclusions: The model allows to evaluate the safety and quality of service (QoS requirements of mobile health applications that can be used to evaluate current applications or to generate the criteria before deployment.

  7. Model of an aquaponic system for minimised water, energy and nitrogen requirements.

    Science.gov (United States)

    Reyes Lastiri, D; Slinkert, T; Cappon, H J; Baganz, D; Staaks, G; Keesman, K J

    2016-01-01

    Water and nutrient savings can be established by coupling water streams between interacting processes. Wastewater from production processes contains nutrients like nitrogen (N), which can and should be recycled in order to meet future regulatory discharge demands. Optimisation of interacting water systems is a complex task. An effective way of understanding, analysing and optimising such systems is by applying mathematical models. The present modelling work aims at supporting the design of a nearly emission-free aquaculture and hydroponic system (aquaponics), thus contributing to sustainable production and to food security for the 21st century. Based on the model, a system that couples 40 m(3) fish tanks and a hydroponic system of 1,000 m(2) can produce 5 tons of tilapia and 75 tons of tomato yearly. The system requires energy to condense and recover evaporated water, for lighting and heating, adding up to 1.3 GJ/m(2) every year. In the suggested configuration, the fish can provide about 26% of the N required in a plant cycle. A coupling strategy that sends water from the fish to the plants in amounts proportional to the fish feed input, reduces the standard deviation of the NO3(-) level in the fish cycle by 35%.

  8. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    Science.gov (United States)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried

  9. Simulation Modeling Requirements for Loss-of-Control Accident Prevention of Turboprop Transport Aircraft

    Science.gov (United States)

    Crider, Dennis; Foster, John V.

    2012-01-01

    . This paper addresses simulation modeling requirements that are unique to turboprop transport aircraft and highlights the growing need for aerodynamic models suitable for stall training for these configurations. A review of prominent accidents that involved aerodynamic stall is used to illustrate various modeling features unique to turboprop configurations and the impact of stall behavior on susceptibility to loss of control that has led to new training requirements. This is followed by an overview of stability and control behavior of straight-wing turboprops, the related aerodynamic characteristics, and a summary of recent experimental studies on icing effects. In addition, differences in flight dynamics behavior between swept-wing jets and straight-wing turboprop configurations are discussed to compare and contrast modeling requirements. Specific recommendations for aerodynamic models along with further research needs and data measurements are also provided. 1

  10. Modelling of radon control and air cleaning requirements in underground uranium mines

    International Nuclear Information System (INIS)

    El Fawal, M.; Gadalla, A.

    2014-01-01

    As a part of a comprehensive study concerned with control workplace short-lived radon daughter concentration in underground uranium mines to safe levels, a computer program has been developed and verified, to calculate ventilation parameters e.g. local pressures, flow rates and radon daughter concentration levels. The computer program is composed of two parts, one part for mine ventilation and the other part for radon daughter levels calculations. This program has been validated in an actual case study to calculate radon concentration levels, pressure and flow rates required to maintain acceptable levels of radon concentrations in each point of the mine. The required fan static pressure and the approximate energy consumption were also estimated. The results of the calculations have been evaluated and compared with similar investigation. It was found that the calculated values are in good agreement with the corresponding values obtained using ''REDES'' standard ventilation modelling software. The developed computer model can be used as an available tool to help in the evaluation of ventilation systems proposed by mining authority, to assist the uranium mining industry in maintaining the health and safety of the workers underground while efficiently achieving economic production targets. It could be used also for regulatory inspection and radiation protection assessments of workers in the underground mining. Also with using this model, one can effectively design, assess and manage underground mine ventilation systems. Values of radon decay products concentration in units of working level, pressures drop and flow rates required to reach the acceptable radon concentration relative to the recommended levels, at different extraction points in the mine and fan static pressure could be estimated which are not available using other software. (author)

  11. New media in strategy – mapping assumptions in the field

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Plesner, Ursula; Raviola, Elena

    2018-01-01

    There is plenty of empirical evidence for claiming that new media make a difference for how strategy is conceived and executed. Furthermore, there is a rapidly growing body of literature that engages with this theme, and offers recommendations regarding the appropriate strategic actions in relation...... to new media. By contrast, there is relatively little attention to the assumptions behind strategic thinking in relation to new media. This article reviews the most influential strategy journals, asking how new media are conceptualized. It is shown that strategy scholars have a tendency to place...... themselves in either a deterministic or at volontaristic camp with regards to technology. Strategy is portrayed as either determined by new media or a matter of rationally using them. Additionally, most articles portray the organization nicely delineated entity, where new media are relevant either...

  12. Commentary: profiling by appearance and assumption: beyond race and ethnicity.

    Science.gov (United States)

    Sapién, Robert E

    2010-04-01

    In this issue, Acquaviva and Mintz highlight issues regarding racial profiling in medicine and how it is perpetuated through medical education: Physicians are taught to make subjective determinations of race and/or ethnicity in case presentations, and such assumptions may affect patient care. The author of this commentary believes that the discussion should be broadened to include profiling on the basis of general appearance. The author reports personal experiences as someone who has profiled and been profiled by appearance-sometimes by skin color, sometimes by other physical attributes. In the two cases detailed here, patient care could have been affected had the author not become aware of his practices in such situations. The author advocates raising awareness of profiling in the broader sense through training.

  13. HYPROLOG: A New Logic Programming Language with Assumptions and Abduction

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2005-01-01

    . The language shows a novel flexibility in the interaction between the different paradigms, including all additional built-in predicates and constraints solvers that may be available. Assumptions and abduction are especially useful for language processing, and we can show how HYPROLOG works seamlessly together...... with the grammar notation provided by the underlying Prolog system. An operational semantics is given which complies with standard declarative semantics for the ``pure'' sublanguages, while for the full HYPROLOG language, it must be taken as definition. The implementation is straightforward and seems to provide...... for abduction, the most efficient of known implementations; the price, however, is a limited use of negations. The main difference wrt.\\ previous implementations of abduction is that we avoid any level of metainterpretation by having Prolog execute the deductive steps directly and by treating abducibles (and...

  14. Deconstructing Community for Conservation: Why Simple Assumptions are Not Sufficient.

    Science.gov (United States)

    Waylen, Kerry Ann; Fischer, Anke; McGowan, Philip J K; Milner-Gulland, E J

    2013-01-01

    Many conservation policies advocate engagement with local people, but conservation practice has sometimes been criticised for a simplistic understanding of communities and social context. To counter this, this paper explores social structuring and its influences on conservation-related behaviours at the site of a conservation intervention near Pipar forest, within the Seti Khola valley, Nepal. Qualitative and quantitative data from questionnaires and Rapid Rural Appraisal demonstrate how links between groups directly and indirectly influence behaviours of conservation relevance (including existing and potential resource-use and proconservation activities). For low-status groups the harvesting of resources can be driven by others' preference for wild foods, whilst perceptions of elite benefit-capture may cause reluctance to engage with future conservation interventions. The findings reiterate the need to avoid relying on simple assumptions about 'community' in conservation, and particularly the relevance of understanding relationships between groups, in order to understand natural resource use and implications for conservation.

  15. Unconditionally Secure and Universally Composable Commitments from Physical Assumptions

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Scafuro, Alessandra

    2013-01-01

    the usefulness of our compiler by providing two (constant-round) instantiations of ideal straight-line extractable commitment based on (malicious) PUFs [36] and stateless tamper-proof hardware tokens [26], therefore achieving the first unconditionally UC-secure commitment with malicious PUFs and stateless tokens......We present a constant-round unconditional black-box compiler that transforms any ideal (i.e., statistically-hiding and statistically-binding) straight-line extractable commitment scheme, into an extractable and equivocal commitment scheme, therefore yielding to UC-security [9]. We exemplify......, respectively. Our constructions are secure for adversaries creating arbitrarily malicious stateful PUFs/tokens. Previous results with malicious PUFs used either computational assumptions to achieve UC-secure commitments or were unconditionally secure but only in the indistinguishability sense [36]. Similarly...

  16. Vitamin D Signaling in the Bovine Immune System: A Model for Understanding Human Vitamin D Requirements

    Directory of Open Access Journals (Sweden)

    Corwin D. Nelson

    2012-03-01

    Full Text Available The endocrine physiology of vitamin D in cattle has been rigorously investigated and has yielded information on vitamin D requirements, endocrine function in health and disease, general metabolism, and maintenance of calcium homeostasis in cattle. These results are relevant to human vitamin D endocrinology. The current debate regarding vitamin D requirements is centered on the requirements for proper intracrine and paracrine vitamin D signaling. Studies in adult and young cattle can provide valuable insight for understanding vitamin D requirements as they relate to innate and adaptive immune responses during infectious disease. In cattle, toll-like receptor recognition activates intracrine and paracrine vitamin D signaling mechanism in the immune system that regulates innate and adaptive immune responses in the presence of adequate 25-hydroxyvitamin D. Furthermore, experiments with mastitis in dairy cattle have provided in vivo evidence for the intracrine vitamin D signaling mechanism in macrophages as well as vitamin D mediated suppression of infection. Epidemiological evidence indicates that circulating concentrations above 32 ng/mL of 25-hydroxyvitamin D are necessary for optimal vitamin D signaling in the immune system, but experimental evidence is lacking for that value. Experiments in cattle can provide that evidence as circulating 25-hydroxyvitamin D concentrations can be experimentally manipulated within ranges that are normal for humans and cattle. Additionally, young and adult cattle can be experimentally infected with bacteria and viruses associated with significant diseases in both cattle and humans. Utilizing the bovine model to further delineate the immunomodulatory role of vitamin D will provide potentially valuable insights into the vitamin D requirements of both humans and cattle, especially as they relate to immune response capacity and infectious disease resistance.

  17. Meeting Human Reliability Requirements through Human Factors Design, Testing, and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Boring

    2007-06-01

    In the design of novel systems, it is important for the human factors engineer to work in parallel with the human reliability analyst to arrive at the safest achievable design that meets design team safety goals and certification or regulatory requirements. This paper introduces the System Development Safety Triptych, a checklist of considerations for the interplay of human factors and human reliability through design, testing, and modeling in product development. This paper also explores three phases of safe system development, corresponding to the conception, design, and implementation of a system.

  18. Verification of voltage/ frequency requirement for emergency diesel generator in nuclear power plant using dynamic modeling

    International Nuclear Information System (INIS)

    Hur, J.S.; Roh, M.S.

    2013-01-01

    Full-text: One major cause of the plant shutdown is the loss of electrical power. The study is to comprehend the coping action against station blackout including emergency diesel generator, sequential loading of safety system and to ensure that the emergency diesel generator should meet requirements, especially voltage and frequency criteria using modeling tool. This paper also considered the change of the sequencing time and load capacity only for finding electrical design margin. However, the revision of load list must be verified with safety analysis. From this study, it is discovered that new load calculation is a key factor in EDG localization and in-house capability increase. (author)

  19. Old and New Ideas for Data Screening and Assumption Testing for Exploratory and Confirmatory Factor Analysis

    Science.gov (United States)

    Flora, David B.; LaBrish, Cathy; Chalmers, R. Philip

    2011-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables. PMID:22403561

  20. Mathematically modelling the power requirement for a vertical shaft mowing machine

    Directory of Open Access Journals (Sweden)

    Jorge Simón Pérez de Corcho Fuentes

    2008-09-01

    Full Text Available This work describes a mathematical model for determining the power demand for a vertical shaft mowing machine, particularly taking into account the influence of speed on cutting power, which is different from that of other models of mowers. The influence of the apparatus’ rotation and translation speeds was simulated in determining power demand. The results showed that no chan-ges in cutting power were produced by varying the knives’ angular speed (if translation speed was constant, while cutting power became increased if translation speed was increased. Variations in angular speed, however, influenced other parameters deter-mining total power demand. Determining this vertical shaft mower’s cutting pattern led to obtaining good crop stubble quality at the mower’s lower rotation speed, hence reducing total energy requirements.

  1. Fast tracking ICT infrastructure requirements and design, based on Enterprise Reference Architecture and matching Reference Models

    DEFF Research Database (Denmark)

    Bernus, Peter; Baltrusch, Rob; Vesterager, Johan

    2002-01-01

    The Globemen Consortium has developed the virtual enterprise reference architecture and methodology (VERAM), based on GERAM and developed reference models for virtual enterprise management and joint mission delivery. The planned virtual enterprise capability includes the areas of sales...... and marketing, global engineering, and customer relationship management. The reference models are the basis for the development of ICT infrastructure requirements. These in turn can be used for ICT infrastructure specification (sometimes referred to as 'ICT architecture').Part of the ICT architecture...... is industry-wide, part of it is industry-specific and a part is specific to the domains of the joint activity that characterises the given Virtual Enterprise Network at hand. The article advocates a step by step approach to building virtual enterprise capability....

  2. The USEtox story: A survey of model developer visions and user requirements

    DEFF Research Database (Denmark)

    Westh, Torbjørn Bochsen; Hauschild, Michael Zwicky; Birkved, Morten

    2015-01-01

    into LCA software and methods, (4) improve update/testing procedures, (5) strengthen communication between developers and users, and (6) extend model scope. By generalizing our recommendations to guide scientific model development in a broader context, we emphasize to acknowledge different levels of user......, we analyzed user expectations and experiences and compared them with the developers’ visions. Methods We applied qualitative and quantitative data collection methods including an online questionnaire, semistructured user and developer interviews, and review of scientific literature. Questionnaire...... and interview results were analyzed in an actor-network perspective in order to understand user needs and to compare these with the developers’ visions. Requirement engineering methods, more specifically function tree, system context, and activity diagrams, were iteratively applied and structured to develop...

  3. Modeling regulatory policies associated with offshore structure removal requirements in the Gulf of Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, Mark J. [Center for Energy Studies, Louisiana State University, Energy Coast and Environment Building, Baton Rouge, LA (United States)

    2008-07-15

    Federal regulations require that a lease in the Outer Continental Shelf of the Gulf of Mexico be cleared of all structures within one year after production on the lease ceases, but in recent years, the Minerals Management Service has begun to encourage operators to remove idle (non-producing) structures on producing leases that are no longer ''economically viable''. At the end of 2003, there were 2175 producing structures, 898 idle (non-producing) structures, and 440 auxiliary (never-producing) structures on 1356 active leases; and 329 idle structures and 65 auxiliary structures on 273 inactive leases. The purpose of this paper is to model the impact of alternative regulatory policies on the removal trends of structures and the inventory of idle iron, and to provide first-order estimates of the cost of each regulatory option. A description of the modeling framework and implementation results is presented. (author)

  4. A new model for evaluating maintenance energy requirements in dogs: allometric equation from 319 pet dogs.

    Science.gov (United States)

    Divol, Guilhem; Priymenko, Nathalie

    2017-01-01

    Reports concerning maintenance energy requirements (MER) in dogs are common but most of the data cover laboratory or utility dogs. This study establishes those of healthy adult pet dogs and the factors which cause these energy requirements to vary. Within the framework of a nutrition teaching exercise, each student followed a pet from his entourage and gathered accurate records of its feeding habits. Data have been restricted to healthy adult dogs with an ideal body weight (BW) which did not vary more than 5 % during the study period. A total of 319 eligible records were analysed using multiple linear regression. Variation factors such as ownership, breed, sex and neutered status, bedding location, temperament and feeding habits were then analysed individually using a non-parametric model. Two models result from this study, one excluding age ( r 2 0·813) and a more accurate one which takes into consideration the age in years ( r 2 0·816). The second model was assessed with the main variation factors and shows that: MER (kcal) = k 1 × k 2 × k 3 × k 4 × k 5 × 128 × BW 0·740 × age -0·050 /d ( r 2 0·836), with k 1 the effect of the breed, k 2 the effect of sex and neutered status, k 3 the effect of bedding location, k 4 the effect of temperament and k 5 the effect of the type of feed. The resulting model is very similar to the recommendations made by the National Research Council (2006) but a greater accuracy was obtained using age raised to a negative power, as demonstrated in human nutrition.

  5. Vehicle Modeling for use in the CAFE model: Process description and modeling assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Moawad, Ayman [Argonne National Lab. (ANL), Argonne, IL (United States); Kim, Namdoo [Argonne National Lab. (ANL), Argonne, IL (United States); Rousseau, Aymeric [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-06-01

    The objective of this project is to develop and demonstrate a process that, at a minimum, provides more robust information that can be used to calibrate inputs applicable under the CAFE model’s existing structure. The project will be more fully successful if a process can be developed that minimizes the need for decision trees and replaces the synergy factors by inputs provided directly from a vehicle simulation tool. The report provides a description of the process that was developed by Argonne National Laboratory and implemented in Autonomie.

  6. Infrastructure requirement of knowledge management system model of statistical learning tool (SLT) for education community

    Science.gov (United States)

    Abdullah, Rusli; Samah, Bahaman Abu; Bolong, Jusang; D'Silva, Jeffrey Lawrence; Shaffril, Hayrol Azril Mohamed

    2014-09-01

    Today, teaching and learning (T&L) using technology as tool is becoming more important especially in the field of statistics as a part of the subject matter in higher education system environment. Eventhough, there are many types of technology of statistical learnig tool (SLT) which can be used to support and enhance T&L environment, however, there is lack of a common standard knowledge management as a knowledge portal for guidance especially in relation to infrastructure requirement of SLT in servicing the community of user (CoU) such as educators, students and other parties who are interested in performing this technology as a tool for their T&L. Therefore, there is a need of a common standard infrastructure requirement of knowledge portal in helping CoU for managing of statistical knowledge in acquiring, storing, desseminating and applying of the statistical knowedge for their specific purposes. Futhermore, by having this infrastructure requirement of knowledge portal model of SLT as a guidance in promoting knowledge of best practise among the CoU, it can also enhance the quality and productivity of their work towards excellence of statistical knowledge application in education system environment.

  7. [In-depth interviews and the Kano model to determine user requirements in a burns unit].

    Science.gov (United States)

    González-Revaldería, J; Holguín-Holgado, P; Lumbreras-Marín, E; Núñez-López, G

    To determine the healthcare requirements of patients in a Burns Unit, using qualitative techniques, such us in-depth personal interviews and Kano's methodology. Qualitative methodology using in-depth personal interviews (12 patients), Kano's conceptual model, and the SERVQHOS questionnaire (24 patients). All patients had been hospitalised in the last 12 months in the Burns Unit. Using Kano's methodology, service attributes were grouped by affinity diagrams, and classified as follows: must-be, attractive (unexpected, great satisfaction), and one-dimensional (linked to the degree of functionality of the service). The outcomes were compared with those obtained with SERVQHOS questionnaire. From the analysis of in-depth interviews, 11 requirements were obtained, referring to hotel aspects, information, need for closer staff relationship, and organisational aspects. The attributes classified as must-be were free television and automatic TV disconnection at midnight. Those classified as attractive were: individual room for more privacy, information about dressing change times in order to avoid anxiety, and additional staff for in-patients. The results were complementary to those obtained with the SERVQHOS questionnaire. In-depth personal interviews provide extra knowledge about patient requirements, complementing the information obtained with questionnaires. With this methodology, a more active patient participation is achieved and the companion's opinion is also taken into account. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. Model-based Assessment for Balancing Privacy Requirements and Operational Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Knirsch, Fabian [Salzburg Univ. (Austria); Engel, Dominik [Salzburg Univ. (Austria); Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States); Prasanna, Viktor [Univ. of Southern California, Los Angeles, CA (United States)

    2015-02-17

    The smart grid changes the way energy is produced and distributed. In addition both, energy and information is exchanged bidirectionally among participating parties. Therefore heterogeneous systems have to cooperate effectively in order to achieve a common high-level use case, such as smart metering for billing or demand response for load curtailment. Furthermore, a substantial amount of personal data is often needed for achieving that goal. Capturing and processing personal data in the smart grid increases customer concerns about privacy and in addition, certain statutory and operational requirements regarding privacy aware data processing and storage have to be met. An increase of privacy constraints, however, often limits the operational capabilities of the system. In this paper, we present an approach that automates the process of finding an optimal balance between privacy requirements and operational requirements in a smart grid use case and application scenario. This is achieved by formally describing use cases in an abstract model and by finding an algorithm that determines the optimum balance by forward mapping privacy and operational impacts. For this optimal balancing algorithm both, a numeric approximation and – if feasible – an analytic assessment are presented and investigated. The system is evaluated by applying the tool to a real-world use case from the University of Southern California (USC) microgrid.

  9. Integrated Data Capturing Requirements for 3d Semantic Modelling of Cultural Heritage: the Inception Protocol

    Science.gov (United States)

    Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.

    2017-02-01

    The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  10. Corporate Data Network (CDN) data requirements task. Enterprise Model. Volume 1

    International Nuclear Information System (INIS)

    1985-11-01

    The NRC has initiated a multi-year program to centralize its information processing in a Corporate Data Network (CDN). The new information processing environment will include shared databases, telecommunications, office automation tools, and state-of-the-art software. Touche Ross and Company was contracted with to perform a general data requirements analysis for shared databases and to develop a preliminary plan for implementation of the CDN concept. The Enterprise Model (Vol. 1) provided the NRC with agency-wide information requirements in the form of data entities and organizational demand patterns as the basis for clustering the entities into logical groups. The Data Dictionary (Vol. 2) provided the NRC with definitions and example attributes and properties for each entity. The Data Model (Vol. 3) defined logical databases and entity relationships within and between databases. The Preliminary Strategic Data Plan (Vol. 4) prioritized the development of databases and included a workplan and approach for implementation of the shared database component of the Corporate Data Network

  11. THE 3C COOPERATION MODEL APPLIED TO THE CLASSICAL REQUIREMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vagner Luiz Gava

    2012-08-01

    Full Text Available Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users’ workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system

  12. The effects of behavioral and structural assumptions in artificial stock market

    Science.gov (United States)

    Liu, Xinghua; Gregor, Shirley; Yang, Jianmei

    2008-04-01

    Recent literature has developed the conjecture that important statistical features of stock price series, such as the fat tails phenomenon, may depend mainly on the market microstructure. This conjecture motivated us to investigate the roles of both the market microstructure and agent behavior with respect to high-frequency returns and daily returns. We developed two simple models to investigate this issue. The first one is a stochastic model with a clearing house microstructure and a population of zero-intelligence agents. The second one has more behavioral assumptions based on Minority Game and also has a clearing house microstructure. With the first model we found that a characteristic of the clearing house microstructure, namely the clearing frequency, can explain fat tail, excess volatility and autocorrelation phenomena of high-frequency returns. However, this feature does not cause the same phenomena in daily returns. So the Stylized Facts of daily returns depend mainly on the agents’ behavior. With the second model we investigated the effects of behavioral assumptions on daily returns. Our study implicates that the aspects which are responsible for generating the stylized facts of high-frequency returns and daily returns are different.

  13. How do rigid-lid assumption affect LES simulation results at high Reynolds flows?

    Science.gov (United States)

    Khosronejad, Ali; Farhadzadeh, Ali; SBU Collaboration

    2017-11-01

    This research is motivated by the work of Kara et al., JHE, 2015. They employed LES to model flow around a model of abutment at a Re number of 27,000. They showed that first-order turbulence characteristics obtained by rigid-lid (RL) assumption compares fairly well with those of level-set (LS) method. Concerning the second-order statistics, however, their simulation results showed a significant dependence on the method used to describe the free surface. This finding can have important implications for open channel flow modeling. The Reynolds number for typical open channel flows, however, could be much larger than that of Kara et al.'s test case. Herein, we replicate the reported study by augmenting the geometric and hydraulic scales to reach a Re number of one order of magnitude larger ( 200,000). The Virtual Flow Simulator (VFS-Geophysics) model in its LES mode is used to simulate the test case using both RL and LS methods. The computational results are validated using measured flow and free-surface data from our laboratory experiments. Our goal is to investigate the effects of RL assumption on both first-order and second order statistics at high Reynolds numbers that occur in natural waterways. Acknowledgment: Computational resources are provided by the Center of Excellence in Wireless & Information Technology (CEWIT) of Stony Brook University.

  14. ANALYSIS OF RAINFALL DATA TO ESTIMATE RAIN CONTRIBUTION TOWARDS CROP WATER REQUIREMENT USING CROPWAT MODEL

    Directory of Open Access Journals (Sweden)

    Tahir Saeed Laghari

    2014-12-01

    Full Text Available A study was carried out to define the analysis of rainfall data in order to estimate its contribution towards crop water requirements to overcome these problems. Rainfall and climatic data was collected from metrological stations, C.P UAF rain gauge (A, (AARI, (B, (CAA, (C and (WAPDA, (D, Faisalabad of given region and this data was reserved for cross validation. The test station’s (A rainfall data was subjected to double mass curve technique to check its consistency with respect to other rainfall stations (B, C and D in that area. The results derived by double curve technique were accurate for interested gauge station because there was no any break in curve. Then this consistent data was used to determine effective rainfall. The ETo was established by using penman-monteith method in the course of CROPWAT model and its effect with respect to other parameters like sun shine hour, wind speed, maximum & minimum temperature and rainfall humidity were determined. It was founded that the reference evapotranspiration (ETo is more during April to September due to increase in temperature and low in remaining months. After that data was placed in the model to acquire crop water requirement and irrigation of illustrative crops (wheat & maize from the district. Through which we estimated that 7.5% rainfall for wheat and 15.5% rainfall for maize can contribute in actual irrigation per year. Through which we determined that 92.5 % and 84.5 % irrigation is required for wheat and maize crop respectively.

  15. Assessing women's sexuality after cancer therapy: checking assumptions with the focus group technique.

    Science.gov (United States)

    Bruner, D W; Boyd, C P

    1999-12-01

    Cancer and cancer therapies impair sexual health in a multitude of ways. The promotion of sexual health is therefore vital for preserving quality of life and is an integral part of total or holistic cancer management. Nursing, to provide holistic care, requires research that is meaningful to patients as well as the profession to develop educational and interventional studies to promote sexual health and coping. To obtain meaningful research data instruments that are reliable, valid, and pertinent to patients' needs are required. Several sexual functioning instruments were reviewed for this study and found to be lacking in either a conceptual foundation or psychometric validation. Without a defined conceptual framework, authors of the instruments must have made certain assumptions regarding what women undergoing cancer therapy experience and what they perceive as important. To check these assumptions before assessing women's sexuality after cancer therapies in a larger study, a pilot study was designed to compare what women experience and perceive as important regarding their sexuality with what is assessed in several currently available research instruments, using the focus group technique. Based on the focus group findings, current sexual functioning questionnaires may be lacking in pertinent areas of concern for women treated for breast or gynecologic malignancies. Better conceptual foundations may help future questionnaire design. Self-regulation theory may provide an acceptable conceptual framework from which to develop a sexual functioning questionnaire.

  16. Bioenergetics model for estimating food requirements of female Pacific walruses (Odobenus rosmarus divergens)

    Science.gov (United States)

    Noren, S.R.; Udevitz, M.S.; Jay, C.V.

    2012-01-01

    Pacific walruses Odobenus rosmarus divergens use sea ice as a platform for resting, nursing, and accessing extensive benthic foraging grounds. The extent of summer sea ice in the Chukchi Sea has decreased substantially in recent decades, causing walruses to alter habitat use and activity patterns which could affect their energy requirements. We developed a bioenergetics model to estimate caloric demand of female walruses, accounting for maintenance, growth, activity (active in-water and hauled-out resting), molt, and reproductive costs. Estimates for non-reproductive females 0–12 yr old (65−810 kg) ranged from 16359 to 68960 kcal d−1 (74−257 kcal d−1 kg−1) for years with readily available sea ice for which we assumed animals spent 83% of their time in water. This translated into the energy content of 3200–5960 clams per day, equivalent to 7–8% and 14–9% of body mass per day for 5–12 and 2–4 yr olds, respectively. Estimated consumption rates of 12 yr old females were minimally affected by pregnancy, but lactation had a large impact, increasing consumption rates to 15% of body mass per day. Increasing the proportion of time in water to 93%, as might happen if walruses were required to spend more time foraging during ice-free periods, increased daily caloric demand by 6–7% for non-lactating females. We provide the first bioenergetics-based estimates of energy requirements for walruses and a first step towards establishing bioenergetic linkages between demography and prey requirements that can ultimately be used in predicting this population’s response to environmental change.

  17. Diverse Secreted Effectors Are Required for Salmonella Persistence in a Mouse Infection Model

    Energy Technology Data Exchange (ETDEWEB)

    Kidwai, Afshan S.; Mushamiri, Ivy T.; Niemann, George; Brown, Roslyn N.; Adkins, Joshua N.; Heffron, Fred

    2013-08-12

    Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS) apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI) was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  18. Diverse secreted effectors are required for Salmonella persistence in a mouse infection model.

    Directory of Open Access Journals (Sweden)

    Afshan S Kidwai

    Full Text Available Salmonella enterica serovar Typhimurium causes typhoid-like disease in mice and is a model of typhoid fever in humans. One of the hallmarks of typhoid is persistence, the ability of the bacteria to survive in the host weeks after infection. Virulence factors called effectors facilitate this process by direct transfer to the cytoplasm of infected cells thereby subverting cellular processes. Secretion of effectors to the cell cytoplasm takes place through multiple routes, including two separate type III secretion (T3SS apparati as well as outer membrane vesicles. The two T3SS are encoded on separate pathogenicity islands, SPI-1 and -2, with SPI-1 more strongly associated with the intestinal phase of infection, and SPI-2 with the systemic phase. Both T3SS are required for persistence, but the effectors required have not been systematically evaluated. In this study, mutations in 48 described effectors were tested for persistence. We replaced each effector with a specific DNA barcode sequence by allelic exchange and co-infected with a wild-type reference to calculate the ratio of wild-type parent to mutant at different times after infection. The competitive index (CI was determined by quantitative PCR in which primers that correspond to the barcode were used for amplification. Mutations in all but seven effectors reduced persistence demonstrating that most effectors were required. One exception was CigR, a recently discovered effector that is widely conserved throughout enteric bacteria. Deletion of cigR increased lethality, suggesting that it may be an anti-virulence factor. The fact that almost all Salmonella effectors are required for persistence argues against redundant functions. This is different from effector repertoires in other intracellular pathogens such as Legionella.

  19. About tests of the “simplifying” assumption for conditional copulas

    Directory of Open Access Journals (Sweden)

    Derumigny Alexis

    2017-08-01

    Full Text Available We discuss the so-called “simplifying assumption” of conditional copulas in a general framework. We introduce several tests of the latter assumption for non- and semiparametric copula models. Some related test procedures based on conditioning subsets instead of point-wise events are proposed. The limiting distributions of such test statistics under the null are approximated by several bootstrap schemes, most of them being new. We prove the validity of a particular semiparametric bootstrap scheme. Some simulations illustrate the relevance of our results.

  20. Experimental evaluation of the pure configurational stress assumption in the flow dynamics of entangled polymer melts

    DEFF Research Database (Denmark)

    Rasmussen, Henrik K.; Bejenariu, Anca Gabriela; Hassager, Ole

    2010-01-01

    to the flow in the non-linear flow regime. This has allowed highly elastic measurements within the limit of pure orientational stress, as the time of the flow was considerably smaller than the Rouse time. A Doi-Edwards [J. Chem. Soc., Faraday Trans. 2 74, 1818-1832 (1978)] type of constitutive model...... with the assumption of pure configurational stress was accurately able to predict the startup as well as the reversed flow behavior. This confirms that this commonly used theoretical picture for the flow of polymeric liquids is a correct physical principle to apply. c 2010 The Society of Rheology. [DOI: 10.1122/1.3496378]...

  1. Summary report of a seminar on geosphere modelling requirements of deep disposal of low and intermediate level radioactive wastes

    International Nuclear Information System (INIS)

    Piper, D.; Paige, R.W.; Broyd, T.W.

    1989-02-01

    A seminar on the geosphere modelling requirements of deep disposal of low and intermediate level radioactive wastes was organised by WS Atkins Engineering Sciences as part of Her Majesty's Inspectorate of Pollution's Radioactive Waste Assessment Programme. The objectives of the seminar were to review geosphere modelling capabilities and prioritise, if possible, any requirements for model development. Summaries of the presentations and subsequent discussions are given in this report. (author)

  2. Dynamic Computational Model of Symptomatic Bacteremia to Inform Bacterial Separation Treatment Requirements.

    Directory of Open Access Journals (Sweden)

    Sinead E Miller

    Full Text Available The rise of multi-drug resistance has decreased the effectiveness of antibiotics, which has led to increased mortality rates associated with symptomatic bacteremia, or bacterial sepsis. To combat decreasing antibiotic effectiveness, extracorporeal bacterial separation approaches have been proposed to capture and separate bacteria from blood. However, bacteremia is dynamic and involves host-pathogen interactions across various anatomical sites. We developed a mathematical model that quantitatively describes the kinetics of pathogenesis and progression of symptomatic bacteremia under various conditions, including bacterial separation therapy, to better understand disease mechanisms and quantitatively assess the biological impact of bacterial separation therapy. Model validity was tested against experimental data from published studies. This is the first multi-compartment model of symptomatic bacteremia in mammals that includes extracorporeal bacterial separation and antibiotic treatment, separately and in combination. The addition of an extracorporeal bacterial separation circuit reduced the predicted time of total bacteria clearance from the blood of an immunocompromised rodent by 49%, compared to antibiotic treatment alone. Implementation of bacterial separation therapy resulted in predicted multi-drug resistant bacterial clearance from the blood of a human in 97% less time than antibiotic treatment alone. The model also proposes a quantitative correlation between time-dependent bacterial load among tissues and bacteremia severity, analogous to the well-known 'area under the curve' for characterization of drug efficacy. The engineering-based mathematical model developed may be useful for informing the design of extracorporeal bacterial separation devices. This work enables the quantitative identification of the characteristics required of an extracorporeal bacteria separation device to provide biological benefit. These devices will potentially

  3. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    -based quantitative models of regional system behavior that may soon be used to determine acceptable land uses. Finally, the philosophical assumptions that underlie urban environmental planning are changing to address new epistemological, ontological and ethical assumptions that support new methods and goals. The inability to use the past as a guide to the future, new prioritizations of values for adaptation, and renewed efforts to focus on intergenerational justice are provided as examples. In order to represent a genuine paradigm shift, this review argues that changes must begin to be evident across the underlying assumptions, conceptual frameworks, and methods of urban environmental planning, and be attributable to the same root cause. The examples presented here represent the early stages of a change in the overall paradigm of the discipline.

  4. Guideline for Adopting the Local Reaction Assumption for Porous Absorbers in Terms of Random Incidence Absorption Coefficients

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2011-01-01

    resistivity and the absorber thickness on the difference between the two surface reaction models are examined and discussed. For a porous absorber backed by a rigid surface, the assumption of local reaction always underestimates the random incidence absorption coefficient and the local reaction models give......Room surfaces have been extensively modeled as locally reacting in room acoustic predictions although such modeling could yield significant errors under certain conditions. Therefore, this study aims to propose a guideline for adopting the local reaction assumption by comparing predicted random...... incidence acoustical characteristics of typical building elements made of porous materials assuming extended and local reaction. For each surface reaction, five well-established wave propagation models, the Delany-Bazley, Miki, Beranek, Allard-Champoux, and Biot model, are employed. Effects of the flow...

  5. Estimating Irrigation Water Requirements using MODIS Vegetation Indices and Inverse Biophysical Modeling

    Science.gov (United States)

    Imhoff, Marc L.; Bounoua, Lahouari; Harriss, Robert; Harriss, Robert; Wells, Gordon; Glantz, Michael; Dukhovny, Victor A.; Orlovsky, Leah

    2007-01-01

    An inverse process approach using satellite-driven (MODIS) biophysical modeling was used to quantitatively assess water resource demand in semi-arid and arid agricultural lands by comparing the carbon and water flux modeled under both equilibrium (in balance with prevailing climate) and non-equilibrium (irrigated) conditions. Since satellite observations of irrigated areas show higher leaf area indices (LAI) than is supportable by local precipitation, we postulate that the degree to which irrigated lands vary from equilibrium conditions is related to the amount of irrigation water used. For an observation year we used MODIS vegetation indices, local climate data, and the SiB2 photosynthesis-conductance model to examine the relationship between climate and the water stress function for a given grid-cell and observed leaf area. To estimate the minimum amount of supplemental water required for an observed cell, we added enough precipitation to the prevailing climatology at each time step to minimize the water stress function and bring the soil to field capacity. The experiment was conducted on irrigated lands on the U.S. Mexico border and Central Asia and compared to estimates of irrigation water used.

  6. On some unwarranted tacit assumptions in cognitive neuroscience.

    Science.gov (United States)

    Mausfeld, Rainer

    2012-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input-output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings.

  7. On Some Unwarranted Tacit Assumptions in Cognitive Neuroscience†

    Science.gov (United States)

    Mausfeld, Rainer

    2011-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. PMID:22435062

  8. Omnibus tests of the martingale assumption in the analysis of recurrent failure time data.

    Science.gov (United States)

    Jones, C L; Harrington, D P

    2001-06-01

    The Andersen-Gill multiplicative intensity (MI) model is well-suited to the analysis of recurrent failure time data. The fundamental assumption of the MI model is that the process Mi(t) for subjects i = 1, ..., n, defined to be the difference between a subject's counting process and compensator, i.e., Ni(t) - Ai(t); t > 0, is a martingale with respect to some filtration. We propose omnibus procedures for testing this assumption. The methods are based on transformations of the estimated martingale residual process Mi(t) a function of consistent estimates of the log-intensity ratios and the baseline cumulative hazard. Under a correctly specified model, the expected value of Mi(t) is approximately equal to zero with approximately uncorrelated increments. These properties are exploited in the proposed testing procedures. We examine the effects of censoring and covariate effects on the operating characteristics of the proposed methods via simulation. The procedures are most sensitive to the omission of a time-varying continuous covariate. We illustrate use of the methods in an analysis of data from a clinical trial involving patients with chronic granulatomous disease.

  9. The Avalanche Hypothesis and Compression of Morbidity: Testing Assumptions through Cohort-Sequential Analysis.

    Directory of Open Access Journals (Sweden)

    Jordan Silberman

    Full Text Available The compression of morbidity model posits a breakpoint in the adult lifespan that separates an initial period of relative health from a subsequent period of ever increasing morbidity. Researchers often assume that such a breakpoint exists; however, this assumption is hitherto untested.To test the assumption that a breakpoint exists--which we term a morbidity tipping point--separating a period of relative health from a subsequent deterioration in health status. An analogous tipping point for healthcare costs was also investigated.Four years of adults' (N = 55,550 morbidity and costs data were retrospectively analyzed. Data were collected in Pittsburgh, PA between 2006 and 2009; analyses were performed in Rochester, NY and Ann Arbor, MI in 2012 and 2013. Cohort-sequential and hockey stick regression models were used to characterize long-term trajectories and tipping points, respectively, for both morbidity and costs.Morbidity increased exponentially with age (P<.001. A morbidity tipping point was observed at age 45.5 (95% CI, 41.3-49.7. An exponential trajectory was also observed for costs (P<.001, with a costs tipping point occurring at age 39.5 (95% CI, 32.4-46.6. Following their respective tipping points, both morbidity and costs increased substantially (Ps<.001.Findings support the existence of a morbidity tipping point, confirming an important but untested assumption. This tipping point, however, may occur earlier in the lifespan than is widely assumed. An "avalanche of morbidity" occurred after the morbidity tipping point-an ever increasing rate of morbidity progression. For costs, an analogous tipping point and "avalanche" were observed. The time point at which costs began to increase substantially occurred approximately 6 years before health status began to deteriorate.

  10. Local conservation scores without a priori assumptions on neutral substitution rates.

    Science.gov (United States)

    Dingel, Janis; Hanus, Pavol; Leonardi, Niccolò; Hagenauer, Joachim; Zech, Jürgen; Mueller, Jakob C

    2008-04-11

    Comparative genomics aims to detect signals of evolutionary conservation as an indicator of functional constraint. Surprisingly, results of the ENCODE project revealed that about half of the experimentally verified functional elements found in non-coding DNA were classified as unconstrained by computational predictions. Following this observation, it has been hypothesized that this may be partly explained by biased estimates on neutral evolutionary rates used by existing sequence conservation metrics. All methods we are aware of rely on a comparison with the neutral rate and conservation is estimated by measuring the deviation of a particular genomic region from this rate. Consequently, it is a reasonable assumption that inaccurate neutral rate estimates may lead to biased conservation and constraint estimates. We propose a conservation signal that is produced by local Maximum Likelihood estimation of evolutionary parameters using an optimized sliding window and present a Kullback-Leibler projection that allows multiple different estimated parameters to be transformed into a conservation measure. This conservation measure does not rely on assumptions about neutral evolutionary substitution rates and little a priori assumptions on the properties of the conserved regions are imposed. We show the accuracy of our approach (KuLCons) on synthetic data and compare it to the scores generated by state-of-the-art methods (phastCons, GERP, SCONE) in an ENCODE region. We find that KuLCons is most often in agreement with the conservation/constraint signatures detected by GERP and SCONE while qualitatively very different patterns from phastCons are observed. Opposed to standard methods KuLCons can be extended to more complex evolutionary models, e.g. taking insertion and deletion events into account and corresponding results show that scores obtained under this model can diverge significantly from scores using the simpler model. Our results suggest that discriminating among the

  11. Local conservation scores without a priori assumptions on neutral substitution rates

    Directory of Open Access Journals (Sweden)

    Hagenauer Joachim

    2008-04-01

    Full Text Available Abstract Background Comparative genomics aims to detect signals of evolutionary conservation as an indicator of functional constraint. Surprisingly, results of the ENCODE project revealed that about half of the experimentally verified functional elements found in non-coding DNA were classified as unconstrained by computational predictions. Following this observation, it has been hypothesized that this may be partly explained by biased estimates on neutral evolutionary rates used by existing sequence conservation metrics. All methods we are aware of rely on a comparison with the neutral rate and conservation is estimated by measuring the deviation of a particular genomic region from this rate. Consequently, it is a reasonable assumption that inaccurate neutral rate estimates may lead to biased conservation and constraint estimates. Results We propose a conservation signal that is produced by local Maximum Likelihood estimation of evolutionary parameters using an optimized sliding window and present a Kullback-Leibler projection that allows multiple different estimated parameters to be transformed into a conservation measure. This conservation measure does not rely on assumptions about neutral evolutionary substitution rates and little a priori assumptions on the properties of the conserved regions are imposed. We show the accuracy of our approach (KuLCons on synthetic data and compare it to the scores generated by state-of-the-art methods (phastCons, GERP, SCONE in an ENCODE region. We find that KuLCons is most often in agreement with the conservation/constraint signatures detected by GERP and SCONE while qualitatively very different patterns from phastCons are observed. Opposed to standard methods KuLCons can be extended to more complex evolutionary models, e.g. taking insertion and deletion events into account and corresponding results show that scores obtained under this model can diverge significantly from scores using the simpler model

  12. Research requirements for a unified approach to modelling chemical effects associated with radioactive waste disposal

    International Nuclear Information System (INIS)

    Krol, A.A.; Read, D.

    1986-09-01

    This report contains the results of a review of the current modelling, laboratory experiments and field experiments being conducted in the United Kingdom to aid understanding and improve prediction of the effects of chemistry on the disposal of radioactive wastes. The aim has been to summarise present work and derive a structure for future research effort that would support the use of probabilistic risk assessment (pra) methods for the disposal of radioactive wastes. The review was conducted by a combination of letter and personal visits, and preliminary results were reported to a plenary meeting of participants held in April, 1986. Following this meeting, copies of the report were circulated to participants at draft stage, so that the finalised report should be taken to provide as far as possible a consensus of opinion of research requirements. (author)

  13. Assessment for Improvement: Two Models for Assessing a Large Quantitative Reasoning Requirement

    Directory of Open Access Journals (Sweden)

    Mary C. Wright

    2015-03-01

    Full Text Available We present two models for assessment of a large and diverse quantitative reasoning (QR requirement at the University of Michigan. These approaches address two key challenges in assessment: (1 dissemination of findings for curricular improvement and (2 resource constraints associated with measurement of large programs. Approaches we present for data collection include convergent validation of self-report surveys, as well as use of mixed methods and learning analytics. Strategies we present for dissemination of findings include meetings with instructors to share data and best practices, sharing of results through social media, and use of easily accessible dashboards. These assessment approaches may be of particular interest to universities with large numbers of students engaging in a QR experience, projects that involve multiple courses with diverse instructional goals, or those who wish to promote evidence-based curricular improvement.

  14. Improved hidden Markov model for nosocomial infections.

    Science.gov (United States)

    Khader, Karim; Leecaster, Molly; Greene, Tom; Samore, Matthew; Thomas, Alun

    2014-12-01

    We propose a novel hidden Markov model (HMM) for parameter estimation in hospital transmission models, and show that commonly made simplifying assumptions can lead to severe model misspecification and poor parameter estimates. A standard HMM that embodies two commonly made simplifying assumptions, namely a fixed patient count and binomially distributed detections is compared with a new alternative HMM that does not require these simplifying assumptions. Using simulated data, we demonstrate how each of the simplifying assumptions used by the standard model leads to model misspecification, whereas the alternative model results in accurate parameter estimates. © The Authors 2013. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  15. Optimal growth of Lactobacillus casei in a Cheddar cheese ripening model system requires exogenous fatty acids.

    Science.gov (United States)

    Tan, W S; Budinich, M F; Ward, R; Broadbent, J R; Steele, J L

    2012-04-01

    Flavor development in ripening Cheddar cheese depends on complex microbial and biochemical processes that are difficult to study in natural cheese. Thus, our group has developed Cheddar cheese extract (CCE) as a model system to study these processes. In previous work, we found that CCE supported growth of Lactobacillus casei, one of the most prominent nonstarter lactic acid bacteria (NSLAB) species found in ripening Cheddar cheese, to a final cell density of 10(8) cfu/mL at 37°C. However, when similar growth experiments were performed at 8°C in CCE derived from 4-mo-old cheese (4mCCE), the final cell densities obtained were only about 10(6) cfu/mL, which is at the lower end of the range of the NSLAB population expected in ripening Cheddar cheese. Here, we report that addition of Tween 80 to CCE resulted in a significant increase in the final cell density of L. casei during growth at 8°C and produced concomitant changes in cytoplasmic membrane fatty acid (CMFA) composition. Although the effect was not as dramatic, addition of milk fat or a monoacylglycerol (MAG) mixture based on the MAG profile of milk fat to 4mCCE also led to an increased final cell density of L. casei in CCE at 8°C and changes in CMFA composition. These observations suggest that optimal growth of L. casei in CCE at low temperature requires supplementation with a source of fatty acids (FA). We hypothesize that L. casei incorporates environmental FA into its CMFA, thereby reducing its energy requirement for growth. The exogenous FA may then be modified or supplemented with FA from de novo synthesis to arrive at a CMFA composition that yields the functionality (i.e., viscosity) required for growth in specific conditions. Additional studies utilizing the CCE model to investigate microbial contributions to cheese ripening should be conducted in CCE supplemented with 1% milk fat. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  16. Model Based User's Access Requirement Analysis of E-Governance Systems

    Science.gov (United States)

    Saha, Shilpi; Jeon, Seung-Hwan; Robles, Rosslin John; Kim, Tai-Hoon; Bandyopadhyay, Samir Kumar

    The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

  17. A clinically relevant model of osteoinduction: a process requiring calcium phosphate and BMP/Wnt signalling.

    Science.gov (United States)

    Eyckmans, J; Roberts, S J; Schrooten, J; Luyten, F P

    2010-06-01

    In this study, we investigated a clinically relevant model of in vivo ectopic bone formation utilizing human periosteum derived cells (HPDCs) seeded in a Collagraft carrier and explored the mechanisms by which this process is driven. Bone formation occurred after eight weeks when a minimum of one million HPDCs was loaded on Collagraft carriers and implanted subcutaneously in NMRI nu/nu mice. De novo bone matrix, mainly secreted by the HPDCs, was found juxta-proximal of the calcium phosphate (CaP) granules suggesting that CaP may have triggered the 'osteoinductive program'. Indeed, removal of the CaP granules by ethylenediaminetetraacetic acid decalcification prior to cell seeding and implantation resulted in loss of bone formation. In addition, inhibition of endogenous bone morphogenetic protein and Wnt signalling by overexpression of the secreted antagonists Noggin and Frzb, respectively, also abrogated osteoinduction. Proliferation of the engrafted HPDCs was strongly reduced in the decalcified scaffolds or when seeded with adenovirus-Noggin/Frzb transduced HPDCs indicating that cell division of the engrafted HPDCs is required for the direct bone formation cascade. These data suggest that this model of bone formation is similar to that observed during physiological intramembranous bone development and may be of importance when investigating tissue engineering strategies.

  18. The Design of Effective ICT-Supported Learning Activities: Exemplary Models, Changing Requirements, and New Possibilities

    Directory of Open Access Journals (Sweden)

    Cameron Richards

    2005-01-01

    Full Text Available Despite the imperatives of policy and rhetoric about their integration in formal education, Information and Communication Technologies (ICTs are often used as an "add-on" in many classrooms and in many lesson plans. Nevertheless, many teachers find that interesting and well-planned tasks, projects, and resources provide a key to harnessing the educational potential of digital resources, Internet communications and interactive multimedia to engage the interest, interaction, and knowledge construction of young learners. To the extent that such approaches go beyond and transform traditional "transmission" models of teaching and formal lesson planning, this paper investigates the changing requirements and new possibilities represented by the challenge of integrating ICTs in education in a way which at the same time connects more effectively with both the specific contents of the curriculum and the various stages and elements of the learning process. Case studies from teacher education foundation courses provide an exemplary focus of inquiry in order to better link relevant new theories or models of learning with practice, to build upon related learner-centered strategies for integrating ICT resources and tools, and to incorporate interdependent functions of learning as information access, communication, and applied interactions. As one possible strategy in this direction, the concept of an "ICT-supported learning activity" suggests the need for teachers to approach this increasing challenge more as "designers" of effective and integrated learning rather than mere "transmitters" of skills or information through an add-on use of ICTs.

  19. Should scientists be required to use a model-based solution to adjust for possible distance-based detectability bias?

    Science.gov (United States)

    Hutto, Richard L

    2016-07-01

    The most popular method used to gain an understanding of population trends or of differences in bird abundance among land condition categories is to use information derived from point counts. Unfortunately, various factors can affect one's ability to detect birds, and those factors need to be controlled or accounted for so that any difference in one's index among time periods or locations is an accurate reflection of differences in bird abundance and not differences in detectability. Avian ecologists could use appropriately sized fixed-area surveys to minimize the chance that they might be deceived by distance-based detectability bias, but the current method of choice is to use a modeling approach that allows one to account for distance-based bias by modeling the effects of distance on detectability or occupancy. I challenge the idea that modeling is the best approach to account for distance-based effects on the detectability of birds because the most important distance-based modeling assumptions can never be met. The use of a fixed-area survey method to generate an index of abundance is the simplest way to control for distance-based detectability bias and should not be universally condemned or be the basis for outright rejection in the publication process. © 2016 by the Ecological Society of America.

  20. Tank waste remediation system retrieval and disposal mission key enabling assumptions

    International Nuclear Information System (INIS)

    Baldwin, J.H.

    1998-01-01

    An overall systems approach has been applied to develop action plans to support the retrieval and immobilization waste disposal mission. The review concluded that the systems and infrastructure required to support the mission are known. Required systems are either in place or plans have been developed. An analysis of the programmatic, management and technical activities necessary to declare Readiness to Proceed with execution of the mission demonstrates that the system, people, and hardware will be on line and ready to support the private contractors. The systems approach included defining the retrieval and immobilized waste disposal mission requirements and evaluating the readiness of the TWRS contractor to supply waste feed to the private contractors in June 2002. The Phase 1 feed delivery requirements from the Private Contractor Request for Proposals were reviewed, transfer piping routes were mapped on it, existing systems were evaluated, and upgrade requirements were defined. Technical Basis Reviews were completed to define work scope in greater detail, cost estimates and associated year by year financial analyses were completed. Personnel training, qualifications, management systems and procedures were reviewed and shown to be in place and ready to support the Phase 1B mission. Key assumptions and risks that could negatively impact mission success were evaluated and appropriate mitigative actions plans were planned and scheduled

  1. Tank waste remediation system retrieval and disposal mission key enabling assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, J.H.

    1998-01-09

    An overall systems approach has been applied to develop action plans to support the retrieval and immobilization waste disposal mission. The review concluded that the systems and infrastructure required to support the mission are known. Required systems are either in place or plans have been developed. An analysis of the programmatic, management and technical activities necessary to declare Readiness to Proceed with execution of the mission demonstrates that the system, people, and hardware will be on line and ready to support the private contractors. The systems approach included defining the retrieval and immobilized waste disposal mission requirements and evaluating the readiness of the TWRS contractor to supply waste feed to the private contractors in June 2002. The Phase 1 feed delivery requirements from the Private Contractor Request for Proposals were reviewed, transfer piping routes were mapped on it, existing systems were evaluated, and upgrade requirements were defined. Technical Basis Reviews were completed to define work scope in greater detail, cost estimates and associated year by year financial analyses were completed. Personnel training, qualifications, management systems and procedures were reviewed and shown to be in place and ready to support the Phase 1B mission. Key assumptions and risks that could negatively impact mission success were evaluated and appropriate mitigative actions plans were planned and scheduled.

  2. Change Impact Analysis for SysML Requirements Models based on Semantics of Trace Relations

    NARCIS (Netherlands)

    ten Hove, David; Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas; de Goede, Koos; Oldevik, J.; Olsen, G. K.; Neple, T.; Kolovos, D.

    2009-01-01

    Change impact analysis is one of the applications of requirements traceability in software engineering community. In this paper, we focus on requirements and requirements relations from traceability perspective. We provide formal definitions of the requirements relations in SysML for change impact

  3. Temporal Distinctiveness in Task Switching: Assessing the Mixture-Distribution Assumption

    Directory of Open Access Journals (Sweden)

    James A Grange

    2016-02-01

    Full Text Available In task switching, increasing the response--cue interval has been shown to reduce the switch cost. This has been attributed to a time-based decay process influencing the activation of memory representations of tasks (task-sets. Recently, an alternative account based on interference rather than decay has been successfully applied to this data (Horoufchin et al., 2011. In this account, variation of the RCI is thought to influence the temporal distinctiveness (TD of episodic traces in memory, thus affecting their retrieval probability. This can affect performance as retrieval probability influences response time: If retrieval succeeds, responding is fast due to positive priming; if retrieval fails, responding is slow, due to having to perform the task via a slow algorithmic process. This account---and a recent formal model (Grange & Cross, 2015---makes the strong prediction that all RTs are a mixture of one of two processes: a fast process when retrieval succeeds, and a slow process when retrieval fails. The present paper assesses the evidence for this mixture-distribution assumption in TD data. In a first section, statistical evidence for mixture-distributions is found using the fixed-point property test. In a second section, a mathematical process model with mixture-distributions at its core is fitted to the response time distribution data. Both approaches provide good evidence in support of the mixture-distribution assumption, and thus support temporal distinctiveness accounts of the data.

  4. Rethinking our assumptions about the evolution of bird song and other sexually dimorphic signals

    Directory of Open Access Journals (Sweden)

    J. Jordan Price

    2015-04-01

    Full Text Available Bird song is often cited as a classic example of a sexually-selected ornament, in part because historically it has been considered a primarily male trait. Recent evidence that females also sing in many songbird species and that sexual dimorphism in song is often the result of losses in females rather than gains in males therefore appears to challenge our understanding of the evolution of bird song through sexual selection. Here I propose that these new findings do not necessarily contradict previous research, but rather they disagree with some of our assumptions about the evolution of sexual dimorphisms in general and female song in particular. These include misconceptions that current patterns of elaboration and diversity in each sex reflect past rates of change and that levels of sexual dimorphism necessarily reflect levels of sexual selection. Using New World blackbirds (Icteridae as an example, I critically evaluate these past assumptions in light of new phylogenetic evidence. Understanding the mechanisms underlying such sexually dimorphic traits requires a clear understanding of their evolutionary histories. Only then can we begin to ask the right questions.

  5. Vessel contents of leaves after excision: a test of the Scholander assumption.

    Science.gov (United States)

    Tyree, Melvin T; Cochard, Herve

    2003-09-01

    When petioles of transpiring leaves are cut in the air, according to the 'Scholander assumption', the vessels cut open should fill with air as the water is drained away by tissue rehydration and/or continued transpiration. The distribution of air-filled vessels versus distance from the cut surface should match the distribution of lengths of 'open vessels', i.e. vessels cut open when the leaf is excised. A paint perfusion method was used to estimate the length distribution of open vessels and this was compared with the observed distribution of embolisms by the cryo-SEM method. In the cryo-SEM method, petioles are frozen in liquid nitrogen soon after the petiole is cut. The petioles are then cut at different distances from the original cut surface while frozen and examined in a cryo-SEM facility, where it is easy to distinguish vessels filled with air from those filled with ice. The Scholander assumption was also confirmed by a hydraulic method, which avoided possible freezing artefacts. In petioles of sunflower (Helianthus annuus L) the distribution of embolized vessels agrees with expectations. This is in contrast to a previous study on sunflower where cryo-SEM results did not agree with expectations. The reasons for this disagreement are suggested, but further study is required for a full elucidation.

  6. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  7. 7 CFR 3550.163 - Transfer of security and assumption of indebtedness.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Transfer of security and assumption of indebtedness... § 3550.163 Transfer of security and assumption of indebtedness. (a) General policy. RHS mortgages contain... transferred with an assumption of the indebtedness. If it is in the best interest of the Government, RHS will...

  8. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  9. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  10. 75 FR 63380 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2010-10-15

    ...-Employer Plans to prescribe interest assumptions under the regulation for valuation dates in November 2010... title IV of the Employee Retirement Income Security Act of 1974. ] PBGC uses the interest assumptions in... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  11. 76 FR 2578 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2011-01-14

    ...-Employer Plans to prescribe interest assumptions under the regulation for valuation dates in February 2011... title IV of the Employee Retirement Income Security Act of 1974. PBGC uses the interest assumptions in... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  12. 77 FR 74353 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-12-14

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... interest assumptions--for paying plan benefits under terminating single-employer plans covered by title IV... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  13. 78 FR 2881 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2013-01-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... interest assumptions--for paying plan benefits under terminating single-employer plans covered by title IV... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  14. 77 FR 28477 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-05-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in June... title IV of the Employee Retirement Income Security Act of 1974. The interest assumptions in the... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  15. 78 FR 62426 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2013-10-22

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... interest assumptions--for paying plan benefits under terminating single-employer plans covered by title IV... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  16. 77 FR 8730 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-02-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... covered by title IV of the Employee Retirement Income Security Act of 1974. The interest assumptions in... same. The interest assumptions are intended to reflect current conditions in the financial and annuity...

  17. 77 FR 41270 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-07-13

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... covered by title IV of the Employee Retirement Income Security Act of 1974. The interest assumptions in... same. The interest assumptions are intended to reflect current conditions in the financial and annuity...

  18. 76 FR 41689 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2011-07-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... covered by title IV of the Employee Retirement Income Security Act of 1974. The interest assumptions in... same. The interest assumptions are intended to reflect current conditions in the financial and annuity...

  19. 77 FR 68685 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-11-16

    ... regulation for valuation dates in December 2012. The interest assumptions are used for paying benefits under... interest assumptions--for paying plan benefits under terminating single-employer plans covered by title IV... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  20. 77 FR 22215 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-04-13

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in May... interest assumptions--for paying plan benefits under terminating single-employer plans covered by title IV... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  1. 78 FR 49682 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2013-08-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... interest assumptions--for paying plan benefits under terminating single-employer plans covered by title IV... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  2. 78 FR 68739 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2013-11-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... title IV of the Employee Retirement Income Security Act of 1974. The interest assumptions in the... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  3. 75 FR 69588 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2010-11-15

    ... interest assumptions under the regulation for valuation dates in December 2010. Interest assumptions are...--for paying plan benefits under terminating single-employer plans covered by title IV of the Employee... reflect current conditions in the financial and annuity markets. Assumptions under the benefit payments...

  4. 77 FR 62433 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-10-15

    ... Single- Employer Plans to prescribe interest assumptions under the regulation for valuation dates in... interest assumptions--for paying plan benefits under terminating single-employer plans covered by title IV... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  5. 76 FR 8649 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2011-02-15

    ...-Employer Plans to prescribe interest assumptions under the regulation for valuation dates in March 2011... title IV of the Employee Retirement Income Security Act of 1974. PBGC uses the interest assumptions in... interest assumptions are intended to reflect current conditions in the financial and annuity markets...

  6. 77 FR 48855 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-08-15

    ... to prescribe interest assumptions under the regulation for valuation dates in September 2012. The... interest assumptions are intended to reflect current conditions in the financial and annuity markets... Assets in Single-Employer Plans (29 CFR part 4044) prescribes interest assumptions for valuing benefits...

  7. Exploring the Influence of Ethnicity, Age, and Trauma on Prisoners' World Assumptions

    Science.gov (United States)

    Gibson, Sandy

    2011-01-01

    In this study, the author explores world assumptions of prisoners, how these assumptions vary by ethnicity and age, and whether trauma history affects world assumptions. A random sample of young and old prisoners, matched for prison location, was drawn from the New Jersey Department of Corrections prison population. Age and ethnicity had…

  8. Impact of violated high-dose refuge assumptions on evolution of Bt resistance.

    Science.gov (United States)

    Campagne, Pascal; Smouse, Peter E; Pasquet, Rémy; Silvain, Jean-François; Le Ru, Bruno; Van den Berg, Johnnie

    2016-04-01

    Transgenic crops expressing Bacillus thuringiensis (Bt) toxins have been widely and successfully deployed for the control of target pests, while allowing a substantial reduction in insecticide use. The evolution of resistance (a heritable decrease in susceptibility to Bt toxins) can pose a threat to sustained control of target pests, but a high-dose refuge (HDR) management strategy has been key to delaying countervailing evolution of Bt resistance. The HDR strategy relies on the mating frequency between susceptible and resistant individuals, so either partial dominance of resistant alleles or nonrandom mating in the pest population itself could elevate the pace of resistance evolution. Using classic Wright-Fisher genetic models, we investigated the impact of deviations from standard refuge model assumptions on resistance evolution in the pest populations. We show that when Bt selection is strong, even deviations from random mating and/or strictly recessive resistance that are below the threshold of detection can yield dramatic increases in the pace of resistance evolution. Resistance evolution is hastened whenever the order of magnitude of model violations exceeds the initial frequency of resistant alleles. We also show that the existence of a fitness cost for resistant individuals on the refuge crop cannot easily overcome the effect of violated HDR assumptions. We propose a parametrically explicit framework that enables both comparison of various field situations and model inference. Using this model, we propose novel empiric estimators of the pace of resistance evolution (and time to loss of control), whose simple calculation relies on the observed change in resistance allele frequency.

  9. Aminoglycoside Concentrations Required for Synergy with Carbapenems against Pseudomonas aeruginosa Determined via Mechanistic Studies and Modeling.

    Science.gov (United States)

    Yadav, Rajbharan; Bulitta, Jürgen B; Schneider, Elena K; Shin, Beom Soo; Velkov, Tony; Nation, Roger L; Landersdorfer, Cornelia B

    2017-12-01

    This study aimed to systematically identify the aminoglycoside concentrations required for synergy with a carbapenem and characterize the permeabilizing effect of aminoglycosides on the outer membrane of Pseudomonas aeruginosa Monotherapies and combinations of four aminoglycosides and three carbapenems were studied for activity against P. aeruginosa strain AH298-GFP in 48-h static-concentration time-kill studies (SCTK) (inoculum: 10 7.6 CFU/ml). The outer membrane-permeabilizing effect of tobramycin alone and in combination with imipenem was characterized via electron microscopy, confocal imaging, and the nitrocefin assay. A mechanism-based model (MBM) was developed to simultaneously describe the time course of bacterial killing and prevention of regrowth by imipenem combined with each of the four aminoglycosides. Notably, 0.25 mg/liter of tobramycin, which was inactive in monotherapy, achieved synergy (i.e., ≥2-log 10 more killing than the most active monotherapy at 24 h) combined with imipenem. Electron micrographs, confocal image analyses, and the nitrocefin uptake data showed distinct outer membrane damage by tobramycin, which was more extensive for the combination with imipenem. The MBM indicated that aminoglycosides enhanced the imipenem target site concentration up to 4.27-fold. Tobramycin was the most potent aminoglycoside to permeabilize the outer membrane; tobramycin (0.216 mg/liter), gentamicin (0.739 mg/liter), amikacin (1.70 mg/liter), or streptomycin (5.19 mg/liter) was required for half-maximal permeabilization. In summary, our SCTK, mechanistic studies and MBM indicated that tobramycin was highly synergistic and displayed the maximum outer membrane disruption potential among the tested aminoglycosides. These findings support the optimization of highly promising antibiotic combination dosage regimens for critically ill patients. Copyright © 2017 American Society for Microbiology.

  10. Review of data requirements for groundwater flow and solute transport modelling and the ability of site investigation methods to meet these requirements

    International Nuclear Information System (INIS)

    McEwen, T.J.; Chapman, N.A.; Robinson, P.C.

    1990-08-01

    This report describes the data requirements for the codes that may be used in the modelling of groundwater flow and radionuclide transport during the assessment of a Nirex site for the deep disposal of low and intermediate level radioactive waste and also the site investigation methods that exist to supply the data for these codes. The data requirements for eight codes are reviewed, with most emphasis on three of the more significant codes, VANDAL, NAMMU and CHEMTARD. The largest part of the report describes and discusses the site investigation techniques and each technique is considered in terms of its ability to provide the data necessary to characterise the geological and hydrogeological environment around a potential repository. (author)

  11. Evaluation of Pharmacokinetic Assumptions Using a 443 Chemical Library (IVIVE)

    Science.gov (United States)

    With the increasing availability of high-throughput and in vitro data for untested chemicals, there is a need for pharmacokinetic (PK) models for in vitro to in vivo extrapolation (IVIVE). Though some PBPK models have been created for individual compounds us...

  12. Evaluation of Pharmacokinetic Assumptions Using a 443 Chemical Library (SOT)

    Science.gov (United States)

    With the increasing availability of high-throughput and in vitro data for untested chemicals, there is a need for pharmacokinetic (PK) models for in vitro to in vivo extrapolation (IVIVE). Though some PBPK models have been created for individual compounds using in vivo data, we ...

  13. Adhering to the assumptions of invitational education: a case study ...

    African Journals Online (AJOL)

    South African schools are constantly faced with evolving needs and challenges characterised by change. As in other countries, schools in South Africa encounter pressure to 'produce more for less' and at the same time to achiev e c ertain goals and standards. Transforming schools into inviting institutions requires a ...

  14. Values and Assumptions in Contestation over School Councils Selecting Principals.

    Science.gov (United States)

    Macpherson, R. J. S.

    1983-01-01

    Many of the arguments in the Victoria Department of Education, Australia, against the involvement of school councils in the selection of principals are based on careerism, opportunism, and protectionism. To move away from the lock-step adherence to the values that lead to accession by seniority requires the application of democratic methods.…

  15. False Assumptions: The Challenges and Politics of Teaching in China

    Science.gov (United States)

    Getty, Laura J.

    2011-01-01

    Teachers in American study-abroad programs usually receive little, if any, training before the trip, because "teaching is teaching". The cultural differences between Chinese and American university classrooms, however, affect the students' ability to learn and the teacher's ability to teach in profound ways. Foreign teachers in China require at…

  16. Requirement for Serratia marcescens Cytolysin in a Murine Model of Hemorrhagic Pneumonia

    Science.gov (United States)

    González-Juarbe, Norberto; Mares, Chris A.; Hinojosa, Cecilia A.; Medina, Jorge L.; Cantwell, Angelene; Dube, Peter H.; Bergman, Molly A.

    2014-01-01

    Serratia marcescens, a member of the carbapenem-resistant Enterobacteriaceae, is an important emerging pathogen that causes a wide variety of nosocomial infections, spreads rapidly within hospitals, and has a systemic mortality rate of ≤41%. Despite multiple clinical descriptions of S. marcescens nosocomial pneumonia, little is known regarding the mechanisms of bacterial pathogenesis and the host immune response. To address this gap, we developed an oropharyngeal aspiration model of lethal and sublethal S. marcescens pneumonia in BALB/c mice and extensively characterized the latter. Lethal challenge (>4.0 × 106 CFU) was characterized by fulminate hemorrhagic pneumonia with rapid loss of lung function and death. Mice challenged with a sublethal dose (<2.0 × 106 CFU) rapidly lost weight, had diminished lung compliance, experienced lung hemorrhage, and responded to the infection with extensive neutrophil infiltration and histopathological changes in tissue architecture. Neutrophil extracellular trap formation and the expression of inflammatory cytokines occurred early after infection. Mice depleted of neutrophils were exquisitely susceptible to an otherwise nonlethal inoculum, thereby demonstrating the requirement for neutrophils in host protection. Mutation of the genes encoding the cytolysin ShlA and its transporter ShlB resulted in attenuated S. marcescens strains that failed to cause profound weight loss, extended illness, hemorrhage, and prolonged lung pathology in mice. This study describes a model of S. marcescens pneumonia that mimics known clinical features of human illness, identifies neutrophils and the toxin ShlA as a key factors important for defense and infection, respectively, and provides a solid foundation for future studies of novel therapeutics for this important opportunistic pathogen. PMID:25422267

  17. Utilization of a mental health collaborative care model among patients who require interpreter services.

    Science.gov (United States)

    Njeru, Jane W; DeJesus, Ramona S; St Sauver, Jennifer; Rutten, Lila J; Jacobson, Debra J; Wilson, Patrick; Wieland, Mark L

    2016-01-01

    Immigrants and refugees to the United States have a higher prevalence of depression compared to the general population and are less likely to receive adequate mental health services and treatment. Those with limited English proficiency (LEP) are at an even higher risk of inadequate mental health care. Collaborative care management (CCM) models for depression are effective in achieving treatment goals among a wide range of patient populations, including patients with LEP. The purpose of this study was to assess the utilization of a statewide initiative that uses CCM for depression management, among patients with LEP in a large primary care practice. This was a retrospective cohort study of patients with depression in a large primary care practice in Minnesota. Patients who met criteria for enrollment into the CCM [with a provider-generated diagnosis of depression or dysthymia in the electronic medical records, and a Patient Health Questionnaire-9 (PHQ-9) score ≥10]. Patient-identified need for interpreter services was used as a proxy for LEP. Rates of enrollment into the DIAMOND (Depression Improvement Across Minnesota, Offering A New Direction) program, a statewide initiative that uses CCM for depression management were measured. These rates were compared between eligible patients who require interpreter services versus patients who do not. Of the 7561 patients who met criteria for enrollment into the DIAMOND program during the study interval, 3511 were enrolled. Only 18.2 % of the eligible patients with LEP were enrolled into DIAMOND compared with the 47.2 % of the eligible English proficient patients. This finding persisted after adjustment for differences in age, gender and depression severity scores (adjusted OR [95 % confidence interval] = 0.43 [0.23, 0.81]). Within primary care practices, tailored interventions are needed, including those that address cultural competence and language navigation, to improve the utilization of this effective model among

  18. The role of relevance and mutual assumption in the language of contract communication

    Directory of Open Access Journals (Sweden)

    Dave Mansergh

    1996-03-01

    Full Text Available Contract communication problems are common within the landscape industry. The contextual frames of reference and assumptions held by both designer and contractor affect the way information is interpreted. In order to interpret a piece of communication correctly, both parties must learn and understand the meanings and implications of the language used. This requires the formation of mutual understanding between them, whereby quality is more likely to be achieved. Relevance theory offers an explanation as to why contract communication problems occur and a guide for achieving successful contract communication. The importance of good communication within the landscape construction industry cannot be over emphasised... On site problems ... usually occur due to communication failure. (Mayer 1987, p.1

  19. Semantics of trace relations in requirements models for consistency checking and inferencing

    NARCIS (Netherlands)

    Göknil, Arda; Ivanov, Ivan; van den Berg, Klaas; Veldhuis, Jan-Willem

    2009-01-01

    Requirements traceability is the ability to relate requirements back to stakeholders and forward to corresponding design artifacts, code, and test cases. Although considerable research has been devoted to relating requirements in both forward and backward directions, less attention has been paid to

  20. Assumptions behind scoring source versus item memory: Effects of age, hippocampal lesions and mild memory problems.

    Science.gov (United States)

    Cooper, Elisa; Greve, Andrea; Henson, Richard N

    2017-06-01

    Source monitoring paradigms have been used to separate: 1) the probability of recognising an item (Item memory) and 2) the probability of remembering the context in which that item was previously encountered (Source memory), conditional on it being recognised. Multinomial Processing Tree (MPT) models are an effective way to estimate these conditional probabilities. Moreover, MPTs make explicit the assumptions behind different ways to parameterise Item and Source memory. Using data from six independent groups across two different paradigms, we show that one would draw different conclusions about the effects of age, age-related memory problems and hippocampal lesions on Item and Source memory, depending on the use of: 1) standard accuracy calculation vs MPT analysis, and 2) two different MPT models. The MPT results were more consistent than standard accuracy calculations, and furnished additional parameters that can be interpreted in terms of, for example, false recollection or missed encoding. Moreover, a new MPT structure that allowed for separate memory representations (one for item information and one for item-plus-source information; the Source-Item model) fit the data better, and provided a different pattern of significant differences in parameters, than the more conventional MPT structure in which source information is a subset of item information (the Item-Source model). Nonetheless, there is no theory-neutral way of scoring data, and thus proper examination of the assumptions underlying the scoring of source monitoring paradigms is necessary before theoretical conclusions can be drawn. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.