WorldWideScience

Sample records for model misspecification justifying

  1. Sensitivity of Fit Indices to Misspecification in Growth Curve Models

    Science.gov (United States)

    Wu, Wei; West, Stephen G.

    2010-01-01

    This study investigated the sensitivity of fit indices to model misspecification in within-individual covariance structure, between-individual covariance structure, and marginal mean structure in growth curve models. Five commonly used fit indices were examined, including the likelihood ratio test statistic, root mean square error of…

  2. Linearity and Misspecification Tests for Vector Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    Teräsvirta, Timo; Yang, Yukai

    The purpose of the paper is to derive Lagrange multiplier and Lagrange multiplier type specification and misspecification tests for vector smooth transition regression models. We report results from simulation studies in which the size and power properties of the proposed asymptotic tests in small...

  3. Detecting Growth Shape Misspecifications in Latent Growth Models: An Evaluation of Fit Indexes

    Science.gov (United States)

    Leite, Walter L.; Stapleton, Laura M.

    2011-01-01

    In this study, the authors compared the likelihood ratio test and fit indexes for detection of misspecifications of growth shape in latent growth models through a simulation study and a graphical analysis. They found that the likelihood ratio test, MFI, and root mean square error of approximation performed best for detecting model misspecification…

  4. Testing constancy of unconditional variance in volatility models by misspecification and specification tests

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Terasvirta, Timo

    The topic of this paper is testing the hypothesis of constant unconditional variance in GARCH models against the alternative that the unconditional variance changes deterministically over time. Tests of this hypothesis have previously been performed as misspecification tests after fitting a GARCH...... models. An application to exchange rate returns is included....

  5. The problem of measurement model misspecification in behavioral and organizational research and some recommended solutions.

    Science.gov (United States)

    MacKenzie, Scott B; Podsakoff, Philip M; Jarvis, Cheryl Burke

    2005-07-01

    The purpose of this study was to review the distinction between formative- and reflective-indicator measurement models, articulate a set of criteria for deciding whether measures are formative or reflective, illustrate some commonly researched constructs that have formative indicators, empirically test the effects of measurement model misspecification using a Monte Carlo simulation, and recommend new scale development procedures for latent constructs with formative indicators. Results of the Monte Carlo simulation indicated that measurement model misspecification can inflate unstandardized structural parameter estimates by as much as 400% or deflate them by as much as 80% and lead to Type I or Type II errors of inference, depending on whether the exogenous or the endogenous latent construct is misspecified. Implications of this research are discussed. Copyright 2005 APA, all rights reserved.

  6. Explained variation and predictive accuracy in general parametric statistical models: the role of model misspecification

    DEFF Research Database (Denmark)

    Rosthøj, Susanne; Keiding, Niels

    2004-01-01

    When studying a regression model measures of explained variation are used to assess the degree to which the covariates determine the outcome of interest. Measures of predictive accuracy are used to assess the accuracy of the predictions based on the covariates and the regression model. We give a ...... a detailed and general introduction to the two measures and the estimation procedures. The framework we set up allows for a study of the effect of misspecification on the quantities estimated. We also introduce a generalization to survival analysis....

  7. The effect of mis-specification on mean and selection between the Weibull and lognormal models

    Science.gov (United States)

    Jia, Xiang; Nadarajah, Saralees; Guo, Bo

    2018-02-01

    The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.

  8. Expectation Maximization Algorithm for Box-Cox Transformation Cure Rate Model and Assessment of Model Misspecification Under Weibull Lifetimes.

    Science.gov (United States)

    Pal, Suvra; Balakrishnan, Narayanaswamy

    2018-05-01

    In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.

  9. A Lagrange multiplier-type test for idiosyncratic unit roots in the exact factor model under misspecification

    NARCIS (Netherlands)

    Zhou, X.; Solberger, M.

    2013-01-01

    We consider an exact factor model and derive a Lagrange multiplier-type test for unit roots in the idiosyncratic components. The asymptotic distribution of the statistic is derived under the misspecification that the differenced factors are white noise. We prove that the asymptotic distribution is

  10. A Bayesian approach to identifying and compensating for model misspecification in population models.

    Science.gov (United States)

    Thorson, James T; Ono, Kotaro; Munch, Stephan B

    2014-02-01

    State-space estimation methods are increasingly used in ecology to estimate productivity and abundance of natural populations while accounting for variability in both population dynamics and measurement processes. However, functional forms for population dynamics and density dependence often will not match the true biological process, and this may degrade the performance of state-space methods. We therefore developed a Bayesian semiparametric state-space model, which uses a Gaussian process (GP) to approximate the population growth function. This offers two benefits for population modeling. First, it allows data to update a specified "prior" on the population growth function, while reverting to this prior when data are uninformative. Second, it allows variability in population dynamics to be decomposed into random errors around the population growth function ("process error") and errors due to the mismatch between the specified prior and estimated growth function ("model error"). We used simulation modeling to illustrate the utility of GP methods in state-space population dynamics models. Results confirmed that the GP model performs similarly to a conventional state-space model when either (1) the prior matches the true process or (2) data are relatively uninformative. However, GP methods improve estimates of the population growth function when the function is misspecified. Results also demonstrated that the estimated magnitude of "model error" can be used to distinguish cases of model misspecification. We conclude with a discussion of the prospects for GP methods in other state-space models, including age and length-structured, meta-analytic, and individual-movement models.

  11. The impact of covariance misspecification in multivariate Gaussian mixtures on estimation and inference: an application to longitudinal modeling.

    Science.gov (United States)

    Heggeseth, Brianna C; Jewell, Nicholas P

    2013-07-20

    Multivariate Gaussian mixtures are a class of models that provide a flexible parametric approach for the representation of heterogeneous multivariate outcomes. When the outcome is a vector of repeated measurements taken on the same subject, there is often inherent dependence between observations. However, a common covariance assumption is conditional independence-that is, given the mixture component label, the outcomes for subjects are independent. In this paper, we study, through asymptotic bias calculations and simulation, the impact of covariance misspecification in multivariate Gaussian mixtures. Although maximum likelihood estimators of regression and mixing probability parameters are not consistent under misspecification, they have little asymptotic bias when mixture components are well separated or if the assumed correlation is close to the truth even when the covariance is misspecified. We also present a robust standard error estimator and show that it outperforms conventional estimators in simulations and can indicate that the model is misspecified. Body mass index data from a national longitudinal study are used to demonstrate the effects of misspecification on potential inferences made in practice. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Why item parcels are (almost) never appropriate: two wrongs do not make a right--camouflaging misspecification with item parcels in CFA models.

    Science.gov (United States)

    Marsh, Herbert W; Lüdtke, Oliver; Nagengast, Benjamin; Morin, Alexandre J S; Von Davier, Matthias

    2013-09-01

    The present investigation has a dual focus: to evaluate problematic practice in the use of item parcels and to suggest exploratory structural equation models (ESEMs) as a viable alternative to the traditional independent clusters confirmatory factor analysis (ICM-CFA) model (with no cross-loadings, subsidiary factors, or correlated uniquenesses). Typically, it is ill-advised to (a) use item parcels when ICM-CFA models do not fit the data, and (b) retain ICM-CFA models when items cross-load on multiple factors. However, the combined use of (a) and (b) is widespread and often provides such misleadingly good fit indexes that applied researchers might believe that misspecification problems are resolved--that 2 wrongs really do make a right. Taking a pragmatist perspective, in 4 studies we demonstrate with responses to the Rosenberg Self-Esteem Inventory (Rosenberg, 1965), Big Five personality factors, and simulated data that even small cross-loadings seriously distort relations among ICM-CFA constructs or even decisions on the number of factors; although obvious in item-level analyses, this is camouflaged by the use of parcels. ESEMs provide a viable alternative to ICM-CFAs and a test for the appropriateness of parcels. The use of parcels with an ICM-CFA model is most justifiable when the fit of both ICM-CFA and ESEM models is acceptable and equally good, and when substantively important interpretations are similar. However, if the ESEM model fits the data better than the ICM-CFA model, then the use of parcels with an ICM-CFA model typically is ill-advised--particularly in studies that are also interested in scale development, latent means, and measurement invariance.

  13. The impact of covariance misspecification in group-based trajectory models for longitudinal data with non-stationary covariance structure.

    Science.gov (United States)

    Davies, Christopher E; Glonek, Gary Fv; Giles, Lynne C

    2017-08-01

    One purpose of a longitudinal study is to gain a better understanding of how an outcome of interest changes among a given population over time. In what follows, a trajectory will be taken to mean the series of measurements of the outcome variable for an individual. Group-based trajectory modelling methods seek to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Group-based trajectory models generally assume a certain structure in the covariances between measurements, for example conditional independence, homogeneous variance between groups or stationary variance over time. Violations of these assumptions could be expected to result in poor model performance. We used simulation to investigate the effect of covariance misspecification on misclassification of trajectories in commonly used models under a range of scenarios. To do this we defined a measure of performance relative to the ideal Bayesian correct classification rate. We found that the more complex models generally performed better over a range of scenarios. In particular, incorrectly specified covariance matrices could significantly bias the results but using models with a correct but more complicated than necessary covariance matrix incurred little cost.

  14. A Critical Review of Construct Indicators and Measurement Model Misspecification in Marketing and Consumer Research.

    OpenAIRE

    Jarvis, Cheryl Burke; MacKenzie, Scott B; Podsakoff, Philip M

    2003-01-01

    A review of the literature suggests that few studies use formative indicator measurement models, even though they should. Therefore, the purpose of this research is to (a) discuss the distinction between formative and reflective measurement models, (b) develop a set of conceptual criteria that can be used to determine whether a construct should be modeled as having formative or reflective indicators, (c) review the marketing literature to obtain an estimate of the extent of measurement model ...

  15. Italian Physical Society Justifying the QCD parton model

    CERN Document Server

    Veneziano, G

    2018-01-01

    I will focus my attention on the two papers I wrote with Roberto and Daniele Amati on justifying the QCD-improved parton model, a very basic tool used every day to estimate a variety of processes involving strong (as well as possibly other) interactions. While doing so, I will also touch on other occasions I had to work —or just interact— with Roberto during more than 30 years of our respective careers.

  16. Estimating the mean and standard deviation of environmental data with below detection limit observations: Considering highly skewed data and model misspecification.

    Science.gov (United States)

    Shoari, Niloofar; Dubé, Jean-Sébastien; Chenouri, Shoja'eddin

    2015-11-01

    In environmental studies, concentration measurements frequently fall below detection limits of measuring instruments, resulting in left-censored data. Some studies employ parametric methods such as the maximum likelihood estimator (MLE), robust regression on order statistic (rROS), and gamma regression on order statistic (GROS), while others suggest a non-parametric approach, the Kaplan-Meier method (KM). Using examples of real data from a soil characterization study in Montreal, we highlight the need for additional investigations that aim at unifying the existing literature. A number of studies have examined this issue; however, those considering data skewness and model misspecification are rare. These aspects are investigated in this paper through simulations. Among other findings, results show that for low skewed data, the performance of different statistical methods is comparable, regardless of the censoring percentage and sample size. For highly skewed data, the performance of the MLE method under lognormal and Weibull distributions is questionable; particularly, when the sample size is small or censoring percentage is high. In such conditions, MLE under gamma distribution, rROS, GROS, and KM are less sensitive to skewness. Related to model misspecification, MLE based on lognormal and Weibull distributions provides poor estimates when the true distribution of data is misspecified. However, the methods of rROS, GROS, and MLE under gamma distribution are generally robust to model misspecifications regardless of skewness, sample size, and censoring percentage. Since the characteristics of environmental data (e.g., type of distribution and skewness) are unknown a priori, we suggest using MLE based on gamma distribution, rROS and GROS. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Specification and misspecification of theoretical foundations and logic models for health communication campaigns.

    Science.gov (United States)

    Slater, Michael D

    2006-01-01

    While increasingly widespread use of behavior change theory is an advance for communication campaigns and their evaluation, such theories provide a necessary but not sufficient condition for theory-based communication interventions. Such interventions and their evaluations need to incorporate theoretical thinking about plausible mechanisms of message effect on health-related attitudes and behavior. Otherwise, strategic errors in message design and dissemination, and misspecified campaign logic models, insensitive to campaign effects, are likely to result. Implications of the elaboration likelihood model, attitude accessibility, attitude to the ad theory, exemplification, and framing are explored, and implications for campaign strategy and evaluation designs are briefly discussed. Initial propositions are advanced regarding a theory of campaign affect generalization derived from attitude to ad theory, and regarding a theory of reframing targeted health behaviors in those difficult contexts in which intended audiences are resistant to the advocated behavior or message.

  18. Model Justified Search Algorithms for Scheduling Under Uncertainty

    National Research Council Canada - National Science Library

    Howe, Adele; Whitley, L. D

    2008-01-01

    .... We also identified plateaus as a significant barrier to superb performance of local search on scheduling and have studied several canonical discrete optimization problems to discover and model the nature of plateaus...

  19. HALL project. Justifying synthesis of the dimensioning inventory model

    International Nuclear Information System (INIS)

    Lagrange, M.H.

    2003-01-01

    This document explains the input data and the hypotheses retained for the establishment of the dimensioning inventory model (DIM). It recalls, first, the scenarios considered for the spent fuel and reprocessing management, describes the updating of the list of families of high-activity and long living (HALL) waste packages and the hypotheses considered for their quantifying in the inventory model. It presents also the selection criteria of type-packages and the list of such packages. It precises the regrouping of package families into type-packages and the related quantitative data. Finally, it details the modalities of preparation of radiological and chemical description of type-packages. (J.S.)

  20. The Misspecification of the Covariance Structures in Multilevel Models for Single-Case Data: A Monte Carlo Simulation Study

    Science.gov (United States)

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M.; Beretvas, S. Natasha; Van den Noortgate, Wim

    2016-01-01

    The impact of misspecifying covariance matrices at the second and third levels of the three-level model is evaluated. Results indicate that ignoring existing covariance has no effect on the treatment effect estimate. In addition, the between-case variance estimates are unbiased when covariance is either modeled or ignored. If the research interest…

  1. Detection of Q-Matrix Misspecification Using Two Criteria for Validation of Cognitive Structures under the Least Squares Distance Model

    Science.gov (United States)

    Romero, Sonia J.; Ordoñez, Xavier G.; Ponsoda, Vincente; Revuelta, Javier

    2014-01-01

    Cognitive Diagnostic Models (CDMs) aim to provide information about the degree to which individuals have mastered specific attributes that underlie the success of these individuals on test items. The Q-matrix is a key element in the application of CDMs, because contains links item-attributes representing the cognitive structure proposed for solve…

  2. Exploratory Analyses To Improve Model Fit: Errors Due to Misspecification and a Strategy To Reduce Their Occurrence.

    Science.gov (United States)

    Green, Samuel B.; Thompson, Marilyn S.; Poirier, Jennifer

    1999-01-01

    The use of Lagrange multiplier (LM) tests in specification searches and the efforts that involve the addition of extraneous parameters to models are discussed. Presented are a rationale and strategy for conducting specification searches in two stages that involve adding parameters to LM tests to maximize fit and then deleting parameters not needed…

  3. Structural Break Tests Robust to Regression Misspecification

    Directory of Open Access Journals (Sweden)

    Alaa Abi Morshed

    2018-05-01

    Full Text Available Structural break tests for regression models are sensitive to model misspecification. We show—analytically and through simulations—that the sup Wald test for breaks in the conditional mean and variance of a time series process exhibits severe size distortions when the conditional mean dynamics are misspecified. We also show that the sup Wald test for breaks in the unconditional mean and variance does not have the same size distortions, yet benefits from similar power to its conditional counterpart in correctly specified models. Hence, we propose using it as an alternative and complementary test for breaks. We apply the unconditional and conditional mean and variance tests to three US series: unemployment, industrial production growth and interest rates. Both the unconditional and the conditional mean tests detect a break in the mean of interest rates. However, for the other two series, the unconditional mean test does not detect a break, while the conditional mean tests based on dynamic regression models occasionally detect a break, with the implied break-point estimator varying across different dynamic specifications. For all series, the unconditional variance does not detect a break while most tests for the conditional variance do detect a break which also varies across specifications.

  4. Animal Models in Forensic Science Research: Justified Use or Ethical Exploitation?

    Science.gov (United States)

    Mole, Calvin Gerald; Heyns, Marise

    2018-05-01

    A moral dilemma exists in biomedical research relating to the use of animal or human tissue when conducting scientific research. In human ethics, researchers need to justify why the use of humans is necessary should suitable models exist. Conversely, in animal ethics, a researcher must justify why research cannot be carried out on suitable alternatives. In the case of medical procedures or therapeutics testing, the use of animal models is often justified. However, in forensic research, the justification may be less evident, particularly when research involves the infliction of trauma on living animals. To determine how the forensic science community is dealing with this dilemma, a review of literature within major forensic science journals was conducted. The frequency and trends of the use of animals in forensic science research was investigated for the period 1 January 2012-31 December 2016. The review revealed 204 original articles utilizing 5050 animals in various forms as analogues for human tissue. The most common specimens utilized were various species of rats (35.3%), pigs (29.3%), mice (17.7%), and rabbits (8.2%) although different specimens were favored in different study themes. The majority of studies (58%) were conducted on post-mortem specimens. It is, however, evident that more needs to be done to uphold the basic ethical principles of reduction, refinement and replacement in the use of animals for research purposes.

  5. Modelling severe Staphylococcus aureus sepsis in conscious pigs: are implications for animal welfare justified?

    DEFF Research Database (Denmark)

    Olsen, Helle G; Kjelgaard-Hansen, Mads; Tveden-Nyborg, Pernille

    2016-01-01

    A porcine model of haematogenous Staphylococcus aureus sepsis has previously been established in our research group. In these studies, pigs developed severe sepsis including liver dysfunction during a 48 h study period. As pigs were awake during the study, animal welfare was challenged by the sev......A porcine model of haematogenous Staphylococcus aureus sepsis has previously been established in our research group. In these studies, pigs developed severe sepsis including liver dysfunction during a 48 h study period. As pigs were awake during the study, animal welfare was challenged....... Prior to euthanasia, a galactose elimination capacity test was performed to assess liver function. Pigs were euthanised 48 h post inoculation for necropsy and histopathological evaluation. While infusion times of 6.66 min, and higher, did not induce liver dysfunction (n = 3), the infusion time of 3......, according to humane endpoints. A usable balance between scientific purpose and animal welfare could not be achieved, and we therefore find it hard to justify further use of this conscious porcine sepsis model. In order to make a model of translational relevance for human sepsis, we suggest that future model...

  6. Are stock prices too volatile to be justified by the dividend discount model?

    Science.gov (United States)

    Akdeniz, Levent; Salih, Aslıhan Altay; Ok, Süleyman Tuluğ

    2007-03-01

    This study investigates excess stock price volatility using the variance bound framework of LeRoy and Porter [The present-value relation: tests based on implied variance bounds, Econometrica 49 (1981) 555-574] and of Shiller [Do stock prices move too much to be justified by subsequent changes in dividends? Am. Econ. Rev. 71 (1981) 421-436.]. The conditional variance bound relationship is examined using cross-sectional data simulated from the general equilibrium asset pricing model of Brock [Asset prices in a production economy, in: J.J. McCall (Ed.), The Economics of Information and Uncertainty, University of Chicago Press, Chicago (for N.B.E.R.), 1982]. Results show that the conditional variance bounds hold, hence, our hypothesis of the validity of the dividend discount model cannot be rejected. Moreover, in our setting, markets are efficient and stock prices are neither affected by herd psychology nor by the outcome of noise trading by naive investors; thus, we are able to control for market efficiency. Consequently, we show that one cannot infer any conclusions about market efficiency from the unconditional variance bounds tests.

  7. Beyond Conflict and Spoilt Identities: How Rwandan Leaders Justify a Single Recategorization Model for Post-Conflict Reconciliation

    Directory of Open Access Journals (Sweden)

    Sigrun Marie Moss

    2014-08-01

    Full Text Available Since 1994, the Rwandan government has attempted to remove the division of the population into the ‘ethnic’ identities Hutu, Tutsi and Twa and instead make the shared Rwandan identity salient. This paper explores how leaders justify the single recategorization model, based on nine in-depth semi-structured interviews with Rwandan national leaders (politicians and bureaucrats tasked with leading unity implementation conducted in Rwanda over three months in 2011/2012. Thematic analysis revealed this was done through a meta-narrative focusing on the shared Rwandan identity. Three frames were found in use to “sell” this narrative where ethnic identities are presented as a an alien construction; b which was used to the disadvantage of the people; and c non-essential social constructs. The material demonstrates the identity entrepreneurship behind the single recategorization approach: the definition of the category boundaries, the category content, and the strategies for controlling and overcoming alternative narratives. Rwandan identity is presented as essential and legitimate, and as offering a potential way for people to escape spoilt subordinate identities. The interviewed leaders insist Rwandans are all one, and that the single recategorization is the right path for Rwanda, but this approach has been criticised for increasing rather than decreasing intergroup conflict due to social identity threat. The Rwandan case offers a rare opportunity to explore leaders’ own narratives and framing of these ‘ethnic’ identities to justify the single recategorization approach.

  8. Justify your alpha

    NARCIS (Netherlands)

    Lakens, Daniel; Adolfi, Federico G.; Albers, Casper J.; Anvari, Farid; Apps, Matthew A.J.; Argamon, Shlomo E.; Baguley, Thom; Becker, Raymond B.; Benning, Stephen D.; Bradford, Daniel E.; Buchanan, Erin M.; Caldwell, Aaron R.; Van Calster, Ben; Carlsson, Rickard; Chen, Sau Chin; Chung, Bryan; Colling, Lincoln J.; Collins, Gary S.; Crook, Zander; Cross, Emily S.; Daniels, Sameera; Danielsson, Henrik; Debruine, Lisa; Dunleavy, Daniel J.; Earp, Brian D.; Feist, Michele I.; Ferrell, Jason D.; Field, James G.; Fox, Nicholas W.; Friesen, Amanda; Gomes, Caio; Gonzalez-Marquez, Monica; Grange, James A.; Grieve, Andrew P.; Guggenberger, Robert; Grist, James; Van Harmelen, Anne Laura; Hasselman, Fred; Hochard, Kevin D.; Hoffarth, Mark R.; Holmes, Nicholas P.; Ingre, Michael; Isager, Peder M.; Isotalus, Hanna K.; Johansson, Christer; Juszczyk, Konrad; Kenny, David A.; Khalil, Ahmed A.; Konat, Barbara; Lao, Junpeng; Larsen, Erik Gahner; Lodder, Gerine M.A.; Lukavský, Jiří; Madan, Christopher R.; Manheim, David; Martin, Stephen R.; Martin, Andrea E.; Mayo, Deborah G.; McCarthy, Randy J.; McConway, Kevin; McFarland, Colin; Nio, Amanda Q.X.; Nilsonne, Gustav; De Oliveira, Cilene Lino; De Xivry, Jean Jacques Orban; Parsons, Sam; Pfuhl, Gerit; Quinn, Kimberly A.; Sakon, John J.; Saribay, S. Adil; Schneider, Iris K.; Selvaraju, Manojkumar; Sjoerds, Zsuzsika; Smith, Samuel G.; Smits, Tim; Spies, Jeffrey R.; Sreekumar, Vishnu; Steltenpohl, Crystal N.; Stenhouse, Neil; Świątkowski, Wojciech; Vadillo, Miguel A.; Van Assen, Marcel A.L.M.; Williams, Matt N.; Williams, Samantha E.; Williams, Donald R.; Yarkoni, Tal; Ziano, Ignazio; Zwaan, Rolf A.

    2018-01-01

    In response to recommendations to redefine statistical significance to P ≤ 0.005, we propose that researchers should transparently report and justify all choices they make when designing a study, including the alpha level.

  9. Justifying an information system.

    Science.gov (United States)

    Neal, T

    1993-03-01

    A four-step model for the hospital pharmacist to use in justifying a computerized information system is described. In the first step, costs are identified and analyzed. Both the costs and the advantages of the existing system are evaluated. A request for information and a request for proposal are prepared and sent to vendors, who return estimates of hardware, software, and support costs. Costs can then be merged and analyzed as one-time costs, recurring annual costs, and total costs annualized over five years. In step 2, benefits are identified and analyzed. Tangible economic benefits are those that directly reduce or avoid costs or directly enhance revenues and can be measured in dollars. Intangible economic benefits are realized through a reduction in overhead and reallocation of labor and are less easily measured in dollars. Noneconomic benefits, some involving quality-of-care issues, can also be used in the justification. Step 3 consists of a formal risk assessment in which the project is broken into categories for which specific questions are answered by assigning a risk factor. In step 4, both costs and benefits are subjected to a financial analysis, the object of which is to maximize the return on investment to the institution from the capital being requested. Calculations include return on investment based on the net present value of money, internal rate of return, payback period, and profitability index. A well-designed justification for an information system not only identifies the costs, risks, and benefits but also presents a plan of action for realizing the benefits.

  10. Interpretational confounding is due to misspecification, not to type of indicator: comment on Howell, Breivik, and Wilcox (2007).

    Science.gov (United States)

    Bollen, Kenneth A

    2007-06-01

    R. D. Howell, E. Breivik, and J. B. Wilcox (2007) have argued that causal (formative) indicators are inherently subject to interpretational confounding. That is, they have argued that using causal (formative) indicators leads the empirical meaning of a latent variable to be other than that assigned to it by a researcher. Their critique of causal (formative) indicators rests on several claims: (a) A latent variable exists apart from the model when there are effect (reflective) indicators but not when there are causal (formative) indicators, (b) causal (formative) indicators need not have the same consequences, (c) causal (formative) indicators are inherently subject to interpretational confounding, and (d) a researcher cannot detect interpretational confounding when using causal (formative) indicators. This article shows that each claim is false. Rather, interpretational confounding is more a problem of structural misspecification of a model combined with an underidentified model that leaves these misspecifications undetected. Interpretational confounding does not occur if the model is correctly specified whether a researcher has causal (formative) or effect (reflective) indicators. It is the validity of a model not the type of indicator that determines the potential for interpretational confounding. Copyright 2007 APA, all rights reserved.

  11. Justifying Clinical Nudges.

    Science.gov (United States)

    Gorin, Moti; Joffe, Steven; Dickert, Neal; Halpern, Scott

    2017-03-01

    The shift away from paternalistic decision-making and toward patient-centered, shared decision-making has stemmed from the recognition that in order to practice medicine ethically, health care professionals must take seriously the values and preferences of their patients. At the same time, there is growing recognition that minor and seemingly irrelevant features of how choices are presented can substantially influence the decisions people make. Behavioral economists have identified striking ways in which trivial differences in the presentation of options can powerfully and predictably affect people's choices. Choice-affecting features of the decision environment that do not restrict the range of choices or significantly alter the incentives have come to be known as "nudges." Although some have criticized conscious efforts to influence choice, we believe that clinical nudges may often be morally justified. The most straightforward justification for nudge interventions is that they help people bypass their cognitive limitations-for example, the tendency to choose the first option presented even when that option is not the best for them-thereby allowing people to make choices that best align with their rational preferences or deeply held values. However, we argue that this justification is problematic. We argue that, if physicians wish to use nudges to shape their patients' choices, the justification for doing so must appeal to an ethical and professional standard, not to patients' preferences. We demonstrate how a standard with which clinicians and bioethicists already are quite familiar-the best-interest standard-offers a robust justification for the use of nudges. © 2017 The Hastings Center.

  12. Semi-Nonparametric Estimation and Misspecification Testing of Diffusion Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis

    of the estimators and tests under the null are derived, and the power properties are analyzed by considering contiguous alternatives. Test directly comparing the drift and diffusion estimators under the relevant null and alternative are also analyzed. Markov Bootstrap versions of the test statistics are proposed...... to improve on the finite-sample approximations. The finite sample properties of the estimators are examined in a simulation study....

  13. Supplier-induced demand: re-examining identification and misspecification in cross-sectional analysis.

    Science.gov (United States)

    Peacock, Stuart J; Richardson, Jeffrey R J

    2007-09-01

    This paper re-examines criticisms of cross-sectional methods used to test for supplier-induced demand (SID) and re-evaluates the empirical evidence using data from Australian medical services. Cross-sectional studies of SID have been criticised on two grounds. First, and most important, the inclusion of the doctor supply in the demand equation leads to an identification problem. This criticism is shown to be invalid, as the doctor supply variable is stochastic and depends upon a variety of other variables including the desirability of the location. Second, cross-sectional studies of SID fail diagnostic tests and produce artefactual findings due to model misspecification. Contrary to this, the re-evaluation of cross-sectional Australian data indicate that demand equations that do not include the doctor supply are misspecified. Empirical evidence from the re-evaluation of Australian medical services data supports the notion of SID. Demand and supply equations are well specified and have very good explanatory power. The demand equation is identified and the desirability of a location is an important predictor of the doctor supply. Results show an average price elasticity of demand of 0.22 and an average elasticity of demand with respect to the doctor supply of 0.46, with the impact of SID becoming stronger as the doctor supply rises. The conclusion we draw from this paper is that two of the main criticisms of the empirical evidence supporting the SID hypothesis have been inappropriately levelled at the methods used. More importantly, SID provides a satisfactory, and robust, explanation of the empirical data on the demand for medical services in Australia.

  14. Measurement Model Specification Error in LISREL Structural Equation Models.

    Science.gov (United States)

    Baldwin, Beatrice; Lomax, Richard

    This LISREL study examines the robustness of the maximum likelihood estimates under varying degrees of measurement model misspecification. A true model containing five latent variables (two endogenous and three exogenous) and two indicator variables per latent variable was used. Measurement model misspecification considered included errors of…

  15. Journalism as Justified True Belief

    Directory of Open Access Journals (Sweden)

    Sílvia Lisboa

    2015-12-01

    Full Text Available If it is important to think of journalism as a form of knowledge, then how does it become knowledge? How does this process work? In order to answer this question, this article proposes a new understanding of journalism as a subject; presenting it as a justified true belief. We think of journalism being based on pillars of truth and justification, conditions necessary in order for Epistemology to grant it the status of knowledge. We address the concept of truth and show how journalistic reports are justified to the public as well as consider the central role of credibility in this process. We add to the epistemic conception by using concepts of discourse that help to understand how journalism provides evidence through its intentions, its authority and its ability. This evidence acts like a guide for the reader towards forming opinions on journalistic reports and recognizing journalism as a form of knowledge.

  16. About 'restriction', 'justified' and 'necessary'

    DEFF Research Database (Denmark)

    Werlauff, Erik

    2016-01-01

    The article is an academic fairy tale about why and how all national corporate tax protection legislation should undergo a 3-part test to ensure its consistency with EU law. Each Member State introduce a compulsory 3-step test for each new (corporate) tax provision. The test is simple: (1) Does...... the tax provision constitute a restriction in the sense of EU law? (2) If the answer is yes: Is the restriction justified? (3) If the answer is yes: Is the restriction necessary?"...

  17. Journalism as Justified True Belief

    OpenAIRE

    Lisboa, Sílvia; Benetti, Marcia

    2015-01-01

    If it is important to think of journalism as a form of knowledge, then how does it become knowledge? How does this process work? In order to answer this question, this article proposes a new understanding of journalism as a subject; presenting it as a justified true belief. We think of journalism being based on pillars of truth and justification, conditions necessary in order for Epistemology to grant it the status of knowledge. We address the concept of truth and show how journalistic report...

  18. The Bernstein-Von-Mises theorem under misspecification

    NARCIS (Netherlands)

    Kleijn, B.J.K.; van der Vaart, A.W.

    2012-01-01

    We prove that the posterior distribution of a parameter in misspecified LAN parametric models can be approximated by a random normal distribution. We derive from this that Bayesian credible sets are not valid confidence sets if the model is misspecified. We obtain the result under conditions that

  19. Is nuclear energy ethically justifiable?

    International Nuclear Information System (INIS)

    Zuend, H.

    1988-01-01

    Nuclear technology brings the chance to provide an essential long term contribution to the energy supply of the world population and to use the raw materials uranium and thorium which have no other use. The use of nuclear energy is ethically justifiable providing certain simple fundamental rules for the design of nuclear facilities are observed. Such rules were clearly violated before the reactor accident at Chernobyl. They are, however, observed in our existing nuclear power plants. Compared with other energy systems nuclear energy has, with the exception of natural gas, the lowest risk. The consideration of the ethical justification of nuclear energy must also include the question of withdrawal. A withdrawal would have considerable social consequences for the industrial nations as well as for the developing countries. The problem of spreading alarm (and concern) by the opponents of nuclear energy should also be included in the ethical justification. 8 refs., 2 figs

  20. Is nuclear energy ethically justifiable?

    International Nuclear Information System (INIS)

    Zuend, H.

    1987-01-01

    Nuclear technology offers the chance to make an extremely long term contribution to the energy supply of the earth. The use of nuclear energy is ethically justifiable, provided that several fundamental rules are obeyed during the technical design of nuclear installations. Such fundamental rules were unequivocally violated in the nuclear power plant Chernobyl. They are, however, fulfilled in the existing Swiss nuclear power plants. Improvements are possible in new nuclear power plants. Compared to other usable energy systems nuclear energy is second only to natural gas in minimal risk per generated energy unit. The question of ethical justification also may rightly be asked of the non-use of nuclear energy. The socially weakest members of the Swiss population would suffer most under a renunciation of nuclear energy. Future prospects for the developing countries would deteriorate considerably with a renunciation by industrial nations of nuclear energy. The widely spread fear concerning the nuclear energy in the population is a consequence of non-objective discussion. 8 refs., 2 figs

  1. Essays on model uncertainty in financial models

    NARCIS (Netherlands)

    Li, Jing

    2018-01-01

    This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the

  2. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...

  3. Univariate and Multivariate Specification Search Indices in Covariance Structure Modeling.

    Science.gov (United States)

    Hutchinson, Susan R.

    1993-01-01

    Simulated population data were used to compare relative performances of the modification index and C. Chou and P. M. Bentler's Lagrange multiplier test (a multivariate generalization of a modification index) for four levels of model misspecification. Both indices failed to recover the true model except at the lowest level of misspecification. (SLD)

  4. A likely-universal model of fracture density and scaling justified by both data and theory. Consequences for crustal hydro-mechanics

    Science.gov (United States)

    Davy, P.; Darcel, C.; Le Goc, R.; Bour, O.

    2011-12-01

    We discuss the parameters that control fracture density on the Earth. We argue that most of fracture systems are spatially organized according to two main regimes. The smallest fractures can grow independently of each others, defining a "dilute" regime controlled by nuclei occurrence rate and individual fracture growth law. Above a certain length, fractures stop growing due to mechanical interactions between fractures. For this "dense" regime, we derive the fracture density distribution by acknowledging that, statistically, fractures do not cross a larger one. This very crude rule, which expresses the inhibiting role of large fractures against smaller ones but not the reverse, actually appears be a very strong control on the eventual fracture density distribution since it results in a self-similar distribution whose exponents and density term are fully determined by the fractal dimension D and a dimensionless parameter γ that encompasses the details of fracture correlations and orientations. The range of values for D and γ appears to be extremely limited, which makes this model quite universal. This theory is supported by quantitative data on either fault or joint networks. The transition between the dilute and dense regimes occurs at about a few tenths of kilometers for faults systems, and a few meters for joints. This remarkable difference between both processes is likely due to a large-scale control (localization) of the fracture growth for faulting that does not exist for jointing. Finally, we discuss the consequences of this model on both flow and mechanical properties. In the dense regime, networks appears to be very close to a critical state.

  5. Modelling and forecasting WIG20 daily returns

    DEFF Research Database (Denmark)

    Amado, Cristina; Silvennoinen, Annestiina; Terasvirta, Timo

    of the model is that the deterministic component is specified before estimating the multiplicative conditional variance component. The resulting model is subjected to misspecification tests and its forecasting performance is compared with that of commonly applied models of conditional heteroskedasticity....

  6. The Self-Justifying Desire for Happiness

    DEFF Research Database (Denmark)

    Rodogno, Raffaele

    2004-01-01

    In Happiness, Tabensky equates the notion of happiness to Aristotelian eudaimonia. I shall claim that doing so amounts to equating two concepts that moderns cannot conceptually equate, namely, the good for a person and the good person or good life. In §2 I examine the way in which Tabensky deals...... with this issue and claim that his idea of happiness is as problematic for us moderns as is any translation of the notion of eudaimonia in terms of happiness. Naturally, if happiness understood as eudaimonia is ambiguous, so will be the notion of a desire for happiness, which we find at the core of Tabensky......'s whole project. In §3 I shall be concerned with another aspect of the desire for happiness; namely, its alleged self-justifying nature. I will attempt to undermine the idea that this desire is self-justifying by undermining the criterion on which Tabensky takes self-justifiability to rest, i.e. its...

  7. Parity, Incomparability and Rationally Justified Choice

    NARCIS (Netherlands)

    Boot, Martijn

    2009-01-01

    This article discusses the possibility of a rationally justified choice between two options neither of which is better than the other while they are not equally good either (‘3NT’). Joseph Raz regards such options as incomparable and argues that reason cannot guide the choice between them. Ruth

  8. Drug companies' evidence to justify advertising.

    Science.gov (United States)

    Wade, V A; Mansfield, P R; McDonald, P J

    1989-11-25

    Ten international pharmaceutical companies were asked by letter to supply their best evidence in support of marketing claims for seventeen products. Fifteen replies were received. Seven replies cited a total of 67 references: 31 contained relevant original data and only 13 were controlled trials, all of which had serious methodological flaws. There were four reports of changes in advertising claims and one company ceased marketing nikethamide in the third world. Standards of evidence used to justify advertising claims are inadequate.

  9. Modeling Attitude Variance in Small UAS’s for Acoustic Signature Simplification Using Experimental Design in a Hardware-in-the-Loop Simulation

    Science.gov (United States)

    2015-03-26

    response. Additionally, choosing correlated levels for multiple factors results in multicollinearity which can cause problems such as model...misspecification or large variances and covariances for the regression coefficients. A good way to avoid multicollinearity is to use orthogonal, factorial

  10. Does uncertainty justify intensity emission caps?

    International Nuclear Information System (INIS)

    Quirion, Philippe

    2005-01-01

    Environmental policies often set 'relative' or 'intensity' emission caps, i.e. emission limits proportional to the polluting firm's output. One of the arguments put forth in favour of relative caps is based on the uncertainty on business-as-usual output: if the firm's production level is higher than expected, so will be business-as-usual emissions, hence reaching a given level of emissions will be more costly than expected. As a consequence, it is argued, a higher emission level should be allowed if the production level is more important than expected. We assess this argument with a stochastic analytical model featuring two random variables: the business-as-usual emission level, proportional to output, and the slope of the marginal abatement cost curve. We compare the relative cap to an absolute cap and to a price instrument, in terms of welfare impact. It turns out that in most plausible cases, either a price instrument or an absolute cap yields a higher expected welfare than a relative cap. Quantitatively, the difference in expected welfare is typically very small between the absolute and the relative cap but may be significant between the relative cap and the price instrument. (author)

  11. Effect of misspecification of gene frequency on the two-point LOD score.

    Science.gov (United States)

    Pal, D K; Durner, M; Greenberg, D A

    2001-11-01

    In this study, we used computer simulation of simple and complex models to ask: (1) What is the penalty in evidence for linkage when the assumed gene frequency is far from the true gene frequency? (2) If the assumed model for gene frequency and inheritance are misspecified in the analysis, can this lead to a higher maximum LOD score than that obtained under the true parameters? Linkage data simulated under simple dominant, recessive, dominant and recessive with reduced penetrance, and additive models, were analysed assuming a single locus with both the correct and incorrect dominance model and assuming a range of different gene frequencies. We found that misspecifying the analysis gene frequency led to little penalty in maximum LOD score in all models examined, especially if the assumed gene frequency was lower than the generating one. Analysing linkage data assuming a gene frequency of the order of 0.01 for a dominant gene, and 0.1 for a recessive gene, appears to be a reasonable tactic in the majority of realistic situations because underestimating the gene frequency, even when the true gene frequency is high, leads to little penalty in the LOD score.

  12. Electrical stimulation in dysphagia treatment: a justified controversy?

    NARCIS (Netherlands)

    Bogaardt, H. C. A.

    2008-01-01

    Electrical stimulation in dysphagia treatment: a justified controversy? Neuromuscular electrostimulation (LAMES) is a method for stimulating muscles with short electrical pulses. Neuromuscular electrostimulation is frequently used in physiotherapy to strengthen healthy muscles (as in sports

  13. Withholding stereotactic radiotherapy in elderly patients with stage I non-small cell lung cancer and co-existing COPD is not justified: Outcomes of a markov model analysis

    International Nuclear Information System (INIS)

    Louie, Alexander V.; Rodrigues, George; Hannouf, Malek; Lagerwaard, Frank; Palma, David; Zaric, Gregory S.; Haasbeek, Cornelis; Senan, Suresh

    2011-01-01

    Background and purpose: To model outcomes of SBRT versus best supportive care (BSC) in elderly COPD patients with stage I NSCLC. Material and methods: A Markov model was constructed to simulate the quality-adjusted and overall survival (OS) in patients ≥75 years undergoing either SBRT or BSC for a five-year timeframe. SBRT rates of local, regional and distant recurrences were obtained from 247 patients treated at the VUMC, Amsterdam. Recurrence rates were converted into transition probabilities and stratified into four groups according to T stage (1, 2) and COPD GOLD score (I-II, III-IV). Data for untreated patients were obtained from the California Cancer Registry. Tumor stage and GOLD score utilities were adapted from the literature. Results: Our model correlated closely with the source OS data for SBRT treated and untreated patients. After SBRT, our model predicted for 6.8-47.2% five-year OS and 14.9-27.4 quality adjusted life months (QALMs). The model predicted for 9.0% and 2.8% five-year OS, and 10.1 and 6.1 QALMs for untreated T1 and T2 patients, respectively. The benefit of SBRT was the least for T2, GOLD III-IV patients. Conclusion: Our model indicates that SBRT should be considered in elderly stage I NSCLC patients with COPD.

  14. The Use of Imputed Sibling Genotypes in Sibship-Based Association Analysis: On Modeling Alternatives, Power and Model Misspecification

    NARCIS (Netherlands)

    Minica, C.C.; Dolan, C.V.; Willemsen, G.; Vink, J.M.; Boomsma, D.I.

    2013-01-01

    When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of

  15. Calculation-experimental method justifies the life of wagons

    Directory of Open Access Journals (Sweden)

    Валерія Сергіївна Воропай

    2015-11-01

    Full Text Available The article proposed a method to evaluate the technical state of tank wagons operating in chemical industry. An algorithm for evaluation the technical state of tank wagons was developed, that makes it possible on the basis of diagnosis and analysis of current condition to justify a further period of operation. The complex of works on testing the tanks and mathematical models for calculations of the design strength and reliability were proposed. The article is devoted to solving the problem of effective exploitation of the working fleet of tank wagons. Opportunities for further exploitation of cars, the complex of works on the assessment of their technical state and the calculation of the resources have been proposed in the article. Engineering research of the chemical industries park has reduced the shortage of the rolling stock for transportation of ammonia. The analysis of the chassis numerous faults and the main elements of tank wagons supporting structure after 20 years of exploitation was made. The algorithm of determining the residual life of the specialized tank wagons operating in an industrial plant has been proposed. The procedure for resource conservation of tank wagons carrying cargo under high pressure was first proposed. The improved procedure for identifying residual life proposed in the article has both theoretical and practical importance

  16. Justifying a recommendation: tell a story or present an argument?

    NARCIS (Netherlands)

    van den Hoven, P.J.

    2017-01-01

    In the deliberative genre there is a complex ‘playground’ of choices to present a recommendation; a rhetorician has to determine his or her position. Relevant dimensions are the coerciveness of the recommendation and the strength of its justifi cation, but also the presentation format, varying from

  17. Justifying Innovative Language Programs in an Environment of ...

    African Journals Online (AJOL)

    pkurgat

    Justifying Innovative Language Programs in an Environment of Change: The Case ... Key words: project management, change management, educational management, .... the sustainability of the course considering that there were and continue to be problems .... language teaching in general on a sound scientific base.

  18. Suing One's Sense Faculties for Fraud: 'Justifiable Reliance' in the ...

    African Journals Online (AJOL)

    The law requires that plaintiffs in fraud cases be 'justified' in relying on a misrepresentation. I deploy the accumulated intuitions of the law to defend externalist accounts of epistemic justification and knowledge against Laurence BonJour's counterexamples involving clairvoyance. I suggest that the law can offer a ...

  19. Investigation into How Managers Justify Investments in IT Infrastructure

    Science.gov (United States)

    Ibe, Richmond Ikechukwu

    2012-01-01

    Organization leaders are dependent on information technology for corporate productivity; however, senior managers have expressed concerns about insufficient benefits from information technology investments. The problem researched was to understand how midsized businesses justify investments in information technology infrastructure. The purpose of…

  20. Thumba X-ray plant: Are radiation fears justified

    International Nuclear Information System (INIS)

    Madhvanath, U.

    1978-01-01

    Technical facts about the X-ray unit located at Vikram Sarabhai Space Centre, Thumba (India) are set down to explain that it is not posing any radiation hazard as reported in a newspaper and thus radiation fears are not justifiable. It is stated that, after thorough checking, X-ray installations in this space centre cause negligible exposure even to workers who handle these units, and others practically do not get any exposure at all. (B.G.W.)

  1. Cost-justifying usability an update for the internet age

    CERN Document Server

    Bias, Randolph G; Bias, Randolph G

    2005-01-01

    You just know that an improvement of the user interface will reap rewards, but how do you justify the expense and the labor and the time-guarantee a robust ROI!-ahead of time? How do you decide how much of an investment should be funded? And what is the best way to sell usability to others? In this completely revised and new edition, Randolph G. Bias (University of Texas at Austin, with 25 years' experience as a usability practitioner and manager) and Deborah J. Mayhew (internationally recognized usability consultant and author of two other seminal books including The Usability Enginee

  2. Why Do Women Justify Violence Against Wives More Often Than Do Men in Vietnam?

    Science.gov (United States)

    Krause, Kathleen H; Gordon-Roberts, Rachel; VanderEnde, Kristin; Schuler, Sidney Ruth; Yount, Kathryn M

    2015-05-06

    Intimate partner violence (IPV) harms the health of women and their children. In Vietnam, 31% of women report lifetime exposure to physical IPV, and surprisingly, women justify physical IPV against wives more often than do men. We compare men's and women's rates of finding good reason for wife hitting and assess whether differences in childhood experiences and resources and constraints in adulthood account for observed differences. Probability samples of married men (n = 522) and women (n = 533) were surveyed in Vietnam. Ordered logit models assessed the proportional odds for women versus men of finding more "good reasons" to hit a wife (never, 1-3 situations, 4-6 situations). In all situations, women found good reason to hit a wife more often than did men. The unadjusted odds for women versus men of reporting more good reasons to hit a wife were 6.55 (95% confidence interval [CI] = [4.82, 8.91]). This gap disappeared in adjusted models that included significant interactions of gender with age, number of children ever born, and experience of physical IPV as an adult. Having children was associated with justifying wife hitting among women but not men. Exposure to IPV in adulthood was associated with justifying wife hitting among men, but was negatively associated with justification of IPV among women. Further study of the gendered effects of resources and constraints in adulthood on attitudes about IPV against women will clarify women's more frequent reporting than men's that IPV against women is justified. © The Author(s) 2015.

  3. The Integration of Continuous and Discrete Latent Variable Models: Potential Problems and Promising Opportunities

    Science.gov (United States)

    Bauer, Daniel J.; Curran, Patrick J.

    2004-01-01

    Structural equation mixture modeling (SEMM) integrates continuous and discrete latent variable models. Drawing on prior research on the relationships between continuous and discrete latent variable models, the authors identify 3 conditions that may lead to the estimation of spurious latent classes in SEMM: misspecification of the structural model,…

  4. Justifying the design and selection of literacy and thinking tools

    Directory of Open Access Journals (Sweden)

    David Whitehead

    2008-10-01

    Full Text Available Criteria for the design and selection of literacy and thinking tools that allow educators to justify what they do are described within a wider framework of learning theory and research into best practice. Based on a meta-analysis of best practice, results from a three year project designed to evaluate the effectiveness of a secondary school literacy initiative in New Zealand, together with recent research from cognitive and neuro-psychologists, it is argued that the design and selection of literacy and thinking tools used in elementary schools should be consistent with (i teaching focused (ii learner focused, (iii thought linked (iv neurologically consistent, (v subject specific, (vi text linked, (vii developmentally appropriate, and (viii assessment linked criteria.

  5. Improved productivity justifies world record underbalanced perforating operation

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, A. M.; Bakker, E. R. [NAM B.V. (Netherlands); Hungerford, K.

    1998-12-31

    To achieve vertical connectivity with all the layers, and thus long term sustained productivity in a highly stratified reservoir, a one run underbalanced perforating operation was considered necessary. Due to coiled tube limitations in this deep (5136 m along hole, 3700 m true vertical depth, with a maximum deviation of 89 degrees), high pressure well a hydraulic workover unit (HWU) was selected to deploy and retrieve the guns. The operation is considered a world record since this is the longest section (total gross interval of 1026 m perforated) of guns conveyed, fired underbalanced and deployed out of a live well. It is concluded that the improved productivity more than justified the additional time, effort and expenditure; considering the full life cycle of the well it is readily apparent that the operation was an economic and technical success. Details of the considerations leading to the perforating technique selection, the planning and the execution of the operation, and the validation of the technique in terms of productivity gains, are provided. 13 refs., 7 figs.

  6. Justifying British Advertising in War and Austerity, 1939-51.

    Science.gov (United States)

    Haughton, Philippa

    2017-09-01

    Drawing together institutional papers, the trade- and national-press, and Mass-Observation documents, this article examines the changing ways that the Advertising Association justified commercial advertising from 1939 to 1951. It argues that the ability to repeatedly re-conceptualize the social and economic purposes of advertising was central to the industry's survival and revival during the years of war and austerity. This matters because the survival and revival of commercial advertising helps to explain the composition of the post-war mixed economy and the emergence of a consumer culture that became the 'golden age' of capitalism. While commercial advertising's role in supporting periods of affluence is well documented, much less is known about its relationship with war and austerity. This omission is problematic. Advertising was only able to shape the 1950s and 1960s economy because its corporate structures remained intact during the 1940s, as the industry withstood the challenges of wartime and the difficulties presented under Attlee's government. Recognizing the deliberate attempts of advertising people to promote a role for commercial advertising invites us to reconsider the inevitability of post-war affluence, while offering fresh insight into the debate around consumer education, freedom of choice, and the centrality of advertising and communication in democratic society: issues central to the society Britain was, and hoped to become. © The Author [2017]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Photonic jet etching: Justifying the shape of optical fiber tip

    Science.gov (United States)

    Abdurrochman, Andri; Zelgowski, Julien; Lecler, Sylvain; Mermet, Frédéric; Tumbelaka, Bernard; Fontaine, Joël

    2016-02-01

    Photonic jet (PJ) is a low diverging and highly concentrated beam in the shadow side of dielectric particle (cylinder or sphere). The concentration can be more than 200 times higher than the incidence wave. It is a non-resonance phenomenon in the near-field can propagate in a few wavelengths. Many potential applications have been proposed, including PJ etching. Hence, a guided-beam is considered increasing the PJ mobility control. While the others used a combination of classical optical fibers and spheres, we are concerned on a classical optical fiber with spherical tip to generate the PJ. This PJ driven waveguide has been realized using Gaussian mode beam inside the core. It has different variable parameters compared to classical PJ, which will be discussed in correlation with the etching demonstrations. The parameters dependency between the tip and PJ properties are complex; and theoretical aspect of this interaction will be exposed to justify the shape of our tip and optical fiber used in our demonstrations. Methods to achieve such a needed optical fiber tip will also be described. Finally the ability to generate PJ out of the shaped optical fiber will be experimentally demonstrated and the potential applications for material processing will be exposed.

  8. Inconsistency of Bayesian Inference for Misspecified Linear Models, and a Proposal for Repairing It

    NARCIS (Netherlands)

    Grünwald, P.; van Ommen, T.

    2017-01-01

    We empirically show that Bayesian inference can be inconsistent under misspecification in simple linear regression problems, both in a model averaging/selection and in a Bayesian ridge regression setting. We use the standard linear model, which assumes homoskedasticity, whereas the data are

  9. Inconsistency of Bayesian inference for misspecified linear models, and a proposal for repairing it

    NARCIS (Netherlands)

    P.D. Grünwald (Peter); T. van Ommen (Thijs)

    2017-01-01

    textabstractWe empirically show that Bayesian inference can be inconsistent under misspecification in simple linear regression problems, both in a model averaging/selection and in a Bayesian ridge regression setting. We use the standard linear model, which assumes homoskedasticity, whereas the data

  10. Two independent pivotal statistics that test location and misspecification and add-up to the Anderson-Rubin statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2002-01-01

    We extend the novel pivotal statistics for testing the parameters in the instrumental variables regression model. We show that these statistics result from a decomposition of the Anderson-Rubin statistic into two independent pivotal statistics. The first statistic is a score statistic that tests

  11. A Systematic Approach for Identifying Level-1 Error Covariance Structures in Latent Growth Modeling

    Science.gov (United States)

    Ding, Cherng G.; Jane, Ten-Der; Wu, Chiu-Hui; Lin, Hang-Rung; Shen, Chih-Kang

    2017-01-01

    It has been pointed out in the literature that misspecification of the level-1 error covariance structure in latent growth modeling (LGM) has detrimental impacts on the inferences about growth parameters. Since correct covariance structure is difficult to specify by theory, the identification needs to rely on a specification search, which,…

  12. Improved hidden Markov model for nosocomial infections.

    Science.gov (United States)

    Khader, Karim; Leecaster, Molly; Greene, Tom; Samore, Matthew; Thomas, Alun

    2014-12-01

    We propose a novel hidden Markov model (HMM) for parameter estimation in hospital transmission models, and show that commonly made simplifying assumptions can lead to severe model misspecification and poor parameter estimates. A standard HMM that embodies two commonly made simplifying assumptions, namely a fixed patient count and binomially distributed detections is compared with a new alternative HMM that does not require these simplifying assumptions. Using simulated data, we demonstrate how each of the simplifying assumptions used by the standard model leads to model misspecification, whereas the alternative model results in accurate parameter estimates. © The Authors 2013. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  13. Formal structures for extracting analytically justifiable decisions from ...

    African Journals Online (AJOL)

    This paper identifies the benefits of transforming business process models into Decision Support Systems (DSS). However, the literature reveals that a business process model “should have a formal foundation” as a major requirement for transforming it into a DSS. The paper further ascertains that formal structures refer to ...

  14. Justifying gender discrimination in the workplace: The mediating role of motherhood myths.

    Science.gov (United States)

    Verniers, Catherine; Vala, Jorge

    2018-01-01

    The issue of gender equality in employment has given rise to numerous policies in advanced industrial countries, all aimed at tackling gender discrimination regarding recruitment, salary and promotion. Yet gender inequalities in the workplace persist. The purpose of this research is to document the psychosocial process involved in the persistence of gender discrimination against working women. Drawing on the literature on the justification of discrimination, we hypothesized that the myths according to which women's work threatens children and family life mediates the relationship between sexism and opposition to a mother's career. We tested this hypothesis using the Family and Changing Gender Roles module of the International Social Survey Programme. The dataset contained data collected in 1994 and 2012 from 51632 respondents from 18 countries. Structural equation modellings confirmed the hypothesised mediation. Overall, the findings shed light on how motherhood myths justify the gender structure in countries promoting gender equality.

  15. Justifying gender discrimination in the workplace: The mediating role of motherhood myths

    Science.gov (United States)

    2018-01-01

    The issue of gender equality in employment has given rise to numerous policies in advanced industrial countries, all aimed at tackling gender discrimination regarding recruitment, salary and promotion. Yet gender inequalities in the workplace persist. The purpose of this research is to document the psychosocial process involved in the persistence of gender discrimination against working women. Drawing on the literature on the justification of discrimination, we hypothesized that the myths according to which women’s work threatens children and family life mediates the relationship between sexism and opposition to a mother’s career. We tested this hypothesis using the Family and Changing Gender Roles module of the International Social Survey Programme. The dataset contained data collected in 1994 and 2012 from 51632 respondents from 18 countries. Structural equation modellings confirmed the hypothesised mediation. Overall, the findings shed light on how motherhood myths justify the gender structure in countries promoting gender equality. PMID:29315326

  16. Army Justified Initial Production Plan for the Paladin Integrated Management Program but Has Not Resolved Two Vehicle Performance Deficiencies (Redacted)

    Science.gov (United States)

    2016-08-05

    model oversight organization in the Federal Government by leading change, speaking truth, and promoting excellence—a diverse organization, working ...VIRGINIA 22350-1500 August 5, 2016 MEMORANDUM FOR AUDITOR GENERAL, DEPARTMENT OF THE ARMY SUBJECT: Army Justified Initial Production Plan for the Paladin... Family of Vehicles, M109A7 Self-Propelled Howitzer and M992A3 Carrier, Ammunition, Tracked, October 2015; • M109A7 AFES Overview, September 2015

  17. System-justifying ideologies and academic outcomes among first-year Latino college students.

    Science.gov (United States)

    O'Brien, Laurie T; Mars, Dustin E; Eccleston, Collette

    2011-10-01

    The present study examines the relationship between system-justifying ideologies and academic outcomes among 78 first-year Latino college students (21 men, 57 women, mean age = 18.1 years) attending a moderately selective West Coast university. Endorsement of system-justifying ideologies was negatively associated with grade point average (GPA); however it was positively associated with feelings of belonging at the university. In addition, system-justifying ideologies were negatively associated with perceptions of personal discrimination. In contrast, ethnic identity centrality was unrelated to GPA, feelings of belonging, and perceptions of personal discrimination once the relationship between system-justifying ideologies and these outcomes was statistically taken into account. The results of the present study suggest that endorsement of system-justifying ideologies may be a double-edged sword for Latino college students, involving trade-offs between academic success and feelings of belonging.

  18. Laboratory experiments cannot be utilized to justify the action of early streamer emission terminals

    International Nuclear Information System (INIS)

    Becerra, Marley; Cooray, Vernon

    2008-01-01

    The early emission of streamers in laboratory long air gaps under switching impulses has been observed to reduce the time of initiation of leader positive discharges. This fact has been arbitrarily extrapolated by the manufacturers of early streamer emission devices to the case of upward connecting leaders initiated under natural lightning conditions, in support of those non-conventional terminals that claim to perform better than Franklin lightning rods. In order to discuss the physical basis and validity of these claims, a self-consistent model based on the physics of leader discharges is used to simulate the performance of lightning rods in the laboratory and under natural lightning conditions. It is theoretically shown that the initiation of early streamers can indeed lead to the early initiation of self-propagating positive leaders in laboratory long air gaps under switching voltages. However, this is not the case for positive connecting leaders initiated from the same lightning rod under the influence of the electric field produced by a downward moving stepped leader. The time evolution of the development of positive leaders under natural conditions is different from the case in the laboratory, where the leader inception condition is closely dependent upon the initiation of the first streamer burst. Our study shows that the claimed similarity between the performance of lightning rods under switching electric fields applied in the laboratory and under the electric field produced by a descending stepped leader is not justified. Thus, the use of existing laboratory results to validate the performance of the early streamer lightning rods under natural conditions is not justified

  19. Why is the conclusion of the Gerda experiment not justified

    Science.gov (United States)

    Klapdor-Kleingrothaus, H. V.; Krivosheina, I. V.

    2013-12-01

    The first results of the GERDA double beta experiment in Gran Sasso were recently presented. They are fully consistent with the HEIDELBERG-MOSCOW experiment, but because of its low statistics cannot proof anything at this moment. It is no surprise that the statistics is still far from being able to test the signal claimed by the HEIDELBERG-MOSCOW experiment. The energy resolution of the coaxial detectors is a factor of 1.5 worse than in the HEIDELBERG-MOSCOW experiment. The original goal of background reduction to 10-2 counts/kg y keV, or by an order of magnitude compared to the HEIDELBERG-MOSCOW experiment, has not been reached. The background is only a factor 2.3 lower if we refer it to the experimental line width, i.e. in units counts/kg y energy resolution. With pulse shape analysis ( PSA) the back-ground in the HEIDELBERG-MOSCOW experiment around Q ββ is 4 × 10-3 counts/kg y keV [1], which is a factor of 4 (5 referring to the line width) lower than that of GERDA with pulse shape analysis. The amount of enriched material used in the GERDA measurement is 14.6 kg, only a factor of 1.34 larger than that used in the HEIDELBERG-MOSCOW experiment. The background model is oversimplified and not yet adequate. It is not shown that the lines of their background can be identified. GERDA has to continue the measurement further ˜5 years, until they can responsibly present an understood background. The present half life limit presented by GERDA of T {1/2/0v} > 2.1 × 1025 y (90% confidence level, i.e. 1.6ρ) is still lower than the half-life of T {1/2/0v} = 2.23{-0.31/+0.44} × 1025 y [1] determined in the HEIDELBERG-MOSCOW experiment.

  20. Quantifying Distributional Model Risk via Optimal Transport

    OpenAIRE

    Blanchet, Jose; Murthy, Karthyek R. A.

    2016-01-01

    This paper deals with the problem of quantifying the impact of model misspecification when computing general expected values of interest. The methodology that we propose is applicable in great generality, in particular, we provide examples involving path dependent expectations of stochastic processes. Our approach consists in computing bounds for the expectation of interest regardless of the probability measure used, as long as the measure lies within a prescribed tolerance measured in terms ...

  1. Semiparametric mixed-effects analysis of PK/PD models using differential equations.

    Science.gov (United States)

    Wang, Yi; Eskridge, Kent M; Zhang, Shunpu

    2008-08-01

    Motivated by the use of semiparametric nonlinear mixed-effects modeling on longitudinal data, we develop a new semiparametric modeling approach to address potential structural model misspecification for population pharmacokinetic/pharmacodynamic (PK/PD) analysis. Specifically, we use a set of ordinary differential equations (ODEs) with form dx/dt = A(t)x + B(t) where B(t) is a nonparametric function that is estimated using penalized splines. The inclusion of a nonparametric function in the ODEs makes identification of structural model misspecification feasible by quantifying the model uncertainty and provides flexibility for accommodating possible structural model deficiencies. The resulting model will be implemented in a nonlinear mixed-effects modeling setup for population analysis. We illustrate the method with an application to cefamandole data and evaluate its performance through simulations.

  2. Evidence-based, ethically justified counseling for fetal bilateral renal agenesis

    Science.gov (United States)

    Thomas, Alana N.; McCullough, Laurence B.; Chervenak, Frank A.; Placencia, Frank X.

    2017-01-01

    Background Not much data are available on the natural history of bilateral renal agenesis, as the medical community does not typically offer aggressive obstetric or neonatal care asbilateral renal agenesis has been accepted as a lethal condition. Aim To provide an evidence-based, ethically justified approach to counseling pregnant women about the obstetric management of bilateral renal agenesis. Study design A systematic literature search was performed using multiple databases. We deploy an ethical analysis of the results of the literature search on the basis of the professional responsibility model of obstetric ethics. Results Eighteen articles met the inclusion criteria for review. With the exception of a single case study using serial amnioinfusion, there has been no other case of survival following dialysis and transplantation documented. Liveborn babies die during the neonatal period. Counseling pregnant women about management of pregnancies complicated by bilateral renal agenesis should be guided by beneficence-based judgment informed by evidence about outcomes. Conclusions Based on the ethical analysis of the results from this review, without experimental obstetric intervention, neonatal mortality rates will continue to be 100%. Serial amnioinfusion therefore should not be offered as treatment, but only as approved innovation or research. PMID:28222038

  3. Digital and multimedia forensics justified: An appraisal on professional policy and legislation

    Science.gov (United States)

    Popejoy, Amy Lynnette

    Recent progress in professional policy and legislation at the federal level in the field of forensic science constructs a transformation of new outcomes for future experts. An exploratory and descriptive qualitative methodology was used to critique and examine Digital and Multimedia Science (DMS) as a justified forensic discipline. Chapter I summarizes Recommendations 1, 2, and 10 of the National Academy of Sciences (NAS) Report 2009 regarding disparities and challenges facing the forensic science community. Chapter I also delivers the overall foundation and framework of this thesis, specifically how it relates to DMS. Chapter II expands on Recommendation 1: "The Promotion and Development of Forensic Science," and focuses chronologically on professional policy and legislative advances through 2014. Chapter III addresses Recommendation 2: "The Standardization of Terminology in Reporting and Testimony," and the issues of legal language and terminology, model laboratory reports, and expert testimony concerning DMS case law. Chapter IV analyzes Recommendation 10: "Insufficient Education and Training," identifying legal awareness for the digital and multimedia examiner to understand the role of the expert witness, the attorney, the judge and the admission of forensic science evidence in litigation in our criminal justice system. Finally, Chapter V studies three DME specific laboratories at the Texas state, county, and city level, concentrating on current practice and procedure.

  4. Physicians and strikes: can a walkout over the malpractice crisis be ethically justified?

    Science.gov (United States)

    Fiester, Autumn

    2004-01-01

    Malpractice insurance rates have created a crisis in American medicine. Rates are rising and reimbursements are not keeping pace. In response, physicians in the states hardest hit by this crisis are feeling compelled to take political action, and the current action of choice seems to be physician strikes. While the malpractice insurance crisis is acknowledged to be severe, does it justify the extreme action of a physician walkout? Should physicians engage in this type of collective action, and what are the costs to patients and the profession when such action is taken? I will offer three related arguments against physician strikes that constitute a prima facie prohibition against such action: first, strikes are intended to cause harm to patients; second, strikes are an affront to the physician-patient relationship; and, third, strikes risk decreasing the public's respect for the medical profession. As with any prima facie obligation, there are justifying conditions that may override the moral prohibition, but I will argue that the current malpractice crisis does not rise to the level of such a justifying condition. While the malpractice crisis demands and justifies a political response on the part of the nation's physicians, strikes and slow-downs are not an ethically justified means to the legitimate end of controlling insurance costs.

  5. Twisted trees and inconsistency of tree estimation when gaps are treated as missing data - The impact of model mis-specification in distance corrections.

    Science.gov (United States)

    McTavish, Emily Jane; Steel, Mike; Holder, Mark T

    2015-12-01

    Statistically consistent estimation of phylogenetic trees or gene trees is possible if pairwise sequence dissimilarities can be converted to a set of distances that are proportional to the true evolutionary distances. Susko et al. (2004) reported some strikingly broad results about the forms of inconsistency in tree estimation that can arise if corrected distances are not proportional to the true distances. They showed that if the corrected distance is a concave function of the true distance, then inconsistency due to long branch attraction will occur. If these functions are convex, then two "long branch repulsion" trees will be preferred over the true tree - though these two incorrect trees are expected to be tied as the preferred true. Here we extend their results, and demonstrate the existence of a tree shape (which we refer to as a "twisted Farris-zone" tree) for which a single incorrect tree topology will be guaranteed to be preferred if the corrected distance function is convex. We also report that the standard practice of treating gaps in sequence alignments as missing data is sufficient to produce non-linear corrected distance functions if the substitution process is not independent of the insertion/deletion process. Taken together, these results imply inconsistent tree inference under mild conditions. For example, if some positions in a sequence are constrained to be free of substitutions and insertion/deletion events while the remaining sites evolve with independent substitutions and insertion/deletion events, then the distances obtained by treating gaps as missing data can support an incorrect tree topology even given an unlimited amount of data. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Compatriot partiality and cosmopolitan justice: Can we justify compatriot partiality within the cosmopolitan framework?

    Directory of Open Access Journals (Sweden)

    Rachelle Bascara

    2016-10-01

    Full Text Available This paper shows an alternative way in which compatriot partiality could be justified within the framework of global distributive justice. Philosophers who argue that compatriot partiality is similar to racial partiality capture something correct about compatriot partiality. However, the analogy should not lead us to comprehensively reject compatriot partiality. We can justify compatriot partiality on the same grounds that liberation movements and affirmative action have been justified. Hence, given cosmopolitan demands of justice, special consideration for the economic well-being of your nation as a whole is justified if and only if the country it identifies is an oppressed developing nation in an unjust global order.This justification is incomplete. We also need to say why Person A, qua national of Country A, is justified in helping her compatriots in Country A over similarly or slightly more oppressed non-compatriots in Country B. I argue that Person A’s partiality towards her compatriots admits further vindication because it is part of an oppressed group’s project of self-emancipation, which is preferable to paternalistic emancipation.Finally, I identify three benefits in my justification for compatriot partiality. First, I do not offer a blanket justification for all forms of compatriot partiality. Partiality between members of oppressed groups is only a temporary effective measure designed to level an unlevel playing field. Second, because history attests that sovereign republics could arise as a collective response to colonial oppression, justifying compatriot partiality on the grounds that I have identified is conducive to the development of sovereignty and even democracy in poor countries, thereby avoiding problems of infringement that many humanitarian poverty alleviation efforts encounter. Finally, my justification for compatriot partiality complies with the implicit cosmopolitan commitment to the realizability of global justice

  7. Bayesian inference for partially identified models exploring the limits of limited data

    CERN Document Server

    Gustafson, Paul

    2015-01-01

    Introduction Identification What Is against Us? What Is for Us? Some Simple Examples of Partially Identified ModelsThe Road Ahead The Structure of Inference in Partially Identified Models Bayesian Inference The Structure of Posterior Distributions in PIMs Computational Strategies Strength of Bayesian Updating, Revisited Posterior MomentsCredible Intervals Evaluating the Worth of Inference Partial Identification versus Model Misspecification The Siren Call of Identification Comp

  8. The Luckless and the Doomed: Contractualism on Justified Risk-Imposition

    DEFF Research Database (Denmark)

    Holm, Sune Hannibal

    2018-01-01

    Several authors have argued that contractualism faces a dilemma when it comes to justifying risks generated by socially valuable activities. At the heart of the matter is the question of whether contractualists should adopt an ex post or an ex ante perspective when assessing whether an action...... to prohibit a range of intuitively permissible and socially valuable activities....

  9. Context Based Inferences in Research Methodology: The Role of Culture in Justifying Knowledge Claims

    Science.gov (United States)

    Evers, Colin W.; Mason, Mark

    2011-01-01

    Drawing on work in epistemology and the philosophy of science, this paper seeks to provide very general reasons for why a comparative perspective needs to be applied to the inferential procedures of research methodologies where these concern the issue of justifying knowledge claims. In particular, the paper explores the role of culture on a number…

  10. Justifying the Use of Internet Sources in School Assignments on Controversial Issues

    Science.gov (United States)

    Mikkonen, Teemu

    2018-01-01

    Introduction: This study concerns students' criteria in the evaluation of Internet sources for a school assignment requiring reflections on a controversial issue. The findings are elaborated by analysing students' discursive accounts in justifying the use or non-use of sources. Method: The interview data was collected in a Finnish upper secondary…

  11. Mandatory Personal Therapy: Does the Evidence Justify the Practice? In Debate

    Science.gov (United States)

    Chaturvedi, Surabhi

    2013-01-01

    The article addresses the question of whether the practice of mandatory personal therapy, followed by several training organisations, is justified by existing research and evidence. In doing so, it discusses some implications of this training requirement from an ethical and ideological standpoint, raising questions of import for training…

  12. "Men Are Dogs": Is The Stereotype Justified? Data On the Cheating College Male

    Science.gov (United States)

    Knox, David; Vail-Smith, Karen; Zusman, Marty

    2008-01-01

    Analysis of data from 1394 undergraduates at a large southeastern university were used to assess the degree to which the stereotype that "men are dogs" (sexually-focused cheaters) is justified. Results suggest that this stereotype is unjustified since the majority of males: (1) define behaviors from kissing to anal sex as cheating; (2)…

  13. "Teach Your Children Well": Arguing in Favor of Pedagogically Justifiable Hospitality Education

    Science.gov (United States)

    Potgieter, Ferdinand J.

    2016-01-01

    This paper is a sequel to the paper which I delivered at last year's BCES conference in Sofia. Making use of hermeneutic phenomenology and constructive interpretivism as methodological apparatus, I challenge the pedagogic justifiability of the fashionable notion of religious tolerance. I suggest that we need, instead, to reflect "de…

  14. Is radiography justified for the evaluation of patients presenting with cervical spine trauma?

    Energy Technology Data Exchange (ETDEWEB)

    Theocharopoulos, Nicholas; Chatzakis, Georgios; Damilakis, John [Department of Medical Physics, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece) and Department of Natural Sciences, Technological Education Institute of Crete, P.O. Box 140, Iraklion 71004 Crete (Greece); Department of Radiology, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece); Department of Medical Physics, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece)

    2009-10-15

    radiogenic lethal cancer incidents. According to the decision model calculations, the use of CT is more favorable over the use of radiography alone or radiography with CT by a factor of 13, for low risk 20 yr old patients, to a factor of 23, for high risk patients younger than 80 yr old. The radiography/CT imaging strategy slightly outperforms plain radiography for high and moderate risk patients. Regardless of the patient age, sex, and fracture risk, the higher diagnostic accuracy obtained by the CT examination counterbalances the increase in dose compared to plain radiography or radiography followed by CT only for positive radiographs and renders CT utilization justified and the radiographic screening redundant.

  15. Heteroscedasticity as a Basis of Direction Dependence in Reversible Linear Regression Models.

    Science.gov (United States)

    Wiedermann, Wolfgang; Artner, Richard; von Eye, Alexander

    2017-01-01

    Heteroscedasticity is a well-known issue in linear regression modeling. When heteroscedasticity is observed, researchers are advised to remedy possible model misspecification of the explanatory part of the model (e.g., considering alternative functional forms and/or omitted variables). The present contribution discusses another source of heteroscedasticity in observational data: Directional model misspecifications in the case of nonnormal variables. Directional misspecification refers to situations where alternative models are equally likely to explain the data-generating process (e.g., x → y versus y → x). It is shown that the homoscedasticity assumption is likely to be violated in models that erroneously treat true nonnormal predictors as response variables. Recently, Direction Dependence Analysis (DDA) has been proposed as a framework to empirically evaluate the direction of effects in linear models. The present study links the phenomenon of heteroscedasticity with DDA and describes visual diagnostics and nine homoscedasticity tests that can be used to make decisions concerning the direction of effects in linear models. Results of a Monte Carlo simulation that demonstrate the adequacy of the approach are presented. An empirical example is provided, and applicability of the methodology in cases of violated assumptions is discussed.

  16. Letter: Can Islamic Jurisprudence Justify Procurement of Transplantable Vital Organs in Brain Death?

    Science.gov (United States)

    Rady, Mohamed Y

    2018-01-01

    In their article, "An International Legal Review of the Relationship between Brain Death and Organ Transplantation," in The Journal of Clinical Ethics 29, no. 1, Aramesh, Arima, Gardiner, and Shah reported on diverse international legislative approaches for justifying procurement of transplantable vital organs in brain death. They stated, "In Islamic traditions in particular, the notion of unstable life is a way to justify organ donation from brain-dead patients that we believe has not been fully described previously in the literature." This commentary queries the extent to which this concept is valid in accordance with the primary source of Islamic law, that is, the Quran. Copyright 2018 The Journal of Clinical Ethics. All rights reserved.

  17. Justifying decisions in social dilemmas: justification pressures and tacit coordination under environmental uncertainty.

    Science.gov (United States)

    de Kwaadsteniet, Erik W; van Dijk, Eric; Wit, Arjaan; De Cremer, David; de Rooij, Mark

    2007-12-01

    This article investigates how justification pressures influence harvesting decisions in common resource dilemmas. The authors argue that when a division rule prescribes a specific harvest level, such as under environmental certainty, people adhere more strongly to this division rule when they have to justify their decisions to fellow group members. When a division rule does not prescribe a specific harvest level, such as under environmental uncertainty, people restrict their harvests when they have to justify their decisions to fellow group members. The results of two experimental studies corroborate this line of reasoning. The findings are discussed in terms of tacit coordination. The authors specify conditions under which justification pressures may or may not facilitate efficient coordination.

  18. How arguments are justified in the media debate on climate change in the USA and France

    OpenAIRE

    Ylä-Anttila, Tuomas; Kukkonen, Anna

    2014-01-01

    This paper examines the differences in the values that are evoked to justify arguments in the media debate on climate change in USA and France from 1997 to 2011. We find that climate change is more often discussed in terms of justice, democracy, and legal regulation in France, while monetary value plays a more important role as a justification for climate policy arguments in the USA. Technological and scientific arguments are more often made in France, and ecological arguments equally in both...

  19. Influenza vaccination in Dutch nursing homes: is tacit consent morally justified?

    Science.gov (United States)

    Verweij, M F; van den Hoven, M A

    2005-01-01

    Efficient procedures for obtaining informed (proxy) consent may contribute to high influenza vaccination rates in nursing homes. Yet are such procedures justified? This study's objective was to gain insight in informed consent policies in Dutch nursing homes; to assess how these may affect influenza vaccination rates and to answer the question whether deviating from standard informed consent procedures could be morally justified. A survey among nursing home physicians. We sent a questionnaire to all (356) nursing homes in the Netherlands, to be completed by one of the physicians. We received 245 completed questionnaires. As 21 institutions appeared to be closed or merged into other institutions, the response was 73.1% (245/335). Of all respondents 81.9% reported a vaccination rate above 80%. Almost 50% reported a vaccination rate above 90%. Most respondents considered herd immunity to be an important consideration for institutional policy. Freedom of choice for residents was considered important by almost all. Nevertheless, 106 out of 245 respondents follow a tacit consent procedure, according to which vaccination will be administered unless the resident or her proxy refuses. These institutions show significantly higher vaccination rates (p tacit consent procedures can be morally justifiable. Such procedures assume that vaccination is good for residents either as individuals or as a group. Even though this assumption may be true for most residents, there are good reasons for preferring express consent procedures.

  20. Justifiability and Animal Research in Health: Can Democratisation Help Resolve Difficulties?

    Science.gov (United States)

    2018-01-01

    Simple Summary Scientists justify animal use in medical research because the benefits to human health outweigh the costs or harms to animals. However, whether it is justifiable is controversial for many people. Even public interests are divided because an increasing proportion of people do not support animal research, while demand for healthcare that is based on animal research is also rising. The wider public should be given more influence in these difficult decisions. This could be through requiring explicit disclosure about the role of animals in drug labelling to inform the public out of respect for people with strong objections. It could also be done through periodic public consultations that use public opinion and expert advice to decide which diseases justify the use of animals in medical research. More public input will help ensure that animal research projects meet public expectations and may help to promote changes to facilitate medical advances that need fewer animals. Abstract Current animal research ethics frameworks emphasise consequentialist ethics through cost-benefit or harm-benefit analysis. However, these ethical frameworks along with institutional animal ethics approval processes cannot satisfactorily decide when a given potential benefit is outweighed by costs to animals. The consequentialist calculus should, theoretically, provide for situations where research into a disease or disorder is no longer ethical, but this is difficult to determine objectively. Public support for animal research is also falling as demand for healthcare is rising. Democratisation of animal research could help resolve these tensions through facilitating ethical health consumerism or giving the public greater input into deciding the diseases and disorders where animal research is justified. Labelling drugs to disclose animal use and providing a plain-language summary of the role of animals may help promote public understanding and would respect the ethical beliefs of

  1. Classical ethical positions and their relevance in justifying behavior: A model of prescriptive attribution.

    OpenAIRE

    Witte, E.H.

    2002-01-01

    This paper separates empirical research on ethics from classical research on morality and relates it to other central questions of social psychology and sociology, e.g., values, culture, justice, attribution. In addition, reference is made to some founding studies of ethical research and its historical development. Based on this line of tradition the development of prescriptive attribution research is introduced, which concentrates on the justification of actions by weighting the importance o...

  2. Modelling severe Staphylococcus aureus sepsis in conscious pigs: are implications for animal welfare justified?

    DEFF Research Database (Denmark)

    Olsen, Helle G; Kjelgaard-Hansen, Mads; Tveden-Nyborg, Pernille

    2016-01-01

    by the severity of induced disease, which in some cases necessitated humane euthanasia. A pilot study was therefore performed in order to establish the sufficient inoculum concentration and application protocol needed to produce signs of liver dysfunction within limits of our pre-defined humane endpoints. Four....... Prior to euthanasia, a galactose elimination capacity test was performed to assess liver function. Pigs were euthanised 48 h post inoculation for necropsy and histopathological evaluation. While infusion times of 6.66 min, and higher, did not induce liver dysfunction (n = 3), the infusion time of 3...

  3. Belief in School Meritocracy as a System-justifying Tool for Low Status Students

    Directory of Open Access Journals (Sweden)

    Virginie eWiederkehr

    2015-07-01

    Full Text Available The belief that, in school, success only depends on will and hard work is widespread in Western societies despite evidence showing that several factors other than merit explain school success, including group belonging (e.g., social class, gender. In the present paper, we argue that because merit is the only track for low status students to reach upward mobility, Belief in School Meritocracy (BSM is a particularly useful system-justifying tool to help them perceive their place in society as being deserved. Consequently, for low status students (but not high status students, this belief should be related to more general system-justifying beliefs (Study 1. Moreover, low status students should be particularly prone to endorsing this belief when their place within a system on which they strongly depend to acquire status is challenged (Study 2. In Study 1, high status (boys and high SES were compared to low status (girls and low SES high school students. Results indicated that BSM was related to system-justifying beliefs only for low SES students and for girls, but not for high SES students or for boys. In Study 2, university students were exposed (or not to information about an important selection process that occurs at the university, depending on the condition. Their subjective status was assessed. Although such a confrontation reduced BSM for high subjective SES students, it tended to enhance it for low subjective SES students. Results are discussed in terms of system-justification motives and the palliative function meritocratic ideology may play for low status students.

  4. Belief in school meritocracy as a system-justifying tool for low status students.

    Science.gov (United States)

    Wiederkehr, Virginie; Bonnot, Virginie; Krauth-Gruber, Silvia; Darnon, Céline

    2015-01-01

    The belief that, in school, success only depends on will and hard work is widespread in Western societies despite evidence showing that several factors other than merit explain school success, including group belonging (e.g., social class, gender). In the present paper, we argue that because merit is the only track for low status students to reach upward mobility, Belief in School Meritocracy (BSM) is a particularly useful system-justifying tool to help them perceive their place in society as being deserved. Consequently, for low status students (but not high status students), this belief should be related to more general system-justifying beliefs (Study 1). Moreover, low status students should be particularly prone to endorsing this belief when their place within a system on which they strongly depend to acquire status is challenged (Study 2). In Study 1, high status (boys and high SES) were compared to low status (girls and low SES) high school students. Results indicated that BSM was related to system-justifying beliefs only for low SES students and for girls, but not for high SES students or for boys. In Study 2, university students were exposed (or not) to information about an important selection process that occurs at the university, depending on the condition. Their subjective status was assessed. Although such a confrontation reduced BSM for high subjective SES students, it tended to enhance it for low subjective SES students. Results are discussed in terms of system justification motives and the palliative function meritocratic ideology may play for low status students.

  5. [Justifying measures to correct functional state of operators varying in personal anxiety].

    Science.gov (United States)

    2012-01-01

    Workers of operating and dispatching occupations are exposed to high nervous and emotional exertion that result in increased personal anxiety, working stress and overstrain. That requires physiologically justified correction of hazardous psycho-physiologic states via various prophylactic measures (stay in schungite room, autogenous training, central electric analgesia, electric acupuncture). Attempted relaxation sessions in schungite room revealed in highly anxious individuals an increased velocity of visual signals perception, of attention concentration and shifting. Autogenous training sessions improve memory and have significant hypotensive effect in highly anxious individuals.

  6. When is deliberate killing of young children justified? Indigenous interpretations of infanticide in Bolivia.

    Science.gov (United States)

    de Hilari, Caroline; Condori, Irma; Dearden, Kirk A

    2009-01-01

    In the Andes, as elsewhere, infanticide is a difficult challenge that remains largely undocumented and misunderstood. From January to March 2004 we used community-based vital event surveillance systems, discussions with health staff, ethnographic interviews, and focus group discussions among Aymara men and women from two geographically distinct sites in the Andes of Bolivia to provide insights into the practice of infanticide. We noted elevated mortality at both sites. In one location, suspected causes of infanticide were especially high for girls. We also observed that community members maintain beliefs that justify infanticide under certain circumstances. Among the Aymara, justification for infanticide was both biological (deformities and twinship) and social (illegitimate birth, family size and poverty). Communities generally did not condemn killing when reasons for doing so were biological, but the taking of life for social reasons was rarely justified. In this cultural context, strategies to address the challenge of infanticide should include education of community members about alternatives to infanticide. At a program level, planners and implementers should target ethnic groups with high levels of infanticide and train health care workers to detect and address multiple warning signs for infanticide (for example, domestic violence and child maltreatment) as well as proxies for infant neglect and abuse such as mother/infant separation and bottle use.

  7. Is Investment in Maize Research Balanced and Justified? An Empirical Study

    Directory of Open Access Journals (Sweden)

    Hari Krishna Shrestha

    2016-12-01

    Full Text Available The objective of this study was to investigate whether the investment in maize research was adequate and balanced in Nepalese context. Resource use in maize research was empirically studied with standard congruency analysis by using Full Time Equivalent (FTE of researchers as a proxy measure of investment. The number of researchers involved in maize was 61 but it was only 21.25 on FTE basis, indicating that full time researchers were very few as compared to the cultivated area of maize in the country. Statistical analysis revealed that the investment in maize research was higher in Tarai and lower in the Hills. Congruency index on actual production basis was found low across the eco-zones and even lower across the geographical regions indicating that the investment in maize research was a mismatch and not justified. While adjusted with the equity factor and the research progress factor in the analysis substantial difference was not found in congruency index. This study recommends that substantial increase in investment in maize research is needed with balanced and justified manner across the eco-zones and the geographical regions. Hills need special attention to increase the investment as maize output value is higher in this eco-zone. Eastern and western regions also need increased investment in maize according to their contribution in the output value.

  8. Can conditional health policies be justified? A policy analysis of the new NHS dental contract reforms.

    Science.gov (United States)

    Laverty, Louise; Harris, Rebecca

    2018-06-01

    Conditional policies, which emphasise personal responsibility, are becoming increasingly common in healthcare. Although used widely internationally, they are relatively new within the UK health system where there have been concerns about whether they can be justified. New NHS dental contracts include the introduction of a conditional component that restricts certain patients from accessing a full range of treatment until they have complied with preventative action. A policy analysis of published documents on the NHS dental contract reforms from 2009 to 2016 was conducted to consider how conditionality is justified and whether its execution is likely to cause distributional effects. Contractualist, paternalistic and mutualist arguments that reflect notions of responsibility and obligation are used as justification within policy. Underlying these arguments is an emphasis on preserving the finite resources of a strained NHS. We argue that the proposed conditional component may differentially affect disadvantaged patients, who do not necessarily have access to the resources needed to meet the behavioural requirements. As such, the conditional component of the NHS dental contract reform has the potential to exacerbate oral health inequalities. Conditional health policies may challenge core NHS principles and, as is the case with any conditional policy, should be carefully considered to ensure they do not exacerbate health inequities. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Justifying molecular images in cell biology textbooks: From constructions to primary data.

    Science.gov (United States)

    Serpente, Norberto

    2016-02-01

    For scientific claims to be reliable and productive they have to be justified. However, on the one hand little is known on what justification precisely means to scientists, and on the other the position held by philosophers of science on what it entails is rather limited; for justifications customarily refer to the written form (textual expressions) of scientific claims, leaving aside images, which, as many cases from the history of science show are relevant to this process. The fact that images can visually express scientific claims independently from text, plus their vast variety and origins, requires an assessment of the way they are currently justified and in turn used as sources to justify scientific claims in the case of particular scientific fields. Similarly, in view of the different nature of images, analysis is required to determine on what side of the philosophical distinction between data and phenomena these different kinds of images fall. This paper historicizes and documents a particular aspect of contemporary life sciences research: the use of the molecular image as vehicle of knowledge production in cell studies, a field that has undergone a significant shift in visual expressions from the early 1980s onwards. Focussing on textbooks as sources that have been overlooked in the historiography of contemporary biomedicine, the aim is to explore (1) whether the shift of cell studies, entailing a superseding of the optical image traditionally conceptualised as primary data, by the molecular image, corresponds with a shift of justificatory practices, and (2) to assess the role of the molecular image as primary data. This paper also explores the dual role of images as teaching resources and as resources for the construction of knowledge in cell studies especially in its relation to discovery and justification. Finally, this paper seeks to stimulate reflection on what kind of archival resources could benefit the work of present and future epistemic

  10. How to justify enforcing a Ulysses contract when Ulysses is competent to refuse.

    Science.gov (United States)

    Davis, John K

    2008-03-01

    Sometimes the mentally ill have sufficient mental capacity to refuse treatment competently, and others have a moral duty to respect their refusal. However, those with episodic mental disorders may wish to precommit themselves to treatment, using Ulysses contracts known as "mental health advance directives." How can health care providers justify enforcing such contracts over an agent's current, competent refusal? I argue that providers respect an agent's autonomy not retrospectively--by reference to his or her past wishes-and not merely synchronically--so that the agent gets what he or she wants right now-but diachronically and prospectively, acting so that the agent can shape his or her circumstances as the agent wishes over time, for the agent will experience the consequences of providers' actions over time. Mental health directives accomplish this, so they are a way of respecting the agent's autonomy even when providers override the agent's current competent refusal.

  11. What justifies the United States ban on federal funding for nonreproductive cloning?

    Science.gov (United States)

    Cunningham, Thomas V

    2013-11-01

    This paper explores how current United States policies for funding nonreproductive cloning are justified and argues against that justification. I show that a common conceptual framework underlies the national prohibition on the use of public funds for cloning research, which I call the simple argument. This argument rests on two premises: that research harming human embryos is unethical and that embryos produced via fertilization are identical to those produced via cloning. In response to the simple argument, I challenge the latter premise. I demonstrate there are important ontological differences between human embryos (produced via fertilization) and clone embryos (produced via cloning). After considering the implications my argument has for the morality of publicly funding cloning for potential therapeutic purposes and potential responses to my position, I conclude that such funding is not only ethically permissible, but also humane national policy.

  12. Do the ends justify the means? Nursing and the dilemma of whistleblowing.

    Science.gov (United States)

    Firtko, Angela; Jackson, Debra

    2005-01-01

    Patient advocacy and a desire to rectify misconduct in the clinical setting are frequently cited reasons for whistleblowing in nursing and healthcare. This paper explores current knowledge about whistleblowing in nursing and critiques current definitions of whistleblowing. The authors draw on published perspectives of whistleblowing including the media, to reflect on the role of the media in health related whistleblowing. Whistleblowing represents a dilemma for nurses. It strikes at the heart of professional values and raises questions about the responsibilities nurses have to communities and clients, the profession, and themselves. In its most damaging forms, whistleblowing necessarily involves a breach of ethical standards, particularly confidentiality. Despite the pain that can be associated with whistleblowing, if the ends are improved professional standards, enhanced outcomes, rectification of wrongdoings, and, increased safety for patients and staff in our health services, then the ends definitely justify the means.

  13. Justifying continuous sedation until death: a focus group study in nursing homes in Flanders, Belgium.

    Science.gov (United States)

    Rys, Sam; Deschepper, Reginald; Deliens, Luc; Mortier, Freddy; Bilsen, Johan

    2013-01-01

    Continuous Sedation until Death (CSD), the act of reducing or removing the consciousness of an incurably ill patient until death, has become a common practice in nursing homes in Flanders (Belgium). Quantitative research has suggested that CSD is not always properly applied. This qualitative study aims to explore and describe the circumstances under which nursing home clinicians consider CSD to be justified. Six focus groups were conducted including 10 physicians, 24 nurses, and 14 care assistants working in either Catholic or non-Catholic nursing homes of varying size. Refractory suffering, limited life expectancy and respecting patient autonomy are considered essential elements in deciding for CSD. However, multiple factors complicate the care of nursing home residents at the end of life, and often hinder clinicians from putting these elements into practice. Nursing home clinicians may benefit from more information and instruction about managing CSD in the complex care situations which typically occur in nursing homes. Copyright © 2013 Mosby, Inc. All rights reserved.

  14. Is the term "fasciculus opticus cerebralis" more justifiable than the term "optic nerve"?

    Science.gov (United States)

    Vojniković, Bojo; Bajek, Snjezana; Bajek, Goran; Strenja-Linić, Ines; Grubesić, Aron

    2013-04-01

    The terminology of the optic nerve had already been changed three times, since 1895 until 1955 when the term "nervus opticus" was introduced in the "Terminologia Anatomica". Following our study we claim that, from the aspect of phylogenetic evolution of binocular vision development as well as optical embryogenesis where opticus is evidently presented as a product of diencephalic structures, the addition of the term "nervus" to opticus is not adequate and justified. From the clinical aspect the term "nervus opticus" is also inadequate, both as a "nerve" that has no functional regenerative properties, unlike other cranial nerves, as well as from a pedagogical and didactical aspect of educating future physicians. We suggest that the term "Fasciculus Opticus Cerebralis" should be used as it much better explains the origin as well as its affiliation to the central nervous system.

  15. Is development of geothermal energy resource in Macedonia justified or not?

    International Nuclear Information System (INIS)

    Popovski, Kiril; Popovska Vasilevska, Sanja

    2007-01-01

    During the 80-ies of last century, Macedonia has been one of the world leaders in development of direct application of geothermal energy. During a period of only 6-7 years a participation of 0,7% in the State energy balance has been reached. However, situation has been changed during the last 20 years and the development of this energy resource has been not only stopped but some of the existing projects have been abandoned leading to regression. This situation is illogical, due the fact that it practically proved of being technically feasible and absolutely economically justified. A summary of the present situation with geothermal projects in Macedonia is made in the paper, and possibilities for their improvement and possibilities and justifications for development of new resources foreseen. Final conclusion is that the development of direct application of geothermal energy in Macedonia offer (in comparison with other renewable energy resources) the best energy and economic effects. (Author)

  16. Selection Criteria in Regime Switching Conditional Volatility Models

    Directory of Open Access Journals (Sweden)

    Thomas Chuffart

    2015-05-01

    Full Text Available A large number of nonlinear conditional heteroskedastic models have been proposed in the literature. Model selection is crucial to any statistical data analysis. In this article, we investigate whether the most commonly used selection criteria lead to choice of the right specification in a regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH and the Markov-Switching GARCH models. Simulation experiments reveal that information criteria and loss functions can lead to misspecification ; BIC sometimes indicates the wrong regime switching framework. Depending on the Data Generating Process used in the experiments, great care is needed when choosing a criterion.

  17. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Is routine antenatal venereal disease research laboratory test still justified? Nigerian experience

    Directory of Open Access Journals (Sweden)

    Nwosu BO

    2015-01-01

    Full Text Available Betrand O Nwosu,1 George U Eleje,1 Amaka L Obi-Nwosu,2 Ita F Ahiarakwem,3 Comfort N Akujobi,4 Chukwudi C Egwuatu,4 Chukwudumebi O Onyiuke5 1Department of Obstetrics and Gynecology, Nnamdi Azikiwe University, Nnewi Campus, Nnewi, Anambra State, Nigeria; 2Department of Family Medicine, Nnamdi Azikiwe University Teaching Hospital, Nnewi, Nigeria; 3Department of Medical Microbiology, Imo State University Teaching Hospital, Orlu, Imo State, Nigeria; 4Department of Medical Microbiology, Nnamdi Azikiwe University, Nnewi Campus, Nnewi, Anambra State, Nigeria; 5Department of Medical Microbiology, Nnamdi Azikiwe University Teaching Hospital, Nnewi, Anambra State, NigeriaObjective: To determine the seroreactivity of pregnant women to syphilis in order to justify the need for routine antenatal syphilis screening.Methods: A multicenter retrospective analysis of routine antenatal venereal disease research laboratory (VDRL test results between 1 September 2010 and 31 August 2012 at three specialist care hospitals in south-east Nigeria was done. A reactive VDRL result is subjected for confirmation using Treponema pallidum hemagglutination assay test. Analysis was by Epi Info 2008 version 3.5.1 and Stata/IC version 10.Results: Adequate records were available regarding 2,156 patients and were thus reviewed. The mean age of the women was 27.4 years (±3.34, and mean gestational age was 26.4 weeks (±6.36. Only 15 cases (0.70% were seropositive to VDRL. Confirmatory T. pallidum hemagglutination assay was positive in 4 of the 15 cases, giving an overall prevalence of 0.19% and a false-positive rate of 73.3%. There was no significant difference in the prevalence of syphilis in relation to maternal age and parity (P>0.05.Conclusion: While the prevalence of syphilis is extremely low in the antenatal care population at the three specialist care hospitals in south-east Nigeria, false-positive rate is high and prevalence did not significantly vary with maternal age or

  19. Model format for a vaccine stability report and software solutions.

    Science.gov (United States)

    Shin, Jinho; Southern, James; Schofield, Timothy

    2009-11-01

    A session of the International Association for Biologicals Workshop on Stability Evaluation of Vaccine, a Life Cycle Approach was devoted to a model format for a vaccine stability report, and software solutions. Presentations highlighted the utility of a model format that will conform to regulatory requirements and the ICH common technical document. However, there need be flexibility to accommodate individual company practices. Adoption of a model format is premised upon agreement regarding content between industry and regulators, and ease of use. Software requirements will include ease of use and protections against inadvertent misspecification of stability design or misinterpretation of program output.

  20. Mankiw's Puzzle on Consumer Durables: A Misspecification

    OpenAIRE

    Tam Bang Vu

    2005-01-01

    Mankiw (1982) shows that consumer durables expenditures should follow a linear ARMA(1,1) process, but the data analyzed supports an AR(1) process instead; thus, a puzzle. In this paper, we employ a more general utility function than Mankiw's quadratic one. Further, the disturbance and depreciation rate are respecified, respectively, as multiplicative and stochastic. The analytical consequence is a nonlinear ARMA(infinity,1) process, which implies that the linear ARMA(1,1) is a misspecificatio...

  1. Dear Critics: Addressing Concerns and Justifying the Benefits of Photography as a Research Method

    Directory of Open Access Journals (Sweden)

    Kyle Elizabeth Miller

    2015-08-01

    Full Text Available Photography serves as an important tool for researchers to learn about the contextualized lives of individuals. This article explores the process of integrating photo elicitation interviews (PEI into research involving children and families. Much literature is dedicated to the general debate surrounding the ethics of visual methods in research, with little attention directed at the actual process of gaining study approval and publishing one's findings. There are two main critiques that researchers must face in order to conduct and disseminate studies involving visual images—ethics committees and peer reviewers. In this article, I identify and discuss some of the challenges that emerged across gaining protocol approval from an ethics committee in the United States. Ethical concerns and restrictions related to the use of photography can delay data collection and create barriers to research designs. Similarly, I describe the process of responding to reviewers' concerns as part of the publication process. Peer reviewers' lack of familiarity with the use of photography as a research tool may lead to misunderstandings and inappropriate requests for manuscript changes. While many concerns are sound, the range of benefits stemming from the use of visual data help to justify the time and energy required to defend this type of research. Implications are discussed for researchers using visual methods in their work. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1503274

  2. How to Justify Purchase of an iPad: Users of the Latest Launch

    Directory of Open Access Journals (Sweden)

    Emílio José Montero Arruda Filho

    2014-09-01

    Full Text Available Contemporary technology innovation is increasingly based on convergence and the multiple uses of products. This change is detailed in the literature about new product development, as well as that on systems integration. This article focuses on the factors that determine the justification for using advanced technology products in which the perceived value of the product is not based on its functionality, as much as on its hedonistic or social value as an “all-in-one” product. In this study, consumer behaviors toward the Apple iPad are analyzed using netnographic evidence taken from internet postings by the consumers themselves. Since Apple initially marketed the iPad as a revolutionary product, with integrated services and features, our analysis concentrates on how consumers perceived these new, innovative features, in an effort to justify their purchase of the product. Our findings indicate that consumers’ justifications are based not only on the iPad’s functionality, but also its hedonic traits, and its similarity to the previously released innovative product, the iPhone.

  3. Quadrilatero ferrifero, MG, Brazil. Regional characteristics justify application for global geoparks network

    International Nuclear Information System (INIS)

    Mantesso-Neto, V.; Azevedo, U.; Guimarães, R.; Nascimento, M.; Beato, D.; Castro, P.; Liccardo, A.

    2010-01-01

    Geopark, a concept created in 2000, is neither strictly geological nor a park in the usual sense. Geopark is a holistic concept, aimed at promoting sustainable economic development based on unique geological features (represented by “geosites”, outcrops with special value, under some point of view), but also having a social objective. The Global Geoparks Network (GGN), working in synergy with UNESCO, has 64 members in 19 countries. This paper presents a brief history and some characteristics of a few European Geoparks, followed by some aspects of the Quadrilátero Ferrífero. As shall be seen, this area is rich in geosites, and in historical, social and cultural attractions. On the other hand, foreseeing a decline in mineral exploitation in mid-century, it urgently seeks a good plan for regional development. As a conclusion, it will be seen that its characteristics fit the Geopark concept, and justify the support of the geoscientific community, and that of society in general, to its application, recently submitted to UNESCO, for admission to the GGN

  4. Ethical analysis of the justifiability of labelling with COPD for smoking cessation.

    Science.gov (United States)

    Kotz, D; Vos, R; Huibers, M J H

    2009-09-01

    Spirometry for early detection of chronic obstructive pulmonary disease (COPD) and smoking cessation is criticised because of the potential negative effects of labelling with disease. To assess the effects of opinions of smokers with mild to moderate COPD on the effectiveness of spirometry for smoking cessation, the justification of early detection of airflow limitation in smokers and the impact of confrontation with COPD. Qualitative study with data from a randomised controlled trial. General population of Dutch and Belgian Limburg. Semistructured ethical exit interviews were conducted with 205 smokers who were motivated to quit smoking and had no prior diagnosis of COPD but were detected with airflows limitation by means of spirometry. They received either (1) counselling, including labelling with COPD, plus with nortriptyline for smoking cessation, (2) counselling excluding labelling with COPD, plus nortriptyline for smoking cessation or (3) care as usual for smoking cessation by the general practitioner, without labelling with COPD. Of the participants, 177 (86%) agreed or completely agreed that it is justified to measure lung function in heavy smokers. These participants argued that measuring lung function raises consciousness of the negative effects of smoking, helps to prevent disease or increases motivation to stop smoking. Most of the 18 participants who disagreed argued that routinely measuring lung function in smokers would interfere with freedom of choice. Labelling with disease is probably a less important issue in the discussion about the pros and cons of early detection of COPD.

  5. Can context justify an ethical double standard for clinical research in developing countries?

    Directory of Open Access Journals (Sweden)

    Landes Megan

    2005-07-01

    Full Text Available Abstract Background The design of clinical research deserves special caution so as to safeguard the rights of participating individuals. While the international community has agreed on ethical standards for the design of research, these frameworks still remain open to interpretation, revision and debate. Recently a breach in the consensus of how to apply these ethical standards to research in developing countries has occurred, notably beginning with the 1994 placebo-controlled trials to reduce maternal to child transmission of HIV-1 in Africa, Asia and the Caribbean. The design of these trials sparked intense debate with the inclusion of a placebo-control group despite the existence of a 'gold standard' and trial supporters grounded their justifications of the trial design on the context of scarcity in resource-poor settings. Discussion These 'contextual' apologetics are arguably an ethical loophole inherent in current bioethical methodology. However, this convenient appropriation of 'contextual' analysis simply fails to acknowledge the underpinnings of feminist ethical analysis upon which it must stand. A more rigorous analysis of the political, social, and economic structures pertaining to the global context of developing countries reveals that the bioethical principles of beneficence and justice fail to be met in this trial design. Conclusion Within this broader, and theoretically necessary, understanding of context, it becomes impossible to justify an ethical double standard for research in developing countries.

  6. Population pharmacokinetic/pharmacodynamic modelling of the hypothalamic-pituitary-gonadal axis

    DEFF Research Database (Denmark)

    Tornøe, Christoffer Wenzel

    2005-01-01

    model mis-specification feasible by quantifying the model uncertainty, which subsequently provides the basis for systematic population PK/PD model development. To support the model building process, the SDE approach was applied to clinical PK/PD data and used as a tool for tracking unexplained...... was stimulated and inhibited by the plasma triptorelin and degarelix concentrations, respec-tively. Circulating LH stimulated the testosterone secretion while the delayed testosterone feedback on the non-basal LH synthesis and release was modelled through a receptor compartment where testosterone stimulates...

  7. Is febrile neutropenia prophylaxis with granulocyte-colony stimulating factors economically justified for adjuvant TC chemotherapy in breast cancer?

    Science.gov (United States)

    Skedgel, Chris; Rayson, Daniel; Younis, Tallal

    2016-01-01

    Febrile neutropenia (FN) during adjuvant chemotherapy is associated with morbidity, mortality risk, and substantial cost, and subsequent chemotherapy dose reductions may result in poorer outcomes. Patients at high risk of, or who develop FN, often receive prophylaxis with granulocyte colony-stimulating factors (G-CSF). We investigated whether different prophylaxis strategies with G-CSF offered favorable value-for-money. We developed a decision model to estimate the short- and long-term costs and outcomes of a hypothetical cohort of women with breast cancer receiving adjuvant taxotere + cyclophosphamide (TC) chemotherapy. The short-term phase estimated upfront costs and FN risks with adjuvant TC chemotherapy without G-CSF prophylaxis (i.e., chemotherapy dose reductions) as well as with secondary and primary G-CSF prophylaxis strategies. The long-term phase estimated the expected costs and quality-adjusted life years (QALYs) for patients who completed adjuvant TC chemotherapy with or without one or more episodes of FN. Secondary G-CSF was associated with lower costs and greater QALY gains than a no G-CSF strategy. Primary G-CSF appears likely to be cost-effective relative to secondary G-CSF at FN rates greater than 28%, assuming some loss of chemotherapy efficacy at lower dose intensities. The cost-effectiveness of primary vs. secondary G-CSF was sensitive to FN risk and mortality, and loss of chemotherapy efficacy following FN. Secondary G-CSF is more effective and less costly than a no G-CSF strategy. Primary G-CSF may be justified at higher willingness-to-pay thresholds and/or higher FN risks, but this threshold FN risk appears to be higher than the 20% rate recommended by current clinical guidelines.

  8. What the eye doesn’t see: An analysis of strategies for justifying acts by an appeal for conealing them

    NARCIS (Netherlands)

    Tellings, A.E.J.M.

    2006-01-01

    This article analyzes the moral reasoning implied in a very commonly used expression, namely, “What the eye doesn't see, the heart doesn't grieve over”, or “What you don't know won't hurt you.” It especially deals with situations in which it is used for trying to justify acts that are, in

  9. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  10. [Cesarean birth: justifying indication or justified concern?].

    Science.gov (United States)

    Muñoz-Enciso, José Manuel; Rosales-Aujang, Enrique; Domínguez-Ponce, Guillermo; Serrano-Díaz, César Leopoldo

    2011-02-01

    Caesarean section is the most common surgery performed in all hospitals of second level of care in the health sector and more frequently in private hospitals in Mexico. To determine the behavior that caesarean section in different hospitals in the health sector in the city of Aguascalientes and analyze the indications during the same period. A descriptive and cross in the top four secondary hospitals in the health sector of the state of Aguascalientes, which together account for 81% of obstetric care in the state, from 1 September to 31 October 2008. Were analyzed: indication of cesarean section and their classification, previous pregnancies, marital status, gestational age, weight and minute Apgar newborn and given birth control during the event. were recorded during the study period, 2.964 pregnancies after 29 weeks, of whom 1.195 were resolved by Caesarean section with an overall rate of 40.3%. We found 45 different indications, which undoubtedly reflect the great diversity of views on the institutional medical staff to schedule a cesarean section. Although each institution has different resources and a population with different characteristics, treatment protocols should be developed by staff of each hospital to have the test as a cornerstone of labor, also request a second opinion before a caesarean section, all try to reduce the frequency of cesarean section.

  11. Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis

    International Nuclear Information System (INIS)

    2011-01-01

    Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnup Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in

  12. Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Enercon Services, Inc.

    2011-03-14

    Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnup Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in

  13. Scientific and technical conference Thermophysical experimental and calculating and theoretical studies to justify characteristics and safety of fast reactors. Thermophysics-2012. Book of abstracts

    International Nuclear Information System (INIS)

    Kalyakin, S.G.; Kukharchuk, O.F.; Sorokin, A.P.

    2012-01-01

    The collection includes abstracts of reports of scientific and technical conference Thermophysics-2012 which has taken place on October 24-26, 2012 in Obninsk. In abstracts the following questions are considered: experimental and calculating and theoretical studies of thermal hydraulics of liquid-metal cooled fast reactors to justify their characteristics and safety; physico-chemical processes in the systems with liquid-metal coolants (LMC); physico-chemical characteristics and thermophysical properties of LMC; development of models, computational methods and calculational codes for simulating processes of of hydrodynamics, heat and mass transfer, including impurities mass transfer in the systems with LMC; methods and means for control of composition and condition of LMC in fast reactor circuits on impurities and purification from them; apparatuses, equipment and technological processes at the work with LMC taking into account the ecology, including fast reactors decommissioning; measuring techniques, sensors and devices for experimental studies of heat and mass transfer in the systems with LMC [ru

  14. Estimation and model selection of semiparametric multivariate survival functions under general censorship.

    Science.gov (United States)

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2010-07-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root- n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.

  15. Analysis of hypoglycemic events using negative binomial models.

    Science.gov (United States)

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Can Defense Spending Be Justified during a Period of Continual Peace?

    Science.gov (United States)

    1991-06-07

    although this was clearly unsatisfactory from a strictly theoretical perspective. 66Revealed Preference is a technique used to explain comsumer behavior ...insurgencies therefore a case cf irrational behavior ? In behaviorial sciences, it is usually tempting to assume away deviations from the prediction of a model...as irrational behavior or an inadequacy of the model. Rationality is axiomatic. All nation-states always act according to what they perceive (as

  17. The Shifting Seasonal Mean Autoregressive Model and Seasonality in the Central England Monthly Temperature Series, 1772-2016

    DEFF Research Database (Denmark)

    He, Changli; Kang, Jian; Terasvirta, Timo

    In this paper we introduce an autoregressive model with seasonal dummy variables in which coefficients of seasonal dummies vary smoothly and deterministically over time. The error variance of the model is seasonally heteroskedastic and multiplicatively decomposed, the decomposition being similar ...... temperature series. More specifically, the idea is to find out in which way and by how much the monthly temperatures are varying over time during the period of more than 240 years, if they do. Misspecification tests are applied to the estimated model and the findings discussed....

  18. Justifying Reasons for Giving Employment Priorities to Isargaran and Veterans in Iranian and American Law

    Directory of Open Access Journals (Sweden)

    Ali Akbar Gorji Azandaryani

    2012-11-01

    Full Text Available Equality is one of the principles and fundamental rights of human being. There has been lots of talk about equality and justice, but the legal aspect of this principle is still under dispute. Human beings are born equal, so their life has an equal moral value. This principle, along with prohibiting discrimination and bias rejection, has a great impact in the legislative and administrative decisions and is accepted in the Constitution and international norms. But here the important point in this matter is a formation of a paradox in the concept of the principle of equality in today's law. There is a kind of discrimination in the legal and social relationship, within the quest for equality. Privileges that granted to soldiers returning from war and their descendants is an issue that arises during or immediately after every war and because of its discriminatory nature becomes a controversial matter at first glance, and there are widespread opinions regarding this issue. In this article, we try to examine justifying reasons for giving employment priorities to veterans based on the theory of permissible discrimination and equality and to allude to isargaran and veterans' employment priority in Iran and the United States law. Therefore, at first, we examine the theoretical discussions and preference of veterans in America's law. In the next part, in the light of the findings of the first part, veterans and isargaran employment preference will be debated in the United States and Iran's judicial system. Discussing this privilege, we conclude that this privilege is granted to veterans and isargaran according to the theory of permissible discrimination and equality and none of these theories is completely accepted by the legislature of Iran and America and various theories have been used according to time and place. برابری یکی از اصول و حقوق بنیادین بشر به شمار می‌رود این اصل در کنار منع ت

  19. Are multi-paddock grazing systems economically justifiable? | M.T. ...

    African Journals Online (AJOL)

    The financial implications of few- and multi-paddock systems were modelled by a discounted cash flow analysis with the (discounted) present value as the dependent variable, and number of paddocks, farm run-down time, time horizon and discount rate as the independent variables. Present values were higher for few- ...

  20. Can High Bandwidth and Latency Justify Large Cache Blocks in Scalable Multiprocessors?

    Science.gov (United States)

    1994-01-01

    400 MB/second. 4 Dubnicki’s work used trace-driven simulation, with traces collected on an 8-processor machine. We would expect such small-scale...312 1 6 32 64 of odk Sb* Bad64.M Figure 17: Miss rate of Ind Blocked LU. Figure 18: MCPR of Ind Blocked LU. overall miss rate of TGauss is a factor of...easily. 17 (’his approach assunics that the model paramelers we collect from simulations with infinite band- width (such as the miss rate and the

  1. Does the Occasion Justify the Denunciation?: a Multilevel Approach for Brazilian Accountants

    Directory of Open Access Journals (Sweden)

    Bernardo de Abreu Guelber Fajardo

    2014-01-01

    Full Text Available Frauds represent large losses to the global economy, and one of the main means for their containment is by means of denunciations within organizations: whistle blowing. This research aims to analyze whistle blowing within the Brazilian context, considering the influence of costs and intrinsic benefits as well as aspects of the individual's interaction with his/her organization, profession and society at large. By means of a questionnaire answered by 124 accountants, a multilevel model was applied to analyze these aspects. The results demonstrate the importance of situational aspects as a positive influence in favor of denunciations. These results are useful for organizations and regulatory institutions in developing institutional mechanisms to encourage denunciation. Moreover, the results are also useful for teachers of professional ethics and members of the Federal and Regional Accounting Councils, which are dedicated to the assessment of alleged deviations from the professional code of ethics.

  2. Allocation of development assistance for health: is the predominance of national income justified?

    Science.gov (United States)

    Sterck, Olivier; Roser, Max; Ncube, Mthuli; Thewissen, Stefan

    2018-02-01

    Gross national income (GNI) per capita is widely regarded as a key determinant of health outcomes. Major donors heavily rely on GNI per capita to allocate development assistance for health (DAH). This article questions this paradigm by analysing the determinants of health outcomes using cross-sectional data from 99 countries in 2012. We use disability-adjusted life years (Group I) per capita as our main indicator for health outcomes. We consider four primary variables: GNI per capita, institutional capacity, individual poverty and the epidemiological surroundings. Our empirical strategy has two innovations. First, we construct a health poverty line of 10.89 international-$ per day, which measures the minimum level of income an individual needs to have access to basic healthcare. Second, we take the contagious nature of communicable diseases into account, by estimating the extent to which the population health in neighbouring countries (the epidemiological surroundings) affects health outcomes. We apply a spatial two-stage least-squares model to mitigate the risks of reverse causality. Our model captures 92% of the variation in health outcomes. We emphasize four findings. First, GNI per capita is not a significant predictor of health outcomes once other factors are controlled for. Second, the poverty gap below the 10.89 health poverty line is a good measure of universal access to healthcare, as it explains 19% of deviation in health outcomes. Third, the epidemiological surroundings in which countries are embedded capture as much as 47% of deviation in health outcomes. Finally, institutional capacity explains 10% of deviation in health outcomes. Our empirical findings suggest that allocation frameworks for DAH should not only take into account national income, which remains an important indicator of countries' financial capacity, but also individual poverty, governance and epidemiological surroundings to increase impact on health outcomes. The Author(s) 2017

  3. Topics in modelling of clustered data

    CERN Document Server

    Aerts, Marc; Ryan, Louise M; Geys, Helena

    2002-01-01

    Many methods for analyzing clustered data exist, all with advantages and limitations in particular applications. Compiled from the contributions of leading specialists in the field, Topics in Modelling of Clustered Data describes the tools and techniques for modelling the clustered data often encountered in medical, biological, environmental, and social science studies. It focuses on providing a comprehensive treatment of marginal, conditional, and random effects models using, among others, likelihood, pseudo-likelihood, and generalized estimating equations methods. The authors motivate and illustrate all aspects of these models in a variety of real applications. They discuss several variations and extensions, including individual-level covariates and combined continuous and discrete outcomes. Flexible modelling with fractional and local polynomials, omnibus lack-of-fit tests, robustification against misspecification, exact, and bootstrap inferential procedures all receive extensive treatment. The application...

  4. La guerre en Irak peut-elle être justifiée comme un cas d’intervention humanitaire?

    Directory of Open Access Journals (Sweden)

    Stéphane Courtois

    2006-05-01

    Full Text Available Most current criticisms against the intervention in Iraq have tackled the two justifications articulated by the members of the coalition:(1 that the United States had to neutralize the threats that Iraq generated for their own security and to the political stability in the Middle Eastand (2 that the war in Iraq can be justified as a necessary stage in the war against international terrorism. The principal objection against justification (1 is that it was, and remains, unfounded. Against justification (2, many have replied that the intervention in Iraq had no connection,or at best had merely an indirect connection, with the fight against terrorism. In a recent text,Fernando Tesón claims that the American intervention in Iraq can nevertheless be morally justified as a case of humanitarian intervention. By “humanitarian intervention”, one must understand a coercive action taken by a state or a group of states inside the sphere of jurisdiction of an independent political community, without the permission of the latter, in order to preventor to end a massive violation of individual rights perpetrated against innocent persons which are not co-nationals inside this political community. I argue in this article that the American intervention in Iraq does not satisfy the conditions of a legitimate humanitarian intervention, as opposed to what Fernando Tesón claims.

  5. Hydroaerothermal investigations conducted in the USSR to justify the construction of large cooling towers

    International Nuclear Information System (INIS)

    Goncharov, V.V.

    1989-01-01

    The multi-purpose task of improving water cooling systems of thermal and nuclear power plants is aimed at the development of efficient designs of cooling towers and other types of industrial coolers which call for comprehensive scientific justification. Cooling towers of 60-70 thou m 3 /h capacity with a chimney height of 130 m and those of 80-100 thou m 3 /h capacity with a chimney height of 150 m were developed. For circulating water systems of large power plants the design of a counterflow chimney cooling tower of 180 thou m 3 /h capacity has been recently developed. At present the work is being conducted on developing a new three-cell cooling tower featuring high reliability, operational flexibility and cost-effectiveness of the design. This cooling tower, besides having higher operating reliability than the conventional one of circular shape, can ensure the commissioning, current repairs and overhauls of water cooling arrangements in a cell-wise sequence, i.e. without shutting down the power generating units. Laboratory and field investigations of the spray-type cooling towers having no packing (fill), studies on heat and mass exchanges processes, aerodynamics of droplet flows and new designs of sprayers made it possible to come to a conclusion that their cooling capacity can be substantially increased and brought up to the level of the cooling towers with film packings. The pilot cooling towers were designed according to the counterflow, crossflow and cross-counterflow schemes. The basic investigation method remains to be the experimental one. On the test rigs and aerodynamic models the heat and mass transfer and aerodynamic resistance coefficients are determined. These studies and subsequent calculations are based on the heat balance equation

  6. Is the use of wildlife group-specific concentration ratios justified?

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Michael D. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Beresford, Nicholas A. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Centre for Ecology and Hydrology, Bailrigg, Lancaster, LA1 4AP (United Kingdom); Copplestone, David [School of Natural Sciences, University of Stirling, Stirling, FK9 4LA (United Kingdom); Howard, Brenda J. [Centre for Ecology and Hydrology, Bailrigg, Lancaster, LA1 4AP (United Kingdom); Yankovich, Tamara L. [International Atomic Energy Agency, Vienna International Centre, 1400 Vienna (Austria)

    2014-07-01

    The international Wildlife Transfer Database (WTD; www.wildlifetransferdatabase.org/?) provides the most comprehensive international compilation of radionuclide transfer parameters (concentration ratios) for wildlife. The concentration ratio (CR{sub wo-media}) is a constant that describes the ratio between the activity concentration of a radionuclide in the whole- organism and the activity concentration of that radionuclide in a reference environmental medium (e.g. soil or filtered water). Developed to support activities of the International Atomic Energy Agency (IAEA) and the International Commission on Radiological Protection (ICRP), the WTD now contains over 100,000 CR{sub wo-media} values. The WTD has been used to generate summary statistics for broad wildlife groups (e.g. amphibian, arthropod, mammal, reptile, shrub, tree etc). The group-specific summary statistics include mean and standard deviation (both arithmetic and geometric) and range. These summarised CR{sub wo-media} values (generally arithmetic or geometric mean) are used in most of the modelling approaches currently implemented for wildlife dose assessment. Beyond the broad organism group summary statistics presented within the WTD, it is possible to generate CR{sub wo-media} summary statistics for some organism sub-categories (e.g. carnivorous, herbivorous and omnivorous birds). However, using a statistical analysis we developed recently for the analysis of summarised datasets, we have shown that there is currently little statistical justification for the use of organism sub-category CR{sub wo-media} values. Large variability is a characteristic of many of the organism-radionuclide datasets within the WTD, even within individual input data sets. Therefore, the statistical validity of defining different CR{sub wo-media} values for these broad wildlife groups may also be questioned. However, no analysis has been undertaken to date to determine the statistical significance of any differences between

  7. A comparison of non-homogeneous Markov regression models with application to Alzheimer’s disease progression

    Science.gov (United States)

    Hubbard, R. A.; Zhou, X.H.

    2011-01-01

    Markov regression models are useful tools for estimating the impact of risk factors on rates of transition between multiple disease states. Alzheimer’s disease (AD) is an example of a multi-state disease process in which great interest lies in identifying risk factors for transition. In this context, non-homogeneous models are required because transition rates change as subjects age. In this report we propose a non-homogeneous Markov regression model that allows for reversible and recurrent disease states, transitions among multiple states between observations, and unequally spaced observation times. We conducted simulation studies to demonstrate performance of estimators for covariate effects from this model and compare performance with alternative models when the underlying non-homogeneous process was correctly specified and under model misspecification. In simulation studies, we found that covariate effects were biased if non-homogeneity of the disease process was not accounted for. However, estimates from non-homogeneous models were robust to misspecification of the form of the non-homogeneity. We used our model to estimate risk factors for transition to mild cognitive impairment (MCI) and AD in a longitudinal study of subjects included in the National Alzheimer’s Coordinating Center’s Uniform Data Set. Using our model, we found that subjects with MCI affecting multiple cognitive domains were significantly less likely to revert to normal cognition. PMID:22419833

  8. Should she be granted asylum? Examining the justifiability of the persecution criterion and nexus clause in asylum law

    Directory of Open Access Journals (Sweden)

    Noa Wirth Nogradi

    2016-10-01

    Full Text Available The current international asylum regime recognizes only persecuted persons as rightful asylum applicants. The Geneva Convention and Protocol enumerate specific grounds upon which persecution is recognized. Claimants who cannot demonstrate a real risk of persecution based on one of the recognized grounds are unlikely to be granted asylum. This paper aims to relate real-world practices to normative theories, asking whether the Convention’s restricted preference towards persecuted persons is normatively justified. I intend to show that the justifications of the persecution criterion also apply to grounds currently lacking recognition. My main concern will be persecution on the grounds of gender.The first section introduces the dominant standpoints in theories of asylum, which give different answers to the question of who should be granted asylum, based on different normative considerations. Humanitarian theories base their claims on the factual neediness of asylum-seekers, holding that whoever is in grave danger of harm or deprivation should be granted asylum. Political theories base their justifications on conceptions of legitimacy and membership, holding that whoever has been denied membership in their original state should be granted asylum. Under political theories, Matthew Price’s theory will be discussed, which provides a normative justification of the currently recognized persecution criterion. The second section provides a descriptive definition of persecution based on Kuosmanen (2014, and evaluates the normative relevance of the different elements of this definition based on the theories presented previously. The third section is devoted to the examination of the normative justifiability of the nexus clause’s exclusive list of the bases (grounds upon which persons might be persecuted. The section argues that while the clause does not recognize that persecution might be based on gender, in fact many women experience harms based on

  9. Spatial measurement error and correction by spatial SIMEX in linear regression models when using predicted air pollution exposures.

    Science.gov (United States)

    Alexeeff, Stacey E; Carroll, Raymond J; Coull, Brent

    2016-04-01

    Spatial modeling of air pollution exposures is widespread in air pollution epidemiology research as a way to improve exposure assessment. However, there are key sources of exposure model uncertainty when air pollution is modeled, including estimation error and model misspecification. We examine the use of predicted air pollution levels in linear health effect models under a measurement error framework. For the prediction of air pollution exposures, we consider a universal Kriging framework, which may include land-use regression terms in the mean function and a spatial covariance structure for the residuals. We derive the bias induced by estimation error and by model misspecification in the exposure model, and we find that a misspecified exposure model can induce asymptotic bias in the effect estimate of air pollution on health. We propose a new spatial simulation extrapolation (SIMEX) procedure, and we demonstrate that the procedure has good performance in correcting this asymptotic bias. We illustrate spatial SIMEX in a study of air pollution and birthweight in Massachusetts. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Assessing the DICE model: uncertainty associated with the emission and retention of greenhouse gases

    International Nuclear Information System (INIS)

    Kaufmann, R.K.

    1997-01-01

    Analysis of the DICE model indicates that it contains unsupported assumptions, simple extrapolations, and mis-specifications that cause it to understate the rate at which economic activity emits greenhouse gases and the rate at which the atmosphere retains greenhouse gases. The model assumes a world population that is 2 billion people lower than the 'base case' projected by demographers. The model extrapolates a decline in the quantity of greenhouse gases emitted per unit of economic activity that is possible only if there is a structural break in the economic and engineering factors have determined this ratio over the last century. The model uses a single equation to simulate the rate at which greenhouse gases accumulate in the atmosphere. The forecast for the airborne fraction generated by this equation contradicts forecasts generated by models that represent the physical and chemical processes which determine the movement of carbon from the atmosphere to the ocean. When these unsupported assumptions, simple extrapolations, and misspecifications are remedied with simple fixes, the economic impact of global climate change increases several fold. Similarly, these remedies increase the impact of uncertainty on estimates for the economic impact of global climate change. Together, these results indicate that considerable scientific and economic research is needed before the threat of climate change can be dismissed with any degree of certainty. 23 refs., 3 figs

  11. PET/CT in cancer: moderate sample sizes may suffice to justify replacement of a regional gold standard

    DEFF Research Database (Denmark)

    Gerke, Oke; Poulsen, Mads Hvid; Bouchelouche, Kirsten

    2009-01-01

    PURPOSE: For certain cancer indications, the current patient evaluation strategy is a perfect but locally restricted gold standard procedure. If positron emission tomography/computed tomography (PET/CT) can be shown to be reliable within the gold standard region and if it can be argued that PET...... of metastasized prostate cancer. RESULTS: An added value in accuracy of PET/CT in adjacent areas can outweigh a downsized target level of accuracy in the gold standard region, justifying smaller sample sizes. CONCLUSIONS: If PET/CT provides an accuracy benefit in adjacent regions, then sample sizes can be reduced....../CT also performs well in adjacent areas, then sample sizes in accuracy studies can be reduced. PROCEDURES: Traditional standard power calculations for demonstrating sensitivities of both 80% and 90% are shown. The argument is then described in general terms and demonstrated by an ongoing study...

  12. A Closer Look at the Junior Doctor Crisis in the United Kingdom's National Health Services: Is Emigration Justifiable?

    Science.gov (United States)

    Teo, Wendy Zi Wei

    2018-07-01

    This article attempts to tackle the ethically and morally troubling issue of emigration of physicians from the United Kingdom, and whether it can be justified. Unlike most research that has already been undertaken in this field, which looks at migration from developing countries to developed countries, this article takes an in-depth look at the migration of physicians between developed countries, in particular from the United Kingdom (UK) to other developed countries such as Canada, Australia, New Zealand, and the United States (US). This examination was written in response to a current and critical crisis in the National Health Service (NHS), where impending contract changes may bring about a potential exodus of junior doctors.

  13. A Lacanian Reading of the Two Novels The Scarlet Letter And Private Memoirs And Confessions of A Justified Sinner

    Directory of Open Access Journals (Sweden)

    Marjan Yazdanpanahi

    2016-07-01

    Full Text Available This paper discusses two novels The Private Memoirs and Confessions of a Justified Sinner and The Scarlet Letter written by James Hogg and Nathaniel Hawthorn from the perspective of Jacques Lacan theories: the mirror stage, the-name-of-the-father and desire. The mirror stage refers to historical value and an essential libidinal relationship with the body-image. The-name-of-the-father is defined as the prohibitive role of the father as the one who lays down the incest taboo in the Oedipus complex. Meanwhile, desire is neither the appetite for satisfaction, nor the demand for love, but the difference that results from the subtraction of the first from the second.

  14. Is a Clean Development Mechanism project economically justified? Case study of an International Carbon Sequestration Project in Iran.

    Science.gov (United States)

    Katircioglu, Salih; Dalir, Sara; Olya, Hossein G

    2016-01-01

    The present study evaluates a carbon sequestration project for the three plant species in arid and semiarid regions of Iran. Results show that Haloxylon performed appropriately in the carbon sequestration process during the 6 years of the International Carbon Sequestration Project (ICSP). In addition to a high degree of carbon dioxide sequestration, Haloxylon shows high compatibility with severe environmental conditions and low maintenance costs. Financial and economic analysis demonstrated that the ICSP was justified from an economic perspective. The financial assessment showed that net present value (NPV) (US$1,098,022.70), internal rate of return (IRR) (21.53%), and payback period (6 years) were in an acceptable range. The results of the economic analysis suggested an NPV of US$4,407,805.15 and an IRR of 50.63%. Therefore, results of this study suggest that there are sufficient incentives for investors to participate in such kind of Clean Development Mechanism (CDM) projects.

  15. Are chest radiographs justified in pre-employment examinations. Presentation of legal position and medical evidence based on 1760 cases

    International Nuclear Information System (INIS)

    Ladd, S.C.; Krause, U.; Ladd, M.E.

    2006-01-01

    The legal and medical basis for chest radiographs as part of pre-employment examinations (PEE) at a University Hospital is evaluated. The radiographs are primarily performed to exclude infectious lung disease. A total of 1760 consecutive chest radiographs performed as a routine part of PEEs were reviewed retrospectively. Pathologic findings were categorized as ''nonrelevant'' or ''relevant.'' No positive finding with respect to tuberculosis or any other infectious disease was found; 94.8% of the chest radiographs were completely normal. Only five findings were regarded as ''relevant'' for the individual. No employment-relevant diagnosis occurred. The performance of chest radiography as part of a PEE is most often not justified. The practice is expensive, can violate national and European law, and lacks medical justification. (orig.) [de

  16. Is nuclear energy justifiable?

    International Nuclear Information System (INIS)

    Roth, E.

    1988-01-01

    This is a comment on an article by Prof. Haerle a theologist, published earlier under the same heading, in which the use of nuclear energy is rejected for ethical reasons. The comment contents the claim mode by the first author that theologists, because they have general ethical competency, must needs have competency to decide on the fittest technique (of energy conversion) for satisfying, or potentially satisfying, the criteria of responsible action. Thus, an ethical comment on, for instance, nuclear energy is beyond the scope of the competency of the churches. One is only entitled as a private person to objecting to nuclear energy, not because of one's position in the church. (HSCH) [de

  17. Justified Self-Esteem

    Science.gov (United States)

    Kristjansson, Kristjan

    2007-01-01

    This paper develops a thread of argument from previous contributions to this journal by Richard Smith and Ruth Cigman about the educational salience of self-esteem. It is argued--contra Smith and Cigman--that the social science conception of self-esteem does serve a useful educational function, most importantly in undermining the inflated…

  18. Justifier l’injustifiable

    Directory of Open Access Journals (Sweden)

    Olivier Jouanjan

    2006-04-01

    Full Text Available Le « droit » tient aussi dans les discours qu’on tient sur lui, notamment les discours des juristes. L’analyse des discours des juristes engagés du Troisième Reich fait ressortir un schéma général de justification, un principe grammatical génératif de ces discours qu’on peut qualifier de « décisionnisme substantiel ». Le positivisme juridique, parce qu’abstrait et « juif », fut désigné comme l’ennemi principal de la science du « droit » nazi, une « science » qui ne pouvait se concevoir elle-même que comme politique. En analysant la construction idéologico-juridique de l’État total, la destruction de la notion de droits subjectifs, la substitution au concept de personnalité juridique d’une notion « concrète » de l’« être-membre-de-la-communauté », puis en montrant le fonctionnement de ces discours dans la pratique, la présente contribution met en évidence la double logique de l’incorporation et de l’incarnation à l’œuvre dans la science nazie du droit, une « science » dont Carl Schmitt fait la « théorie » en 1934 à travers la « pensée de l’ordre concret ».

  19. Rethinking Recruitment in Policing in Australia: Can the Continued Use of Masculinised Recruitment Tests and Pass Standards that Limit the Number of Women be Justified?

    Directory of Open Access Journals (Sweden)

    Susan Robinson

    2015-06-01

    Full Text Available Over the past couple of decades, Australian police organisations have sought to increase the numbers of women in sworn policing roles by strictly adhering to equal treatment of men and women in the recruitment process. Unfortunately this blind adherence to equal treatment in the recruitment processes may inadvertently disadvantage and limit women. In particular, the emphasis on masculine attributes in recruitment, as opposed to the ‘soft’ attributes of communication and conflict resolution skills, and the setting of the minimum pass standards according to average male performance, disproportionately disadvantages women and serves to unnecessarily limit the number of women in policing. This paper reviews studies undertaken by physiotherapists and a range of occupational experts to discuss the relevance of physical fitness and agility tests and the pass standards that are applied to these in policing. It is suggested that masculinised recruitment tests that pose an unnecessary barrier to women cannot be justified unless directly linked to the job that is to be undertaken. Utilising a policy development and review model, an analysis of the problem posed by physical testing that is unadjusted for gender, is applied. As a result, it is recommended that police organisations objectively review recruitment processes and requirements to identify and eliminate unnecessary barriers to women’s entry to policing. It is also recommended that where fitness and agility tests are deemed essential to the job, the pass level is adjusted for gender.

  20. A Smooth Transition Logit Model of the Effects of Deregulation in the Electricity Market

    DEFF Research Database (Denmark)

    Hurn, A.S.; Silvennoinen, Annastiina; Teräsvirta, Timo

    We consider a nonlinear vector model called the logistic vector smooth transition autoregressive model. The bivariate single-transition vector smooth transition regression model of Camacho (2004) is generalised to a multivariate and multitransition one. A modelling strategy consisting of specific......We consider a nonlinear vector model called the logistic vector smooth transition autoregressive model. The bivariate single-transition vector smooth transition regression model of Camacho (2004) is generalised to a multivariate and multitransition one. A modelling strategy consisting...... of specification, including testing linearity, estimation and evaluation of these models is constructed. Nonlinear least squares estimation of the parameters of the model is discussed. Evaluation by misspecification tests is carried out using tests derived in a companion paper. The use of the modelling strategy...

  1. On the Estimation of Disease Prevalence by Latent Class Models for Screening Studies Using Two Screening Tests with Categorical Disease Status Verified in Test Positives Only

    Science.gov (United States)

    Chu, Haitao; Zhou, Yijie; Cole, Stephen R.; Ibrahim, Joseph G.

    2010-01-01

    Summary To evaluate the probabilities of a disease state, ideally all subjects in a study should be diagnosed by a definitive diagnostic or gold standard test. However, since definitive diagnostic tests are often invasive and expensive, it is generally unethical to apply them to subjects whose screening tests are negative. In this article, we consider latent class models for screening studies with two imperfect binary diagnostic tests and a definitive categorical disease status measured only for those with at least one positive screening test. Specifically, we discuss a conditional independent and three homogeneous conditional dependent latent class models and assess the impact of misspecification of the dependence structure on the estimation of disease category probabilities using frequentist and Bayesian approaches. Interestingly, the three homogeneous dependent models can provide identical goodness-of-fit but substantively different estimates for a given study. However, the parametric form of the assumed dependence structure itself is not “testable” from the data, and thus the dependence structure modeling considered here can only be viewed as a sensitivity analysis concerning a more complicated non-identifiable model potentially involving heterogeneous dependence structure. Furthermore, we discuss Bayesian model averaging together with its limitations as an alternative way to partially address this particularly challenging problem. The methods are applied to two cancer screening studies, and simulations are conducted to evaluate the performance of these methods. In summary, further research is needed to reduce the impact of model misspecification on the estimation of disease prevalence in such settings. PMID:20191614

  2. Computed tomography is not justified in every pediatric blunt trauma patient with a suspicious mechanism of injury.

    Science.gov (United States)

    Hershkovitz, Yehuda; Zoarets, Itai; Stepansky, Albert; Kozer, Eran; Shapira, Zahar; Klin, Baruch; Halevy, Ariel; Jeroukhimov, Igor

    2014-07-01

    Computed tomography (CT) has become an important tool for the diagnosis of intra-abdominal and chest injuries in patients with blunt trauma. The role of CT in conscious asymptomatic patients with a suspicious mechanism of injury remains controversial. This controversy intensifies in the management of pediatric blunt trauma patients, who are much more susceptible to radiation exposure. The objective of this study was to evaluate the role of abdominal and chest CT imaging in asymptomatic pediatric patients with a suspicious mechanism of injury. Forty-two pediatric patients up to 15 years old were prospectively enrolled. All patients presented with a suspicious mechanism of blunt trauma and multisystem injury. They were neurologically intact and had no signs of injury to the abdomen or chest. Patients underwent CT imaging of the chest and abdomen as part of the initial evaluation. Thirty-one patients (74%) had a normal CT scan. Two patients of 11 with an abnormal CT scan required a change in management and were referred for observation in the Intensive Care Unit. None of the patients required surgical intervention. The routine use of CT in asymptomatic pediatric patients with a suspicious mechanism of blunt trauma injury is not justified. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Economic modelling of energy services: Rectifying misspecified energy demand functions

    International Nuclear Information System (INIS)

    Hunt, Lester C.; Ryan, David L.

    2015-01-01

    Although it is well known that energy demand is derived, since energy is required not for its own sake but for the energy services it produces – such as heating, lighting, and motive power – energy demand models, both theoretical and empirical, often fail to take account of this feature. In this paper, we highlight the misspecification that results from ignoring this aspect, and its empirical implications – biased estimates of price elasticities and other measures – and provide a relatively simple and empirically practicable way to rectify it, which has a strong theoretical grounding. To do so, we develop an explicit model of consumer behaviour in which utility derives from consumption of energy services rather than from the energy sources that are used to produce them. As we discuss, this approach opens up the possibility of examining many aspects of energy demand in a theoretically sound way that have not previously been considered on a widespread basis, although some existing empirical work could be interpreted as being consistent with this type of specification. While this formulation yields demand equations for energy services rather than for energy or particular energy sources, these are shown to be readily converted, without added complexity, into the standard type of energy demand equation(s) that is (are) typically estimated. The additional terms that the resulting energy demand equations include, compared to those that are typically estimated, highlight the misspecification that is implicit when typical energy demand equations are estimated. A simple solution for dealing with an apparent drawback of this formulation for empirical purposes, namely that information is required on typically unobserved energy efficiency, indicates how energy efficiency can be captured in the model, such as by including exogenous trends and/or including its possible dependence on past energy prices. The approach is illustrated using an empirical example that involves

  4. The frequency of Tay-Sachs disease causing mutations in the Brazilian Jewish population justifies a carrier screening program

    Directory of Open Access Journals (Sweden)

    Roberto Rozenberg

    Full Text Available CONTEXT: Tay-Sachs disease is an autosomal recessive disease characterized by progressive neurologic degeneration, fatal in early childhood. In the Ashkenazi Jewish population the disease incidence is about 1 in every 3,500 newborns and the carrier frequency is 1 in every 29 individuals. Carrier screening programs for Tay-Sachs disease have reduced disease incidence by 90% in high-risk populations in several countries. The Brazilian Jewish population is estimated at 90,000 individuals. Currently, there is no screening program for Tay-Sachs disease in this population. OBJECTIVE: To evaluate the importance of a Tay-Sachs disease carrier screening program in the Brazilian Jewish population by determining the frequency of heterozygotes and the acceptance of the program by the community. SETTING: Laboratory of Molecular Genetics - Institute of Biosciences - Universidade de São Paulo. PARTICIPANTS: 581 senior students from selected Jewish high schools. PROCEDURE: Molecular analysis of Tay-Sachs disease causing mutations by PCR amplification of genomic DNA, followed by restriction enzyme digestion. RESULTS: Among 581 students that attended educational classes, 404 (70% elected to be tested for Tay-Sachs disease mutations. Of these, approximately 65% were of Ashkenazi Jewish origin. Eight carriers were detected corresponding to a carrier frequency of 1 in every 33 individuals in the Ashkenazi Jewish fraction of the sample. CONCLUSION: The frequency of Tay-Sachs disease carriers among the Ashkenazi Jewish population of Brazil is similar to that of other countries where carrier screening programs have led to a significant decrease in disease incidence. Therefore, it is justifiable to implement a Tay-Sachs disease carrier screening program for the Brazilian Jewish population.

  5. The frequency of Tay-Sachs disease causing mutations in the Brazilian Jewish population justifies a carrier screening program.

    Science.gov (United States)

    Rozenberg, R; Pereira, L da V

    2001-07-05

    Tay-Sachs disease is an autosomal recessive disease characterized by progressive neurologic degeneration, fatal in early childhood. In the Ashkenazi Jewish population the disease incidence is about 1 in every 3,500 newborns and the carrier frequency is 1 in every 29 individuals. Carrier screening programs for Tay-Sachs disease have reduced disease incidence by 90% in high-risk populations in several countries. The Brazilian Jewish population is estimated at 90,000 individuals. Currently, there is no screening program for Tay-Sachs disease in this population. To evaluate the importance of a Tay-Sachs disease carrier screening program in the Brazilian Jewish population by determining the frequency of heterozygotes and the acceptance of the program by the community. Laboratory of Molecular Genetics--Institute of Biosciences--Universidade de São Paulo. 581 senior students from selected Jewish high schools. Molecular analysis of Tay-Sachs disease causing mutations by PCR amplification of genomic DNA, followed by restriction enzyme digestion. Among 581 students that attended educational classes, 404 (70%) elected to be tested for Tay-Sachs disease mutations. Of these, approximately 65% were of Ashkenazi Jewish origin. Eight carriers were detected corresponding to a carrier frequency of 1 in every 33 individuals in the Ashkenazi Jewish fraction of the sample. The frequency of Tay-Sachs disease carriers among the Ashkenazi Jewish population of Brazil is similar to that of other countries where carrier screening programs have led to a significant decrease in disease incidence. Therefore, it is justifiable to implement a Tay-Sachs disease carrier screening program for the Brazilian Jewish population.

  6. Sentinel lymph node biopsy in patients with a needle core biopsy diagnosis of ductal carcinoma in situ: is it justified?

    LENUS (Irish Health Repository)

    Doyle, B

    2012-02-01

    BACKGROUND: The incidence of ductal carcinoma in situ (DCIS) has increased markedly with the introduction of population-based mammographic screening. DCIS is usually diagnosed non-operatively. Although sentinel lymph node biopsy (SNB) has become the standard of care for patients with invasive breast carcinoma, its use in patients with DCIS is controversial. AIM: To examine the justification for offering SNB at the time of primary surgery to patients with a needle core biopsy (NCB) diagnosis of DCIS. METHODS: A retrospective analysis was performed of 145 patients with an NCB diagnosis of DCIS who had SNB performed at the time of primary surgery. The study focused on rates of SNB positivity and underestimation of invasive carcinoma by NCB, and sought to identify factors that might predict the presence of invasive carcinoma in the excision specimen. RESULTS: 7\\/145 patients (4.8%) had a positive sentinel lymph node, four macrometastases and three micrometastases. 6\\/7 patients had invasive carcinoma in the final excision specimen. 55\\/145 patients (37.9%) with an NCB diagnosis of DCIS had invasive carcinoma in the excision specimen. The median invasive tumour size was 6 mm. A radiological mass and areas of invasion <1 mm, amounting to "at least microinvasion" on NCB were predictive of invasive carcinoma in the excision specimen. CONCLUSIONS: SNB positivity in pure DCIS is rare. In view of the high rate of underestimation of invasive carcinoma in patients with an NCB diagnosis of DCIS in this study, SNB appears justified in this group of patients.

  7. Validation of image quality in full-field digital mammography: Is the replacement of wet by dry laser printers justified?

    International Nuclear Information System (INIS)

    Schueller, Gerd; Kaindl, Elisabeth; Langenberger, Herbert; Stadler, Alfred; Schueller-Weidekamm, Claudia; Semturs, Friedrich; Helbich, Thomas H.

    2007-01-01

    Objective: Dry laser printers have replaced wet laser printers to produce hard copies of high-resolution digital images, primarily because of environmental concerns. However, no scientific research data have been published that compare the image quality of dry and wet laser printers in full-field digital mammography (FFDM). This study questions the image quality of these printers. Materials and methods: Objective image quality parameters of both printers were evaluated using a standardized printer test image, i.e., optical density and detectability of specific image elements (lines, curves, and shapes). Furthermore, mammograms of 129 patients with different breast tissue composition patterns were imaged with both printers. A total of 1806 subjective image quality parameters (brightness, contrast, and detail detection of anatomic structures), the detectability of breast lesions, as well as diagnostic performance according to the BI-RADS classification were evaluated. In addition, the presence of film artifacts was investigated. Results: Optical density values were equal for the dry and the wet laser printer. Detection of specific image elements on the printer test image was not different. Ratings of subjective image quality parameters were equal, as were the detectability of breast lesions and the diagnostic performance. Dry laser printer images showed more artifacts (164 versus 27). However, these artifacts did not influence image quality. Conclusion: Based on the evidence of objective and subjective parameters, a dry laser printer equals the image quality of a wet laser printer in FFDM. Therefore, not only for reasons of environmental preference, the replacement of wet laser printers by dry laser printers in FFDM is justified

  8. An individual-based probabilistic model for simulating fisheries population dynamics

    Directory of Open Access Journals (Sweden)

    Jie Cao

    2016-12-01

    Full Text Available The purpose of stock assessment is to support managers to provide intelligent decisions regarding removal from fish populations. Errors in assessment models may have devastating impacts on the population fitness and negative impacts on the economy of the resource users. Thus, accuracte estimations of population size, growth rates are critical for success. Evaluating and testing the behavior and performance of stock assessment models and assessing the consequences of model mis-specification and the impact of management strategies requires an operating model that accurately describe the dynamics of the target species, and can resolve spatial and seasonal changes. In addition, the most thorough evaluations of assessment models use an operating model that takes a different form than the assessment model. This paper presents an individual-based probabilistic model used to simulate the complex dynamics of populations and their associated fisheries. Various components of population dynamics are expressed as random Bernoulli trials in the model and detailed life and fishery histories of each individual are tracked over their life span. The simulation model is designed to be flexible so it can be used for different species and fisheries. It can simulate mixing among multiple stocks and link stock-recruit relationships to environmental factors. Furthermore, the model allows for flexibility in sub-models (e.g., growth and recruitment and model assumptions (e.g., age- or size-dependent selectivity. This model enables the user to conduct various simulation studies, including testing the performance of assessment models under different assumptions, assessing the impacts of model mis-specification and evaluating management strategies.

  9. Information matrix estimation procedures for cognitive diagnostic models.

    Science.gov (United States)

    Liu, Yanlou; Xin, Tao; Andersson, Björn; Tian, Wei

    2018-03-06

    Two new methods to estimate the asymptotic covariance matrix for marginal maximum likelihood estimation of cognitive diagnosis models (CDMs), the inverse of the observed information matrix and the sandwich-type estimator, are introduced. Unlike several previous covariance matrix estimators, the new methods take into account both the item and structural parameters. The relationships between the observed information matrix, the empirical cross-product information matrix, the sandwich-type covariance matrix and the two approaches proposed by de la Torre (2009, J. Educ. Behav. Stat., 34, 115) are discussed. Simulation results show that, for a correctly specified CDM and Q-matrix or with a slightly misspecified probability model, the observed information matrix and the sandwich-type covariance matrix exhibit good performance with respect to providing consistent standard errors of item parameter estimates. However, with substantial model misspecification only the sandwich-type covariance matrix exhibits robust performance. © 2018 The British Psychological Society.

  10. What justifies a hospital admission at the end of life? A focus group study on perspectives of family physicians and nurses

    NARCIS (Netherlands)

    Reyniers, T.; Houttekieri, D.; Cohen, J.; Pasman, H.R.; Deliens, L.

    2014-01-01

    Background: Despite a majority preferring not to die in hospital and health policies aimed at increasing home death, the proportion of hospital deaths remains high. Gaining insight into professional caregiver perspectives about what justifies them could be helpful in understanding the persistently

  11. 32 CFR 37.560 - Must I be able to estimate project expenditures precisely in order to justify use of a fixed...

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false Must I be able to estimate project expenditures... I be able to estimate project expenditures precisely in order to justify use of a fixed-support TIA... purposes of this illustration, let that minimum recipient cost sharing be 40% of the total project costs...

  12. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  13. Simple, efficient estimators of treatment effects in randomized trials using generalized linear models to leverage baseline variables.

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J

    2010-04-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation.

  14. In lung cancer patients where a malignant pleural effusion is found at operation could resection ever still be justified?

    Science.gov (United States)

    Fiorelli, Alfonso; Santini, Mario

    2013-08-01

    A best evidence topic in thoracic surgery was written according to a structured protocol. The question addressed was whether surgery could ever be justified in non-small cell lung cancer patients with an unexpected malignant pleural effusion at surgery. Eight papers were chosen to answer the question. The authors, journal, date and country of publication, patient group studied, study type, relevant outcomes and results of these papers were tabulated. Study limitations included a lack of retrospective studies, the heterogeneous patient population and various treatments applied. Three papers found that surgery--compared to exploratory thoracotomy--was associated with a survival advantage in cases of minimal pleural disease. One paper showed that the median survival time of 58.8 months in patients with pleural effusion was better than that of patients with more extensive pleural dissemination as pleural nodule (10 months; P=0.0001) or pleural nodule with effusion (19.3 months; P=0.019). Another study showed that pleural effusion patients with N0-1 status had a median survival time more than 5 years longer than patients with similar or more extensive pleural dissemination but with N2-N3 status. A further study showed a better 5-year survival time in patients with pleural effusion, than in patients with pleural nodule (22.9% vs 8.9%, respectively; P=0.45). In two papers, surgery vs exploratory thoracotomy had better survival in cases of N0 status and of complete tumour resection independently of pleural dissemination. Different strategies were employed to obtain freedom from macroscopic residual tumour, including pneumonectomy, lobar resection or, to a lesser extent, pleurectomy in patients having pleural dissemination. Only one paper reported a worse median survival time after pneumonectomy than for more limited resections (12.8 vs 24.1 months, respectively; P=0.0018). In the remaining papers, no comparison between the different resections was made. In all studies

  15. Brand Cigarillos: Low Price but High Particulate Matter Levels-Is Their Favorable Taxation in the European Union Justified?

    Science.gov (United States)

    Wasel, Julia; Boll, Michael; Schulze, Michaela; Mueller, Daniel; Bundschuh, Matthias; Groneberg, David A; Gerber, Alexander

    2015-08-06

    taxation of cigarillos is not justifiable.

  16. Modelling Conditional and Unconditional Heteroskedasticity with Smoothly Time-Varying Structure

    DEFF Research Database (Denmark)

    Amado, Christina; Teräsvirta, Timo

    multiplier type misspecification tests. Finite-sample properties of these procedures and tests are examined by simulation. An empirical application to daily stock returns and another one to daily exchange rate returns illustrate the functioning and properties of our modelling strategy in practice......In this paper, we propose two parametric alternatives to the standard GARCH model. They allow the conditional variance to have a smooth time-varying structure of either ad- ditive or multiplicative type. The suggested parameterizations describe both nonlinearity and structural change...... in the conditional and unconditional variances where the transition between regimes over time is smooth. A modelling strategy for these new time-varying parameter GARCH models is developed. It relies on a sequence of Lagrange multiplier tests, and the adequacy of the estimated models is investigated by Lagrange...

  17. Can the benefits of physical seabed restoration justify the costs? An assessment of a disused aggregate extraction site off the Thames Estuary, UK.

    Science.gov (United States)

    Cooper, Keith; Burdon, Daryl; Atkins, Jonathan P; Weiss, Laura; Somerfield, Paul; Elliott, Michael; Turner, Kerry; Ware, Suzanne; Vivian, Chris

    2013-10-15

    Physical and biological seabed impacts can persist long after the cessation of marine aggregate dredging. Whilst small-scale experimental studies have shown that it may be possible to mitigate such impacts, it is unclear whether the costs of restoration are justified on an industrial scale. Here we explore this question using a case study off the Thames Estuary, UK. By understanding the nature and scale of persistent impacts, we identify possible techniques to restore the physical properties of the seabed, and the costs and the likelihood of success. An analysis of the ecosystem services and goods/benefits produced by the site is used to determine whether intervention is justified. Whilst a comparison of costs and benefits at this site suggests restoration would not be warranted, the analysis is site-specific. We emphasise the need to better define what is, and is not, an acceptable seabed condition post-dredging. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  18. On deciding to have a lobotomy: either lobotomies were justified or decisions under risk should not always seek to maximise expected utility.

    Science.gov (United States)

    Cooper, Rachel

    2014-02-01

    In the 1940s and 1950s thousands of lobotomies were performed on people with mental disorders. These operations were known to be dangerous, but thought to offer great hope. Nowadays, the lobotomies of the 1940s and 1950s are widely condemned. The consensus is that the practitioners who employed them were, at best, misguided enthusiasts, or, at worst, evil. In this paper I employ standard decision theory to understand and assess shifts in the evaluation of lobotomy. Textbooks of medical decision making generally recommend that decisions under risk are made so as to maximise expected utility (MEU) I show that using this procedure suggests that the 1940s and 1950s practice of psychosurgery was justifiable. In making sense of this finding we have a choice: Either we can accept that psychosurgery was justified, in which case condemnation of the lobotomists is misplaced. Or, we can conclude that the use of formal decision procedures, such as MEU, is problematic.

  19. How to define and build an effective cyber threat intelligence capability how to understand, justify and implement a new approach to security

    CERN Document Server

    Dalziel, Henry; Carnall, James

    2014-01-01

    Intelligence-Led Security: How to Understand, Justify and Implement a New Approach to Security is a concise review of the concept of Intelligence-Led Security. Protecting a business, including its information and intellectual property, physical infrastructure, employees, and reputation, has become increasingly difficult. Online threats come from all sides: internal leaks and external adversaries; domestic hacktivists and overseas cybercrime syndicates; targeted threats and mass attacks. And these threats run the gamut from targeted to indiscriminate to entirely accidental. Amo

  20. How can interventions for inhabitants be justified after a nuclear accident? An approach based on the radiological protection system of the international commission on radiological protection

    International Nuclear Information System (INIS)

    Takahara, Shogo; Homma, Toshimitsu; Yoneda, Minoru; Shimada, Yoko

    2016-01-01

    Management of radiation-induced risks in areas contaminated by a nuclear accident is characterized by three ethical issues: (1) risk trade-off, (2) paternalistic intervention and (3) individualization of responsibilities. To deal with these issues and to clarify requirements of justification of interventions for the purpose of reduction in radiation-induced risks, we explored the ethical basis of the radiological protection system of the International Commission on Radiological Protection (ICRP). The ICRP's radiological protection system is established based on three normative ethics, i.e. utilitarianism, deontology and virtue ethics. The three ethical issues can be resolved based on the decision-making framework which is constructed in combination with these ethical theories. In addition, the interventions for inhabitants have the possibility to be justified in accordance with two ways. Firstly, when the dangers are severe and far-reaching, interventions could be justified with a sufficient explanation about the nature of harmful effects (or beneficial consequences). Secondly, if autonomy of intervened-individuals can be promoted, those interventions could be justified. (author)

  1. How can health care organisations make and justify decisions about risk reduction? Lessons from a cross-industry review and a health care stakeholder consensus development process

    International Nuclear Information System (INIS)

    Sujan, Mark A.; Habli, Ibrahim; Kelly, Tim P.; Gühnemann, Astrid; Pozzi, Simone; Johnson, Christopher W.

    2017-01-01

    Interventions to reduce risk often have an associated cost. In UK industries decisions about risk reduction are made and justified within a shared regulatory framework that requires that risk be reduced as low as reasonably practicable. In health care no such regulatory framework exists, and the practice of making decisions about risk reduction is varied and lacks transparency. Can health care organisations learn from relevant industry experiences about making and justifying risk reduction decisions? This paper presents lessons from a qualitative study undertaken with 21 participants from five industries about how such decisions are made and justified in UK industry. Recommendations were developed based on a consensus development exercise undertaken with 20 health care stakeholders. The paper argues that there is a need in health care to develop a regulatory framework and an agreed process for managing explicitly the trade-off between risk reduction and cost. The framework should include guidance about a health care specific notion of acceptable levels of risk, guidance about standardised risk reduction interventions, it should include regulatory incentives for health care organisations to reduce risk, and it should encourage the adoption of an approach for documenting explicitly an organisation's risk position. - Highlights: • Empirical description of industry perceptions on making risk reduction decisions. • Health care consensus development identified five recommendations. • Risk concept should be better integrated into safety management. • Education and awareness about risk concept are required. • Health systems need to start a dialogue about acceptable levels of risk.

  2. Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.

    Science.gov (United States)

    Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H

    2016-01-01

    Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.

  3. [Hemolytic disease of the newborn has not vanished from Finland--routine protection of RhD negative mothers during pregnancy is justifiable].

    Science.gov (United States)

    Sainio, Susanna; Kuosmanen, Malla

    2012-01-01

    Prophylaxis of RhD negative mothers with anti-D immunoglobulin after childbirth is the most important procedure reducing the immunization of the mother and the risk of severe hemolytic disease of the newborn. In spite of this, anti-D antibodies having relevance to pregnancy are later detected in 1.8% of RhD negative mothers. Half of these cases could be prevented by routine anti-D prophylaxis given to the mothers during weeks 28 to 34 of pregnancy. Convincing evidence of the effectiveness of this measure has accumulated in the last few years, and application of the treatment is justified also in Finland.

  4. On adjustment for auxiliary covariates in additive hazard models for the analysis of randomized experiments

    DEFF Research Database (Denmark)

    Vansteelandt, S.; Martinussen, Torben; Tchetgen, E. J Tchetgen

    2014-01-01

    We consider additive hazard models (Aalen, 1989) for the effect of a randomized treatment on a survival outcome, adjusting for auxiliary baseline covariates. We demonstrate that the Aalen least-squares estimator of the treatment effect parameter is asymptotically unbiased, even when the hazard...... that, in view of its robustness against model misspecification, Aalen least-squares estimation is attractive for evaluating treatment effects on a survival outcome in randomized experiments, and the primary reasons to consider baseline covariate adjustment in such settings could be interest in subgroup......'s dependence on time or on the auxiliary covariates is misspecified, and even away from the null hypothesis of no treatment effect. We furthermore show that adjustment for auxiliary baseline covariates does not change the asymptotic variance of the estimator of the effect of a randomized treatment. We conclude...

  5. Are more restrictive food cadmium standards justifiable health safety measures or opportunistic barriers to trade? An answer from economics and public health

    International Nuclear Information System (INIS)

    Figueroa B, Eugenio

    2008-01-01

    In the past, Cd regulations have imposed trade restrictions on foodstuffs from some developing countries seeking to access markets in the developed world and in recent years, there has been a trend towards imposing more rigorous standards. This trend seems to respond more to public and private sectors strategies in some developed countries to create disguised barriers to trade and to improve market competitiveness for their industries, than to scientifically justified health precautions (sanitary and phytosanitary measures) and/or technical barriers to trade acceptable under the Uruguay Round Agreement of the WTO. Applying more rigorous Cd standards in some developed countries will not only increase production costs in developing countries but it will also have a large impact on their economies highly dependent on international agricultural markets. In the current literature there are large uncertainties in the cause-effect relationship between current levels of Cd intakes and eventual health effects in human beings; even the risk of Cd to kidney function is under considerable debate. Recent works on the importance of zinc:Cd ratio rather than Cd levels alone to determine Cd risk factors, on the one hand, and on the declining trends of Cd level in foods and soils, on the other, also indicate a lack of scientific evidence justifying more restrictive cadmium standards. This shows that developing countries should fight for changing and making more transparent the current international structures and procedures for setting sanitary and phytosanitary measures and technical barriers to trade

  6. Are more restrictive food cadmium standards justifiable health safety measures or opportunistic barriers to trade? An answer from economics and public health.

    Science.gov (United States)

    Figueroa B, Eugenio

    2008-01-15

    In the past, Cd regulations have imposed trade restrictions on foodstuffs from some developing countries seeking to access markets in the developed world and in recent years, there has been a trend towards imposing more rigorous standards. This trend seems to respond more to public and private sectors strategies in some developed countries to create disguised barriers to trade and to improve market competitiveness for their industries, than to scientifically justified health precautions (sanitary and phytosanitary measures) and/or technical barriers to trade acceptable under the Uruguay Round Agreement of the WTO. Applying more rigorous Cd standards in some developed countries will not only increase production costs in developing countries but it will also have a large impact on their economies highly dependent on international agricultural markets. In the current literature there are large uncertainties in the cause-effect relationship between current levels of Cd intakes and eventual health effects in human beings; even the risk of Cd to kidney function is under considerable debate. Recent works on the importance of zinc:Cd ratio rather than Cd levels alone to determine Cd risk factors, on the one hand, and on the declining trends of Cd level in foods and soils, on the other, also indicate a lack of scientific evidence justifying more restrictive cadmium standards. This shows that developing countries should fight for changing and making more transparent the current international structures and procedures for setting sanitary and phytosanitary measures and technical barriers to trade.

  7. Current Evidence to Justify, and the Methodological Considerations for a Randomised Controlled Trial Testing the Hypothesis that Statins Prevent the Malignant Progression of Barrett's Oesophagus

    Directory of Open Access Journals (Sweden)

    David Thurtle

    2014-12-01

    Full Text Available Barrett’s oesophagus is the predominant risk factor for oesophageal adenocarcinoma, a cancer whose incidence is increasing and which has a poor prognosis. This article reviews the latest experimental and epidemiological evidence justifying the development of a randomised controlled trial investigating the hypothesis that statins prevent the malignant progression of Barrett’s oesophagus, and explores the methodological considerations for such a trial. The experimental evidence suggests anti-carcinogenic properties of statins on oesophageal cancer cell lines, based on the inhibition of the mevalonate pathway and the production of pro-apoptotic proteins. The epidemiological evidence reports inverse associations between statin use and the incidence of oesophageal carcinoma in both general population and Barrett’s oesophagus cohorts. Such a randomised controlled trial would be a large multi-centre trial, probably investigating simvastatin, given the wide clinical experience with this drug, relatively low side-effect profile and low financial cost. As with any clinical trial, high adherence is important, which could be increased with therapy, patient, doctor and system-focussed interventions. We would suggest there is now sufficient evidence to justify a full clinical trial that attempts to prevent this aggressive cancer in a high-risk population.

  8. Relative efficiency of joint-model and full-conditional-specification multiple imputation when conditional models are compatible: The general location model.

    Science.gov (United States)

    Seaman, Shaun R; Hughes, Rachael A

    2018-06-01

    Estimating the parameters of a regression model of interest is complicated by missing data on the variables in that model. Multiple imputation is commonly used to handle these missing data. Joint model multiple imputation and full-conditional specification multiple imputation are known to yield imputed data with the same asymptotic distribution when the conditional models of full-conditional specification are compatible with that joint model. We show that this asymptotic equivalence of imputation distributions does not imply that joint model multiple imputation and full-conditional specification multiple imputation will also yield asymptotically equally efficient inference about the parameters of the model of interest, nor that they will be equally robust to misspecification of the joint model. When the conditional models used by full-conditional specification multiple imputation are linear, logistic and multinomial regressions, these are compatible with a restricted general location joint model. We show that multiple imputation using the restricted general location joint model can be substantially more asymptotically efficient than full-conditional specification multiple imputation, but this typically requires very strong associations between variables. When associations are weaker, the efficiency gain is small. Moreover, full-conditional specification multiple imputation is shown to be potentially much more robust than joint model multiple imputation using the restricted general location model to mispecification of that model when there is substantial missingness in the outcome variable.

  9. Simulation for Teaching Orthopaedic Residents in a Competency-based Curriculum: Do the Benefits Justify the Increased Costs?

    Science.gov (United States)

    Nousiainen, Markku T; McQueen, Sydney A; Ferguson, Peter; Alman, Benjamin; Kraemer, William; Safir, Oleg; Reznick, Richard; Sonnadara, Ranil

    2016-04-01

    Although simulation-based training is becoming widespread in surgical education and research supports its use, one major limitation is cost. Until now, little has been published on the costs of simulation in residency training. At the University of Toronto, a novel competency-based curriculum in orthopaedic surgery has been implemented for training selected residents, which makes extensive use of simulation. Despite the benefits of this intensive approach to simulation, there is a need to consider its financial implications and demands on faculty time. This study presents a cost and faculty work-hours analysis of implementing simulation as a teaching and evaluation tool in the University of Toronto's novel competency-based curriculum program compared with the historic costs of using simulation in the residency training program. All invoices for simulation training were reviewed to determine the financial costs before and after implementation of the competency-based curriculum. Invoice items included costs for cadavers, artificial models, skills laboratory labor, associated materials, and standardized patients. Costs related to the surgical skills laboratory rental fees and orthopaedic implants were waived as a result of special arrangements with the skills laboratory and implant vendors. Although faculty time was not reimbursed, faculty hours dedicated to simulation were also evaluated. The academic year of 2008 to 2009 was chosen to represent an academic year that preceded the introduction of the competency-based curriculum. During this year, 12 residents used simulation for teaching. The academic year of 2010 to 2011 was chosen to represent an academic year when the competency-based curriculum training program was functioning parallel but separate from the regular stream of training. In this year, six residents used simulation for teaching and assessment. The academic year of 2012 to 2013 was chosen to represent an academic year when simulation was used equally

  10. Outcome and survival of patients aged 75 years and older compared to younger patients after ruptured abdominal aortic aneurysm repair: do the results justify the effort?

    DEFF Research Database (Denmark)

    Shahidi, S; Schroeder, T Veith; Carstensen, M.

    2009-01-01

    We evaluated early mortality (preoperative variables that may be predictive of 30-day mortality in elderly patients compared to younger patients after emergency open repair of ruptured abdominal aortic aneurysm (RAAA). The survey is a retrospective analysis based...... patients compared to the younger group. Between the survivors of the two groups, there were no significant differences in the total length of stay (LOS) and the LOS in the intensive care unit. Advanced age (>or=75) and the combination of this advanced age and serum creatinine of >or=0.150 mmol/L were...... the only significant (p preoperative risk factors in our single-center study. However, we believe that treatment for RAAA can be justified in elderly patients. In our experience, surgical open repair has been life-saving in 33% of patients aged 75 years and older, at a relatively low price for each...

  11. Had We But World Enough, and Time... But We Don't!: Justifying the Thermodynamic and Infinite-Time Limits in Statistical Mechanics

    Science.gov (United States)

    Palacios, Patricia

    2018-05-01

    In this paper, I compare the use of the thermodynamic limit in the theory of phase transitions with the infinite-time limit in the explanation of equilibrium statistical mechanics. In the case of phase transitions, I will argue that the thermodynamic limit can be justified pragmatically since the limit behavior (i) also arises before we get to the limit and (ii) for values of N that are physically significant. However, I will contend that the justification of the infinite-time limit is less straightforward. In fact, I will point out that even in cases where one can recover the limit behavior for finite t, i.e. before we get to the limit, one cannot recover this behavior for realistic time scales. I will claim that this leads us to reconsider the role that the rate of convergence plays in the justification of infinite limits and calls for a revision of the so-called Butterfield's principle.

  12. For Better or Worse? System-Justifying Beliefs in Sixth-Grade Predict Trajectories of Self-Esteem and Behavior Across Early Adolescence.

    Science.gov (United States)

    Godfrey, Erin B; Santos, Carlos E; Burson, Esther

    2017-06-19

    Scholars call for more attention to how marginalization influences the development of low-income and racial/ethnic minority youth and emphasize the importance of youth's subjective perceptions of contexts. This study examines how beliefs about the fairness of the American system (system justification) in sixth grade influence trajectories of self-esteem and behavior among 257 early adolescents (average age 11.4) from a diverse, low-income, middle school in an urban southwestern city. System justification was associated with higher self-esteem, less delinquent behavior, and better classroom behavior in sixth grade but worse trajectories of these outcomes from sixth to eighth grade. These findings provide novel evidence that system-justifying beliefs undermine the well-being of marginalized youth and that early adolescence is a critical developmental period for this process. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  13. The EU Seal Products Ban – Why Ineffective Animal Welfare Protection Cannot Justify Trade Restrictions under European and International Trade Law

    Directory of Open Access Journals (Sweden)

    Martin Hennig

    2015-03-01

    Full Text Available In this article, the author questions the legitimacy of the general ban on trade in seal products adopted by the European Union. It is submitted that the EU Seal Regime, which permits the marketing of Greenlandic seal products derived from Inuit hunts, but excludes Canadian and Norwegian seal products from the European market, does not ensure a satisfactory degree of animal welfare protection in order to justify the comprehensive trade restriction in place. It is argued that the current ineffective EU ban on seal products, which according to the WTO Appellate Body cannot be reconciled with the objective of protecting animal welfare, has no legal basis in EU Treaties and should be annulled.

  14. Had We But World Enough, and Time... But We Don't!: Justifying the Thermodynamic and Infinite-Time Limits in Statistical Mechanics

    Science.gov (United States)

    Palacios, Patricia

    2018-04-01

    In this paper, I compare the use of the thermodynamic limit in the theory of phase transitions with the infinite-time limit in the explanation of equilibrium statistical mechanics. In the case of phase transitions, I will argue that the thermodynamic limit can be justified pragmatically since the limit behavior (i) also arises before we get to the limit and (ii) for values of N that are physically significant. However, I will contend that the justification of the infinite-time limit is less straightforward. In fact, I will point out that even in cases where one can recover the limit behavior for finite t, i.e. before we get to the limit, one cannot recover this behavior for realistic time scales. I will claim that this leads us to reconsider the role that the rate of convergence plays in the justification of infinite limits and calls for a revision of the so-called Butterfield's principle.

  15. Attitudes justifying domestic violence predict endorsement of corporal punishment and physical and psychological aggression towards children: a study in 25 low- and middle-income countries.

    Science.gov (United States)

    Lansford, Jennifer E; Deater-Deckard, Kirby; Bornstein, Marc H; Putnick, Diane L; Bradley, Robert H

    2014-05-01

    The Convention on the Rights of the Child has prompted countries to protect children from abuse and exploitation. Exposure to domestic violence and corporal punishment are risk factors in children's development. This study investigated how women's attitudes about domestic violence are related to attitudes about corporal punishment and harsh behaviors toward children, and whether country-wide norms regarding domestic violence and corporal punishment are related to psychological aggression and physical violence toward children. Data were drawn from the Multiple Indicator Cluster Survey, a nationally representative and internationally comparable household survey developed by the United Nations Children's Fund. Measures of domestic violence and discipline were completed by 85 999 female caregivers of children between the ages of 2 and 14 years from families in 25 low- and middle-income countries. Mothers who believed that husbands were justified in hitting their wives were more likely to believe that corporal punishment is necessary to rear children. Mothers who believed that husbands were justified in hitting their wives and that corporal punishment is necessary to rear children were more likely to report that their child had experienced psychological aggression and physical violence. Countrywide norms regarding the acceptability of husbands hitting wives and advisability of corporal punishment moderated the links between mothers' attitudes and their behaviors toward children. Pediatricians can address parents' psychological aggression and physical violence toward children by discussing parents' attitudes and behaviors within a framework that incorporates social norms regarding the acceptability of domestic violence and corporal punishment. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. HOW TO JUSTIFY AUTOMATION PROJECTS

    OpenAIRE

    Velásquez C., José

    2014-01-01

    This article deals with an Automation Project development. Important aspects with regard to its financial advantages are shown, with the purpose of knowing about the savings engaging a variety of areas within an enterprise, such as security, quality, marketing and logistics. El artículo trata sobre el desarrollo de un proyecto de automatización, se muestran aspectos importantes para su justificación económica, a fin de conocer los ahorros que pueden darse en distintas áreas de la empresa c...

  17. The Ends Justify the Memes

    OpenAIRE

    Miller, Ian D.; Cupchik, Gerald C.

    2016-01-01

    This talk presents an update on my research into memes.  It begins with an introduction to memes that is suitable for any audience.  It concludes with a detailed description of human research and simulation results that converge with one another.  I also present a short online study on email forwarding chains.

  18. Wound healing in cell studies and animal model experiments by low level laser therapy; Were clinical studies justified? A systematic review

    NARCIS (Netherlands)

    Lucas, C.; Criens-Poublon, L. J.; Cockrell, C. T.; de Haan, R. J.

    2002-01-01

    Based on results of cell studies and animal experiments, clinical trials with Low Level Laser Therapy (LLLT) were performed, which finally did not demonstrate a beneficial effect on outcome of wound healing. The aim of this study was to investigate whether the evidence from cell studies and animal

  19. There are calls for a national screening programme for prostate cancer: what is the evidence to justify such a national screening programme?

    Science.gov (United States)

    Green, A; Tait, C; Aboumarzouk, O; Somani, B K; Cohen, N P

    2013-05-01

    Prostate cancer is the commonest cancer in men and a major health issue worldwide. Screening for early disease has been available for many years, but there is still no national screening programme established in the United Kingdom. To assess the latest evidence regarding prostate cancer screening and whether it meets the necessary requirements to be established as a national programme for all men. Electronic databases and library catalogues were searched electronically and manual retrieval was performed. Only primary research results were used for the analysis. In recent years, several important randomised controlled trials have produced varied outcomes. In Europe the largest study thus far concluded that screening reduced prostate cancer mortality by 20%. On the contrary, a large American trial found no reduction in mortality after 7-10 years follow-up. Most studies comment on the adverse effects of screening - principally those of overdiagnosis and subsequent overtreatment. Further information about the natural history of prostate cancer and accuracy of screening is needed before a screening programme can be truly justified. In the interim, doctors and patients should discuss the risks, benefits and sequelae of taking part in voluntary screening for prostate cancer.

  20. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  1. “This is not a burning issue for me”: How citizens justify their use of wood heaters in a city with a severe air pollution problem

    International Nuclear Information System (INIS)

    Reeve, Ian; Scott, John; Hine, Donald W.; Bhullar, Navjot

    2013-01-01

    Although wood smoke pollution has been linked to health problems, wood burning remains a popular form of domestic heating in many countries across the world. In this paper, we describe the rhetoric of resistance to wood heater regulation amongst citizens in the regional Australian town of Armidale, where wood smoke levels regularly exceed national health advisory limits. We discuss how this is related to particular sources of resistance, such as affective attachment to wood heating and socio-cultural norms. The research draws on six focus groups with participants from households with and without wood heating. With reference to practice theory, we argue that citizen discourses favouring wood burning draw upon a rich suite of justifications and present this activity as a natural and traditional activity promoting comfort and cohesion. Such discourses also emphasise the identity of the town as a rural community and the supposed gemeinschaft qualities of such places. We show that, in this domain of energy policy, it is not enough to present ‘facts’ which have little emotional association or meaning for the populace. Rather, we need understand how social scripts, often localised, inform identity and practice. - Highlights: ► The negative health effects of wood smoke from wood heaters are known by citizens. ► Continued use of wood heating is justified with a rich suite of rhetorical strategies. ► Some strategies try to negate or diminish the case for phasing out wood heaters. ► Other strategies present wood heating as a natural, traditional and social activity

  2. Is the gravity effect of radiographic anatomic features enough to justify stone clearance or fragments retention following extracorporeal shock wave lithotripsy (SWL).

    Science.gov (United States)

    Mustafa, Mahmoud

    2012-08-01

    We determined whether the gravity effect of radiographic anatomic features on the preoperative urography (IVP) are enough to predict fragments clearance after shock wave lithotripsy (SWL). A Total of 282 patients with mean age 45.8 ± 13.2 years (189 male, 93 female), who underwent SWL due to renal calculi between October 2005 and August 2009 were enrolled. The mean calculi load was 155.72 ± 127.66 mm². The patients were stratified into three groups: patients with pelvis calculi (group 1); patients with upper or middle pole calculi (group 2) and patients with lower pole calculi (group 3). Three angles on the pretreatment IVP were measured: the inner angle between the axis of the lower pole infundibular and ureteropelvic axis (angle I); the inner angle between the lower pole infundibular axis and main axis of pelvis-ureteropelvic (UP) junction point (angle II) and the inner angle between the lower pole infundibular axis and perpendicular line (angle III). Multivariate analysis was used to define the significant predictors of stone clearance. The overall success rate was 85.81%. All angles, sessions number, shock waves number and stone burden were significant predictors of success in patients in group 1. However, in group 2 only angle II and in group 3 angles I and II had significant effect on stone clearance. Radiographic anatomic features have significant role in determining the stone-free rate following satisfactory fragmentation of renal stones with SWL. The measurement of infundibulopelvic angle in different manner helps to predict the stone-free status in patients with renal calculi located not only in lower pole, but also in renal pelvis and upper or middle pole. Gravity effect is not enough to justify the significant influence of the radiographic anatomic features on the stone clearance and fragments retention after SWL.

  3. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    Science.gov (United States)

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  4. Supplemental Material, PWQ42_2_747845_Choma_and_Prusaczyk - The Effects of System Justifying Beliefs on Skin-Tone Surveillance, Skin-Color Dissatisfaction, and Skin-Bleaching Behavior

    OpenAIRE

    Choma, Becky L.; Prusaczyk, Elvira

    2018-01-01

    Supplemental Material, PWQ42_2_747845_Choma_and_Prusaczyk for The Effects of System Justifying Beliefs on Skin-Tone Surveillance, Skin-Color Dissatisfaction, and Skin-Bleaching Behavior by Becky L. Choma, and Elvira Prusaczyk in Psychology of Women Quarterly

  5. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  6. The estimation of time-varying risks in asset pricing modelling using B-Spline method

    Science.gov (United States)

    Nurjannah; Solimun; Rinaldo, Adji

    2017-12-01

    Asset pricing modelling has been extensively studied in the past few decades to explore the risk-return relationship. The asset pricing literature typically assumed a static risk-return relationship. However, several studies found few anomalies in the asset pricing modelling which captured the presence of the risk instability. The dynamic model is proposed to offer a better model. The main problem highlighted in the dynamic model literature is that the set of conditioning information is unobservable and therefore some assumptions have to be made. Hence, the estimation requires additional assumptions about the dynamics of risk. To overcome this problem, the nonparametric estimators can also be used as an alternative for estimating risk. The flexibility of the nonparametric setting avoids the problem of misspecification derived from selecting a functional form. This paper investigates the estimation of time-varying asset pricing model using B-Spline, as one of nonparametric approach. The advantages of spline method is its computational speed and simplicity, as well as the clarity of controlling curvature directly. The three popular asset pricing models will be investigated namely CAPM (Capital Asset Pricing Model), Fama-French 3-factors model and Carhart 4-factors model. The results suggest that the estimated risks are time-varying and not stable overtime which confirms the risk instability anomaly. The results is more pronounced in Carhart’s 4-factors model.

  7. Analysing model fit of psychometric process models: An overview, a new test and an application to the diffusion model.

    Science.gov (United States)

    Ranger, Jochen; Kuhn, Jörg-Tobias; Szardenings, Carsten

    2017-05-01

    Cognitive psychometric models embed cognitive process models into a latent trait framework in order to allow for individual differences. Due to their close relationship to the response process the models allow for profound conclusions about the test takers. However, before such a model can be used its fit has to be checked carefully. In this manuscript we give an overview over existing tests of model fit and show their relation to the generalized moment test of Newey (Econometrica, 53, 1985, 1047) and Tauchen (J. Econometrics, 30, 1985, 415). We also present a new test, the Hausman test of misspecification (Hausman, Econometrica, 46, 1978, 1251). The Hausman test consists of a comparison of two estimates of the same item parameters which should be similar if the model holds. The performance of the Hausman test is evaluated in a simulation study. In this study we illustrate its application to two popular models in cognitive psychometrics, the Q-diffusion model and the D-diffusion model (van der Maas, Molenaar, Maris, Kievit, & Boorsboom, Psychol Rev., 118, 2011, 339; Molenaar, Tuerlinckx, & van der Maas, J. Stat. Softw., 66, 2015, 1). We also compare the performance of the test to four alternative tests of model fit, namely the M 2 test (Molenaar et al., J. Stat. Softw., 66, 2015, 1), the moment test (Ranger et al., Br. J. Math. Stat. Psychol., 2016) and the test for binned time (Ranger & Kuhn, Psychol. Test. Asess. , 56, 2014b, 370). The simulation study indicates that the Hausman test is superior to the latter tests. The test closely adheres to the nominal Type I error rate and has higher power in most simulation conditions. © 2017 The British Psychological Society.

  8. Estimating inverse probability weights using super learner when weight-model specification is unknown in a marginal structural Cox model context.

    Science.gov (United States)

    Karim, Mohammad Ehsanul; Platt, Robert W

    2017-06-15

    Correct specification of the inverse probability weighting (IPW) model is necessary for consistent inference from a marginal structural Cox model (MSCM). In practical applications, researchers are typically unaware of the true specification of the weight model. Nonetheless, IPWs are commonly estimated using parametric models, such as the main-effects logistic regression model. In practice, assumptions underlying such models may not hold and data-adaptive statistical learning methods may provide an alternative. Many candidate statistical learning approaches are available in the literature. However, the optimal approach for a given dataset is impossible to predict. Super learner (SL) has been proposed as a tool for selecting an optimal learner from a set of candidates using cross-validation. In this study, we evaluate the usefulness of a SL in estimating IPW in four different MSCM simulation scenarios, in which we varied the specification of the true weight model specification (linear and/or additive). Our simulations show that, in the presence of weight model misspecification, with a rich and diverse set of candidate algorithms, SL can generally offer a better alternative to the commonly used statistical learning approaches in terms of MSE as well as the coverage probabilities of the estimated effect in an MSCM. The findings from the simulation studies guided the application of the MSCM in a multiple sclerosis cohort from British Columbia, Canada (1995-2008), to estimate the impact of beta-interferon treatment in delaying disability progression. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  10. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  11. Goodness-of-fit tests and model diagnostics for negative binomial regression of RNA sequencing data.

    Science.gov (United States)

    Mi, Gu; Di, Yanming; Schafer, Daniel W

    2015-01-01

    This work is about assessing model adequacy for negative binomial (NB) regression, particularly (1) assessing the adequacy of the NB assumption, and (2) assessing the appropriateness of models for NB dispersion parameters. Tools for the first are appropriate for NB regression generally; those for the second are primarily intended for RNA sequencing (RNA-Seq) data analysis. The typically small number of biological samples and large number of genes in RNA-Seq analysis motivate us to address the trade-offs between robustness and statistical power using NB regression models. One widely-used power-saving strategy, for example, is to assume some commonalities of NB dispersion parameters across genes via simple models relating them to mean expression rates, and many such models have been proposed. As RNA-Seq analysis is becoming ever more popular, it is appropriate to make more thorough investigations into power and robustness of the resulting methods, and into practical tools for model assessment. In this article, we propose simulation-based statistical tests and diagnostic graphics to address model adequacy. We provide simulated and real data examples to illustrate that our proposed methods are effective for detecting the misspecification of the NB mean-variance relationship as well as judging the adequacy of fit of several NB dispersion models.

  12. Some Considerations on the Partial Credit Model

    Science.gov (United States)

    Verhelst, N. D.; Verstralen, H. H. F. M.

    2008-01-01

    The Partial Credit Model (PCM) is sometimes interpreted as a model for stepwise solution of polytomously scored items, where the item parameters are interpreted as difficulties of the steps. It is argued that this interpretation is not justified. A model for stepwise solution is discussed. It is shown that the PCM is suited to model sums of binary…

  13. Adjustment or updating of models

    Indian Academy of Sciences (India)

    25, Part 3, June 2000, pp. 235±245 ... While the model is defined in terms of these spatial parameters, ... discussed in terms of `model order' with concern focused on whether or not the ..... In other words, it is not easy to justify what the required.

  14. Angular overlap model in actinides

    International Nuclear Information System (INIS)

    Gajek, Z.; Mulak, J.

    1991-01-01

    Quantitative foundations of the Angular Overlap Model in actinides based on ab initio calculations of the crystal field effect in the uranium (III) (IV) and (V) ions in various crystals are presented. The calculations justify some common simplifications of the model and fix up the relations between the AOM parameters. Traps and limitations of the AOM phenomenology are discussed

  15. Angular overlap model in actinides

    Energy Technology Data Exchange (ETDEWEB)

    Gajek, Z.; Mulak, J. (Polska Akademia Nauk, Wroclaw (PL). Inst. Niskich Temperatur i Badan Strukturalnych)

    1991-01-01

    Quantitative foundations of the Angular Overlap Model in actinides based on ab initio calculations of the crystal field effect in the uranium (III) (IV) and (V) ions in various crystals are presented. The calculations justify some common simplifications of the model and fix up the relations between the AOM parameters. Traps and limitations of the AOM phenomenology are discussed.

  16. Interpretable inference on the mixed effect model with the Box-Cox transformation.

    Science.gov (United States)

    Maruo, K; Yamaguchi, Y; Noma, H; Gosho, M

    2017-07-10

    We derived results for inference on parameters of the marginal model of the mixed effect model with the Box-Cox transformation based on the asymptotic theory approach. We also provided a robust variance estimator of the maximum likelihood estimator of the parameters of this model in consideration of the model misspecifications. Using these results, we developed an inference procedure for the difference of the model median between treatment groups at the specified occasion in the context of mixed effects models for repeated measures analysis for randomized clinical trials, which provided interpretable estimates of the treatment effect. From simulation studies, it was shown that our proposed method controlled type I error of the statistical test for the model median difference in almost all the situations and had moderate or high performance for power compared with the existing methods. We illustrated our method with cluster of differentiation 4 (CD4) data in an AIDS clinical trial, where the interpretability of the analysis results based on our proposed method is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Phylogenetic tree reconstruction accuracy and model fit when proportions of variable sites change across the tree.

    Science.gov (United States)

    Shavit Grievink, Liat; Penny, David; Hendy, Michael D; Holland, Barbara R

    2010-05-01

    Commonly used phylogenetic models assume a homogeneous process through time in all parts of the tree. However, it is known that these models can be too simplistic as they do not account for nonhomogeneous lineage-specific properties. In particular, it is now widely recognized that as constraints on sequences evolve, the proportion and positions of variable sites can vary between lineages causing heterotachy. The extent to which this model misspecification affects tree reconstruction is still unknown. Here, we evaluate the effect of changes in the proportions and positions of variable sites on model fit and tree estimation. We consider 5 current models of nucleotide sequence evolution in a Bayesian Markov chain Monte Carlo framework as well as maximum parsimony (MP). We show that for a tree with 4 lineages where 2 nonsister taxa undergo a change in the proportion of variable sites tree reconstruction under the best-fitting model, which is chosen using a relative test, often results in the wrong tree. In this case, we found that an absolute test of model fit is a better predictor of tree estimation accuracy. We also found further evidence that MP is not immune to heterotachy. In addition, we show that increased sampling of taxa that have undergone a change in proportion and positions of variable sites is critical for accurate tree reconstruction.

  18. A matlab framework for estimation of NLME models using stochastic differential equations: applications for estimation of insulin secretion rates.

    Science.gov (United States)

    Mortensen, Stig B; Klim, Søren; Dammann, Bernd; Kristensen, Niels R; Madsen, Henrik; Overgaard, Rune V

    2007-10-01

    The non-linear mixed-effects model based on stochastic differential equations (SDEs) provides an attractive residual error model, that is able to handle serially correlated residuals typically arising from structural mis-specification of the true underlying model. The use of SDEs also opens up for new tools for model development and easily allows for tracking of unknown inputs and parameters over time. An algorithm for maximum likelihood estimation of the model has earlier been proposed, and the present paper presents the first general implementation of this algorithm. The implementation is done in Matlab and also demonstrates the use of parallel computing for improved estimation times. The use of the implementation is illustrated by two examples of application which focus on the ability of the model to estimate unknown inputs facilitated by the extension to SDEs. The first application is a deconvolution-type estimation of the insulin secretion rate based on a linear two-compartment model for C-peptide measurements. In the second application the model is extended to also give an estimate of the time varying liver extraction based on both C-peptide and insulin measurements.

  19. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  20. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    Science.gov (United States)

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  1. Modelling Embedded Systems by Non-Monotonic Refinement

    NARCIS (Netherlands)

    Mader, Angelika H.; Marincic, J.; Wupper, H.

    2008-01-01

    This paper addresses the process of modelling embedded sys- tems for formal verification. We propose a modelling process built on non-monotonic refinement and a number of guidelines. The outcome of the modelling process is a model, together with a correctness argument that justifies our modelling

  2. Identification and estimation of nonlinear models using two samples with nonclassical measurement errors

    KAUST Repository

    Carroll, Raymond J.

    2010-05-01

    This paper considers identification and estimation of a general nonlinear Errors-in-Variables (EIV) model using two samples. Both samples consist of a dependent variable, some error-free covariates, and an error-prone covariate, for which the measurement error has unknown distribution and could be arbitrarily correlated with the latent true values; and neither sample contains an accurate measurement of the corresponding true variable. We assume that the regression model of interest - the conditional distribution of the dependent variable given the latent true covariate and the error-free covariates - is the same in both samples, but the distributions of the latent true covariates vary with observed error-free discrete covariates. We first show that the general latent nonlinear model is nonparametrically identified using the two samples when both could have nonclassical errors, without either instrumental variables or independence between the two samples. When the two samples are independent and the nonlinear regression model is parameterized, we propose sieve Quasi Maximum Likelihood Estimation (Q-MLE) for the parameter of interest, and establish its root-n consistency and asymptotic normality under possible misspecification, and its semiparametric efficiency under correct specification, with easily estimated standard errors. A Monte Carlo simulation and a data application are presented to show the power of the approach.

  3. Cost Modeling for Space Telescope

    Science.gov (United States)

    Stahl, H. Philip

    2011-01-01

    Parametric cost models are an important tool for planning missions, compare concepts and justify technology investments. This paper presents on-going efforts to develop single variable and multi-variable cost models for space telescope optical telescope assembly (OTA). These models are based on data collected from historical space telescope missions. Standard statistical methods are used to derive CERs for OTA cost versus aperture diameter and mass. The results are compared with previously published models.

  4. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  5. Forecasting Performance of Asymmetric GARCH Stock Market Volatility Models

    Directory of Open Access Journals (Sweden)

    Hojin Lee

    2009-12-01

    Full Text Available We investigate the asymmetry between positive and negative returns in their effect on conditional variance of the stock market index and incorporate the characteristics to form an out-of-sample volatility forecast. Contrary to prior evidence, however, the results in this paper suggest that no asymmetric GARCH model is superior to basic GARCH(1,1 model. It is our prior knowledge that, for equity returns, it is unlikely that positive and negative shocks have the same impact on the volatility. In order to reflect this intuition, we implement three diagnostic tests for volatility models: the Sign Bias Test, the Negative Size Bias Test, and the Positive Size Bias Test and the tests against the alternatives of QGARCH and GJR-GARCH. The asymmetry test results indicate that the sign and the size of the unexpected return shock do not influence current volatility differently which contradicts our presumption that there are asymmetric effects in the stock market volatility. This result is in line with various diagnostic tests which are designed to determine whether the GARCH(1,1 volatility estimates adequately represent the data. The diagnostic tests in section 2 indicate that the GARCH(1,1 model for weekly KOSPI returns is robust to the misspecification test. We also investigate two representative asymmetric GARCH models, QGARCH and GJR-GARCH model, for our out-of-sample forecasting performance. The out-of-sample forecasting ability test reveals that no single model is clearly outperforming. It is seen that the GJR-GARCH and QGARCH model give mixed results in forecasting ability on all four criteria across all forecast horizons considered. Also, the predictive accuracy test of Diebold and Mariano based on both absolute and squared prediction errors suggest that the forecasts from the linear and asymmetric GARCH models need not be significantly different from each other.

  6. Adjusting for overdispersion in piecewise exponential regression models to estimate excess mortality rate in population-based research.

    Science.gov (United States)

    Luque-Fernandez, Miguel Angel; Belot, Aurélien; Quaresma, Manuela; Maringe, Camille; Coleman, Michel P; Rachet, Bernard

    2016-10-01

    In population-based cancer research, piecewise exponential regression models are used to derive adjusted estimates of excess mortality due to cancer using the Poisson generalized linear modelling framework. However, the assumption that the conditional mean and variance of the rate parameter given the set of covariates x i are equal is strong and may fail to account for overdispersion given the variability of the rate parameter (the variance exceeds the mean). Using an empirical example, we aimed to describe simple methods to test and correct for overdispersion. We used a regression-based score test for overdispersion under the relative survival framework and proposed different approaches to correct for overdispersion including a quasi-likelihood, robust standard errors estimation, negative binomial regression and flexible piecewise modelling. All piecewise exponential regression models showed the presence of significant inherent overdispersion (p-value regression modelling, with either a quasi-likelihood or robust standard errors, was the best approach as it deals with both, overdispersion due to model misspecification and true or inherent overdispersion.

  7. Flexible mixture modeling via the multivariate t distribution with the Box-Cox transformation: an alternative to the skew-t distribution.

    Science.gov (United States)

    Lo, Kenneth; Gottardo, Raphael

    2012-01-01

    Cluster analysis is the automated search for groups of homogeneous observations in a data set. A popular modeling approach for clustering is based on finite normal mixture models, which assume that each cluster is modeled as a multivariate normal distribution. However, the normality assumption that each component is symmetric is often unrealistic. Furthermore, normal mixture models are not robust against outliers; they often require extra components for modeling outliers and/or give a poor representation of the data. To address these issues, we propose a new class of distributions, multivariate t distributions with the Box-Cox transformation, for mixture modeling. This class of distributions generalizes the normal distribution with the more heavy-tailed t distribution, and introduces skewness via the Box-Cox transformation. As a result, this provides a unified framework to simultaneously handle outlier identification and data transformation, two interrelated issues. We describe an Expectation-Maximization algorithm for parameter estimation along with transformation selection. We demonstrate the proposed methodology with three real data sets and simulation studies. Compared with a wealth of approaches including the skew-t mixture model, the proposed t mixture model with the Box-Cox transformation performs favorably in terms of accuracy in the assignment of observations, robustness against model misspecification, and selection of the number of components.

  8. Ecologically justified regulatory provisions for riverine hydroelectric power plants and minimum instream flow requirements in diverted streams; Oekologisch begruendete, dynamische Mindestwasserregelungen bei Ausleitungskraftwerken

    Energy Technology Data Exchange (ETDEWEB)

    Jorde, K.

    1997-12-31

    The study was intended to develop a model versatile enough to permit quantification of various water demand scenarios in connection with operation of riverine hydroelectric power plants. Specific emphasis was to be placed on defining the minimum instream flow to be maintained in river segments because of the elementary significance to flowing water biocinoses. Based on fictitious minimum water requirements, various scenarious were simulated for flow regimes depending on power plant operation, so as to establish a system for comparative analysis and evaluation of resulting economic effects on power plant efficiency on the one hand, and the ecologic effects on the aquatic habitat. The information derived was to serve as a basis for decision-making for regulatory purposes. For this study, the temporal and spatial variability of the flow regime at the river bed in a river segment was examined for the first time. Based on this information, complemented by information obtained from habitat simulations, a method was derived for determination of ecologic requirements and their incorporation into regulatory water management provisions. The field measurements were carried out with the FST hemisphere as a proven and most efficient and reliable method of assessing flow regimes at river beds. Evaluation of the measured instream flow data characterising three morphologically different segments of diverted rivers was done with the CASIMIR computer code. The ASS models derived were used for comparative assessment of existing regulatory provisions and recommended amendments determining required minimum instream flow in diverted rivers. The requirements were defined taking as a basis data obtained for three different years. (orig./CB) [Deutsch] Ziel der Arbeit war die Entwicklung eines Modellverfahrens, das flexibel die Quantifizierung unterschiedlicher Nutzansprueche an Laufwasserkraftanlagen ermoeglicht. Insbesondere der Erhalt einer gewissen Dynamik, die fuer

  9. Ecologically justified regulatory provisions for riverine hydroelectric power plants and minimum instream flow requirements in diverted streams; Oekologisch begruendete, dynamische Mindestwasserregelungen bei Ausleitungskraftwerken

    Energy Technology Data Exchange (ETDEWEB)

    Jorde, K

    1998-12-31

    The study was intended to develop a model versatile enough to permit quantification of various water demand scenarios in connection with operation of riverine hydroelectric power plants. Specific emphasis was to be placed on defining the minimum instream flow to be maintained in river segments because of the elementary significance to flowing water biocinoses. Based on fictitious minimum water requirements, various scenarious were simulated for flow regimes depending on power plant operation, so as to establish a system for comparative analysis and evaluation of resulting economic effects on power plant efficiency on the one hand, and the ecologic effects on the aquatic habitat. The information derived was to serve as a basis for decision-making for regulatory purposes. For this study, the temporal and spatial variability of the flow regime at the river bed in a river segment was examined for the first time. Based on this information, complemented by information obtained from habitat simulations, a method was derived for determination of ecologic requirements and their incorporation into regulatory water management provisions. The field measurements were carried out with the FST hemisphere as a proven and most efficient and reliable method of assessing flow regimes at river beds. Evaluation of the measured instream flow data characterising three morphologically different segments of diverted rivers was done with the CASIMIR computer code. The ASS models derived were used for comparative assessment of existing regulatory provisions and recommended amendments determining required minimum instream flow in diverted rivers. The requirements were defined taking as a basis data obtained for three different years. (orig./CB) [Deutsch] Ziel der Arbeit war die Entwicklung eines Modellverfahrens, das flexibel die Quantifizierung unterschiedlicher Nutzansprueche an Laufwasserkraftanlagen ermoeglicht. Insbesondere der Erhalt einer gewissen Dynamik, die fuer

  10. Modeling units of study from a pedagogical perspective: the pedagogical meta-model behind EML

    NARCIS (Netherlands)

    Koper, Rob

    2003-01-01

    This text is a short summary of the work on pedagogical analysis carried out when EML (Educational Modelling Language) was being developed. Because we address pedagogical meta-models the consequence is that I must justify the underlying pedagogical models it describes. I have included a (far from

  11. More Precise Estimation of Lower-Level Interaction Effects in Multilevel Models.

    Science.gov (United States)

    Loeys, Tom; Josephy, Haeike; Dewitte, Marieke

    2018-01-01

    In hierarchical data, the effect of a lower-level predictor on a lower-level outcome may often be confounded by an (un)measured upper-level factor. When such confounding is left unaddressed, the effect of the lower-level predictor is estimated with bias. Separating this effect into a within- and between-component removes such bias in a linear random intercept model under a specific set of assumptions for the confounder. When the effect of the lower-level predictor is additionally moderated by another lower-level predictor, an interaction between both lower-level predictors is included into the model. To address unmeasured upper-level confounding, this interaction term ought to be decomposed into a within- and between-component as well. This can be achieved by first multiplying both predictors and centering that product term next, or vice versa. We show that while both approaches, on average, yield the same estimates of the interaction effect in linear models, the former decomposition is much more precise and robust against misspecification of the effects of cross-level and upper-level terms, compared to the latter.

  12. Proceedings of the Workshop on Justifying the Suitability of Nuclear Licensee Organisational Structure, Resources and Competencies - Methods, Approaches and Good Practices

    International Nuclear Information System (INIS)

    2009-01-01

    The nuclear industry is currently facing a range of organisational challenges. The nuclear renaissance is resulting in renewed interest in new reactor build programmes; existing plants are being modernised; ageing plants and an ageing workforce are being replaced. The industry is developing new models of working in a competitive, and increasingly global market which has seen increased use of contractors and organisational change taking place at an unparalleled rate. It is clear that the way in which nuclear licensees' organisations are structured and resourced has a potential impact on nuclear safety. For example, nuclear safety may be challenged if organisational structures create uncertainty concerning authority and responsibilities or if nuclear safety functions are not adequately resourced. Inasmuch as this is so, then it is reasonable to expect both licensees and regulatory bodies to seek assurance that licensee organisations are suitable to manage nuclear safety and discharge the responsibilities associated with operating as a nuclear licensee. Although licensees should have the authority to organise their plant activities in different ways, they should also be able to demonstrate that they understand the potential impact that these activities may have on plant safety. They should be able to show how their organisations are designed to carry out these activities safely and effectively, and to verify that the nuclear safety functions are being delivered as expected. There is a growing interest from some nuclear regulatory bodies, as well as licensees, in methods and approaches that can be used to ensure that the licensee organisations are well structured and have sufficient resources and competencies to manage safety. To address these and other nuclear plant organisational safety-related issues a NEA/CSNI workshop was held in Uppsala (Sweden) hosted by the Swedish Radiation Safety Authority with support from the European Union's Joint Research Centre (JRC

  13. Modeling of hydrogen interactions with beryllium

    Energy Technology Data Exchange (ETDEWEB)

    Longhurst, G.R. [Lockheed Martin Idaho Technologies Co., Idaho Falls, ID (United States)

    1998-01-01

    In this paper, improved mathematical models are developed for hydrogen interactions with beryllium. This includes the saturation effect observed for high-flux implantation of ions from plasmas and retention of tritium produced from neutronic transmutations in beryllium. Use of the models developed is justified by showing how they can replicated experimental data using the TMAP4 tritium transport code. (author)

  14. Justified requirements in private transportation and a recommendation for improving the efficiency of household energy utilisation through the use of small ecologically-friendly or 'ultralight' vehicles for mass private transportation in the 21st century

    International Nuclear Information System (INIS)

    Juravic, T.

    1999-01-01

    Needs and ownership are sociobiologically manifested in the alter-ego of a Homo sapiens where the natural progression of events (a household being the fundamental microlevel) and the social order, i.e. globalisation, are based on ownership and needs as sacred rights, and for this reason universal values like energy conservation end up as the waste of the mindless worship of consumption. Justified needs are phenomena of a consumerist (egocentric, pragmatic, voluntary) social conscience and instinctive behaviour - an unpredictable cause resulting from freedom being the foundation of the quality of life, socio-economic and political changes but are mutually exclusive to understanding (expressing and gaining deeper and richer knowledge). Inbuilt limits and/or control of consumption, which are already used in household appliances with aforeset processes (goals) for unknown consumers, to achieve large energy savings in 'routine' functions are more effective than attempts to prevent mistakes (lack of user knowledge through repression). A private vehicle, as a symbol of the freedom and quality of life, is a mechanism for achieving 'justified' needs and presents another means of household energy utilisation. The consumer's desires regarding private transportation are not sufficiently reconciled with intelligent microprocessors (expert systems), which achieve (the most) optimal behaviour in the process of transportation. This detailed consideration (as part of investigating the technical system) cannot be examined on a strictly logical or scientific basis, as it only proposes a method of co-agreement (not co-reponsability) of manufacturers and consumers and an alternative logical way of thinking, or organisation of the interaction between vehicles and traffic in order to form a judgement of really justifiable needs, and to achieve a robotic private vehicle, transportation and traffic. The goal of this consideration is to establish the DIVISION of energy with the help of

  15. Material Modelling - Composite Approach

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    1997-01-01

    is successfully justified comparing predicted results with experimental data obtained in the HETEK-project on creep, relaxation, and shrinkage of very young concretes cured at a temperature of T = 20^o C and a relative humidity of RH = 100%. The model is also justified comparing predicted creep, shrinkage......, and internal stresses caused by drying shrinkage with experimental results reported in the literature on the mechanical behavior of mature concretes. It is then concluded that the model presented applied in general with respect to age at loading.From a stress analysis point of view the most important finding...... in this report is that cement paste and concrete behave practically as linear-viscoelastic materials from an age of approximately 10 hours. This is a significant age extension relative to earlier studies in the literature where linear-viscoelastic behavior is only demonstrated from ages of a few days. Thus...

  16. Evaluating remedial alternatives for an acid mine drainage stream: A model post audit

    Science.gov (United States)

    Runkel, Robert L.; Kimball, Briant A.; Walton-Day, Katherine; Verplanck, Philip L.; Broshears, Robert E.

    2012-01-01

    A post audit for a reactive transport model used to evaluate acid mine drainage treatment systems is presented herein. The post audit is based on a paired synoptic approach in which hydrogeochemical data are collected at low (existing conditions) and elevated (following treatment) pH. Data obtained under existing, low-pH conditions are used for calibration, and the resultant model is used to predict metal concentrations observed following treatment. Predictions for Al, As, Fe, H+, and Pb accurately reproduce the observed reduction in dissolved concentrations afforded by the treatment system, and the information provided in regard to standard attainment is also accurate (predictions correctly indicate attainment or nonattainment of water quality standards for 19 of 25 cases). Errors associated with Cd, Cu, and Zn are attributed to misspecification of sorbent mass (precipitated Fe). In addition to these specific results, the post audit provides insight in regard to calibration and sensitivity analysis that is contrary to conventional wisdom. Steps taken during the calibration process to improve simulations of As sorption were ultimately detrimental to the predictive results, for example, and the sensitivity analysis failed to bracket observed metal concentrations.

  17. Evaluating remedial alternatives for an acid mine drainage stream: a model post audit.

    Science.gov (United States)

    Runkel, Robert L; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L; Broshears, Robert E

    2012-01-03

    A post audit for a reactive transport model used to evaluate acid mine drainage treatment systems is presented herein. The post audit is based on a paired synoptic approach in which hydrogeochemical data are collected at low (existing conditions) and elevated (following treatment) pH. Data obtained under existing, low-pH conditions are used for calibration, and the resultant model is used to predict metal concentrations observed following treatment. Predictions for Al, As, Fe, H(+), and Pb accurately reproduce the observed reduction in dissolved concentrations afforded by the treatment system, and the information provided in regard to standard attainment is also accurate (predictions correctly indicate attainment or nonattainment of water quality standards for 19 of 25 cases). Errors associated with Cd, Cu, and Zn are attributed to misspecification of sorbent mass (precipitated Fe). In addition to these specific results, the post audit provides insight in regard to calibration and sensitivity analysis that is contrary to conventional wisdom. Steps taken during the calibration process to improve simulations of As sorption were ultimately detrimental to the predictive results, for example, and the sensitivity analysis failed to bracket observed metal concentrations.

  18. Justifying Compulsory Environmental Education in Liberal Democracies

    Science.gov (United States)

    Schinkel, Anders

    2009-01-01

    The need for education for (as opposed to about) sustainability is urged from many sides. Initiatives in this area tend to focus on formal education. Governmental, supra-governmental and non-governmental bodies all expect much of this kind of education, which is to transform children--and through them society--in the direction of sustainability.…

  19. Justifying Objective Bayesianism on Predicate Languages

    Directory of Open Access Journals (Sweden)

    Jürgen Landes

    2015-04-01

    Full Text Available Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism to inductive logic. We show that the maximum entropy principle can be motivated largely in terms of minimising worst-case expected loss.

  20. Are segregated sports classes scientifically justified?

    OpenAIRE

    Lawson, Sian; Hall, Edward

    2014-01-01

    School sports classes are a key part of physical and mental development, yet in many countries these classes are gender segregated. Before institutionalised segregation can be condoned it is important to tackle assumptions and check for an evidence-based rationale. This presentation aims to analyse the key arguments for segregation given in comment-form response to a recent media article discussing mixed school sports (Lawson, 2013).\\ud \\ud The primary argument given was division for strength...

  1. Justifying Physical Education Based on Neuroscience Evidence

    Science.gov (United States)

    Berg, Kris

    2010-01-01

    Research has shown that exercise improves cognitive function and psychological traits that influence behavior (e.g., mood, level of motivation). The evidence in the literature also shows that physical education may enhance learning or that academic performance is at least maintained despite a reduction in classroom time in order to increase time…

  2. Three requirements for justifying an educational neuroscience.

    Science.gov (United States)

    Hruby, George G

    2012-03-01

    Over the past quarter century, efforts to bridge between research in the neurosciences and research, theory, and practice in education have grown from a mere hope to noteworthy scholarly sophistication. Many dedicated educational researchers have developed the secondary expertise in the necessary neurosciences and related fields to generate both empirical research and theoretical syntheses of noteworthy promise. Nonetheless, thoughtful and critical scholars in education have expressed concern about both the intellectual coherence and ethical dangers of this new area. It is still an open question whether educational neuroscience is for some time yet to remain only a formative study area for adventurous scholars or is already a fully fledged field of educational scholarship. In this paper, I suggest that to be a worthy field of educational research, educational neuroscience will need to address three issues: intellectual coherence, mutually informing and respected scholarly expertise, and an ethical commitment to the moral implications and obligations shared within educational research generally. I shall set forth some examples of lapses in this regard, focusing primarily on work on reading development, as that is my area of expertise, and make recommendations for due diligence. Arguments. First, intellectual coherence requires both precision in definition of technical terms (so that diverse scholars and professionals may communicate findings and insights consistently across fields), and precision in the logical warrants by which educational implications are drawn from empirical data from the neurosciences. Both needs are facilitated by careful attention to categorical boundary and avoidance of category error. Second, educational neuroscientists require focused and broad expertise in both the neurosciences and educational scholarship on teaching and learning in classrooms (and/or ancillary fields). If history is our guide, neuroscience implications for practice will prove unlikely in practice without expertise on practice. Additionally, respect for the expertise of others in this hybrid and necessarily collaborative enterprise is required. Third, educational neuroscience must take seriously the heightened moral and ethical concerns and commitments of educational professionals generally and educational researchers particularly. This means keeping a vigilant eye towards preserving the integrity of empirical and theoretical findings against rhetorical misuse by educational marketers, policy makers, and polemicists targeting the general public. I conclude that educational neuroscience is more than a hybrid patchwork of individual interests constituting a study area, and is perhaps ready to stand as a legitimate field of educational inquiry. It will not be accepted as such, however, nor should it be, unless the need to demonstrate a capacity for consistent intellectual coherence, scholarly expertise, and ethical commitment is met. ©2012 The British Psychological Society.

  3. Globalization and Employment: Is Anxiety Justified?

    Science.gov (United States)

    Lee, Eddy

    1996-01-01

    Despite concerns that globalization will increase unemployment and wage inequality, drive down wages and labor standards, and threaten national policy autonomy, it is clear that national policies still determine employment levels and labor standards. However, the need to protect those damaged by globalization still exists. (SK)

  4. Self-Esteem: Justifying Its Existence.

    Science.gov (United States)

    Street, Sue; Isaacs, Madelyn

    1998-01-01

    The role of self-esteem as a professional and personality construct has been obscured by its panacea role. Definitions of self-esteem and related terms are distinguished. Self-esteem is discussed as a developmental construct, a personality construct, and as a therapeutic goal. Therapeutic, educational, and counseling implications are discussed.…

  5. Three Requirements for Justifying an Educational Neuroscience

    Science.gov (United States)

    Hruby, George G.

    2012-01-01

    Background: Over the past quarter century, efforts to bridge between research in the neurosciences and research, theory, and practice in education have grown from a mere hope to noteworthy scholarly sophistication. Many dedicated educational researchers have developed the secondary expertise in the necessary neurosciences and related fields to…

  6. Justifying group-specific common morality.

    Science.gov (United States)

    Strong, Carson

    2008-01-01

    Some defenders of the view that there is a common morality have conceived such morality as being universal, in the sense of extending across all cultures and times. Those who deny the existence of such a common morality often argue that the universality claim is implausible. Defense of common morality must take account of the distinction between descriptive and normative claims that there is a common morality. This essay considers these claims separately and identifies the nature of the arguments for each claim. It argues that the claim that there is a universal common morality in the descriptive sense has not been successfully defended to date. It maintains that the claim that there is a common morality in the normative sense need not be understood as universalist. This paper advocates the concept of group specific common morality, including country-specific versions. It suggests that both the descriptive and the normative claims that there are country-specific common moralities are plausible, and that a country-specific normative common morality could provide the basis for a country's bioethics.

  7. Justified and unjustified use of growth hormone.

    NARCIS (Netherlands)

    A-J. van der Lely (Aart-Jan)

    2004-01-01

    textabstractGrowth hormone (GH) replacement therapy for children and adults with proven GH deficiency due to a pituitary disorder has become an accepted therapy with proven efficacy. GH is increasingly suggested, however, as a potential treatment for frailty, osteoporosis,

  8. Is the Pro-Network Bias Justified?

    Directory of Open Access Journals (Sweden)

    Rafael Pardo

    2013-07-01

    Full Text Available The academic literature, policy makers, and international organizations often emphasize the value of networks that, allegedly, may contribute to subcontractor upgrading, innovation, and economic welfare. By contrast, it is difficult to assess whether engagement in production outsourcing networks also accrues some advantages to outsourcers (contractors. To research differences between these organizations and vertically integrated organizations, we analyzed a sample of 1,031 industrial plants, statistically representative of firms with more than 50 employees in Spain’s manufacturing industry. We used t-tests, nonparametric tests, and chi-square tests, and hypotheses were tested for three subsets of companies, classified by the R&D intensity of the industry. In each set of industries, subcontracting is systematically associated with small batch production. By contrast, vertically integrated plants are more inclined to use mass production. In every type of industry, subcontracting is a form of governance especially efficient for the diffusion of new technology. Plants that subcontract production are more likely than integrated plants to adopt advanced manufacturing technology, whatever the R&D intensity of the industry. We conclude that outsourcers seem better prepared than vertically integrated organizations to meet customers’ requirements but employment of subcontracting do not lower necessarily their technology needs—a widespread “pro-network” argument.

  9. Gastric carcinoma: when is palliative gastrectomy justified?

    Directory of Open Access Journals (Sweden)

    Hubert Scheidbach

    2011-12-01

    Full Text Available Gastric carcinoma is frequently diagnosed with an advanced stage of non-curable tumor growth characterized by infiltration of the gastric serosa, peritoneal tumor spread and/or metastases within lymph nodes and liver. Currently, there is a controversy on the value of palliative resection with regard to the safety and benefit to the patient outcome. Based on the available literature, this overview summarizes the various aspects and interprets the limited data on the palliative resection of gastric carcinoma. It turns out that the available study results may indicate potential for an improved quality of life and a prolongation of survival if an acceptable morbidity and mortality are present.

  10. Are ionic CAT contrast media still justifiable

    International Nuclear Information System (INIS)

    Witt, H.; Trempenau, B.; Dietz, G.

    1984-01-01

    The authors' clinical results revealed no statistically significant differences of tolerance between the two X-ray contrast media 'Ioxitalamat' and 'Ioglicinat'. Side-effects were found in 4.3% of the cases for both contrast media, a rate which is slightly below the one for urography. However, it must not be overlooked that patients exposed to certain risk faktors such as e.g. relative contraindications were as far as possible excluded from the study. (orig./WU) [de

  11. Is the public's faith in fusion justified?

    International Nuclear Information System (INIS)

    Wilkie, T.

    1991-01-01

    This paper is in the form of a near-verbatim transcript of the presentation given at SOFT. The intention of the paper was to give an outsider's view of the work of the fusion community and, within limits, to be provocative and engender discussion. The paper is therefore deliberately discursive and non-technical. (orig.)

  12. A hybrid model for combining case-control and cohort studies in systematic reviews of diagnostic tests

    Science.gov (United States)

    Chen, Yong; Liu, Yulun; Ning, Jing; Cormier, Janice; Chu, Haitao

    2014-01-01

    Systematic reviews of diagnostic tests often involve a mixture of case-control and cohort studies. The standard methods for evaluating diagnostic accuracy only focus on sensitivity and specificity and ignore the information on disease prevalence contained in cohort studies. Consequently, such methods cannot provide estimates of measures related to disease prevalence, such as population averaged or overall positive and negative predictive values, which reflect the clinical utility of a diagnostic test. In this paper, we propose a hybrid approach that jointly models the disease prevalence along with the diagnostic test sensitivity and specificity in cohort studies, and the sensitivity and specificity in case-control studies. In order to overcome the potential computational difficulties in the standard full likelihood inference of the proposed hybrid model, we propose an alternative inference procedure based on the composite likelihood. Such composite likelihood based inference does not suffer computational problems and maintains high relative efficiency. In addition, it is more robust to model mis-specifications compared to the standard full likelihood inference. We apply our approach to a review of the performance of contemporary diagnostic imaging modalities for detecting metastases in patients with melanoma. PMID:25897179

  13. Education and gender bias in the sex ratio at birth: evidence from India.

    Science.gov (United States)

    Echávarri, Rebeca A; Ezcurra, Roberto

    2010-02-01

    This article investigates the possible existence of a nonlinear link between female disadvantage in natality and education. To this end, we devise a theoretical model based on the key role of social interaction in explaining people's acquisition of preferences, which justifies the existence of a nonmonotonic relationship between female disadvantage in natality and education. The empirical validity of the proposed model is examined for the case of India, using district-level data. In this context, our econometric analysis pays particular attention to the role of spatial dependence to avoid any potential problems of misspecification. The results confirm that the relationship between the sex ratio at birth and education in India follows an inverted U-shape. This finding is robust to the inclusion of additional explanatory variables in the analysis, and to the choice of the spatial weight matrix used to quantify the spatial interdependence between the sample districts.

  14. Bias and inference from misspecified mixed-effect models in stepped wedge trial analysis.

    Science.gov (United States)

    Thompson, Jennifer A; Fielding, Katherine L; Davey, Calum; Aiken, Alexander M; Hargreaves, James R; Hayes, Richard J

    2017-10-15

    Many stepped wedge trials (SWTs) are analysed by using a mixed-effect model with a random intercept and fixed effects for the intervention and time periods (referred to here as the standard model). However, it is not known whether this model is robust to misspecification. We simulated SWTs with three groups of clusters and two time periods; one group received the intervention during the first period and two groups in the second period. We simulated period and intervention effects that were either common-to-all or varied-between clusters. Data were analysed with the standard model or with additional random effects for period effect or intervention effect. In a second simulation study, we explored the weight given to within-cluster comparisons by simulating a larger intervention effect in the group of the trial that experienced both the control and intervention conditions and applying the three analysis models described previously. Across 500 simulations, we computed bias and confidence interval coverage of the estimated intervention effect. We found up to 50% bias in intervention effect estimates when period or intervention effects varied between clusters and were treated as fixed effects in the analysis. All misspecified models showed undercoverage of 95% confidence intervals, particularly the standard model. A large weight was given to within-cluster comparisons in the standard model. In the SWTs simulated here, mixed-effect models were highly sensitive to departures from the model assumptions, which can be explained by the high dependence on within-cluster comparisons. Trialists should consider including a random effect for time period in their SWT analysis model. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  15. Hedging endowment assurance products under interest rate and mortality risk

    NARCIS (Netherlands)

    Chen, A.; Mahayni, A.

    2007-01-01

    This paper analyzes how model misspecification associated with both interest rate and mortality risk influences hedging decisions of insurance companies. For this purpose, diverse risk management strategies which are riskminimizing when model risk is ignored come into consideration. The

  16. Integrated population modeling of black bears in Minnesota: implications for monitoring and management.

    Directory of Open Access Journals (Sweden)

    John R Fieberg

    Full Text Available BACKGROUND: Wildlife populations are difficult to monitor directly because of costs and logistical challenges associated with collecting informative abundance data from live animals. By contrast, data on harvested individuals (e.g., age and sex are often readily available. Increasingly, integrated population models are used for natural resource management because they synthesize various relevant data into a single analysis. METHODOLOGY/PRINCIPAL FINDINGS: We investigated the performance of integrated population models applied to black bears (Ursus americanus in Minnesota, USA. Models were constructed using sex-specific age-at-harvest matrices (1980-2008, data on hunting effort and natural food supplies (which affects hunting success, and statewide mark-recapture estimates of abundance (1991, 1997, 2002. We compared this approach to Downing reconstruction, a commonly used population monitoring method that utilizes only age-at-harvest data. We first conducted a large-scale simulation study, in which our integrated models provided more accurate estimates of population trends than did Downing reconstruction. Estimates of trends were robust to various forms of model misspecification, including incorrectly specified cub and yearling survival parameters, age-related reporting biases in harvest data, and unmodeled temporal variability in survival and harvest rates. When applied to actual data on Minnesota black bears, the model predicted that harvest rates were negatively correlated with food availability and positively correlated with hunting effort, consistent with independent telemetry data. With no direct data on fertility, the model also correctly predicted 2-point cycles in cub production. Model-derived estimates of abundance for the most recent years provided a reasonable match to an empirical population estimate obtained after modeling efforts were completed. CONCLUSIONS/SIGNIFICANCE: Integrated population modeling provided a reasonable

  17. Integrated population modeling of black bears in Minnesota: implications for monitoring and management.

    Science.gov (United States)

    Fieberg, John R; Shertzer, Kyle W; Conn, Paul B; Noyce, Karen V; Garshelis, David L

    2010-08-12

    Wildlife populations are difficult to monitor directly because of costs and logistical challenges associated with collecting informative abundance data from live animals. By contrast, data on harvested individuals (e.g., age and sex) are often readily available. Increasingly, integrated population models are used for natural resource management because they synthesize various relevant data into a single analysis. We investigated the performance of integrated population models applied to black bears (Ursus americanus) in Minnesota, USA. Models were constructed using sex-specific age-at-harvest matrices (1980-2008), data on hunting effort and natural food supplies (which affects hunting success), and statewide mark-recapture estimates of abundance (1991, 1997, 2002). We compared this approach to Downing reconstruction, a commonly used population monitoring method that utilizes only age-at-harvest data. We first conducted a large-scale simulation study, in which our integrated models provided more accurate estimates of population trends than did Downing reconstruction. Estimates of trends were robust to various forms of model misspecification, including incorrectly specified cub and yearling survival parameters, age-related reporting biases in harvest data, and unmodeled temporal variability in survival and harvest rates. When applied to actual data on Minnesota black bears, the model predicted that harvest rates were negatively correlated with food availability and positively correlated with hunting effort, consistent with independent telemetry data. With no direct data on fertility, the model also correctly predicted 2-point cycles in cub production. Model-derived estimates of abundance for the most recent years provided a reasonable match to an empirical population estimate obtained after modeling efforts were completed. Integrated population modeling provided a reasonable framework for synthesizing age-at-harvest data, periodic large-scale abundance estimates, and

  18. Bayesian inference in an extended SEIR model with nonparametric disease transmission rate: an application to the Ebola epidemic in Sierra Leone.

    Science.gov (United States)

    Frasso, Gianluca; Lambert, Philippe

    2016-10-01

    SummaryThe 2014 Ebola outbreak in Sierra Leone is analyzed using a susceptible-exposed-infectious-removed (SEIR) epidemic compartmental model. The discrete time-stochastic model for the epidemic evolution is coupled to a set of ordinary differential equations describing the dynamics of the expected proportions of subjects in each epidemic state. The unknown parameters are estimated in a Bayesian framework by combining data on the number of new (laboratory confirmed) Ebola cases reported by the Ministry of Health and prior distributions for the transition rates elicited using information collected by the WHO during the follow-up of specific Ebola cases. The time-varying disease transmission rate is modeled in a flexible way using penalized B-splines. Our framework represents a valuable stochastic tool for the study of an epidemic dynamic even when only irregularly observed and possibly aggregated data are available. Simulations and the analysis of the 2014 Sierra Leone Ebola data highlight the merits of the proposed methodology. In particular, the flexible modeling of the disease transmission rate makes the estimation of the effective reproduction number robust to the misspecification of the initial epidemic states and to underreporting of the infectious cases. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Preliminary Multivariable Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  20. Modeling Site Heterogeneity with Posterior Mean Site Frequency Profiles Accelerates Accurate Phylogenomic Estimation.

    Science.gov (United States)

    Wang, Huai-Chun; Minh, Bui Quang; Susko, Edward; Roger, Andrew J

    2018-03-01

    Proteins have distinct structural and functional constraints at different sites that lead to site-specific preferences for particular amino acid residues as the sequences evolve. Heterogeneity in the amino acid substitution process between sites is not modeled by commonly used empirical amino acid exchange matrices. Such model misspecification can lead to artefacts in phylogenetic estimation such as long-branch attraction. Although sophisticated site-heterogeneous mixture models have been developed to address this problem in both Bayesian and maximum likelihood (ML) frameworks, their formidable computational time and memory usage severely limits their use in large phylogenomic analyses. Here we propose a posterior mean site frequency (PMSF) method as a rapid and efficient approximation to full empirical profile mixture models for ML analysis. The PMSF approach assigns a conditional mean amino acid frequency profile to each site calculated based on a mixture model fitted to the data using a preliminary guide tree. These PMSF profiles can then be used for in-depth tree-searching in place of the full mixture model. Compared with widely used empirical mixture models with $k$ classes, our implementation of PMSF in IQ-TREE (http://www.iqtree.org) speeds up the computation by approximately $k$/1.5-fold and requires a small fraction of the RAM. Furthermore, this speedup allows, for the first time, full nonparametric bootstrap analyses to be conducted under complex site-heterogeneous models on large concatenated data matrices. Our simulations and empirical data analyses demonstrate that PMSF can effectively ameliorate long-branch attraction artefacts. In some empirical and simulation settings PMSF provided more accurate estimates of phylogenies than the mixture models from which they derive.

  1. Numerical Modeling of Rotary Kiln Productivity Increase

    NARCIS (Netherlands)

    Romero-Valle, M.A.; Pisaroni, M.; Van Puyvelde, D.; Lahaye, D.J.P.; Sadi, R.

    2013-01-01

    Rotary kilns are used in many industrial processes ranging from cement manufacturing to waste incineration. The operating conditions vary widely depending on the process. While there are many models available within the literature and industry, the wide range of operating conditions justifies

  2. Statistical models for brain signals with properties that evolve across trials.

    Science.gov (United States)

    Ombao, Hernando; Fiecas, Mark; Ting, Chee-Ming; Low, Yin Fen

    2017-12-07

    Most neuroscience cognitive experiments involve repeated presentations of various stimuli across several minutes or a few hours. It has been observed that brain responses, even to the same stimulus, evolve over the course of the experiment. These changes in brain activation and connectivity are believed to be associated with learning and/or habituation. In this paper, we present two general approaches to modeling dynamic brain connectivity using electroencephalograms (EEGs) recorded across replicated trials in an experiment. The first approach is the Markovian regime-switching vector autoregressive model (MS-VAR) which treats EEGs as realizations of an underlying brain process that switches between different states both within a trial and across trials in the entire experiment. The second is the slowly evolutionary locally stationary process (SEv-LSP) which characterizes the observed EEGs as a mixture of oscillatory activities at various frequency bands. The SEv-LSP model captures the dynamic nature of the amplitudes of the band-oscillations and cross-correlations between them. The MS-VAR model is able to capture abrupt changes in the dynamics while the SEv-LSP directly gives interpretable results. Moreover, it is nonparametric and hence does not suffer from model misspecification. For both of these models, time-evolving connectivity metrics in the frequency domain are derived from the model parameters for both functional and effective connectivity. We illustrate these two models for estimating cross-trial connectivity in selective attention using EEG data from an oddball paradigm auditory experiment where the goal is to characterize the evolution of brain responses to target stimuli and to standard tones presented randomly throughout the entire experiment. The results suggest dynamic changes in connectivity patterns over trials with inter-subject variability. Copyright © 2017. Published by Elsevier Inc.

  3. Statistical models for brain signals with properties that evolve across trials

    KAUST Repository

    Ombao, Hernando

    2017-12-07

    Most neuroscience cognitive experiments involve repeated presentations of various stimuli across several minutes or a few hours. It has been observed that brain responses, even to the same stimulus, evolve over the course of the experiment. These changes in brain activation and connectivity are believed to be associated with learning and/or habituation. In this paper, we present two general approaches to modeling dynamic brain connectivity using electroencephalograms (EEGs) recorded across replicated trials in an experiment. The first approach is the Markovian regime-switching vector autoregressive model (MS-VAR) which treats EEGs as realizations of an underlying brain process that switches between different states both within a trial and across trials in the entire experiment. The second is the slowly evolutionary locally stationary process (SEv-LSP) which characterizes the observed EEGs as a mixture of oscillatory activities at various frequency bands. The SEv-LSP model captures the dynamic nature of the amplitudes of the band-oscillations and cross-correlations between them. The MS-VAR model is able to capture abrupt changes in the dynamics while the SEv-LSP directly gives interpretable results. Moreover, it is nonparametric and hence does not suffer from model misspecification. For both of these models, time-evolving connectivity metrics in the frequency domain are derived from the model parameters for both functional and effective connectivity. We illustrate these two models for estimating cross-trial connectivity in selective attention using EEG data from an oddball paradigm auditory experiment where the goal is to characterize the evolution of brain responses to target stimuli and to standard tones presented randomly throughout the entire experiment. The results suggest dynamic changes in connectivity patterns over trials with inter-subject variability.

  4. Discrimination of Semi-Quantitative Models by Experiment Selection: Method Application in Population Biology

    NARCIS (Netherlands)

    Vatcheva, Ivayla; Bernard, Olivier; de Jong, Hidde; Gouze, Jean-Luc; Mars, Nicolaas; Nebel, B.

    2001-01-01

    Modeling an experimental system often results in a number of alternative models that are justified equally well by the experimental data. In order to discriminate between these models, additional experiments are needed. We present a method for the discrimination of models in the form of

  5. Kinematic Cosmology & a new ``Steady State'' Model of Continued Creation

    Science.gov (United States)

    Wegener, Mogens

    2006-03-01

    Only a new "steady state" model justifies the observations of fully mature galaxies at ever increasing distances. The basic idea behind the world model presented here, which is a synthesis of the cosmologies of Parmenides and Herakleitos, is that the invariant structure of the infinite contents of a universe in flux may be depicted as a finite hyperbolic pseudo-sphere.

  6. Declarative versus imperative process modeling languages : the issue of maintainability

    NARCIS (Netherlands)

    Fahland, D.; Mendling, J.; Reijers, H.A.; Weber, B.; Weidlich, M.; Zugal, S.; Rinderle-Ma, S.; Sadiq, S.; Leymann, F.

    2010-01-01

    The rise of interest in declarative languages for process modeling both justifies and demands empirical investigations into their presumed advantages over more traditional, imperative alternatives. Our concern in this paper is with the ease of maintaining business process models, for example due to

  7. Testing for Stock Market Contagion: A Quantile Regression Approach

    NARCIS (Netherlands)

    S.Y. Park (Sung); W. Wang (Wendun); N. Huang (Naijing)

    2015-01-01

    markdownabstract__Abstract__ Regarding the asymmetric and leptokurtic behavior of financial data, we propose a new contagion test in the quantile regression framework that is robust to model misspecification. Unlike conventional correlation-based tests, the proposed quantile contagion test

  8. Diagnostic Value of MRI in Patients With Implanted Pacemakers and Implantable Cardioverter-Defibrillators Across a Cross Population: Does the Benefit Justify the Risk? A Proof of Concept Study.

    Science.gov (United States)

    Samar, Huma; Yamrozik, June A; Williams, Ronald B; Doyle, Mark; Shah, Moneal; Bonnet, Christopher A; Biederman, Robert W W

    2017-09-01

    The objective of this study was to assess the diagnostic usefulness of thoracic and nonthoracic magnetic resonance imaging (MRI) imaging in patients with implantable cardiac devices (permanent pacemaker or implantable cardioverter-defibrillators [ICDs]) to determine if there was a substantial benefit to patients with regard to diagnosis and/or management. MRI is infrequently performed on patients with conventional pacemakers or ICDs. Multiple studies have documented the safety of MRI scans in patients with implanted devices, yet the diagnostic value of this approach has not been established. Evaluation data were acquired in 136 patients with implanted cardiac devices who underwent MRIs during a 10-year period at a single institution. Specific criteria were followed for all patients to objectively define if the diagnosis by MRI enhanced patient care; 4 questions were answered after scan interpretation by both MRI technologists and MRI physicians who performed the scan. 1) Did the primary diagnosis change? 2) Did the MRI provide additional information to the existing diagnosis? 3) Was the pre-MRI (tentative) diagnosis confirmed? 4) Did patient management change? If "Yes" was answered to any of the preceding questions, the MRI scan was considered to be of value to patient diagnosis and/or therapy. In 97% (n = 132) of patients, MR added value to patient diagnosis and management. In 49% (n = 67) of patients, MRI added additional valuable information to the primary diagnosis, and in 30% (n = 41) of patients, MRI changed the principle diagnosis and subsequent management of the patient. No safety issues were encountered, and no adverse effects of undergoing the MRI scan were noted in any patient. MRI in patients with implanted pacemakers and defibrillators added value to patient diagnosis and management, which justified the risk of the procedure. Published by Elsevier Inc.

  9. StranshamFord v Minister of Justice and Correctional Services and Others: Can active voluntary euthanasia and doctorassisted suicide be legally justified and are they consistent with the biomedical ethical principles Some suggested guidelines for doct

    Directory of Open Access Journals (Sweden)

    David McQuoid-Mason

    2015-11-01

    Full Text Available The recent case of Stransham-Ford v Minister of Justice and Correctional Services and Others held that voluntary active euthanasia and doctor assisted suicide may be legally justified in certain circumstances. The court observed that the distinction between ‘active’ and ‘passive’ voluntary euthanasia is not legally tenable as in both instances the doctors concerned have the ‘actual’ or ‘eventual’ intention to terminate the patient’s life and have caused or hastened the patient’s death. It is argued that as the South African Constitution is the supreme law of the country, the fundamental rights of patients guaranteed in the Constitution cannot be undermined by ethical duties imposed on health care practitioners by international and national professional bodies. The court in the Stransham-Ford case did not use ethical theories and principles to decide the matter. It simply applied the values in the Constitution and the provisions of the Bill of Rights. However, in order to assist medical practitioners with practical guidelines with which many of them are familiar - rather than complicated unfamiliar philosophical arguments - the biomedical ethical principles of patient autonomy, beneficence, non-maleficence and justice or fairness are applied to active voluntary euthanasia and doctor-assisted suicide in the context of the Stransham-Ford case. Although the case has not set a precedent or opened the floodgates to doctor-assisted voluntary active euthanasia and it is open to Parliament, the Constitutional Court or other courts to develop the concept or outlaw it, some guidelines are offered for doctors to consider should they be authorized by a court to assist with voluntary active euthanasia.

  10. HEDR modeling approach

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1992-07-01

    This report details the conceptual approaches to be used in calculating radiation doses to individuals throughout the various periods of operations at the Hanford Site. The report considers the major environmental transport pathways--atmospheric, surface water, and ground water--and projects and appropriate modeling technique for each. The modeling sequence chosen for each pathway depends on the available data on doses, the degree of confidence justified by such existing data, and the level of sophistication deemed appropriate for the particular pathway and time period being considered

  11. Robust inference in the negative binomial regression model with an application to falls data.

    Science.gov (United States)

    Aeberhard, William H; Cantoni, Eva; Heritier, Stephane

    2014-12-01

    A popular way to model overdispersed count data, such as the number of falls reported during intervention studies, is by means of the negative binomial (NB) distribution. Classical estimating methods are well-known to be sensitive to model misspecifications, taking the form of patients falling much more than expected in such intervention studies where the NB regression model is used. We extend in this article two approaches for building robust M-estimators of the regression parameters in the class of generalized linear models to the NB distribution. The first approach achieves robustness in the response by applying a bounded function on the Pearson residuals arising in the maximum likelihood estimating equations, while the second approach achieves robustness by bounding the unscaled deviance components. For both approaches, we explore different choices for the bounding functions. Through a unified notation, we show how close these approaches may actually be as long as the bounding functions are chosen and tuned appropriately, and provide the asymptotic distributions of the resulting estimators. Moreover, we introduce a robust weighted maximum likelihood estimator for the overdispersion parameter, specific to the NB distribution. Simulations under various settings show that redescending bounding functions yield estimates with smaller biases under contamination while keeping high efficiency at the assumed model, and this for both approaches. We present an application to a recent randomized controlled trial measuring the effectiveness of an exercise program at reducing the number of falls among people suffering from Parkinsons disease to illustrate the diagnostic use of such robust procedures and their need for reliable inference. © 2014, The International Biometric Society.

  12. Experiment selection for the discrimination of semi-quantitative models of dynamical systems

    NARCIS (Netherlands)

    Vatcheva, [No Value; de Jong, H; Bernard, O; Mars, NJI

    Modeling an experimental system often results in a number of alternative models that are all justified by the available experimental data. To discriminate among these models, additional experiments are needed. Existing methods for the selection of discriminatory experiments in statistics and in

  13. Conclusion of LOD-score analysis for family data generated under two-locus models.

    Science.gov (United States)

    Dizier, M H; Babron, M C; Clerget-Darpoux, F

    1996-06-01

    The power to detect linkage by the LOD-score method is investigated here for diseases that depend on the effects of two genes. The classical strategy is, first, to detect a major-gene (MG) effect by segregation analysis and, second, to seek for linkage with genetic markers by the LOD-score method using the MG parameters. We already showed that segregation analysis can lead to evidence for a MG effect for many two-locus models, with the estimates of the MG parameters being very different from those of the two genes involved in the disease. We show here that use of these MG parameter estimates in the LOD-score analysis may lead to a failure to detect linkage for some two-locus models. For these models, use of the sib-pair method gives a non-negligible increase of power to detect linkage. The linkage-homogeneity test among subsamples differing for the familial disease distribution provides evidence of parameter misspecification, when the MG parameters are used. Moreover, for most of the models, use of the MG parameters in LOD-score analysis leads to a large bias in estimation of the recombination fraction and sometimes also to a rejection of linkage for the true recombination fraction. A final important point is that a strong evidence of an MG effect, obtained by segregation analysis, does not necessarily imply that linkage will be detected for at least one of the two genes, even with the true parameters and with a close informative marker.

  14. Conclusions of LOD-score analysis for family data generated under two-locus models

    Energy Technology Data Exchange (ETDEWEB)

    Dizier, M.H.; Babron, M.C.; Clergt-Darpoux, F. [Unite de Recherches d`Epidemiologie Genetique, Paris (France)

    1996-06-01

    The power to detect linkage by the LOD-score method is investigated here for diseases that depend on the effects of two genes. The classical strategy is, first, to detect a major-gene (MG) effect by segregation analysis and, second, to seek for linkage with genetic markers by the LOD-score method using the MG parameters. We already showed that segregation analysis can lead to evidence for a MG effect for many two-locus models, with the estimates of the MG parameters being very different from those of the two genes involved in the disease. We show here that use of these MG parameter estimates in the LOD-score analysis may lead to a failure to detect linkage for some two-locus models. For these models, use of the sib-pair method gives a non-negligible increase of power to detect linkage. The linkage-homogeneity test among subsamples differing for the familial disease distribution provides evidence of parameter misspecification, when the MG parameters are used. Moreover, for most of the models, use of the MG parameters in LOD-score analysis leads to a large bias in estimation of the recombination fraction and sometimes also to a rejection of linkage for the true recombination fraction. A final important point is that a strong evidence of an MG effect, obtained by segregation analysis, does not necessarily imply that linkage will be detected for at least one of the two genes, even with the true parameters and with a close informative marker. 17 refs., 3 tabs.

  15. Modelling

    CERN Document Server

    Spädtke, P

    2013-01-01

    Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.

  16. Islamic vs. conventional banks : Business models, efficiency and stability

    NARCIS (Netherlands)

    Beck, T.H.L.; Demirgüc-Kunt, A.; Merrouche, O.

    2013-01-01

    How different are Islamic banks from conventional banks? Does the recent crisis justify a closer look at the Sharia-compliant business model for banking? When comparing conventional and Islamic banks, controlling for time-variant country-fixed effects, we find few significant differences in business

  17. Kinetic models in spin chemistry. 1. The hyperfine interaction

    DEFF Research Database (Denmark)

    Mojaza, M.; Pedersen, J. B.

    2012-01-01

    Kinetic models for quantum systems are quite popular due to their simplicity, although they are difficult to justify. We show that the transformation from quantum to kinetic description can be done exactly for the hyperfine interaction of one nuclei with arbitrary spin; more spins are described w...... induced enhancement of the reaction yield. (C) 2012 Elsevier B.V. All rights reserved....

  18. Preliminary Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Prince, F. Andrew; Smart, Christian; Stephens, Kyle; Henrichs, Todd

    2009-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. However, great care is required. Some space telescope cost models, such as those based only on mass, lack sufficient detail to support such analysis and may lead to inaccurate conclusions. Similarly, using ground based telescope models which include the dome cost will also lead to inaccurate conclusions. This paper reviews current and historical models. Then, based on data from 22 different NASA space telescopes, this paper tests those models and presents preliminary analysis of single and multi-variable space telescope cost models.

  19. EM Adaptive LASSO – A Multilocus Modeling Strategy for Detecting SNPs Associated With Zero-inflated Count Phenotypes

    Directory of Open Access Journals (Sweden)

    Himel eMallick

    2016-03-01

    Full Text Available Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP, Negative Binomial, and Zero-inflated Negative Binomial (ZINB. However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely

  20. A new model of wheezing severity in young children using the validated ISAAC wheezing module: A latent variable approach with validation in independent cohorts.

    Science.gov (United States)

    Brunwasser, Steven M; Gebretsadik, Tebeb; Gold, Diane R; Turi, Kedir N; Stone, Cosby A; Datta, Soma; Gern, James E; Hartert, Tina V

    2018-01-01

    The International Study of Asthma and Allergies in Children (ISAAC) Wheezing Module is commonly used to characterize pediatric asthma in epidemiological studies, including nearly all airway cohorts participating in the Environmental Influences on Child Health Outcomes (ECHO) consortium. However, there is no consensus model for operationalizing wheezing severity with this instrument in explanatory research studies. Severity is typically measured using coarsely-defined categorical variables, reducing power and potentially underestimating etiological associations. More precise measurement approaches could improve testing of etiological theories of wheezing illness. We evaluated a continuous latent variable model of pediatric wheezing severity based on four ISAAC Wheezing Module items. Analyses included subgroups of children from three independent cohorts whose parents reported past wheezing: infants ages 0-2 in the INSPIRE birth cohort study (Cohort 1; n = 657), 6-7-year-old North American children from Phase One of the ISAAC study (Cohort 2; n = 2,765), and 5-6-year-old children in the EHAAS birth cohort study (Cohort 3; n = 102). Models were estimated using structural equation modeling. In all cohorts, covariance patterns implied by the latent variable model were consistent with the observed data, as indicated by non-significant χ2 goodness of fit tests (no evidence of model misspecification). Cohort 1 analyses showed that the latent factor structure was stable across time points and child sexes. In both cohorts 1 and 3, the latent wheezing severity variable was prospectively associated with wheeze-related clinical outcomes, including physician asthma diagnosis, acute corticosteroid use, and wheeze-related outpatient medical visits when adjusting for confounders. We developed an easily applicable continuous latent variable model of pediatric wheezing severity based on items from the well-validated ISAAC Wheezing Module. This model prospectively associates with

  1. The use of logistic regression in modelling the distributions of bird ...

    African Journals Online (AJOL)

    The method of logistic regression was used to model the observed geographical distribution patterns of bird species in Swaziland in relation to a set of environmental variables. Reporting rates derived from bird atlas data are used as an index of population densities. This is justified in part by the success of the modelling ...

  2. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  3. Inhomogeneous Markov Models for Describing Driving Patterns

    DEFF Research Database (Denmark)

    Iversen, Emil Banning; Møller, Jan K.; Morales, Juan Miguel

    2017-01-01

    . Specifically, an inhomogeneous Markov model that captures the diurnal variation in the use of a vehicle is presented. The model is defined by the time-varying probabilities of starting and ending a trip, and is justified due to the uncertainty associated with the use of the vehicle. The model is fitted to data...... collected from the actual utilization of a vehicle. Inhomogeneous Markov models imply a large number of parameters. The number of parameters in the proposed model is reduced using B-splines....

  4. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  5. A thermal model of the economy

    Science.gov (United States)

    Arroyo Colon, Luis Balbino

    The motivation for this work came from an interest in Economics (particularly since the 2008 economic downturn) and a desire to use the tools of physics in a field that has not been the subject of great exploration. We propose a model of economics in analogy to thermodynamics and introduce the concept of the Value Multiplier as a fundamental addition to any such model. Firstly, we attempt to make analogies between some economic concepts and fundamental concepts of thermal physics. Then we introduce the value multiplier and justify its existence in our system; the value multiplier allows us to account for some intangible, psychological elements of the value of goods and services. We finally bring all the elements together in a qualitative system. In particular, we attempt to make an analogy with the Keynesian Multiplier that justifies the usefulness of fiscal stimulus in severe economic downturns. ii

  6. On-line validation of linear process models using generalized likelihood ratios

    International Nuclear Information System (INIS)

    Tylee, J.L.

    1981-12-01

    A real-time method for testing the validity of linear models of nonlinear processes is described and evaluated. Using generalized likelihood ratios, the model dynamics are continually monitored to see if the process has moved far enough away from the nominal linear model operating point to justify generation of a new linear model. The method is demonstrated using a seventh-order model of a natural circulation steam generator

  7. The disruption management model.

    Science.gov (United States)

    McAlister, James

    2011-10-01

    Within all organisations, business continuity disruptions present a set of dilemmas that managers may not have dealt with before in their normal daily duties. The disruption management model provides a simple but effective management tool to enable crisis management teams to stay focused on recovery in the midst of a business continuity incident. The model has four chronological primary headlines, which steer the team through a quick-time crisis decision-making process. The procedure facilitates timely, systematic, rationalised and justified decisions, which can withstand post-event scrutiny. The disruption management model has been thoroughly tested within an emergency services environment and is proven to significantly support clear and concise decision making in a business continuity context.

  8. Causal Inference and Model Selection in Complex Settings

    Science.gov (United States)

    Zhao, Shandong

    Propensity score methods have become a part of the standard toolkit for applied researchers who wish to ascertain causal effects from observational data. While they were originally developed for binary treatments, several researchers have proposed generalizations of the propensity score methodology for non-binary treatment regimes. In this article, we firstly review three main methods that generalize propensity scores in this direction, namely, inverse propensity weighting (IPW), the propensity function (P-FUNCTION), and the generalized propensity score (GPS), along with recent extensions of the GPS that aim to improve its robustness. We compare the assumptions, theoretical properties, and empirical performance of these methods. We propose three new methods that provide robust causal estimation based on the P-FUNCTION and GPS. While our proposed P-FUNCTION-based estimator preforms well, we generally advise caution in that all available methods can be biased by model misspecification and extrapolation. In a related line of research, we consider adjustment for posttreatment covariates in causal inference. Even in a randomized experiment, observations might have different compliance performance under treatment and control assignment. This posttreatment covariate cannot be adjusted using standard statistical methods. We review the principal stratification framework which allows for modeling this effect as part of its Bayesian hierarchical models. We generalize the current model to add the possibility of adjusting for pretreatment covariates. We also propose a new estimator of the average treatment effect over the entire population. In a third line of research, we discuss the spectral line detection problem in high energy astrophysics. We carefully review how this problem can be statistically formulated as a precise hypothesis test with point null hypothesis, why a usual likelihood ratio test does not apply for problem of this nature, and a doable fix to correctly

  9. Plume and Dose Modeling Performed to Assess Waste Management Enhancements Associated with Envirocare's Decision to Purchase of an Engineered Rail Rollover Facility Enclosure

    International Nuclear Information System (INIS)

    Rogers, T.; Clayman, B.

    2003-01-01

    This paper describes the modeling performed on a proposed enclosure for the existing railcar rollover facility located in Clive, Utah at a radioactive waste disposal site owned and operated by Envirocare of Utah, Inc. (Envirocare). The dose and plume modeling information was used as a tool to justify the decision to make the capital purchase and realize the modeled performance enhancements

  10. A mathematical model of star formation in the Galaxy

    Directory of Open Access Journals (Sweden)

    M.A. Sharaf

    2012-06-01

    Full Text Available This paper is generally concerned with star formation in the Galaxy, especially blue stars. Blue stars are the most luminous, massive and the largest in radius. A simple mathematical model of the formation of the stars is established and put in computational algorithm. This algorithm enables us to know more about the formation of the star. Some real and artificial examples had been used to justify this model.

  11. Method of modeling the cognitive radio using Opnet Modeler

    OpenAIRE

    Yakovenko, I. V.; Poshtarenko, V. M.; Kostenko, R. V.

    2012-01-01

    This article is a review of the first wireless standard based on cognitive radio networks. The necessity of wireless networks based on the technology of cognitive radio. An example of the use of standard IEEE 802.22 in Wimax network through which was implemented in the simulation software environment Opnet Modeler. Schedules to check the performance of HTTP and FTP protocols CR network. Simulation results justify the use of standard IEEE 802.22 in wireless networks. Ця стаття являє собою о...

  12. A random effects meta-analysis model with Box-Cox transformation.

    Science.gov (United States)

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The

  13. A random effects meta-analysis model with Box-Cox transformation

    Directory of Open Access Journals (Sweden)

    Yusuke Yamaguchi

    2017-07-01

    Full Text Available Abstract Background In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. Methods We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. Results A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and

  14. A computerized model for integrating the physical environmental factors into metropolitan landscape planning

    Science.gov (United States)

    Julius Gy Fabos; Kimball H. Ferris

    1977-01-01

    This paper justifies and illustrates (in simplified form) a landscape planning approach to the environmental management of the metropolitan landscape. The model utilizes a computerized assessment and mapping system, which exhibits a recent advancement in computer technology that allows for greater accuracy and the weighting of different values when mapping at the...

  15. Neimark-Sacker bifurcation for the discrete-delay Kaldor model

    International Nuclear Information System (INIS)

    Dobrescu, Loretti I.; Opris, Dumitru

    2009-01-01

    We consider a discrete-delay time, Kaldor nonlinear business cycle model in income and capital. Given an investment function, resembling the one discussed by Rodano, we use the linear approximation analysis to state the local stability property and local bifurcations, in the parameter space. Finally, we will give some numerical examples to justify the theoretical results.

  16. Investigation on Self-Organization Processes in DC Generators by Synergetic Modeling

    Directory of Open Access Journals (Sweden)

    Ion Voncilă

    2014-09-01

    Full Text Available In this paper is suggested a new mathematical model, based on which it can be justified the self-excitation DC generators, either shunt or series excitation, by self-organization phenomena that appear to overcome threshold values (self-excitation in these generators is an avalanche process, a positive feedback, considered at first glance uncontrollable.

  17. Investigation on Self-Organization Processes in DC Generators by Synergetic Modeling

    OpenAIRE

    Ion Voncilă; Mădălin Costin; Răzvan Buhosu

    2014-01-01

    In this paper is suggested a new mathematical model, based on which it can be justified the self-excitation DC generators, either shunt or series excitation, by self-organization phenomena that appear to overcome threshold values (self-excitation in these generators is an avalanche process, a positive feedback, considered at first glance uncontrollable).

  18. Estimates of live-tree carbon stores in the Pacific Northwest are sensitive to model selection

    Science.gov (United States)

    Susanna L. Melson; Mark E. Harmon; Jeremy S. Fried; James B. Domingo

    2011-01-01

    Estimates of live-tree carbon stores are influenced by numerous uncertainties. One of them is model-selection uncertainty: one has to choose among multiple empirical equations and conversion factors that can be plausibly justified as locally applicable to calculate the carbon store from inventory measurements such as tree height and diameter at breast height (DBH)....

  19. Optimising the management of complex dynamic ecosystems. An ecological-economic modelling approach

    NARCIS (Netherlands)

    Hein, L.G.

    2005-01-01

    Keywords: ecological-economic modelling; ecosystem services; resource use; efficient; sustainability; wetlands, rangelands.

  20. Developing a Model for Assigning Senior Officers in the Brazilian Air Force

    Science.gov (United States)

    2015-03-01

    Bandura , Albert. 1977. Social Learning Theory . Englewood Cliffs, NJ: Prentice- Hall. Black, Gene. 2014. “Surface Warfare Officer Community Brief...1982, 565) use social learning theory to justify the model. According to this theory , human behavior can be explained in terms of “continuous...DRIVERS ......................................................................... 21 A. THEORY OF WORK PERFORMANCE

  1. Tracing the Rationale Behind UML Model Change Through Argumentation

    Science.gov (United States)

    Jureta, Ivan J.; Faulkner, Stéphane

    Neglecting traceability—i.e., the ability to describe and follow the life of a requirement—is known to entail misunderstanding and miscommunication, leading to the engineering of poor quality systems. Following the simple principles that (a) changes to UML model instances ought be justified to the stakeholders, (b) justification should proceed in a structured manner to ensure rigor in discussions, critique, and revisions of model instances, and (c) the concept of argument instantiated in a justification process ought to be well defined and understood, the present paper introduces the UML Traceability through Argumentation Method (UML-TAM) to enable the traceability of design rationale in UML while allowing the appropriateness of model changes to be checked by analysis of the structure of the arguments provided to justify such changes.

  2. Interpretational Confounding Is Due to Misspecification, Not to Type of Indicator: Comment on Howell, Breivik, and Wilcox (2007)

    Science.gov (United States)

    Bollen, Kenneth A.

    2007-01-01

    R. D. Howell, E. Breivik, and J. B. Wilcox (2007) have argued that causal (formative) indicators are inherently subject to interpretational confounding. That is, they have argued that using causal (formative) indicators leads the empirical meaning of a latent variable to be other than that assigned to it by a researcher. Their critique of causal…

  3. Evidence for the credibility of health economic models for health policy decision-making

    DEFF Research Database (Denmark)

    Søgaard, Rikke; Lindholt, Jes S.

    2012-01-01

    OBJECTIVE: To investigate whether the credibility of health economic models of screening for abdominal aortic aneurysms for health policy decision-making has improved since 2005 when a systematic review by Campbell et al. concluded that reporting standards were poor and there was divergence between...... benefited from general advances in health economic modelling and some improvements in reporting were noted. However, the low level of agreement between studies in model structures and assumptions, and difficulty in justifying these (convergent validity), remain a threat to the credibility of health economic...... models. Decision-makers should not accept the results of a modelling study if the methods are not fully transparent and justified. Modellers should, whenever relevant, supplement a primary report of results with a technical report detailing and discussing the methodological choices made....

  4. Comment on ''Spectroscopy of samarium isotopes in the sdg interacting boson model''

    International Nuclear Information System (INIS)

    Kuyucak, S.; Lac, V.

    1993-01-01

    We point out that the data used in the sdg boson model calculations by Devi and Kota [Phys. Rev. C 45, 2238 (1992)] can be equally well described by the much simpler sd boson model. We present additional data for the Sm isotopes which cannot be explained in the sd model and hence may justify such an extension to the sdg bosons. We also comment on the form of the Hamiltonian and the transition operators used in this paper

  5. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  6. The algebraic collective model

    International Nuclear Information System (INIS)

    Rowe, D.J.; Turner, P.S.

    2005-01-01

    A recently proposed computationally tractable version of the Bohr collective model is developed to the extent that we are now justified in describing it as an algebraic collective model. The model has an SU(1,1)xSO(5) algebraic structure and a continuous set of exactly solvable limits. Moreover, it provides bases for mixed symmetry collective model calculations. However, unlike the standard realization of SU(1,1), used for computing beta wave functions and their matrix elements in a spherical basis, the algebraic collective model makes use of an SU(1,1) algebra that generates wave functions appropriate for deformed nuclei with intrinsic quadrupole moments ranging from zero to any large value. A previous paper focused on the SO(5) wave functions, as SO(5) (hyper-)spherical harmonics, and computation of their matrix elements. This paper gives analytical expressions for the beta matrix elements needed in applications of the model and illustrative results to show the remarkable gain in efficiency that is achieved by using such a basis in collective model calculations for deformed nuclei

  7. Model instruments of effective segmentation of the fast food market

    OpenAIRE

    Mityaeva Tetyana L.

    2013-01-01

    The article presents results of optimisation step-type calculations of economic effectiveness of promotion of fast food with consideration of key parameters of assessment of efficiency of the marketing strategy of segmentation. The article justifies development of a mathematical model on the bases of 3D-presentations and three-dimensional system of management variables. The modern applied mathematical packages allow formation not only of one-dimensional and two-dimensional arrays and analyse ...

  8. Preconditions of forming of loyalty management model in pharmaceutical institution

    Directory of Open Access Journals (Sweden)

    O. O. Molodozhonova

    2013-04-01

    Full Text Available The first stage of the mechanism for implementing of two-level model of efficient management of loyalty was justified. It is based on the fundamental value systems of the formation of consumer commitment and institutional commitment of pharmaceutical professionals. The stage involves recruitment, selection and adaptation period for pharmaceutical professionals and pre-use of axiological questioning of consumers of pharmaceutic goods.

  9. Selected bibliography on the modeling and control of plant processes

    Science.gov (United States)

    Viswanathan, M. M.; Julich, P. M.

    1972-01-01

    A bibliography of information pertinent to the problem of simulating plants is presented. Detailed simulations of constituent pieces are necessary to justify simple models which may be used for analysis. Thus, this area of study is necessary to support the Earth Resources Program. The report sums up the present state of the problem of simulating vegetation. This area holds the hope of major benefits to mankind through understanding the ecology of a region and in improving agricultural yield.

  10. Modelling of air-conditioned and heated spaces

    Energy Technology Data Exchange (ETDEWEB)

    Moehl, U

    1987-01-01

    A space represents a complex system involving numerous components, manipulated variables and disturbances which need to be described if dynamic behaviour of space air is to be determined. A justifiable amount of simulation input is determined by the application of adjusted modelling of the individual components. The determination of natural air exchange in heated spaces and of space-air flow in air-conditioned space are a primary source of uncertainties. (orig.).

  11. Prediction of nitrogen and phosphorus leaching to groundwater and surface waters; process descriptions of the animo4.0 model

    NARCIS (Netherlands)

    Groenendijk, P.; Renaud, L.V.; Roelsma, J.

    2005-01-01

    The fertilization reduction policy intended to pursue environmental objects and regional water management strategies to meet Water Framework Directive objectives justify a thorough evaluation of the effectiveness of measures and reconnaissance of adverse impacts. The model aims at the evaluation and

  12. Corporate governance and banks : How justified is the match?

    NARCIS (Netherlands)

    van der Elst, C.F.

    2015-01-01

    Banks and bank governance are different. We critically assess the arguments used to pervade these divergences in operational activities. We also question if and how, in light of the specificity of banking activities, bank governance translates the operational peculiarities in different governance

  13. What Justifies a Future with Humans in It?

    Science.gov (United States)

    Murphy, Timothy F

    2016-11-01

    Antinatalist commentators recommend that humanity bring itself to a close, on the theory that pain and suffering override the value of any possible life. Other commentators do not require the voluntary extinction of human beings, but they defend that outcome if people were to choose against having children. Against such views, Richard Kraut has defended a general moral obligation to people the future with human beings until the workings of the universe render such efforts impossible. Kraut advances this view on the grounds that we are obliged to exercise beneficence toward others and on the grounds that the goods available in human lives are morally compelling. This account ultimately succeeds in making no more than a prima facie defense of human perpetuation because considerations of beneficence could override - in some cases probably should - override any duty to perpetuate human beings. While the goods of human life may be distinctive, they cannot serve as reason-giving in regard to their own perpetuation. Ironically, the exercise of beneficence may authorize the extinction of human beings, if it becomes possible to enhance the goods available to human descendants in a way that moves them away from human nature as now given. The defense of a morally obligatory and strictly human future remains elusive, even as it becomes morally desirable to work against Fateful Catastrophes, those human-caused events that threaten to extinguish existing lives already good and enriching for their bearers. © 2016 John Wiley & Sons Ltd.

  14. Common extensor origin release in recalcitrant lateral epicondylitis - role justified?

    Directory of Open Access Journals (Sweden)

    Mukundan Cibu

    2010-05-01

    Full Text Available Abstract The aim of our study was to analyse the efficacy of operative management in recalcitrant lateral epicondylitis of elbow. Forty patients included in this study were referred by general practitioners with a diagnosis of tennis elbow to the orthopaedic department at a district general hospital over a five year period. All had two or more steroid injections at the tender spot, without permanent relief of pain. All subsequently underwent simple fasciotomy of the extensor origin. Of forty patients thirty five had improvement in pain and function, two had persistent symptoms and three did not perceive any improvement. Twenty five had excellent, ten had well, two had fair and three had poor outcomes (recurrent problem; pain at rest and night. Two patients underwent revision surgery. Majority of the patients had improvement in pain and function following operative treatment. In this study, an extensor fasciotomy was demonstrated to be an effective treatment for refractory chronic lateral epicondylitis; however, further studies are warranted.

  15. Justifying Design Decisions with Theory-based Design Principles

    OpenAIRE

    Schermann, Michael;Gehlert, Andreas;Pohl, Klaus;Krcmar, Helmut

    2014-01-01

    Although the role of theories in design research is recognized, we show that little attention has been paid on how to use theories when designing new artifacts. We introduce design principles as a new methodological approach to address this problem. Design principles extend the notion of design rationales that document how a design decision emerged. We extend the concept of design rationales by using theoretical hypotheses to support or object to design decisions. At the example of developing...

  16. How three Narratives of Modernity justify Economic Inequality

    DEFF Research Database (Denmark)

    Larsen, Christian Albrekt

    2016-01-01

    The acceptance of income differences varies across countries. This article suggests belief in three narratives of modernity to account for this: the “tunnel effect”, related to perceptions of generational mobility; the “procedural justice effect”, related to the perceived fairness in the process ...

  17. Justifying the Ivory Tower: Higher Education and State Economic Growth

    Science.gov (United States)

    Baldwin, J. Norman; McCracken, William A., III

    2013-01-01

    As the U.S. continues to embrace a comprehensive plan for economic recovery, this article investigates the validity of the claim that investing in higher education will help restore state economic growth and prosperity. It presents the findings from a study that indicates that the most consistent predictors of state economic growth related to…

  18. Is extended biopsy protocol justified in all patients with suspected ...

    African Journals Online (AJOL)

    Objective: To determine the significance of an extended 10-core transrectal biopsy protocol in different categories of patients with suspected prostate cancer using digital guidance. Materials and Methods: We studied 125 men who were being evaluated for prostate cancer. They all had an extended 10-core digitally guided ...

  19. Is selenium supplementation in autoimmune thyroid diseases justified?

    DEFF Research Database (Denmark)

    Winther, Kristian H.; Bonnema, Steen; Hegedüs, Laszlo

    2017-01-01

    PURPOSE OF REVIEW: This review provides an appraisal of recent evidence for or against selenium supplementation in patients with autoimmune thyroid diseases, and discusses possible effect mechanisms. RECENT FINDINGS: Epidemiological data suggest an increased prevalence of autoimmune thyroid...... diseases under conditions of low dietary selenium intake. Two systematic reviews have evaluated controlled trials among patients with autoimmune thyroiditis and report that selenium supplementation decreases circulating thyroid autoantibodies. The immunomodulatory effects of selenium might involve reducing...... proinflammatory cytokine release. However, clinically relevant effects of selenium supplementation, including improvement in quality of life, are more elusive. In Graves’ disease, some, but not all, trials indicate that adjuvant selenium supplementation enhances the restoration of biochemical euthyroidism...

  20. Is selenium supplementation in autoimmune thyroid diseases justified?

    Science.gov (United States)

    Winther, Kristian H; Bonnema, Steen J; Hegedüs, Laszlo

    2017-10-01

    This review provides an appraisal of recent evidence for or against selenium supplementation in patients with autoimmune thyroid diseases, and discusses possible effect mechanisms. Epidemiological data suggest an increased prevalence of autoimmune thyroid diseases under conditions of low dietary selenium intake. Two systematic reviews have evaluated controlled trials among patients with autoimmune thyroiditis and report that selenium supplementation decreases circulating thyroid autoantibodies. The immunomodulatory effects of selenium might involve reducing proinflammatory cytokine release. However, clinically relevant effects of selenium supplementation, including improvement in quality of life, are more elusive. In Graves' disease, some, but not all, trials indicate that adjuvant selenium supplementation enhances the restoration of biochemical euthyroidism, and might benefit patients with mild Graves' orbitopathy. The use of selenium supplementation as adjuvant therapy to standard thyroid medication may be widespread, but a growing body of evidence yields equivocal results. The available evidence from trials does not support routine selenium supplementation in the standard treatment of patients with autoimmune thyroiditis or Graves' disease. However, correction of moderate to severe selenium deficiency may offer benefits in preventing, as well as treating, these disorders. Molecular mechanisms have been proposed, but further studies are needed.

  1. Is abandoning routine peritoneal cultures during appendectomy justified?

    International Nuclear Information System (INIS)

    Al-Saadi, A.; Al-Wadan, Ali H.; Hamarnah, Samir A.; Amin, H.

    2007-01-01

    Objective was to identify if there are any advantages of taking swab form the peritoneal fluid during appendectomy and if it has any clinical implication on the progress of diseases. Record of 160 patients who underwent appendectomy in Saqr Hospital, Rak, United Arab Emirates, from 2003-2005 and had culture and sensitivity from the peritoneal cavity were reviewed retrospectively. The macroscopic picture of the appendix, microorganism in peritoneal cultures, antibiotic and the extent using the result of the culture and sensitivity were evaluated. Patients with normal appendix who underwent laparoscopic appendectomy were excluded. Patients age ranged from 4-55 years with male to female ratio of 4:1, all had prophylactic antibiotics and standard surgical procedures; 60% had perforated appendix, 13% were gangrenous. The most common organisms cultured were, Escherichia coli and bacteroids, rate of wound infection was 5%. None of the patients had their course of antibiotics adjusted in response to the result of the swab. Swabs from the peritoneal cavity during appendectomy do not have any clinical advantage especially with the empiric use of antibiotics and the short hospital stay. (author)

  2. Program to justify life extension of older nuclear piping systems

    International Nuclear Information System (INIS)

    Burr, T.K.; Dwight, J.E. Jr.; Morton, D.K.

    1991-01-01

    The Idaho National Engineering Laboratory (INEL) has a history of more than 40 years devoted to the operation of nuclear reactors designed for research and experiments. The Advanced Test Reactor (ATR) is one such operating reactor whose mission requires continued operation for an additional 25 years or more. Since the ATR is approaching its design life of twenty years, life extension evaluations have been initiated. Of particular importance are the associated high temperature, high pressure loop piping system supporting in--reactor experiments. Failure of this piping could challenge core safety margins. Since regulatory rules for nuclear power plant life extension are only in the formulation stage, the current technical guidance on this subject provided by the Department of Energy (DOE) or the commercial nuclear industry is incomplete. In the interim, order to assure continued safe operation of this piping beyond its initial design life, a program has been developed to provide the necessary technical justification for life extension. This paper describes a program that establishes Section 11 of the ASME Boiler and Pressure Vessel Code as the governing criteria document, retains B31.1 as the Code of record for Section 11 activities, specifies additional inservice inspection requirements more strict than Section 11, and relies heavily on flaw detection and fracture mechanics evaluations. 18 refs., 2 figs

  3. Perinatal health in the Danube region - new birth cohort justified.

    Czech Academy of Sciences Publication Activity Database

    Knudsen, L. E.; Andersen, Z.J.; Šrám, Radim; Braun Kohlová, M.; Gurzau, E.S.; Fucic, A.; Gribaldo, L.; Rössner ml., Pavel; Rössnerová, Andrea; Máca, V.; Zvěřinová, I.; Gajdošová, D.; Moshammer, H.; Rudnai, P.; Ščasný, M.

    2017-01-01

    Roč. 32, 1-2 (2017), s. 9-14 ISSN 2191-0308 Institutional support: RVO:68378041 Keywords : birth cohort * child health * Danube region * environmental exposures Subject RIV: DN - Health Impact of the Environment Quality OBOR OECD: Public and environmental health

  4. Beyond Baby Doe: Does Infant Transplantation Justify Euthanasia?

    Science.gov (United States)

    Coulter, David L.

    1988-01-01

    The paper examines ethical issues in the transplantation of organs from infants with anencephaly into infants with severe heart and kidney disease. It argues that active euthanasia of infants with anencephaly should be prohibited to safeguard the rights of all persons with severe neurological disabilities. (Author/DB)

  5. 4D ultrasound imaging - ethically justifiable in India?

    Science.gov (United States)

    Indiran, Venkatraman

    2017-01-01

    Four-dimensional (4D) ultrasound (real-time volume sonography), which has been used in the West since the last decade for the determination of gender as well as for bonding and entertainment of the parents, has become widely available in India in this decade. Here, I would like to discuss the ethical issues associated with 4D ultrasonography in India. These are self-referral, the use of the technology for non-medical indications, a higher possibility of the disclosure of the foetus' gender and safety concerns.

  6. Justifying the design and selection of literacy and thinking tools

    Directory of Open Access Journals (Sweden)

    David Whitehead

    2008-10-01

    Full Text Available Criteria for the design and selection of literacy and thinking tools that allow educators to justifywhat they do are described within a wider framework of learning theory and research into bestpractice. Based on a meta-analysis of best practice, results from a three year project designedto evaluate the effectiveness of a secondary school literacy initiative in New Zealand, togetherwith recent research from cognitive and neuro-psychologists, it is argued that the design andselection of literacy and thinking tools used in elementary schools should be consistent with (iteaching focused (ii learner focused, (iii thought linked (iv neurologically consistent, (vsubject specific, (vi text linked, (vii developmentally appropriate, and (viii assessment linkedcriteria.

  7. Justified and Ancient: Pop Music in EFL Classrooms.

    Science.gov (United States)

    Domoney, Liz; Harris, Simon

    1993-01-01

    A teacher training workshop uses linked tasks through which teachers explore the integration of pop music into Mexican secondary school English classes. Rather than being discrete, marginal items, pop music activities are worth linking, elaborating, and treating as more central in a secondary school program. (Contains 10 references.) (Author)

  8. Justifying genetics as a possible legal defence to criminal ...

    African Journals Online (AJOL)

    However, jurisprudence of many criminal cases tends to question whether a person's inherited genes predispose him to violence and further determine his criminal responsibility in law. Under the Nigerian criminal law, the legal test of criminal responsibility is mainly whether the accused person intends the consequence of ...

  9. Justified ethicality: observing desired counterfactuals modifies ethical perceptions and behavior

    NARCIS (Netherlands)

    Shalvi, S.; Dana, J.; Handgraaf, M.J.J.; de Dreu, C.K.W.

    2011-01-01

    Employing a die-under-cup paradigm, we study the extent to which people lie when it is transparently clear they cannot be caught. We asked participants to report the outcome of a private die roll and gain money according to their reports. Results suggest that the degree of lying depends on the

  10. Legislative prohibitions on wearing a headscarf: are they justified ...

    African Journals Online (AJOL)

    In recent years the headscarf has been described as a symbol of Islam's oppression of women and simultaneously of terrorism. As the debate regarding the acceptability of the headscarf in the modern world continues, an increasing number of states have legislated to ban the wearing of the headscarf. This article critically ...

  11. Justified Ilegality?: Controlled clientelism by the Chilean administration

    Directory of Open Access Journals (Sweden)

    Marcelo Moriconi Bezerra

    2011-07-01

    Full Text Available The Chilean civil service is considered one of the most efficient in Latin America. However, different studies describe the informal institutions that operate between the Legislative Power and the bureaucracy to fill positions in the public administration. Although some of these clientelistic practices are against the law, they have been accepted and defended in both the political and scientific spheres. Legality is not considered an important value if certain indexes have a positive development. In this context, it is important to study how corruption and clientelism have been ignored, or hidden, through political discourses and technical reports about the situation of bureaucracy. All of this allows a better understanding of why after 20 years of administrative reforms there are damaging practices which negatively affect democracy that have not been eradicated.

  12. Is post-operative radiation for renal cell carcinoma justified?

    International Nuclear Information System (INIS)

    Aref, Ibrahim; Bociek, R. Gregory; Salhani, Douglas

    1997-01-01

    Purpose: To identify the pattern of failure in patients with resected renal cell carcinoma (RCC). Materials and methods: The records of 116 patients with unilateral, non-hematogenous metastatic RCC who were treated with definitive surgery and referred to the Ottawa Regional Cancer Centre between 1977 and 1988 were reviewed. Distribution by stage included T1 (3 patients), T2 (42 patients) and T3 (71 patients). The median follow-up was 44 months, with a range of 4-267 months. Results: Local regional failure (LRF) developed in 8 patients. Nine patients developed local or regional recurrence, plus distant failure. Fifty-eight patients had distant metastases (DM) only. The 7-year actuarial rate for LRF and DM were 12%, and 67%, respectively. The overall 7-year actuarial survival rate was 35%, and cause-specific survival was 42%. Conclusions: LRF alone is rare following nephrectomy. DM is the main pattern of failure. This data does not support the role of adjuvant radiation therapy in this disease

  13. Is the use of nuclear energy ethically justifiable?

    International Nuclear Information System (INIS)

    Feldhaus, S.

    1992-01-01

    If one wants to try and attain, in a responsible manner, the objective of future energy supply, which consists in meeting, to a satisfactory degree, the humanly adequate demands of a growing population for energy by means of a good and evil estimation guided by the criteria of social and environmental tolerability, then, based on current results of reserve and risk comparisons, and taking into account at the same time the economical practicability aspects of the different ways of energy supply, the most urgent requirement turns out to be the immediate reduction of worldwide CO 2 immission. This requirement which must be translated into immediate action is of priority. However, at present a significant CO 2 reduction in connection with the securing of a sufficient worldwide energy supply can only be achieved by means of various consistent acions which are presented in this paper in the form of a set. (orig./HSCH) [de

  14. Why Status Effects Need not Justify Egalitarian Income Policy

    NARCIS (Netherlands)

    Graafland, J.J.

    2010-01-01

    Economic research overwhelmingly shows that the utility individuals derive from their income depends on the incomes of others. Theoretical literature has proven that these status effects imply a more egalitarian income policy than in the conventional case, in which people value their income

  15. Justifying Music Instruction in American Public Schools: A Historical Perspective.

    Science.gov (United States)

    Jorgensen, Estelle R.

    1995-01-01

    Charts the development of music education from early utilitarianism up to its current emphasis on aesthetic value. Recent attempts to pursue music education as an interdisciplinary subject have been limited due to budget cuts. Briefly discusses this financial crisis and suggests some sources of alternative funding. (MJP)

  16. Naval Aviation: F-14 Upgrades are not Adequately Justified

    Science.gov (United States)

    1994-10-01

    1AMI aircraf wdll li4c( h~ave1 a radar PigUnI 3i4 npiJ814 capal~ ihq i) tamvai4 cro-w’ tit lm-uai:g_ ,e1.’nitiyirng. anti ;Uatimkitg taggetj6 whens...CaiafornrM Weq conducted (put rrvi-ew between June 16W9 andl MA) lW.14 in accoriaaice with g.enerall) auc"- td giwenhzne~nt aiuiitlZg ntaandard We. are

  17. Contemporary Methods of Social Introduction: Is the Stigmatisation justified?

    Directory of Open Access Journals (Sweden)

    Lisa M. Steffek

    2009-12-01

    Full Text Available Historically, individuals in search of a romantic partner have expanded their pool of alternatives by meeting others through their personal social networks. In the last few decades, however, a growing singles population, coupled with advances in technology, has promoted the utilisation and modernization of contemporary marriage market intermediaries (MMIs, including online dating sites, social networking sites, and professional matchmaking services. Importantly, these contemporary MMIs depart from more normative methods for meeting others, making their use ripe for social stigmatization, as evidenced by myriad portrayals in the popular media. The purpose of the present research was to provide an empirical exploration of the validity of the layperson stigma towards users of contemporary MMIs by assessing the extent to which users and nonusers of these various services differ on key individual characteristics relevant to relationship initiation and progression. Specifically, we surveyed 96 individuals, all of whom were attending a singles‘ happy hour, and compared users and nonusers of contemporary MMIs on several important characteristics. Although users reported going on more dates and perceived greater attractiveness in others at the event, no differences were observed in personality (i.e., the Big 5 or adult attachment classification (i.e., secure vs. insecure. Altogether, our findings suggest that users of contemporary MMIs are not socially undesirable people (or at least any more undesirable than nonusers.

  18. Justifying the "Folate trap" in folic acid fortification programs.

    Science.gov (United States)

    Mahajan, Niraj N; Mahajan, Kshitija N; Soni, Rajani N; Gaikwad, Nilima L

    2007-01-01

    Many countries have now adopted fortification, where folic acid is added to flour and intended to benefit all with rise in blood folate level. During many transformations of folate from one form to another, a proportion is accidentally converted to N(5)-methyl-THF, an inactive metabolite, the so-called "folate trap". Consideration should be given to including B(12) as well as folic acid in any program of supplementation or food fortification to prevent NTDs. This is especially applicable to developing countries like India where the majority of women are vegetarians and have borderline levels of vitamin B(12). Administration of [6S]-5-MTHF is more effective than is folic acid supplementation at improving folate status. Therefore, we urge to reconsider the "folate trap" in folic acid fortification programs.

  19. How are pharmaceutical patent term extensions justified? Australia's evolving scheme.

    Science.gov (United States)

    Lawson, Charles

    2013-12-01

    This article examines the evolving patent term extension schemes under the Patents Act 1903 (Cth), the Patents Act 1952 (Cth) and the Patents Act 1990 (Cth). The analysis traces the change from "inadequate remuneration" to a scheme directed specifically at certain pharmaceuticals. An examination of the policy justification shows there are legitimate questions about the desirability of any extension. The article concludes that key information provisions in the Patents Act 1990 (Cth) that might assist a better policy analysis are presently not working and that any justification needs evidence demonstrating that the benefits of patent term extensions to the community as a whole outweigh the costs and that the objectives of extensions can only be achieved by restricting competition.

  20. Is it economically justifiable to decide on nuclear power phaseout?

    International Nuclear Information System (INIS)

    Scholz, L.

    1988-01-01

    The author critically comments on the various expert opinions describing and assessing the economic and other impacts of a nuclear power phaseout. One of his conclusions is that the complexity of the problems involved may lead astray in the public controversy about the nuclear issue, inducing irrational speculations on the one hand, and blue-eyed hopes on the other. This may lead to the dangerous situation that speculations or arbitrary information may be clad with the cloak of (unjustified) assumptions made to look like scientific information. (orig.) [de

  1. Justifying Innovative Language Programs in an Environment of ...

    African Journals Online (AJOL)

    In the analysis of the literature that has been written on project management and language issues in development, it attempts to show how the Communication Skills programme could benefit from this knowledge on project management and educational change management in the third millennium. The paper concludes that ...

  2. Is extended biopsy protocol justified in all patients with suspected ...

    African Journals Online (AJOL)

    2012-01-03

    Jan 3, 2012 ... Objective: To determine the significance of an extended 10-core transrectal biopsy protocol in different categories of patients with suspected prostate cancer using digital guidance. Materials and Methods: We studied 125 men who were being evaluated for prostate cancer. They all had an extended.

  3. British media attacks on homeopathy: are they justified?

    Science.gov (United States)

    Vithoulkas, George

    2008-04-01

    Homeopathy is being attacked by the British media. These attacks draw support from irresponsible and unjustified claims by certain teachers of homeopathy. Such claims include the use of 'dream' and 'imaginative' methods for provings. For prescribing some such teachers attempt to replace the laborious process of matching symptom picture and remedy with spurious theories based on 'signatures', sensations and other methods. Other irresponsible claims have also been made. These "new ideas" risk destroying the principles, theory, and practice of homeopathy.

  4. Is Routine Ordering of Both Hemoglobin and Hematocrit Justifiable?

    Science.gov (United States)

    Addison, David J.

    1966-01-01

    In order to assess the value of routine simultaneous hemoglobin and hematocrit determinations, paired determinations in the following groups were studied: (1) 360 consecutive pairs from the hematology laboratory, (2) 95 pairs on general medical patients, (3) 43 pairs on 10 patients with upper gastrointestinal hemorrhage, and (4) 62 pairs on 10 patients with burns. These values were plotted on scatter diagrams. In the 560 pairs only three disparate determinations were found. It is concluded that, in most clinical situations, determination of the hemoglobin or the hematocrit as a screening procedure provides as much useful information as the simultaneous determination of both. PMID:5296947

  5. Social dominance and ethical ideology: the end justifies the means?

    Science.gov (United States)

    Wilson, Marc Stewart

    2003-10-01

    Although many social psychological researchers have tried to identify the antecedents of unethical or immoral behavior, investigators have little considered the content of ethical beliefs that associate with important personality variables such as authoritarianism (B. Altemeyer, 1981, 1996) and social dominance orientation (SDO; J. Sidanius, 1993). Previous studies suggest that authoritarianism is associated with the rejection of relativistic standards for moral actions and--to a lesser extent--the idealistic belief that moral actions should not harm others (J. W. McHoskey, 1996). In the present study, 160 New Zealand University students completed measures of SDO (J. Sidanius), Right Wing Authoritarianism (RWA, B. Altemeyer, 1981), and two subscales of ethical ideology: Relativism and Idealism (D. R. Forsyth, 1980). As expected, SDO showed a negative relationship with Idealism, a belief that actions should not harm others. But, contrary to expectations, SDO showed no consistent association with relativism, a belief that the moralities of actions are not comparable. On the basis of those findings, people with high SDO might be described as "ruthless" in their pursuit of desirable goals and are indifferent about whether the morality of different actions can be compared or even matter.

  6. Does Biology Justify Ideology? The Politics of Genetic Attribution

    Science.gov (United States)

    Suhay, Elizabeth; Jayaratne, Toby Epstein

    2013-01-01

    Conventional wisdom suggests that political conservatives are more likely than liberals to endorse genetic explanations for many human characteristics and behaviors. Whether and to what extent this is true has received surprisingly limited systematic attention. We examine evidence from a large U.S. public opinion survey that measured the extent to which respondents believed genetic explanations account for a variety of differences among individuals as well as groups in society. We find that conservatives were indeed more likely than liberals to endorse genetic explanations for perceived race and class differences in characteristics often associated with socioeconomic inequality (intelligence, math skills, drive, and violence). Different ideological divisions emerged, however, with respect to respondents’ explanations for sexual orientation. Here, liberals were more likely than conservatives to say that sexual orientation is due to genes and less likely to say that it is due to choice or the environment. These patterns suggest that conservative and liberal ideologues will tend to endorse genetic explanations where their policy positions are bolstered by “naturalizing” human differences. That said, debates over genetic influence may be more politicized with respect to race, class, and sexual orientation than population differences generally: We find that left/right political ideology was not significantly associated with genetic (or other) attributions for individual differences in intelligence, math skills, drive, or violence. We conclude that conceptions of the proper role of government are closely intertwined with assumptions about the causes of human difference, but that this relationship is a complex one. PMID:26379311

  7. Photovoltaic array performance simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Menicucci, D. F.

    1986-09-15

    The experience of the solar industry confirms that, despite recent cost reductions, the profitability of photovoltaic (PV) systems is often marginal and the configuration and sizing of a system is a critical problem for the design engineer. Construction and evaluation of experimental systems are expensive and seldom justifiable. A mathematical model or computer-simulation program is a desirable alternative, provided reliable results can be obtained. Sandia National Laboratories, Albuquerque (SNLA), has been studying PV-system modeling techniques in an effort to develop an effective tool to be used by engineers and architects in the design of cost-effective PV systems. This paper reviews two of the sources of error found in previous PV modeling programs, presents the remedies developed to correct these errors, and describes a new program that incorporates these improvements.

  8. Preliminary Multi-Variable Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. This paper reviews the methodology used to develop space telescope cost models; summarizes recently published single variable models; and presents preliminary results for two and three variable cost models. Some of the findings are that increasing mass reduces cost; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and technology development as a function of time reduces cost at the rate of 50% per 17 years.

  9. The contribution of Skyrme Hartree-Fock calculations to the understanding of the shell model

    International Nuclear Information System (INIS)

    Zamick, L.

    1984-01-01

    The authors present a detailed comparison of Skyrme Hartree-Fock and the shell model. The H-F calculations are sensitive to the parameters that are chosen. The H-F results justify the use of effective charges in restricted model space calculations by showing that the core contribution can be large. Further, the H-F results roughly justify the use of a constant E2 effective charge, but seem to yield nucleus dependent E4 effective charges. The H-F can yield results for E6 and higher multipoles, which would be zero in s-d model space calculations. On the other side of the coin in H-F the authors can easily consider only the lowest rotational band, whereas in the shell model one can calculate the energies and properties of many more states. In the comparison some apparent problems remain, in particular E4 transitions in the upper half of the s-d shell

  10. A probit- log- skew-normal mixture model for repeated measures data with excess zeros, with application to a cohort study of paediatric respiratory symptoms

    Directory of Open Access Journals (Sweden)

    Johnston Neil W

    2010-06-01

    smaller for asthmatics relative to non-asthmatics throughout the year, whereas there was no difference in the severity of the symptoms between the two groups. Conclusions A positive association was observed between viral infection status and both the probability of experiencing any respiratory symptoms, and their severity during the year. For DAVIS data the random effects probit -log skew normal model fits significantly better than the random effects probit -log normal model, endorsing our parametric choice for the model. The simulation study indicates that our proposed model seems to be robust to misspecification of the distribution of the positive skewed response.

  11. Theoretical Background for the Decision-Making Process Modelling under Controlled Intervention Conditions

    OpenAIRE

    Bakanauskienė Irena; Baronienė Laura

    2017-01-01

    This article is intended to theoretically justify the decision-making process model for the cases, when active participation of investing entities in controlling the activities of an organisation and their results is noticeable. Based on scientific literature analysis, a concept of controlled conditions is formulated, and using a rational approach to the decision-making process, a model of the 11-steps decision-making process under controlled intervention is presented. Also, there have been u...

  12. Cost effectiveness of recycling: A systems model

    Energy Technology Data Exchange (ETDEWEB)

    Tonjes, David J., E-mail: david.tonjes@stonybrook.edu [Department of Technology and Society, College of Engineering and Applied Sciences, Stony Brook University, Stony Brook, NY 11794-3560 (United States); Waste Reduction and Management Institute, School of Marine and Atmospheric Sciences, Stony Brook University, Stony Brook, NY 11794-5000 (United States); Center for Bioenergy Research and Development, Advanced Energy Research and Technology Center, Stony Brook University, 1000 Innovation Rd., Stony Brook, NY 11794-6044 (United States); Mallikarjun, Sreekanth, E-mail: sreekanth.mallikarjun@stonybrook.edu [Department of Technology and Society, College of Engineering and Applied Sciences, Stony Brook University, Stony Brook, NY 11794-3560 (United States)

    2013-11-15

    Highlights: • Curbside collection of recyclables reduces overall system costs over a range of conditions. • When avoided costs for recyclables are large, even high collection costs are supported. • When avoided costs for recyclables are not great, there are reduced opportunities for savings. • For common waste compositions, maximizing curbside recyclables collection always saves money. - Abstract: Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets.

  13. Improving practical atmospheric dispersion models

    International Nuclear Information System (INIS)

    Hunt, J.C.R.; Hudson, B.; Thomson, D.J.

    1992-01-01

    The new generation of practical atmospheric dispersion model (for short range ≤ 30 km) are based on dispersion science and boundary layer meteorology which have widespread international acceptance. In addition, recent improvements in computer skills and the widespread availability of small powerful computers make it possible to have new regulatory models which are more complex than the previous generation which were based on charts and simple formulae. This paper describes the basis of these models and how they have developed. Such models are needed to satisfy the urgent public demand for sound, justifiable and consistent environmental decisions. For example, it is preferable that the same models are used to simulate dispersion in different industries; in many countries at present different models are used for emissions from nuclear and fossil fuel power stations. The models should not be so simple as to be suspect but neither should they be too complex for widespread use; for example, at public inquiries in Germany, where simple models are mandatory, it is becoming usual to cite the results from highly complex computational models because the simple models are not credible. This paper is written in a schematic style with an emphasis on tables and diagrams. (au) (22 refs.)

  14. Mesoscopic and continuum modelling of angiogenesis

    KAUST Repository

    Spill, F.

    2014-03-11

    Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. © 2014 Springer-Verlag Berlin Heidelberg.

  15. Mesoscopic and continuum modelling of angiogenesis

    KAUST Repository

    Spill, F.; Guerrero, P.; Alarcon, T.; Maini, P. K.; Byrne, H. M.

    2014-01-01

    Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. © 2014 Springer-Verlag Berlin Heidelberg.

  16. Uncertainty, sensitivity analysis and the role of data based mechanistic modeling in hydrology

    Science.gov (United States)

    Ratto, M.; Young, P. C.; Romanowicz, R.; Pappenberger, F.; Saltelli, A.; Pagano, A.

    2007-05-01

    calibration of mechanistic hydrological models, making their properties more transparent. It also helps to highlight possible mis-specification problems, if these are identified. The results of the exercise show that the two modelling methodologies have good synergy; combining well to produce a complete joint modelling approach that has the kinds of checks-and-balances required in practical data-based modelling of rainfall-flow systems. Such a combined approach also produces models that are suitable for different kinds of application. As such, the DBM model considered in the paper is developed specifically as a vehicle for flow and flood forecasting (although the generality of DBM modelling means that a simulation version of the model could be developed if required); while TOPMODEL, suitably calibrated (and perhaps modified) in the light of the DBM and GSA results, immediately provides a simulation model with a variety of potential applications, in areas such as catchment management and planning.

  17. A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives.

    Science.gov (United States)

    Martinez, Aleix; Du, Shichuan

    2012-05-01

    In cognitive science and neuroscience, there have been two leading models describing how humans perceive and classify facial expressions of emotion-the continuous and the categorical model. The continuous model defines each facial expression of emotion as a feature vector in a face space. This model explains, for example, how expressions of emotion can be seen at different intensities. In contrast, the categorical model consists of C classifiers, each tuned to a specific emotion category. This model explains, among other findings, why the images in a morphing sequence between a happy and a surprise face are perceived as either happy or surprise but not something in between. While the continuous model has a more difficult time justifying this latter finding, the categorical model is not as good when it comes to explaining how expressions are recognized at different intensities or modes. Most importantly, both models have problems explaining how one can recognize combinations of emotion categories such as happily surprised versus angrily surprised versus surprise. To resolve these issues, in the past several years, we have worked on a revised model that justifies the results reported in the cognitive science and neuroscience literature. This model consists of C distinct continuous spaces. Multiple (compound) emotion categories can be recognized by linearly combining these C face spaces. The dimensions of these spaces are shown to be mostly configural. According to this model, the major task for the classification of facial expressions of emotion is precise, detailed detection of facial landmarks rather than recognition. We provide an overview of the literature justifying the model, show how the resulting model can be employed to build algorithms for the recognition of facial expression of emotion, and propose research directions in machine learning and computer vision researchers to keep pushing the state of the art in these areas. We also discuss how the model can

  18. Innovation Production Models

    Directory of Open Access Journals (Sweden)

    Tamam N. Guseinova

    2016-01-01

    Full Text Available The article is dedicated to the study of the models of production of innovations at enterprise and state levels. The shift towards a new technology wave induces a change in systems of division of labour as well as establishment of new forms of cooperation that are reflected both in theory and practice of innovation policy and management. Within the scope of the research question we have studied different generation of innovation process, starting with simple linear models - "technology push" and "market pull" - and ending with a complex integrated model of open innovations. There are two organizational models of innovation production at the enterprise level: start-ups in the early stages of their development and ambidextrous organizations. The former are prone to linear models of innovation process, while the latter create innovation within more sophisticated inclusive processes. Companies that effectuate reciprocal ambidexterity stand out from all the rest, since together with start-ups, research and development centres, elements of innovation infrastructure and other economic agents operating in the same value chain they constitute the core of most advanced forms of national innovation systems, namely Triple Helix and Quadruple Helix systems. National innovation systems - models of innovation production at the state level - evolve into systems with a more profound division of labour that enable "line production" of innovations. These tendencies are closely related to the advent and development of the concept of serial entrepreneurship that transforms entrepreneurship into a new type of profession. International experience proves this concept to be efficient in various parts of the world. Nevertheless, the use of above mentioned models and concepts in national innovation system should be justified by socioeconomic conditions of economic regions, since they determine the efficiency of implementation of certain innovation processes and

  19. Utility Function and Optimum Consumption in the models with Habit Formation and Catching up with the Joneses

    OpenAIRE

    Naryshkin, Roman; Davison, Matt

    2009-01-01

    This paper analyzes popular time-nonseparable utility functions that describe "habit formation" consumer preferences comparing current consumption with the time averaged past consumption of the same individual and "catching up with the Joneses" (CuJ) models comparing individual consumption with a cross-sectional average consumption level. Few of these models give reasonable optimum consumption time series. We introduce theoretically justified utility specifications leading to a plausible cons...

  20. Modeling of the water gap in BWR fuel elements using SCALE/TRITON; Modellierung des Wasserspalts bei SWR-BE mit SCALE/TRITON

    Energy Technology Data Exchange (ETDEWEB)

    Tittelbach, S.; Chernykh, M. [WTI Wissenschaftlich-Technische Ingenieurberatung GmbH, Juelich (Germany)

    2012-11-01

    The authors show that an adequate modeling of the water gap in BWR fuel element models using the code TRITON requires an explicit consideration of the Dancoff factors. The analysis of three modeling options reveals that considering the moderating effects of the water gap coolant for the peripheral fuel elements the resulting deviations of the U-235 and Pu-239 concentrations are significantly reduced. The increased temporal calculation efforts are justified with respect to the burnup credits for criticality safety analyses.

  1. Real Time Updating in Distributed Urban Rainfall Runoff Modelling

    DEFF Research Database (Denmark)

    Borup, Morten; Madsen, Henrik

    that are being updated from system measurements was studied. The results showed that the fact alone that it takes time for rainfall data to travel the distance between gauges and catchments has such a big negative effect on the forecast skill of updated models, that it can justify the choice of even very...... as in a real data case study. The results confirmed that the method is indeed suitable for DUDMs and that it can be used to utilise upstream as well as downstream water level and flow observations to improve model estimates and forecasts. Due to upper and lower sensor limits many sensors in urban drainage...

  2. ANALYSIS MODEL FOR INVENTORY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    CAMELIA BURJA

    2010-01-01

    Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.

  3. Differentiating intraday seasonalities through wavelet multi-scaling

    NARCIS (Netherlands)

    Gençay, R.; Selçuk, F.; Whitcher, B.

    2001-01-01

    It is well documented that strong intraday seasonalities may induce distortions in the estimation of volatility models. These seasonalities are also the dominant source for the underlying misspecifications of the various volatility models. Therefore, an obvious route is to filter out the underlying

  4. Portfolio Management with Stochastic Interest Rates and Inflation Ambiguity

    DEFF Research Database (Denmark)

    Munk, Claus; Rubtsov, Alexey Vladimirovich

    2014-01-01

    prices. The investor is ambiguous about the inflation model and prefers a portfolio strategy which is robust to model misspecification. Ambiguity about the inflation dynamics is shown to affect the optimal portfolio fundamentally different than ambiguity about the price dynamics of traded assets...

  5. Statistical tests for equal predictive ability across multiple forecasting methods

    DEFF Research Database (Denmark)

    Borup, Daniel; Thyrsgaard, Martin

    We develop a multivariate generalization of the Giacomini-White tests for equal conditional predictive ability. The tests are applicable to a mixture of nested and non-nested models, incorporate estimation uncertainty explicitly, and allow for misspecification of the forecasting model as well as ...

  6. Causal Models and Exploratory Analysis in Heterogeneous Information Fusion for Detecting Potential Terrorists

    Science.gov (United States)

    2015-11-01

    independent. The PFT model is deliberately not that of a rational actor doing cost-benefit calculations. Real individuals are affected by emotions ...use the TLWS and PF methods discussed earlier. Our quasi-Bayesian method is “quasi” because we used heuristic methods to determine the weight given...are often justified heuristically on a case-by-case basis. One way to think about the structural issues around which we had to design is to think of

  7. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 2: Appendices

    Science.gov (United States)

    Lee, F. C.; Radman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    The computer programs and derivations generated in support of the modeling and design optimization program are presented. Programs for the buck regulator, boost regulator, and buck-boost regulator are described. The computer program for the design optimization calculations is presented. Constraints for the boost and buck-boost converter were derived. Derivations of state-space equations and transfer functions are presented. Computer lists for the converters are presented, and the input parameters justified.

  8. Efficient Work Team Scheduling: Using Psychological Models of Knowledge Retention to Improve Code Writing Efficiency

    Directory of Open Access Journals (Sweden)

    Michael J. Pelosi

    2014-12-01

    Full Text Available Development teams and programmers must retain critical information about their work during work intervals and gaps in order to improve future performance when work resumes. Despite time lapses, project managers want to maximize coding efficiency and effectiveness. By developing a mathematically justified, practically useful, and computationally tractable quantitative and cognitive model of learning and memory retention, this study establishes calculations designed to maximize scheduling payoff and optimize developer efficiency and effectiveness.

  9. Modeling density-driven flow in porous media principles, numerics, software

    CERN Document Server

    Holzbecher, Ekkehard O

    1998-01-01

    Modeling of flow and transport in groundwater has become an important focus of scientific research in recent years. Most contributions to this subject deal with flow situations, where density and viscosity changes in the fluid are neglected. This restriction may not always be justified. The models presented in the book demonstrate immpressingly that the flow pattern may be completely different when density changes are taken into account. The main applications of the models are: thermal and saline convection, geothermal flow, saltwater intrusion, flow through salt formations etc. This book not only presents basic theory, but the reader can also test his knowledge by applying the included software and can set up own models.

  10. Semi-analytical model for a slab one-dimensional photonic crystal

    Science.gov (United States)

    Libman, M.; Kondratyev, N. M.; Gorodetsky, M. L.

    2018-02-01

    In our work we justify the applicability of a dielectric mirror model to the description of a real photonic crystal. We demonstrate that a simple one-dimensional model of a multilayer mirror can be employed for modeling of a slab waveguide with periodically changing width. It is shown that this width change can be recalculated to the effective refraction index modulation. The applicability of transfer matrix method of reflection properties calculation was demonstrated. Finally, our 1-D model was employed to analyze reflection properties of a 2-D structure - a slab photonic crystal with a number of elliptic holes.

  11. A Numerical-Analytical Approach to Modeling the Axial Rotation of the Earth

    Science.gov (United States)

    Markov, Yu. G.; Perepelkin, V. V.; Rykhlova, L. V.; Filippova, A. S.

    2018-04-01

    A model for the non-uniform axial rotation of the Earth is studied using a celestial-mechanical approach and numerical simulations. The application of an approximate model containing a small number of parameters to predict variations of the axial rotation velocity of the Earth over short time intervals is justified. This approximate model is obtained by averaging variable parameters that are subject to small variations due to non-stationarity of the perturbing factors. The model is verified and compared with predictions over a long time interval published by the International Earth Rotation and Reference Systems Service (IERS).

  12. The Separate Spheres Model of Gendered Inequality.

    Science.gov (United States)

    Miller, Andrea L; Borgida, Eugene

    2016-01-01

    Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI) has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals' endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology.

  13. The Separate Spheres Model of Gendered Inequality.

    Directory of Open Access Journals (Sweden)

    Andrea L Miller

    Full Text Available Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals' endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology.

  14. The Separate Spheres Model of Gendered Inequality

    Science.gov (United States)

    Miller, Andrea L.; Borgida, Eugene

    2016-01-01

    Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI) has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals’ endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology. PMID:26800454

  15. Modeling liquid hydrogen cavitating flow with the full cavitation model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, X.B.; Qiu, L.M.; Qi, H.; Zhang, X.J.; Gan, Z.H. [Institute of Refrigeration and Cryogenic Engineering, Zhejiang University, Hangzhou 310027 (China)

    2008-12-15

    Cavitation is the formation of vapor bubbles within a liquid where flow dynamics cause the local static pressure to drop below the vapor pressure. This paper strives towards developing an effective computational strategy to simulate liquid hydrogen cavitation relevant to liquid rocket propulsion applications. The aims are realized by performing a steady state computational fluid dynamic (CFD) study of liquid hydrogen flow over a 2D hydrofoil and an axisymmetric ogive in Hord's reports with a so-called full cavitation model. The thermodynamic effect was demonstrated with the assumption of thermal equilibrium between the gas phase and liquid phase. Temperature-dependent fluid thermodynamic properties were specified along the saturation line from the ''Gaspak 3.2'' databank. Justifiable agreement between the computed surface pressure, temperature and experimental data of Hord was obtained. Specifically, a global sensitivity analysis is performed to examine the sensitivity of the turbulent computations to the wall grid resolution, wall treatments and changes in model parameters. A proper near-wall model and grid resolution were suggested. The full cavitation model with default model parameters provided solutions with comparable accuracy to sheet cavitation in liquid hydrogen for the two geometries. (author)

  16. Comparing fixed effects and covariance structure estimators for panel data

    DEFF Research Database (Denmark)

    Ejrnæs, Mette; Holm, Anders

    2006-01-01

    In this article, the authors compare the traditional econometric fixed effect estimator with the maximum likelihood estimator implied by covariance structure models for panel data. Their findings are that the maximum like lipoid estimator is remarkably robust to certain types of misspecifications...

  17. Analysis of the K-epsilon turbulence model

    International Nuclear Information System (INIS)

    Mohammadi, B.; Pironneau, O.

    1993-12-01

    This book is aimed at applied mathematicians interested in numerical simulation of turbulent flows. The book is centered around the k - ε model but it also deals with other models such as subgrid scale models, one equation models and Reynolds Stress models. The reader is expected to have some knowledge of numerical methods for fluids and, if possible, some understanding of fluid mechanics, the partial differential equations used and their variational formulations. This book presents the k - ε method for turbulence in a language familiar to applied mathematicians, stripped bare of all the technicalities of turbulence theory. The model is justified from a mathematical standpoint rather than from a physical one. The numerical algorithms are investigated and some theoretical and numerical results presented. This book should prove an invaluable tool for those studying a subject that is still controversial but very useful for industrial applications. (authors). 71 figs., 200 refs

  18. The British Model in Britain: Failing slowly

    International Nuclear Information System (INIS)

    Thomas, Steve

    2006-01-01

    In 1990, Britain reorganised its electricity industry to run on competitive lines. The British reforms are widely regarded as successful and the model used provides the basis for reforms of electricity industries worldwide. The main reason for this perception of success is major reductions in the real price of electricity with no reduction in service quality. This paper examines whether the reputation of the British reforms is justified. It concludes that the reputation is not justified and that serious fundamental problems are beginning to emerge. The central question is: have the British reforms resulted in the creation of efficient wholesale and retail markets? On this criterion, the reforms have failed. The wholesale market is dominated by obscure long-term contracts, privileged access to the market and self-dealing within integrated generator/retailers, leaving the spot markets with minimal liquidity and unreliable prices. The failure to develop an efficient wholesale market places the onus on consumers to impose competitive forces on electricity companies by switching regularly. Small consumers will not do this and they are paying too much for their power. For the future, there is a serious risk that the electricity industry will become a weakly regulated oligopoly with a veneer of competition

  19. Pengembangan Soal Penalaran Model TIMSS Matematika SMP

    Directory of Open Access Journals (Sweden)

    A. Rizta

    2013-06-01

    Full Text Available Penelitian ini bertujuan mengembangkan soal penalaran model TIMSS pada mata pelajaran matematika SMP. Subjek penelitian adalah siswa kelas VIII.7 SMP Negeri 1 Palembang yang berjumlah 27 orang. Metode penelitian yang digunakan development research atau pengembangan. Hasil penelitian ini menunjukkan bahwa sebanyak 22,22% siswa mendapat skor  penalaran di atas 65%, dan 77,78% siswa memperoleh skor penalaran di bawah 65%. Lebih rinci pencapaian hasil tes penalaran pada domain penalaran generalize 11,11%,  domain penalaran justify 3,7%, domain penalaran integrate 29,63%, domain penalaran analyze 44,45%, dan domain penalaran non-routine problem 51,85%. Berdasarkan hasil tes tersebut, jika acuan batas pencapaian 65% maka  penalaran siswa masih berada di bawah batas pencapaian minimal dengan kata lain kemampuan penalaran siswa masih rendah.   The aim of this research was developing TIMSS reasoning problem on mathematics SMP. Subject of this research was 27 students on VIII.7 SMPN 1 Palembang. This research used development research. The result show that 22,22% students reach above 65% of reasoning problem, and vice versa. More detail result show that 11,11% reached generalize reasoning level, 3,7% reached justify level, 29,63% reached integrate level, 44,45% reached analyze level, and 51,85% reached non-routine problem. Based on the result, if 65% was determined as minimum limit of success, it means the student reasoning ability still low.  

  20. Multiscale modelling and analysis of collective decision making in swarm robotics.

    Science.gov (United States)

    Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey

    2014-01-01

    We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.

  1. Multiscale Modelling and Analysis of Collective Decision Making in Swarm Robotics

    Science.gov (United States)

    Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey

    2014-01-01

    We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable. PMID:25369026

  2. Model Based Reasoning by Introductory Students When Analyzing Earth Systems and Societal Challenges

    Science.gov (United States)

    Holder, L. N.; Herbert, B. E.

    2014-12-01

    Understanding how students use their conceptual models to reason about societal challenges involving societal issues such as natural hazard risk assessment, environmental policy and management, and energy resources can improve instructional activity design that directly impacts student motivation and literacy. To address this question, we created four laboratory exercises for an introductory physical geology course at Texas A&M University that engages students in authentic scientific practices by using real world problems and issues that affect societies based on the theory of situated cognition. Our case-study design allows us to investigate the various ways that students utilize model based reasoning to identify and propose solutions to societally relevant issues. In each of the four interventions, approximately 60 students in three sections of introductory physical geology were expected to represent and evaluate scientific data, make evidence-based claims about the data trends, use those claims to express conceptual models, and use their models to analyze societal challenges. Throughout each step of the laboratory exercise students were asked to justify their claims, models, and data representations using evidence and through the use of argumentation with peers. Cognitive apprenticeship was the foundation for instruction used to scaffold students so that in the first exercise they are given a partially completed model and in the last exercise students are asked to generate a conceptual model on their own. Student artifacts, including representation of earth systems, representation of scientific data, verbal and written explanations of models and scientific arguments, and written solutions to specific societal issues or environmental problems surrounding earth systems, were analyzed through the use of a rubric that modeled authentic expertise and students were sorted into three categories. Written artifacts were examined to identify student argumentation and

  3. Animal models of pediatric chronic kidney disease. Is adenine intake an appropriate model?

    Directory of Open Access Journals (Sweden)

    Débora Claramunt

    2015-11-01

    Full Text Available Pediatric chronic kidney disease (CKD has peculiar features. In particular, growth impairment is a major clinical manifestation of CKD that debuts in pediatric age because it presents in a large proportion of infants and children with CKD and has a profound impact on the self-esteem and social integration of the stunted patients. Several factors associated with CKD may lead to growth retardation by interfering with the normal physiology of growth plate, the organ where longitudinal growth rate takes place. The study of growth plate is hardly possible in humans and justifies the use of animal models. Young rats made uremic by 5/6 nephrectomy have been widely used as a model to investigate growth retardation in CKD. This article examines the characteristics of this model and analyzes the utilization of CKD induced by high adenine diet as an alternative research protocol.

  4. Complex groundwater flow systems as traveling agent models

    Directory of Open Access Journals (Sweden)

    Oliver López Corona

    2014-10-01

    Full Text Available Analyzing field data from pumping tests, we show that as with many other natural phenomena, groundwater flow exhibits complex dynamics described by 1/f power spectrum. This result is theoretically studied within an agent perspective. Using a traveling agent model, we prove that this statistical behavior emerges when the medium is complex. Some heuristic reasoning is provided to justify both spatial and dynamic complexity, as the result of the superposition of an infinite number of stochastic processes. Even more, we show that this implies that non-Kolmogorovian probability is needed for its study, and provide a set of new partial differential equations for groundwater flow.

  5. Theoretical Background for the Decision-Making Process Modelling under Controlled Intervention Conditions

    Directory of Open Access Journals (Sweden)

    Bakanauskienė Irena

    2017-12-01

    Full Text Available This article is intended to theoretically justify the decision-making process model for the cases, when active participation of investing entities in controlling the activities of an organisation and their results is noticeable. Based on scientific literature analysis, a concept of controlled conditions is formulated, and using a rational approach to the decision-making process, a model of the 11-steps decision-making process under controlled intervention is presented. Also, there have been unified conditions, describing the case of controlled interventions thus providing preconditions to ensure the adequacy of the proposed decision-making process model.

  6. Mathematical modelling of plant transients in the PWR for simulator purposes

    International Nuclear Information System (INIS)

    Hartel, K.

    1984-01-01

    This chapter presents the results of the testing of anticipated and abnormal plant transients in pressurized water reactors (PWRs) of the type WWER 440 by means of the numerical simulation of 32 different, stationary and nonstationary, operational regimes. Topics considered include the formation of the PWR mathematical model, the physical approximation of the reactor core, the structure of the reactor core model, a mathematical approximation of the reactor model, the selection of numerical methods, and a computerized simulation system. The necessity of a PWR simulator in Czechoslovakia is justified by the present status and the outlook for the further development of the Czechoslovak nuclear power complex

  7. GARUSO - Version 1.0. Uncertainty model for multipath ultrasonic transit time gas flow meters

    Energy Technology Data Exchange (ETDEWEB)

    Lunde, Per; Froeysa, Kjell-Eivind; Vestrheim, Magne

    1997-09-01

    This report describes an uncertainty model for ultrasonic transit time gas flow meters configured with parallel chords, and a PC program, GARUSO Version 1.0, implemented for calculation of the meter`s relative expanded uncertainty. The program, which is based on the theoretical uncertainty model, is used to carry out a simplified and limited uncertainty analysis for a 12`` 4-path meter, where examples of input and output uncertainties are given. The model predicts a relative expanded uncertainty for the meter at a level which further justifies today`s increasing tendency to use this type of instruments for fiscal metering of natural gas. 52 refs., 15 figs., 11 tabs.

  8. On the Formal Modeling of Games of Language and Adversarial Argumentation : A Logic-Based Artificial Intelligence Approach

    OpenAIRE

    Eriksson Lundström, Jenny S. Z.

    2009-01-01

    Argumentation is a highly dynamical and dialectical process drawing on human cognition. Successful argumentation is ubiquitous to human interaction. Comprehensive formal modeling and analysis of argumentation presupposes a dynamical approach to the following phenomena: the deductive logic notion, the dialectical notion and the cognitive notion of justified belief. For each step of an argumentation these phenomena form networks of rules which determine the propositions to be allowed to make se...

  9. Overdeepening development in a glacial landscape evolution model with quarrying

    DEFF Research Database (Denmark)

    Ugelvig, Sofie Vej; Egholm, D.L.; Iverson, Neal R.

    In glacial landscape evolution models, subglacial erosion rates are often related to basal sliding or ice discharge by a power-law. This relation can be justified when considering bed abrasion, where rock debris transported in the basal ice drives erosion. However, the relation is not well...... supported when considering models for quarrying of rock blocks from the bed. Field observations indicate that the principal mechanism of glacial erosion is quarrying, which emphasize the importance of a better way of implementing erosion by quarrying in glacial landscape evolution models. Iverson (2012...... around the obstacles. The erosion rate is quantified by considering the likelihood of rock fracturing on topographic bumps. The model includes a statistical treatment of the bedrock weakness, which is neglected in previous quarrying models. Sliding rate, effective pressure, and average bedslope...

  10. On the empirical relevance of the transient in opinion models

    Energy Technology Data Exchange (ETDEWEB)

    Banisch, Sven, E-mail: sven.banisch@universecity.d [Mathematical Physics, Physics Department, Bielefeld University, 33501 Bielefeld (Germany); Institute for Complexity Science (ICC), 1249-078 Lisbon (Portugal); Araujo, Tanya, E-mail: tanya@iseg.utl.p [Research Unit on Complexity in Economics (UECE), ISEG, TULisbon, 1249-078 Lisbon (Portugal); Institute for Complexity Science (ICC), 1249-078 Lisbon (Portugal)

    2010-07-12

    While the number and variety of models to explain opinion exchange dynamics is huge, attempts to justify the model results using empirical data are relatively rare. As linking to real data is essential for establishing model credibility, this Letter develops an empirical confirmation experiment by which an opinion model is related to real election data. The model is based on a representation of opinions as a vector of k bits. Individuals interact according to the principle that similarity leads to interaction and interaction leads to still more similarity. In the comparison to real data we concentrate on the transient opinion profiles that form during the dynamic process. An artificial election procedure is introduced which allows to relate transient opinion configurations to the electoral performance of candidates for which data are available. The election procedure based on the well-established principle of proximity voting is repeatedly performed during the transient period and remarkable statistical agreement with the empirical data is observed.

  11. On the empirical relevance of the transient in opinion models

    International Nuclear Information System (INIS)

    Banisch, Sven; Araujo, Tanya

    2010-01-01

    While the number and variety of models to explain opinion exchange dynamics is huge, attempts to justify the model results using empirical data are relatively rare. As linking to real data is essential for establishing model credibility, this Letter develops an empirical confirmation experiment by which an opinion model is related to real election data. The model is based on a representation of opinions as a vector of k bits. Individuals interact according to the principle that similarity leads to interaction and interaction leads to still more similarity. In the comparison to real data we concentrate on the transient opinion profiles that form during the dynamic process. An artificial election procedure is introduced which allows to relate transient opinion configurations to the electoral performance of candidates for which data are available. The election procedure based on the well-established principle of proximity voting is repeatedly performed during the transient period and remarkable statistical agreement with the empirical data is observed.

  12. Mechanical Impedance Modeling of Human Arm: A survey

    Science.gov (United States)

    Puzi, A. Ahmad; Sidek, S. N.; Sado, F.

    2017-03-01

    Human arm mechanical impedance plays a vital role in describing motion ability of the upper limb. One of the impedance parameters is stiffness which is defined as the ratio of an applied force to the measured deformation of the muscle. The arm mechanical impedance modeling is useful in order to develop a better controller for system that interacts with human as such an automated robot-assisted platform for automated rehabilitation training. The aim of the survey is to summarize the existing mechanical impedance models of human upper limb so to justify the need to have an improved version of the arm model in order to facilitate the development of better controller of such systems with ever increase in complexity. In particular, the paper will address the following issue: Human motor control and motor learning, constant and variable impedance models, methods for measuring mechanical impedance and mechanical impedance modeling techniques.

  13. Modeling in biopharmaceutics, pharmacokinetics, and pharmacodynamics homogeneous and heterogeneous approaches

    CERN Document Server

    Macheras, Panos

    2006-01-01

    The state of the art in Biopharmaceutics, Pharmacokinetics, and Pharmacodynamics Modeling is presented in this book. It shows how advanced physical and mathematical methods can expand classical models in order to cover heterogeneous drug-biological processes and therapeutic effects in the body. The book is divided into four parts; the first deals with the fundamental principles of fractals, diffusion and nonlinear dynamics; the second with drug dissolution, release, and absorption; the third with empirical, compartmental, and stochastic pharmacokinetic models, and the fourth mainly with nonclassical aspects of pharmacodynamics. The classical models that have relevance and application to these sciences are also considered throughout. Many examples are used to illustrate the intrinsic complexity of drug administration related phenomena in the human, justifying the use of advanced modeling methods. This timely and useful book will appeal to graduate students and researchers in pharmacology, pharmaceutical scienc...

  14. Automated parameter estimation for biological models using Bayesian statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Langmead, Christopher J; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram; Jha, Sumit K

    2015-01-01

    Probabilistic models have gained widespread acceptance in the systems biology community as a useful way to represent complex biological systems. Such models are developed using existing knowledge of the structure and dynamics of the system, experimental observations, and inferences drawn from statistical analysis of empirical data. A key bottleneck in building such models is that some system variables cannot be measured experimentally. These variables are incorporated into the model as numerical parameters. Determining values of these parameters that justify existing experiments and provide reliable predictions when model simulations are performed is a key research problem. Using an agent-based model of the dynamics of acute inflammation, we demonstrate a novel parameter estimation algorithm by discovering the amount and schedule of doses of bacterial lipopolysaccharide that guarantee a set of observed clinical outcomes with high probability. We synthesized values of twenty-eight unknown parameters such that the parameterized model instantiated with these parameter values satisfies four specifications describing the dynamic behavior of the model. We have developed a new algorithmic technique for discovering parameters in complex stochastic models of biological systems given behavioral specifications written in a formal mathematical logic. Our algorithm uses Bayesian model checking, sequential hypothesis testing, and stochastic optimization to automatically synthesize parameters of probabilistic biological models.

  15. Non-exponential extinction of radiation by fractional calculus modelling

    International Nuclear Information System (INIS)

    Casasanta, G.; Ciani, D.; Garra, R.

    2012-01-01

    Possible deviations from exponential attenuation of radiation in a random medium have been recently studied in several works. These deviations from the classical Beer-Lambert law were justified from a stochastic point of view by Kostinski (2001) . In his model he introduced the spatial correlation among the random variables, i.e. a space memory. In this note we introduce a different approach, including a memory formalism in the classical Beer-Lambert law through fractional calculus modelling. We find a generalized Beer-Lambert law in which the exponential memoryless extinction is only a special case of non-exponential extinction solutions described by Mittag-Leffler functions. We also justify this result from a stochastic point of view, using the space fractional Poisson process. Moreover, we discuss some concrete advantages of this approach from an experimental point of view, giving an estimate of the deviation from exponential extinction law, varying the optical depth. This is also an interesting model to understand the meaning of fractional derivative as an instrument to transmit randomness of microscopic dynamics to the macroscopic scale.

  16. The nuisance of nuisance regression: spectral misspecification in a common approach to resting-state fMRI preprocessing reintroduces noise and obscures functional connectivity.

    Science.gov (United States)

    Hallquist, Michael N; Hwang, Kai; Luna, Beatriz

    2013-11-15

    Recent resting-state functional connectivity fMRI (RS-fcMRI) research has demonstrated that head motion during fMRI acquisition systematically influences connectivity estimates despite bandpass filtering and nuisance regression, which are intended to reduce such nuisance variability. We provide evidence that the effects of head motion and other nuisance signals are poorly controlled when the fMRI time series are bandpass-filtered but the regressors are unfiltered, resulting in the inadvertent reintroduction of nuisance-related variation into frequencies previously suppressed by the bandpass filter, as well as suboptimal correction for noise signals in the frequencies of interest. This is important because many RS-fcMRI studies, including some focusing on motion-related artifacts, have applied this approach. In two cohorts of individuals (n=117 and 22) who completed resting-state fMRI scans, we found that the bandpass-regress approach consistently overestimated functional connectivity across the brain, typically on the order of r=.10-.35, relative to a simultaneous bandpass filtering and nuisance regression approach. Inflated correlations under the bandpass-regress approach were associated with head motion and cardiac artifacts. Furthermore, distance-related differences in the association of head motion and connectivity estimates were much weaker for the simultaneous filtering approach. We recommend that future RS-fcMRI studies ensure that the frequencies of nuisance regressors and fMRI data match prior to nuisance regression, and we advocate a simultaneous bandpass filtering and nuisance regression strategy that better controls nuisance-related variability. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. The Nuisance of Nuisance Regression: Spectral Misspecification in a Common Approach to Resting-State fMRI Preprocessing Reintroduces Noise and Obscures Functional Connectivity

    OpenAIRE

    Hallquist, Michael N.; Hwang, Kai; Luna, Beatriz

    2013-01-01

    Recent resting-state functional connectivity fMRI (RS-fcMRI) research has demonstrated that head motion during fMRI acquisition systematically influences connectivity estimates despite bandpass filtering and nuisance regression, which are intended to reduce such nuisance variability. We provide evidence that the effects of head motion and other nuisance signals are poorly controlled when the fMRI time series are bandpass-filtered but the regressors are unfiltered, resulting in the inadvertent...

  18. A Framework for Developing the Structure of Public Health Economic Models.

    Science.gov (United States)

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics

  19. Modeling cerebral blood flow during posture change from sitting to standing

    DEFF Research Database (Denmark)

    Ottesen, Johnny T.; Olufsen, M.; Tran, H.T.

    2004-01-01

    extremities, the brain, and the heart. We use physiologically based control mechanisms to describe the regulation of cerebral blood flow velocity and arterial pressure in response to orthostatic hypotension resulting from postural change. To justify the fidelity of our mathematical model and control......Abstract Hypertension, decreased cerebral blood flow, and diminished cerebral blood flow velocity regulation, are among the first signs indicating the presence of cerebral vascular disease. In this paper, we will present a mathematical model that can predict blood flow and pressure during posture...

  20. Two dimensional Hall MHD modeling of a plasma opening switch with density inhomogeneities

    Energy Technology Data Exchange (ETDEWEB)

    Zabaidullin, O [Kurchatov Institute, Moscow (Russian Federation); Chuvatin, A; Etlicher, B [Ecole Polytechnique, Palaiseau (France). Laboratoire de Physique des Milieux Ionises

    1997-12-31

    The results of two-dimensional numerical modeling of the Plasma Opening Switch in the MHD framework with Hall effect are presented. An enhanced Hall diffusion coefficient was used in the simulations. Recent experiments justify the application of this approach. The result of the modeling also correlates better with the experiment than in the case of the classical diffusion coefficient. Numerically generated pictures propose a switching scenario in which the translation between the conduction and opening phases can be explained by an abrupt `switching on` and further domination of the Hall effect at the end of the conduction phase. (author). 3 figs., 6 refs.

  1. Whole-Volume Clustering of Time Series Data from Zebrafish Brain Calcium Images via Mixture Modeling.

    Science.gov (United States)

    Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L

    2018-02-01

    Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.

  2. Identification of Super Phenix steam generator by a simple polynomial model

    International Nuclear Information System (INIS)

    Rousseau, I.

    1981-01-01

    This note suggests a method of identification for the steam generator of the Super-Phenix fast neutron power plant for simple polynomial models. This approach is justified in the selection of the adaptive control. The identification algorithms presented will be applied to multivariable input-output behaviours. The results obtained with the representation in self-regressive form and by simple polynomial models will be compared and the effect of perturbations on the output signal will be tested, in order to select a good identification algorithm for multivariable adaptive regulation [fr

  3. Animal models of chronic obstructive pulmonary disease.

    Science.gov (United States)

    Pérez-Rial, Sandra; Girón-Martínez, Álvaro; Peces-Barba, Germán

    2015-03-01

    Animal models of disease have always been welcomed by the scientific community because they provide an approach to the investigation of certain aspects of the disease in question. Animal models of COPD cannot reproduce the heterogeneity of the disease and usually only manage to represent the disease in its milder stages. Moreover, airflow obstruction, the variable that determines patient diagnosis, not always taken into account in the models. For this reason, models have focused on the development of emphysema, easily detectable by lung morphometry, and have disregarded other components of the disease, such as airway injury or associated vascular changes. Continuous, long-term exposure to cigarette smoke is considered the main risk factor for this disease, justifying the fact that the cigarette smoke exposure model is the most widely used. Some variations on this basic model, related to exposure time, the association of other inducers or inhibitors, exacerbations or the use of transgenic animals to facilitate the identification of pathogenic pathways have been developed. Some variations or heterogeneity of this disease, then, can be reproduced and models can be designed for resolving researchers' questions on disease identification or treatment responses. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.

  4. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  5. A mathematical model to determine incorporated quantities of radioactivity from the measured photometric values of tritium-autoradiographs in neuroanatomy

    International Nuclear Information System (INIS)

    Jennissen, J.J.

    1981-01-01

    The mathematical/empirical model developed in this paper helps to determine the incorporated radioactivity from the measured photometric values and the exposure time T. Possible errors of autoradiography due to the exposure time or the preparation are taken into consideration by the empirical model. It is shown that the error of appr. 400% appearing in the sole comparison of the measured photometric values can be corrected. The model is valid for neuroanatomy as optical nerves, i.e. neuroanatomical material, were used to develop it. Its application also to the other sections of the central nervous system seems to be justified due to the reduction of errors thus achieved. (orig.) [de

  6. On an elastic dissipation model as continuous approximation for discrete media

    Directory of Open Access Journals (Sweden)

    I. V. Andrianov

    2006-01-01

    Full Text Available Construction of an accurate continuous model for discrete media is an important topic in various fields of science. We deal with a 1D differential-difference equation governing the behavior of an n-mass oscillator with linear relaxation. It is known that a string-type approximation is justified for low part of frequency spectra of a continuous model, but for free and forced vibrations a solution of discrete and continuous models can be quite different. A difference operator makes analysis difficult due to its nonlocal form. Approximate equations can be obtained by replacing the difference operators via a local derivative operator. Although application of a model with derivative of more than second order improves the continuous model, a higher order of approximated differential equation seriously complicates a solution of continuous problem. It is known that accuracy of the approximation can dramatically increase using Padé approximations. In this paper, one- and two-point Padé approximations suitable for justify choice of structural damping models are used.

  7. Hybrid discrete choice models: Gained insights versus increasing effort

    International Nuclear Information System (INIS)

    Mariel, Petr; Meyerhoff, Jürgen

    2016-01-01

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  8. Hybrid discrete choice models: Gained insights versus increasing effort

    Energy Technology Data Exchange (ETDEWEB)

    Mariel, Petr, E-mail: petr.mariel@ehu.es [UPV/EHU, Economía Aplicada III, Avda. Lehendakari Aguire, 83, 48015 Bilbao (Spain); Meyerhoff, Jürgen [Institute for Landscape Architecture and Environmental Planning, Technical University of Berlin, D-10623 Berlin, Germany and The Kiel Institute for the World Economy, Duesternbrooker Weg 120, 24105 Kiel (Germany)

    2016-10-15

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  9. Lotka-Volterra competition models for sessile organisms.

    Science.gov (United States)

    Spencer, Matthew; Tanner, Jason E

    2008-04-01

    Markov models are widely used to describe the dynamics of communities of sessile organisms, because they are easily fitted to field data and provide a rich set of analytical tools. In typical ecological applications, at any point in time, each point in space is in one of a finite set of states (e.g., species, empty space). The models aim to describe the probabilities of transitions between states. In most Markov models for communities, these transition probabilities are assumed to be independent of state abundances. This assumption is often suspected to be false and is rarely justified explicitly. Here, we start with simple assumptions about the interactions among sessile organisms and derive a model in which transition probabilities depend on the abundance of destination states. This model is formulated in continuous time and is equivalent to a Lotka-Volterra competition model. We fit this model and a variety of alternatives in which transition probabilities do not depend on state abundances to a long-term coral reef data set. The Lotka-Volterra model describes the data much better than all models we consider other than a saturated model (a model with a separate parameter for each transition at each time interval, which by definition fits the data perfectly). Our approach provides a basis for further development of stochastic models of sessile communities, and many of the methods we use are relevant to other types of community. We discuss possible extensions to spatially explicit models.

  10. Item selection via Bayesian IRT models.

    Science.gov (United States)

    Arima, Serena

    2015-02-10

    With reference to a questionnaire that aimed to assess the quality of life for dysarthric speakers, we investigate the usefulness of a model-based procedure for reducing the number of items. We propose a mixed cumulative logit model, which is known in the psychometrics literature as the graded response model: responses to different items are modelled as a function of individual latent traits and as a function of item characteristics, such as their difficulty and their discrimination power. We jointly model the discrimination and the difficulty parameters by using a k-component mixture of normal distributions. Mixture components correspond to disjoint groups of items. Items that belong to the same groups can be considered equivalent in terms of both difficulty and discrimination power. According to decision criteria, we select a subset of items such that the reduced questionnaire is able to provide the same information that the complete questionnaire provides. The model is estimated by using a Bayesian approach, and the choice of the number of mixture components is justified according to information criteria. We illustrate the proposed approach on the basis of data that are collected for 104 dysarthric patients by local health authorities in Lecce and in Milan. Copyright © 2014 John Wiley & Sons, Ltd.

  11. A hybrid mammalian cell cycle model

    Directory of Open Access Journals (Sweden)

    Vincent Noël

    2013-08-01

    Full Text Available Hybrid modeling provides an effective solution to cope with multiple time scales dynamics in systems biology. Among the applications of this method, one of the most important is the cell cycle regulation. The machinery of the cell cycle, leading to cell division and proliferation, combines slow growth, spatio-temporal re-organisation of the cell, and rapid changes of regulatory proteins concentrations induced by post-translational modifications. The advancement through the cell cycle comprises a well defined sequence of stages, separated by checkpoint transitions. The combination of continuous and discrete changes justifies hybrid modelling approaches to cell cycle dynamics. We present a piecewise-smooth version of a mammalian cell cycle model, obtained by hybridization from a smooth biochemical model. The approximate hybridization scheme, leading to simplified reaction rates and binary event location functions, is based on learning from a training set of trajectories of the smooth model. We discuss several learning strategies for the parameters of the hybrid model.

  12. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    International Nuclear Information System (INIS)

    St John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program

  13. A Model Based on Cocitation for Web Information Retrieval

    Directory of Open Access Journals (Sweden)

    Yue Xie

    2014-01-01

    Full Text Available According to the relationship between authority and cocitation in HITS, we propose a new hyperlink weighting scheme to describe the strength of the relevancy between any two webpages. Then we combine hyperlink weight normalization and random surfing schemes as used in PageRank to justify the new model. In the new model based on cocitation (MBCC, the pages with stronger relevancy are assigned higher values, not just depending on the outlinks. This model combines both features of HITS and PageRank. Finally, we present the results of some numerical experiments, showing that the MBCC ranking agrees with the HITS ranking, especially in top 10. Meanwhile, MBCC keeps the superiority of PageRank, that is, existence and uniqueness of ranking vectors.

  14. Liquid-drop model applied to heavy ions irradiation

    International Nuclear Information System (INIS)

    De Cicco, Hernan; Alurralde, Martin A.; Saint-Martin, Maria L. G.; Bernaola, Omar A.

    1999-01-01

    Liquid-drop model is used, previously applied in the study of radiation damage in metals, in an energy range not covered by molecular dynamics, in order to understand experimental data of particle tracks in an organic material (Makrofol E), which cannot be accurately described by the existing theoretical methods. The nuclear and electronic energy depositions are considered for each ion considered and the evolution of the thermal explosion is evaluated. The experimental observation of particle tracks in a region previously considered as 'prohibited' are justified. Although the model used has free parameters and some discrepancies with the experimental diametrical values exist, the agreement obtained is highly superior than that of other existing models. (author)

  15. Model instruments of effective segmentation of the fast food market

    Directory of Open Access Journals (Sweden)

    Mityaeva Tetyana L.

    2013-03-01

    Full Text Available The article presents results of optimisation step-type calculations of economic effectiveness of promotion of fast food with consideration of key parameters of assessment of efficiency of the marketing strategy of segmentation. The article justifies development of a mathematical model on the bases of 3D-presentations and three-dimensional system of management variables. The modern applied mathematical packages allow formation not only of one-dimensional and two-dimensional arrays and analyse links of variables, but also of three-dimensional, besides, the more links and parameters are taken into account, the more adequate and adaptive are results of modelling and, as a result, more informative and strategically valuable. The article shows modelling possibilities that allow taking into account strategies and reactions on formation of the marketing strategy under conditions of entering the fast food market segments.

  16. Mathematical study of mixing models

    International Nuclear Information System (INIS)

    Lagoutiere, F.; Despres, B.

    1999-01-01

    This report presents the construction and the study of a class of models that describe the behavior of compressible and non-reactive Eulerian fluid mixtures. Mixture models can have two different applications. Either they are used to describe physical mixtures, in the case of a true zone of extensive mixing (but then this modelization is incomplete and must be considered only as a point of departure for the elaboration of models of mixtures actually relevant). Either they are used to solve the problem of the numerical mixture. This problem appears during the discretization of an interface which separates fluids having laws of different state: the zone of numerical mixing is the set of meshes which cover the interface. The attention is focused on numerical mixtures, for which the hypothesis of non-miscibility (physics) will bring two equations (the sixth and the eighth of the system). It is important to emphasize that even in the case of the only numerical mixture, the presence in one and same place (same mesh) of several fluids have to be taken into account. This will be formalized by the possibility for mass fractions to take all values between 0 and 1. This is not at odds with the equations that derive from the hypothesis of non-miscibility. One way of looking at things is to consider that there are two scales of observation: the physical scale at which one observes the separation of fluids, and the numerical scale, given by the fineness of the mesh, to which a mixture appears. In this work, mixtures are considered from the mathematical angle (both in the elaboration phase and during their study). In particular, Chapter 5 shows a result of model degeneration for a non-extended mixing zone (case of an interface): this justifies the use of models in the case of numerical mixing. All these models are based on the classical model of non-viscous compressible fluids recalled in Chapter 2. In Chapter 3, the central point of the elaboration of the class of models is

  17. Some variations of the Kristallin-I near-field model

    International Nuclear Information System (INIS)

    Smith, P.A.; Curti, E.

    1995-11-01

    The Kristallin-I project is an integrated analysis of the final disposal of vitrified high-level radioactive waste (HLW) in the crystalline basement of Northern Switzerland. It includes an analysis of the radiological consequences of radionuclide release from a repository. This analysis employs a chain of independent models for the near-field, geosphere and biosphere. In constructing these models, processes are incorporated that are believed to be relevant to repository safety, while other processes are neglected. In the present report, a set of simplified, steady-state models of the near-field is developed to investigate the possible effects of specific processes which are neglected in the time-dependent Kristallin-I near-field model. These processes are neglected, either because they are thought unlikely to occur to a significant degree, or because they are likely to make a positive contribution to the performance of the near-field barrier to radionuclide migration, but are insufficiently understood to justify incorporating them in a safety assessment. The aim of this report is to investigate whether the arguments for neglecting these processes in the Kristallin-I near-field model can be justified. (author) figs., tabs., refs

  18. Some variations of the Kristallin-I near-field model

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P A; Curti, E [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1995-11-01

    The Kristallin-I project is an integrated analysis of the final disposal of vitrified high-level radioactive waste (HLW) in the crystalline basement of Northern Switzerland. It includes an analysis of the radiological consequences of radionuclide release from a repository. This analysis employs a chain of independent models for the near-field, geosphere and biosphere. In constructing these models, processes are incorporated that are believed to be relevant to repository safety, while other processes are neglected. In the present report, a set of simplified, steady-state models of the near-field is developed to investigate the possible effects of specific processes which are neglected in the time-dependent Kristallin-I near-field model. These processes are neglected, either because (i) they are thought unlikely to occur to a significant degree, or because (ii) they are likely to make a positive contribution to the performance of the near-field barrier to radionuclide migration, but are insufficiently understood to justify incorporating them in a safety assessment. The aim of this report is to investigate whether the arguments for neglecting these processes in the Kristallin-I near-field model can be justified. This work addresses the following topics: - radionuclide transport at the bentonite-host rock interface, - canister settlement, -chemical conditions and radionuclide transport at the glass-bentonite interface. (author) figs., tabs., refs.

  19. Some variations of the Kristallin-I near-field model

    International Nuclear Information System (INIS)

    Smith, P.A.; Curti, E.

    1995-11-01

    The Kristallin-I project is an integrated analysis of the final disposal of vitrified high-level radioactive waste (HLW) in the crystalline basement of Northern Switzerland. It includes an analysis of the radiological consequences of radionuclide release from a repository. This analysis employs a chain of independent models for the near-field, geosphere and biosphere. In constructing these models, processes are incorporated that are believed to be relevant to repository safety, while other processes are neglected. In the present report, a set of simplified, steady-state models of the near-field is developed to investigate the possible effects of specific processes which are neglected in the time-dependent Kristallin-I near-field model. These processes are neglected, either because (i) they are thought unlikely to occur to a significant degree, or because (ii) they are likely to make a positive contribution to the performance of the near-field barrier to radionuclide migration, but are insufficiently understood to justify incorporating them in a safety assessment. The aim of this report is to investigate whether the arguments for neglecting these processes in the Kristallin-I near-field model can be justified. This work addresses the following topics: - radionuclide transport at the bentonite-host rock interface, - canister settlement, -chemical conditions and radionuclide transport at the glass-bentonite interface. (author) figs., tabs., refs

  20. Mechanistic movement models to understand epidemic spread.

    Science.gov (United States)

    Fofana, Abdou Moutalab; Hurford, Amy

    2017-05-05

    An overlooked aspect of disease ecology is considering how and why animals come into contact with one and other resulting in disease transmission. Mathematical models of disease spread frequently assume mass-action transmission, justified by stating that susceptible and infectious hosts mix readily, and foregoing any detailed description of host movement. Numerous recent studies have recorded, analysed and modelled animal movement. These movement models describe how animals move with respect to resources, conspecifics and previous movement directions and have been used to understand the conditions for the occurrence and the spread of infectious diseases when hosts perform a type of movement. Here, we summarize the effect of the different types of movement on the threshold conditions for disease spread. We identify gaps in the literature and suggest several promising directions for future research. The mechanistic inclusion of movement in epidemic models may be beneficial for the following two reasons. Firstly, the estimation of the transmission coefficient in an epidemic model is possible because animal movement data can be used to estimate the rate of contacts between conspecifics. Secondly, unsuccessful transmission events, where a susceptible host contacts an infectious host but does not become infected can be quantified. Following an outbreak, this enables disease ecologists to identify 'near misses' and to explore possible alternative epidemic outcomes given shifts in ecological or immunological parameters.This article is part of the themed issue 'Opening the black box: re-examining the ecology and evolution of parasite transmission'. © 2017 The Author(s).

  1. Homogenised constitutive model dedicated to reinforced concrete plates subjected to seismic solicitations

    International Nuclear Information System (INIS)

    Combescure, Christelle

    2013-01-01

    Safety reassessments are periodically performed on the EDF nuclear power plants and the recent seismic reassessments leaded to the necessity of taking into account the non-linear behaviour of materials when modeling and simulating industrial structures of these power plants under seismic solicitations. A large proportion of these infrastructures is composed of reinforced concrete buildings, including reinforced concrete slabs and walls, and literature seems to be poor on plate modeling dedicated to seismic applications for this material. As for the few existing models dedicated to these specific applications, they present either a lack of dissipation energy in the material behaviour, or no micromechanical approach that justifies the parameters needed to properly describe the model. In order to provide a constitutive model which better represents the reinforced concrete plate behaviour under seismic loadings and whose parameters are easier to identify for the civil engineer, a constitutive model dedicated to reinforced concrete plates under seismic solicitations is proposed: the DHRC (Dissipative Homogenised Reinforced Concrete) model. Justified by a periodic homogenisation approach, this model includes two dissipative phenomena: damage of concrete matrix and internal sliding at the interface between steel rebar and surrounding concrete. An original coupling term between damage and sliding, resulting from the homogenisation process, induces a better representation of energy dissipation during the material degradation. The model parameters are identified from the geometric characteristics of the plate and a restricted number of material characteristics, allowing a very simple use of the model. Numerical validations of the DHRC model are presented, showing good agreement with experimental behaviour. A one dimensional simplification of the DHRC model is proposed, allowing the representation of reinforced concrete bars and simplified models of rods and wire mesh

  2. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  3. Conceptual model to determine maximum activity of radioactive waste in near-surface disposal facilities

    International Nuclear Information System (INIS)

    Iarmosh, I.; Olkhovyk, Yu.

    2016-01-01

    For development of the management strategy for radioactive waste to be placed in near - surface disposal facilities (NSDF), it is necessary to justify long - term safety of such facilities. Use of mathematical modelling methods for long - term forecasts of radwaste radiation impact and assessment of radiation risks from radionuclides migration can help to resolve this issue. The purpose of the research was to develop the conceptual model for determining the maximum activity of radwaste to be safely disposed in the NSDF and to test it in the case of Lot 3 Vector NSDF (Chornobyl exclusion zone). This paper describes an approach to the development of such a model. The conceptual model of "9"0 Sr migration from Lot 3 through aeration zone and aquifer soils was developed. The results of modelling are shown. The proposals on further steps for the model improvement were developed

  4. Modeling and Simulation of Bus Dispatching Policy for Timed Transfers on Signalized Networks

    Science.gov (United States)

    Cho, Hsun-Jung; Lin, Guey-Shii

    2007-12-01

    The major work of this study is to formulate the system cost functions and to integrate the bus dispatching policy with signal control. The integrated model mainly includes the flow dispersion model for links, signal control model for nodes, and dispatching control model for transfer terminals. All such models are inter-related for transfer operations in one-center transit network. The integrated model that combines dispatching policies with flexible signal control modes can be applied to assess the effectiveness of transfer operations. It is found that, if bus arrival information is reliable, an early dispatching decision made at the mean bus arrival times is preferable. The costs for coordinated operations with slack times are relatively low at the optimal common headway when applying adaptive route control. Based on such findings, a threshold function of bus headway for justifying an adaptive signal route control under various time values of auto drivers is developed.

  5. IMPORTANCE OF DIFFERENT MODELS IN DECISION MAKING, EXPLAINING THE STRATEGIC BEHAVIOR IN ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Cristiano de Oliveira Maciel

    2006-11-01

    Full Text Available This study is about the different models of decision process analyzing the organizational strategy. The article presents the strategy according to a cognitive approach. The discussion about that approach has three models of decision process: rational actor model, organizational behavior, and political model. These models, respectively, present some improvement in the decision making results, search for a good decision facing the cognitive restrictions of the administrator, and lots of talks for making a decision. According to the emphasis of each model, the possibilities for analyzing the strategy are presented. The article also shows that it is necessary to take into account the three different ways of analysis. That statement is justified once the analysis as well as the decision making become more complex, mainly those which are more important for the organizations.

  6. Agricultural and Environmental Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    Kaylie Rasmuson; Kurt Rautenstrauch

    2003-01-01

    This analysis is one of nine technical reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) biosphere model. It documents input parameters for the biosphere model, and supports the use of the model to develop Biosphere Dose Conversion Factors (BDCF). The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the repository at Yucca Mountain. The ERMYN provides the TSPA with the capability to perform dose assessments. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships between the major activities and their products (the analysis and model reports) that were planned in the biosphere Technical Work Plan (TWP, BSC 2003a). It should be noted that some documents identified in Figure 1-1 may be under development and therefore not available at the time this document is issued. The ''Biosphere Model Report'' (BSC 2003b) describes the ERMYN and its input parameters. This analysis report, ANL-MGR-MD-000006, ''Agricultural and Environmental Input Parameters for the Biosphere Model'', is one of the five reports that develop input parameters for the biosphere model. This report defines and justifies values for twelve parameters required in the biosphere model. These parameters are related to use of contaminated groundwater to grow crops. The parameter values recommended in this report are used in the soil, plant, and carbon-14 submodels of the ERMYN

  7. Compartmental modeling and tracer kinetics

    CERN Document Server

    Anderson, David H

    1983-01-01

    This monograph is concerned with mathematical aspects of compartmental an­ alysis. In particular, linear models are closely analyzed since they are fully justifiable as an investigative tool in tracer experiments. The objective of the monograph is to bring the reader up to date on some of the current mathematical prob­ lems of interest in compartmental analysis. This is accomplished by reviewing mathematical developments in the literature, especially over the last 10-15 years, and by presenting some new thoughts and directions for future mathematical research. These notes started as a series of lectures that I gave while visiting with the Division of Applied ~1athematics, Brown University, 1979, and have developed in­ to this collection of articles aimed at the reader with a beginning graduate level background in mathematics. The text can be used as a self-paced reading course. With this in mind, exercises have been appropriately placed throughout the notes. As an aid in reading the material, the e~d of a ...

  8. A simple model for the evolution of a non-Abelian cosmic string network

    Energy Technology Data Exchange (ETDEWEB)

    Cella, G. [Istituto Nazionale di Fisica Nucleare, sez. Pisa, Largo Bruno Pontecorvo 3, 56126 Pisa (Italy); Pieroni, M., E-mail: giancarlo.cella@pi.infn.it, E-mail: mauro.pieroni@apc.univ-paris7.fr [AstroParticule et Cosmologie, Université Paris Diderot, CNRS, CEA, Observatoire de Paris, Sorbonne Paris Cité, F-75205 Paris Cedex 13 (France)

    2016-06-01

    In this paper we present the results of numerical simulations intended to study the behavior of non-Abelian cosmic strings networks. In particular we are interested in discussing the variations in the asymptotic behavior of the system as we variate the number of generators for the topological defects. A simple model which allows for cosmic strings is presented and its lattice discretization is discussed. The evolution of the generated cosmic string networks is then studied for different values for the number of generators for the topological defects. Scaling solution appears to be approached in most cases and we present an argument to justify the lack of scaling for the residual cases.

  9. A model for the inverse 1-median problem on trees under uncertain costs

    Directory of Open Access Journals (Sweden)

    Kien Trung Nguyen

    2016-01-01

    Full Text Available We consider the problem of justifying vertex weights of a tree under uncertain costs so that a prespecified vertex become optimal and the total cost should be optimal in the uncertainty scenario. We propose a model which delivers the information about the optimal cost which respect to each confidence level \\(\\alpha \\in [0,1]\\. To obtain this goal, we first define an uncertain variable with respect to the minimum cost in each confidence level. If all costs are independently linear distributed, we present the inverse distribution function of this uncertain variable in \\(O(n^{2}\\log n\\ time, where \\(n\\ is the number of vertices in the tree.

  10. Dynamics of a Computer Virus Propagation Model with Delays and Graded Infection Rate

    Directory of Open Access Journals (Sweden)

    Zizhen Zhang

    2017-01-01

    Full Text Available A four-compartment computer virus propagation model with two delays and graded infection rate is investigated in this paper. The critical values where a Hopf bifurcation occurs are obtained by analyzing the distribution of eigenvalues of the corresponding characteristic equation. In succession, direction and stability of the Hopf bifurcation when the two delays are not equal are determined by using normal form theory and center manifold theorem. Finally, some numerical simulations are also carried out to justify the obtained theoretical results.

  11. A dynamic P53-MDM2 model with time delay

    Energy Technology Data Exchange (ETDEWEB)

    Mihalas, Gh.I. [Department of Biophysics and Medical Informatics, University of Medicine and Pharmacy, Piata Eftimie Murgu, nr. 3, 300041 Timisoara (Romania)]. E-mail: mihalas@medinfo.umft.ro; Neamtu, M. [Department of Forecasting, Economic Analysis, Mathematics and Statistics, West University of Timisoara, Str. Pestalozzi, nr. 14A, 300115 Timisoara (Romania)]. E-mail: mihaela.neamtu@fse.uvt.ro; Opris, D. [Department of Applied Mathematics, West University of Timisoara, Bd. V. Parvan, nr. 4, 300223 Timisoara (Romania)]. E-mail: opris@math.uvt.ro; Horhat, R.F. [Department of Biophysics and Medical Informatics, University of Medicine and Pharmacy, Piata Eftimie Murgu, nr. 3, 300041 Timisoara (Romania)]. E-mail: rhorhat@yahoo.com

    2006-11-15

    Specific activator and repressor transcription factors which bind to specific regulator DNA sequences, play an important role in gene activity control. Interactions between genes coding such transcription factors should explain the different stable or sometimes oscillatory gene activities characteristic for different tissues. Starting with the model P53-MDM2 described into [Mihalas GI, Simon Z, Balea G, Popa E. Possible oscillatory behaviour in P53-MDM2 interaction computer simulation. J Biol Syst 2000;8(1):21-9] and the process described into [Kohn KW, Pommier Y. Molecular interaction map of P53 and MDM2 logic elements, which control the off-on switch of P53 in response to DNA damage. Biochem Biophys Res Commun 2005;331:816-27] we enveloped a new model of this interaction. Choosing the delay as a bifurcation parameter we study the direction and stability of the bifurcating periodic solutions. Some numerical examples are finally given for justifying the theoretical results.

  12. A multi-criteria model for maintenance job scheduling

    Directory of Open Access Journals (Sweden)

    Sunday A. Oke

    2007-12-01

    Full Text Available This paper presents a multi-criteria maintenance job scheduling model, which is formulated using a weighted multi-criteria integer linear programming maintenance scheduling framework. Three criteria, which have direct relationship with the primary objectives of a typical production setting, were used. These criteria are namely minimization of equipment idle time, manpower idle time and lateness of job with unit parity. The mathematical model constrained by available equipment, manpower and job available time within planning horizon was tested with a 10-job, 8-hour time horizon problem with declared equipment and manpower available as against the required. The results, analysis and illustrations justify multi-criteria consideration. Thus, maintenance managers are equipped with a tool for adequate decision making that guides against error in the accumulated data which may lead to wrong decision making. The idea presented is new since it provides an approach that has not been documented previously in the literature.

  13. A Novel Computer Virus Propagation Model under Security Classification

    Directory of Open Access Journals (Sweden)

    Qingyi Zhu

    2017-01-01

    Full Text Available In reality, some computers have specific security classification. For the sake of safety and cost, the security level of computers will be upgraded with increasing of threats in networks. Here we assume that there exists a threshold value which determines when countermeasures should be taken to level up the security of a fraction of computers with low security level. And in some specific realistic environments the propagation network can be regarded as fully interconnected. Inspired by these facts, this paper presents a novel computer virus dynamics model considering the impact brought by security classification in full interconnection network. By using the theory of dynamic stability, the existence of equilibria and stability conditions is analysed and proved. And the above optimal threshold value is given analytically. Then, some numerical experiments are made to justify the model. Besides, some discussions and antivirus measures are given.

  14. New robust statistical procedures for the polytomous logistic regression models.

    Science.gov (United States)

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  15. A dynamic P53-MDM2 model with time delay

    International Nuclear Information System (INIS)

    Mihalas, Gh.I.; Neamtu, M.; Opris, D.; Horhat, R.F.

    2006-01-01

    Specific activator and repressor transcription factors which bind to specific regulator DNA sequences, play an important role in gene activity control. Interactions between genes coding such transcription factors should explain the different stable or sometimes oscillatory gene activities characteristic for different tissues. Starting with the model P53-MDM2 described into [Mihalas GI, Simon Z, Balea G, Popa E. Possible oscillatory behaviour in P53-MDM2 interaction computer simulation. J Biol Syst 2000;8(1):21-9] and the process described into [Kohn KW, Pommier Y. Molecular interaction map of P53 and MDM2 logic elements, which control the off-on switch of P53 in response to DNA damage. Biochem Biophys Res Commun 2005;331:816-27] we enveloped a new model of this interaction. Choosing the delay as a bifurcation parameter we study the direction and stability of the bifurcating periodic solutions. Some numerical examples are finally given for justifying the theoretical results

  16. Creating Shared Mental Models: The Support of Visual Language

    Science.gov (United States)

    Landman, Renske B.; van den Broek, Egon L.; Gieskes, José F. B.

    Cooperative design involves multiple stakeholders that often hold different ideas of the problem, the ways to solve it, and to its solutions (i.e., mental models; MM). These differences can result in miscommunication, misunderstanding, slower decision making processes, and less chance on cooperative decisions. In order to facilitate the creation of a shared mental model (sMM), visual languages (VL) are often used. However, little scientific foundation is behind this choice. To determine whether or not this gut feeling is justified, a research was conducted in which various stakeholders had to cooperatively redesign a process chain, with and without VL. To determine whether or not a sMM was created, scores on agreement in individual MM, communication, and cooperation were analyzed. The results confirmed the assumption that VL can indeed play an important role in the creation of sMM and, hence, can aid the processes of cooperative design and engineering.

  17. Forecasting Hong Kong economy using factor augmented vector autoregression

    OpenAIRE

    Pang, Iris Ai Jao

    2010-01-01

    This work applies the FAVAR model to forecast GDP growth rate, unemployment rate and inflation rate of the Hong Kong economy. There is no factor model forecasting literature on the Hong Kong economy. The objective is to find out whether factor forecasting of using a large dataset can improve forecast performance of the Hong Kong economy. To avoid misspecification of the number of factors in the FAVAR, combination forecasts are constructed. It is found that forecasts from FAVAR model overall o...

  18. Business capital accumulation and the user cost: is there a heterogeneity bias? JRC Working Papers in Economics and Finance, 2017/11

    OpenAIRE

    FATICA SERENA

    2017-01-01

    Empirical models of capital accumulation estimated on aggregate data series are based on the assumption that capital asset types respond in the same way to cost variables. Likewise, aggregate models do not consider potential heterogeneity in investment behavior originating on the demand side for capital, e.g. at the sector level. We show that the underlying assumption of homogeneity may indeed lead to misspecification of standard aggregate investment models. Using data from 23 sectors in 10 O...

  19. Development and validation of an elastic and inelastic calculation method for tubes, based on beam models and taking into account the thermal stresses on the wall

    International Nuclear Information System (INIS)

    Krakowiak, C.

    1989-11-01

    A simplified model for the elastic-plastic calculations of thin and flexible tubes submitted to thermal stresses is presented. The method is based on beam models and provides satisfactory results concerning the displacement of the whole tube system. These results can be justified by the fact that the modifications of the tube cross sections (from circular to elliptical), the flexibility of the elbow joints and the radial temperature profile are included in the calculations. The thermoplasticity analysis is performed by defining independent and general flow directions and determining the corresponding behavior laws. The model is limited to proportional monotonous charging, however the obtained results are promissing [fr

  20. Data driven propulsion system weight prediction model

    Science.gov (United States)

    Gerth, Richard J.

    1994-10-01

    The objective of the research was to develop a method to predict the weight of paper engines, i.e., engines that are in the early stages of development. The impetus for the project was the Single Stage To Orbit (SSTO) project, where engineers need to evaluate alternative engine designs. Since the SSTO is a performance driven project the performance models for alternative designs were well understood. The next tradeoff is weight. Since it is known that engine weight varies with thrust levels, a model is required that would allow discrimination between engines that produce the same thrust. Above all, the model had to be rooted in data with assumptions that could be justified based on the data. The general approach was to collect data on as many existing engines as possible and build a statistical model of the engines weight as a function of various component performance parameters. This was considered a reasonable level to begin the project because the data would be readily available, and it would be at the level of most paper engines, prior to detailed component design.

  1. Recovery from schizophrenia and the recovery model.

    Science.gov (United States)

    Warner, Richard

    2009-07-01

    The recovery model refers to subjective experiences of optimism, empowerment and interpersonal support, and to a focus on collaborative treatment approaches, finding productive roles for user/consumers, peer support and reducing stigma. The model is influencing service development around the world. This review will assess whether optimism about outcome from serious mental illness and other tenets of the recovery model are borne out by recent research. Remission of symptoms has been precisely defined, but the definition of 'recovery' is a more diffuse concept that includes such factors as being productive and functioning independently. Recent research and a large, earlier body of data suggest that optimism about outcome from schizophrenia is justified. A substantial proportion of people with the illness will recover completely and many more will regain good social functioning. Outcome is better for people in the developing world. Mortality for people with schizophrenia is increasing but is lower in the developing world. Working appears to help people recover from schizophrenia, and recent advances in vocational rehabilitation have been shown to be effective in countries with differing economies and labor markets. A growing body of research supports the concept that empowerment is an important component of the recovery process. Key tenets of the recovery model - optimism about recovery from schizophrenia, the importance of access to employment and the value of empowerment of user/consumers in the recovery process - are supported by the scientific research. Attempts to reduce the internalized stigma of mental illness should enhance the recovery process.

  2. Sensitivity of fluvial sediment source apportionment to mixing model assumptions: A Bayesian model comparison.

    Science.gov (United States)

    Cooper, Richard J; Krueger, Tobias; Hiscock, Kevin M; Rawlins, Barry G

    2014-11-01

    Mixing models have become increasingly common tools for apportioning fluvial sediment load to various sediment sources across catchments using a wide variety of Bayesian and frequentist modeling approaches. In this study, we demonstrate how different model setups can impact upon resulting source apportionment estimates in a Bayesian framework via a one-factor-at-a-time (OFAT) sensitivity analysis. We formulate 13 versions of a mixing model, each with different error assumptions and model structural choices, and apply them to sediment geochemistry data from the River Blackwater, Norfolk, UK, to apportion suspended particulate matter (SPM) contributions from three sources (arable topsoils, road verges, and subsurface material) under base flow conditions between August 2012 and August 2013. Whilst all 13 models estimate subsurface sources to be the largest contributor of SPM (median ∼76%), comparison of apportionment estimates reveal varying degrees of sensitivity to changing priors, inclusion of covariance terms, incorporation of time-variant distributions, and methods of proportion characterization. We also demonstrate differences in apportionment results between a full and an empirical Bayesian setup, and between a Bayesian and a frequentist optimization approach. This OFAT sensitivity analysis reveals that mixing model structural choices and error assumptions can significantly impact upon sediment source apportionment results, with estimated median contributions in this study varying by up to 21% between model versions. Users of mixing models are therefore strongly advised to carefully consider and justify their choice of model structure prior to conducting sediment source apportionment investigations. An OFAT sensitivity analysis of sediment fingerprinting mixing models is conductedBayesian models display high sensitivity to error assumptions and structural choicesSource apportionment results differ between Bayesian and frequentist approaches.

  3. Modelling Practice

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...

  4. From the harmonic oscillator to the A-D-E classification of conformal models

    International Nuclear Information System (INIS)

    Itzykson, C.

    1988-01-01

    Arithmetical aspects of the solution of systems involving dimensional statistical models and conformal field theory. From this perspective, the analysis of the harmonic oscillator, the free particle in a box, the rational billards is effectuated. Moreover, the description of the classification of minimal conformal models and Weiss-Lumino-Witten models, based on the simplest affine algebra is also given. Attempts to interpret and justify the appearance of A-D-E classification of algebra in W-Z-W model are made. Extensions of W-Z-W model, based on SU(N) level one, and the ways to deal with rank two Lie groups, using the arithmetics of quadratic intergers, are described

  5. On the validity of evolutionary models with site-specific parameters.

    Directory of Open Access Journals (Sweden)

    Konrad Scheffler

    Full Text Available Evolutionary models that make use of site-specific parameters have recently been criticized on the grounds that parameter estimates obtained under such models can be unreliable and lack theoretical guarantees of convergence. We present a simulation study providing empirical evidence that a simple version of the models in question does exhibit sensible convergence behavior and that additional taxa, despite not being independent of each other, lead to improved parameter estimates. Although it would be desirable to have theoretical guarantees of this, we argue that such guarantees would not be sufficient to justify the use of these models in practice. Instead, we emphasize the importance of taking the variance of parameter estimates into account rather than blindly trusting point estimates - this is standardly done by using the models to construct statistical hypothesis tests, which are then validated empirically via simulation studies.

  6. DSNP models used in the pebble-bed HTGR dynamic simulation. V.2

    International Nuclear Information System (INIS)

    Saphier, D.

    1984-04-01

    A detailed description is given of the components that were used in the DSNP simulation of the PNP-500 high temperature gas-cooled pebble-bed reactor. Each component presented in this report describes in detail the mathematical model that was used, and the assumptions that were made in developing the model. Most of the models were developed using basic physical principles with the simplication that could be justified on the basis of the requested accuracy. Most of the models were developed as either one dimensional or lumped parameter models. The heat transfer and flow correlations, which are mostly based on semiempirical correlations were either provided by KFA or were adapted from the available literature. A short description of DSNP is also given, with a comprehensive list of all the statements available in Rev. 4.1 of DSNP. (H.K.)

  7. Leadership Models.

    Science.gov (United States)

    Freeman, Thomas J.

    This paper discusses six different models of organizational structure and leadership, including the scalar chain or pyramid model, the continuum model, the grid model, the linking pin model, the contingency model, and the circle or democratic model. Each model is examined in a separate section that describes the model and its development, lists…

  8. Models and role models.

    Science.gov (United States)

    ten Cate, Jacob M

    2015-01-01

    Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of action and was also utilized for the formulation of oral care products. In addition, we made use of intra-oral (in situ) models to study other features of the oral environment that drive the de/remineralization balance in individual patients. This model addressed basic questions, such as how enamel and dentine are affected by challenges in the oral cavity, as well as practical issues related to fluoride toothpaste efficacy. The observation that perhaps fluoride is not sufficiently potent to reduce dental caries in the present-day society triggered us to expand our knowledge in the bacterial aetiology of dental caries. For this we developed the Amsterdam Active Attachment biofilm model. Different from studies on planktonic ('single') bacteria, this biofilm model captures bacteria in a habitat similar to dental plaque. With data from the combination of these models, it should be possible to study separate processes which together may lead to dental caries. Also products and novel agents could be evaluated that interfere with either of the processes. Having these separate models in place, a suggestion is made to design computer models to encompass the available information. Models but also role models are of the utmost importance in bringing and guiding research and researchers. 2015 S. Karger AG, Basel

  9. Model(ing) Law

    DEFF Research Database (Denmark)

    Carlson, Kerstin

    The International Criminal Tribunal for the former Yugoslavia (ICTY) was the first and most celebrated of a wave of international criminal tribunals (ICTs) built in the 1990s designed to advance liberalism through international criminal law. Model(ing) Justice examines the case law of the ICTY...

  10. Models and role models

    NARCIS (Netherlands)

    ten Cate, J.M.

    2015-01-01

    Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of

  11. QUASI-STATIC MODEL OF MAGNETICALLY COLLIMATED JETS AND RADIO LOBES. II. JET STRUCTURE AND STABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Colgate, Stirling A.; Li, Hui [Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Fowler, T. Kenneth [University of California, Berkeley, CA 94720 (United States); Hooper, E. Bickford [Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States); McClenaghan, Joseph; Lin, Zhihong [University of California, Irvine, CA 92697 (United States)

    2015-11-10

    This is the second in a series of companion papers showing that when an efficient dynamo can be maintained by accretion disks around supermassive black holes in active galactic nuclei, it can lead to the formation of a powerful, magnetically driven, and mediated helix that could explain both the observed radio jet/lobe structures and ultimately the enormous power inferred from the observed ultrahigh-energy cosmic rays. In the first paper, we showed self-consistently that minimizing viscous dissipation in the disk naturally leads to jets of maximum power with boundary conditions known to yield jets as a low-density, magnetically collimated tower, consistent with observational constraints of wire-like currents at distances far from the black hole. In this paper we show that these magnetic towers remain collimated as they grow in length at nonrelativistic velocities. Differences with relativistic jet models are explained by three-dimensional magnetic structures derived from a detailed examination of stability properties of the tower model, including a broad diffuse pinch with current profiles predicted by a detailed jet solution outside the collimated central column treated as an electric circuit. We justify our model in part by the derived jet dimensions in reasonable agreement with observations. Using these jet properties, we also discuss the implications for relativistic particle acceleration in nonrelativistically moving jets. The appendices justify the low jet densities yielding our results and speculate how to reconcile our nonrelativistic treatment with general relativistic MHD simulations.

  12. Electricity price modeling with stochastic time change

    International Nuclear Information System (INIS)

    Borovkova, Svetlana; Schmeck, Maren Diane

    2017-01-01

    In this paper, we develop a novel approach to electricity price modeling, based on the powerful technique of stochastic time change. This technique allows us to incorporate the characteristic features of electricity prices (such as seasonal volatility, time varying mean reversion and seasonally occurring price spikes) into the model in an elegant and economically justifiable way. The stochastic time change introduces stochastic as well as deterministic (e.g., seasonal) features in the price process' volatility and in the jump component. We specify the base process as a mean reverting jump diffusion and the time change as an absolutely continuous stochastic process with seasonal component. The activity rate of the stochastic time change can be related to the factors that influence supply and demand. Here we use the temperature as a proxy for the demand and hence, as the driving factor of the stochastic time change, and show that this choice leads to realistic price paths. We derive properties of the resulting price process and develop the model calibration procedure. We calibrate the model to the historical EEX power prices and apply it to generating realistic price paths by Monte Carlo simulations. We show that the simulated price process matches the distributional characteristics of the observed electricity prices in periods of both high and low demand. - Highlights: • We develop a novel approach to electricity price modeling, based on the powerful technique of stochastic time change. • We incorporate the characteristic features of electricity prices, such as seasonal volatility and spikes into the model. • We use the temperature as a proxy for the demand and hence, as the driving factor of the stochastic time change • We derive properties of the resulting price process and develop the model calibration procedure. • We calibrate the model to the historical EEX power prices and apply it to generating realistic price paths.

  13. Validity of the electrical model representation of the effects of nuclear magnetic resonance (1961); Validite de la representation par modele electrique des effets de resonance magnetique nucleaire (1961)

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1961-07-01

    When studying the behaviour of a magnetic resonance transducer formed by the association of an electrical network and of a set of nuclear spins, it is possible to bring about a representation that is analytically equivalent by means of an entirely electrical model, available for transients as well as steady-state. A detailed study of the validity conditions justifies its use in most cases. Also proposed is a linearity criterion of Bloch's equations in transient state that is simply the prolongation of the well-known condition of non-saturation in the steady-state. (author) [French] L'etude du comportement d'un transducteur a resonance magnetique forme de l'association d'un reseau electrique et d'un ensemble de noyaux dotes de spin, montre qu'il est possible d'en deduire une representation analytiquement equivalente au moyen d'un modele entierement electrique utilisable pour un regime transitoire aussi bien que pour un regime permanent. Une etude detaillee des conditions de validite permet d'en justifier l'emploi dans la majorite des cas. On propose enfin un critere de linearite des equations de Bloch en regime transitoire, qui constitue un prolongement de la condition connue de non-saturation en regime stationnaire. (auteur)

  14. Visualization of logistic algorithm in Wilson model

    Science.gov (United States)

    Glushchenko, A. S.; Rodin, V. A.; Sinegubov, S. V.

    2018-05-01

    Economic order quantity (EOQ), defined by the Wilson's model, is widely used at different stages of production and distribution of different products. It is useful for making decisions in the management of inventories, providing a more efficient business operation and thus bringing more economic benefits. There is a large amount of reference material and extensive computer shells that help solving various logistics problems. However, the use of large computer environments is not always justified and requires special user training. A tense supply schedule in a logistics model is optimal, if, and only if, the planning horizon coincides with the beginning of the next possible delivery. For all other possible planning horizons, this plan is not optimal. It is significant that when the planning horizon changes, the plan changes immediately throughout the entire supply chain. In this paper, an algorithm and a program for visualizing models of the optimal value of supplies and their number, depending on the magnitude of the planned horizon, have been obtained. The program allows one to trace (visually and quickly) all main parameters of the optimal plan on the charts. The results of the paper represent a part of the authors’ research work in the field of optimization of protection and support services of ports in the Russian North.

  15. A general phenomenological model for work function

    Science.gov (United States)

    Brodie, I.; Chou, S. H.; Yuan, H.

    2014-07-01

    A general phenomenological model is presented for obtaining the zero Kelvin work function of any crystal facet of metals and semiconductors, both clean and covered with a monolayer of electropositive atoms. It utilizes the known physical structure of the crystal and the Fermi energy of the two-dimensional electron gas assumed to form on the surface. A key parameter is the number of electrons donated to the surface electron gas per surface lattice site or adsorbed atom, which is taken to be an integer. Initially this is found by trial and later justified by examining the state of the valence electrons of the relevant atoms. In the case of adsorbed monolayers of electropositive atoms a satisfactory justification could not always be found, particularly for cesium, but a trial value always predicted work functions close to the experimental values. The model can also predict the variation of work function with temperature for clean crystal facets. The model is applied to various crystal faces of tungsten, aluminium, silver, and select metal oxides, and most demonstrate good fits compared to available experimental values.

  16. Local yield stress statistics in model amorphous solids

    Science.gov (United States)

    Barbot, Armand; Lerbinger, Matthias; Hernandez-Garcia, Anier; García-García, Reinaldo; Falk, Michael L.; Vandembroucq, Damien; Patinet, Sylvain

    2018-03-01

    We develop and extend a method presented by Patinet, Vandembroucq, and Falk [Phys. Rev. Lett. 117, 045501 (2016), 10.1103/PhysRevLett.117.045501] to compute the local yield stresses at the atomic scale in model two-dimensional Lennard-Jones glasses produced via differing quench protocols. This technique allows us to sample the plastic rearrangements in a nonperturbative manner for different loading directions on a well-controlled length scale. Plastic activity upon shearing correlates strongly with the locations of low yield stresses in the quenched states. This correlation is higher in more structurally relaxed systems. The distribution of local yield stresses is also shown to strongly depend on the quench protocol: the more relaxed the glass, the higher the local plastic thresholds. Analysis of the magnitude of local plastic relaxations reveals that stress drops follow exponential distributions, justifying the hypothesis of an average characteristic amplitude often conjectured in mesoscopic or continuum models. The amplitude of the local plastic rearrangements increases on average with the yield stress, regardless of the system preparation. The local yield stress varies with the shear orientation tested and strongly correlates with the plastic rearrangement locations when the system is sheared correspondingly. It is thus argued that plastic rearrangements are the consequence of shear transformation zones encoded in the glass structure that possess weak slip planes along different orientations. Finally, we justify the length scale employed in this work and extract the yield threshold statistics as a function of the size of the probing zones. This method makes it possible to derive physically grounded models of plasticity for amorphous materials by directly revealing the relevant details of the shear transformation zones that mediate this process.

  17. Parametric Sensitivity Analysis of the WAVEWATCH III Model

    Directory of Open Access Journals (Sweden)

    Beng-Chun Lee

    2009-01-01

    Full Text Available The parameters in numerical wave models need to be calibrated be fore a model can be applied to a specific region. In this study, we selected the 8 most important parameters from the source term of the WAVEWATCH III model and subjected them to sensitivity analysis to evaluate the sensitivity of the WAVEWATCH III model to the selected parameters to determine how many of these parameters should be considered for further discussion, and to justify the significance priority of each parameter. After ranking each parameter by sensitivity and assessing their cumulative impact, we adopted the ARS method to search for the optimal values of those parameters to which the WAVEWATCH III model is most sensitive by comparing modeling results with ob served data at two data buoys off the coast of north eastern Taiwan; the goal being to find optimal parameter values for improved modeling of wave development. The procedure adopting optimal parameters in wave simulations did improve the accuracy of the WAVEWATCH III model in comparison to default runs based on field observations at two buoys.

  18. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  19. Modelling of the behaviour of a UF_6 container in a fire

    International Nuclear Information System (INIS)

    Pinton, Eric

    1996-01-01

    This thesis is justified by the safety needs about storage and transport of UF_6 containers. To define their behaviour under fire conditions, a modelling was developed. Before tackling the numerical modelling, a phenomenological interpretation with experimental results of containers inside a furnace (800 C) during a fixed period was carried out. The internal heat transfers were considerably improved with these results. The 2D elaborated model takes into account most of the physical phenomena encountered in this type of situation (boiling, evaporation, condensation, radiant heat transfers through an absorbing gas, convection, pressurisation, thermal contact resistance, UF_6 expansion, solid core sinking in the liquid, elastic and plastic deformations of the steel container). This model was successfully confronted with experiments. (author) [fr

  20. APPLICATION OF IMPRECISE MODELS IN ANALYSIS OF RISK MANAGEMENT OF SOFTWARE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-11-01

    Full Text Available The analysis of functional completeness for existing detection systems was conducted. It made it possible to define information systems with a similar feature set, to assess the degree of similarity and the matching degree of the means from the "standard" model of risk management system, that considers the recommended ICAO practices and standards on aviation safety, to justify the advisability of decision-making support system creation, using imprecise model and imprecise logic for risk analysis at aviation activities. Imprecise models have a number of features regarding the possibility of taking into account the experts’ intuition and experience, the possibility of more adequate flight safety management processes modelling and obtaining the accurate decisions that correlate with the initial data; support for the rapid development of a safety management system with its further functionality complexity increase; their hardware and software implementation in control systems and decision making is less sophisticated in comparison with classical algorithms.

  1. Progresive diseases study using Markov´s multiple stage models

    Directory of Open Access Journals (Sweden)

    René Iral Palomino, Esp estadística

    2005-12-01

    Full Text Available Risk factors and their degree of association with a progressive disease,such as Alzheimerís disease or liver cancer, can be identifi edby using epidemiological models; some examples of these modelsinclude logistic and Poisson regression, log-linear, linear regression,and mixed models. Using models that take into account not onlythe different health status that a person could experience betweenvisits but also his/her characteristics (i.e. age, gender, genetic traits,etc. seems to be reasonable and justifi ed. In this paper we discussa methodology to estimate the effect of covariates that could beassociated with a disease when its progression or regression canbe idealized by means of a multi-state model that incorporates thelongitudinal nature of data. This method is based on the Markovproperty and it is illustrated using simulated data about Alzheimerísdisease. Finally, the merits and limitations of this method are discussed.

  2. HYPERELASTIC MODELS FOR GRANULAR MATERIALS

    Energy Technology Data Exchange (ETDEWEB)

    Humrickhouse, Paul W; Corradini, Michael L

    2009-01-29

    A continuum framework for modeling of dust mobilization and transport, and the behavior of granular systems in general, has been reviewed, developed and evaluated for reactor design applications. The large quantities of micron-sized particles expected in the international fusion reactor design, ITER, will accumulate into piles and layers on surfaces, which are large relative to the individual particle size; thus, particle-particle, rather than particle-surface, interactions will determine the behavior of the material in bulk, and a continuum approach is necessary and justified in treating the phenomena of interest; e.g., particle resuspension and transport. The various constitutive relations that characterize these solid particle interactions in dense granular flows have been discussed previously, but prior to mobilization their behavior is not even fluid. Even in the absence of adhesive forces between particles, dust or sand piles can exist in static equilibrium under gravity and other forces, e.g., fluid shear. Their behavior is understood to be elastic, though not linear. The recent “granular elasticity” theory proposes a non-linear elastic model based on “Hertz contacts” between particles; the theory identifies the Coulomb yield condition as a requirement for thermodynamic stability, and has successfully reproduced experimental results for stress distributions in sand piles. The granular elasticity theory is developed and implemented in a stand- alone model and then implemented as part of a finite element model, ABAQUS, to determine the stress distributions in dust piles subjected to shear by a fluid flow. We identify yield with the onset of mobilization, and establish, for a given dust pile and flow geometry, the threshold pressure (force) conditions on the surface due to flow required to initiate it. While the granular elasticity theory applies strictly to cohesionless granular materials, attractive forces are clearly important in the interaction of

  3. Modeling Progress in AI

    OpenAIRE

    Brundage, Miles

    2015-01-01

    Participants in recent discussions of AI-related issues ranging from intelligence explosion to technological unemployment have made diverse claims about the nature, pace, and drivers of progress in AI. However, these theories are rarely specified in enough detail to enable systematic evaluation of their assumptions or to extrapolate progress quantitatively, as is often done with some success in other technological domains. After reviewing relevant literatures and justifying the need for more ...

  4. Inhalation Exposure Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rautenstrauch

    2004-09-10

    This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception.

  5. Inhalation Exposure Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    K. Rautenstrauch

    2004-01-01

    This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception

  6. Uncertainty-based calibration and prediction with a stormwater surface accumulation-washoff model based on coverage of sampled Zn, Cu, Pb and Cd field data

    DEFF Research Database (Denmark)

    Lindblom, Erik Ulfson; Ahlman, S.; Mikkelsen, Peter Steen

    2011-01-01

    allows identifying a range of behavioral model parameter sets. The small catchment size and nearness of the rain gauge justified excluding the hydrological model parameters from the uncertainty assessment. Uniform, closed prior distributions were heuristically specified for the dry and wet removal...... of accumulated metal available on the conceptual catchment surface. Forward Monte Carlo analysis based on the posterior parameter sets covered 95% of the observed event mean concentrations, and 95% prediction quantiles for site mean concentrations were estimated to 470 μg/l ±20% for Zn, 295 μg/l ±40% for Cu, 20...

  7. AN ACCURATE MODELING OF DELAY AND SLEW METRICS FOR ON-CHIP VLSI RC INTERCONNECTS FOR RAMP INPUTS USING BURR’S DISTRIBUTION FUNCTION

    Directory of Open Access Journals (Sweden)

    Rajib Kar

    2010-09-01

    Full Text Available This work presents an accurate and efficient model to compute the delay and slew metric of on-chip interconnect of high speed CMOS circuits foe ramp input. Our metric assumption is based on the Burr’s Distribution function. The Burr’s distribution is used to characterize the normalized homogeneous portion of the step response. We used the PERI (Probability distribution function Extension for Ramp Inputs technique that extends delay metrics and slew metric for step inputs to the more general and realistic non-step inputs. The accuracy of our models is justified with the results compared with that of SPICE simulations.

  8. Geometric singularities and spectra of Landau-Ginzburg models

    International Nuclear Information System (INIS)

    Greene, B.R.; Roan, S.S.; Yau, S.T.

    1991-01-01

    Some mathematical and physical aspects of superconformal string compactification in weighted projective space are discussed. In particular, we recast the path integral argument establishing the connection between Landau-Ginsburg conformal theories and Calabi-Yau string compactification in a geometric framework. We then prove that the naive expression for the vanishing of the first Chern class for a complete intersection (adopted from the smooth case) is sufficient to ensure that the resulting variety, which is generically singular, can be resolved to a smooth Calabi-Yau space. This justifies much analysis which has recently been expended on the study of Landau-Ginzburg models. Furthermore, we derive some simple formulae for the determination of the Witten index in these theories which are complementary to those derived using semiclassical reasoning by Vafa. Finally, we also comment on the possible geometrical significance of unorbifolded Landau-Ginzburg theories. (orig.)

  9. An approximation method for diffusion based leaching models

    International Nuclear Information System (INIS)

    Shukla, B.S.; Dignam, M.J.

    1987-01-01

    In connection with the fixation of nuclear waste in a glassy matrix equations have been derived for leaching models based on a uniform concentration gradient approximation, and hence a uniform flux, therefore requiring the use of only Fick's first law. In this paper we improve on the uniform flux approximation, developing and justifying the approach. The resulting set of equations are solved to a satisfactory approximation for a matrix dissolving at a constant rate in a finite volume of leachant to give analytical expressions for the time dependence of the thickness of the leached layer, the diffusional and dissolutional contribution to the flux, and the leachant composition. Families of curves are presented which cover the full range of all the physical parameters for this system. The same procedure can be readily extended to more complex systems. (author)

  10. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...

  11. Nonlinear Model Predictive Control for Oil Reservoirs Management

    DEFF Research Database (Denmark)

    Capolei, Andrea

    expensive gradient computation by using high-order ESDIRK (Explicit Singly Diagonally Implicit Runge-Kutta) temporal integration methods and continuous adjoints. The high order integration scheme allows larger time steps and therefore faster solution times. We compare gradient computation by the continuous...... gradient-based optimization and the required gradients are computed by the adjoint method. We propose the use of efficient high order implicit time integration methods for the solution of the forward and the adjoint equations of the dynamical model. The Ensemble Kalman filter is used for data assimilation...... equivalent strategy is not justified for the particular case studied in this paper. The third contribution of this thesis is a mean-variance method for risk mitigation in production optimization of oil reservoirs. We introduce a return-risk bicriterion objective function for the profit-risk tradeoff...

  12. [Requirements imposed on model objects in microevolutionary investigations].

    Science.gov (United States)

    Mina, M V

    2015-01-01

    Extrapolation of results of investigations of a model object is justified only within the limits of a set of objects that have essential properties in common with the modal object. Which properties are essential depends on the aim of a study. Similarity of objects emerged in the process of their independent evolution does not prove similarity of ways and mechanisms of their evolution. If the objects differ in their essential properties then extrapolation of results of investigation of an object on another one is risky because it may lead to wrong decisions and, moreover, to the loss of interest to alternative hypotheses. Positions formulated above are considered with the reference to species flocks of fishes, large African Barbus in particular.

  13. Electromechanical modelling of tapered ionic polymer metal composites transducers

    Directory of Open Access Journals (Sweden)

    Rakesha Chandra Dash

    2016-09-01

    Full Text Available Ionic polymer metal composites (IPMCs are relatively new smart materials that exhibit a bidirectional electromechanical coupling. IPMCs have large number of important engineering applications such as micro robotics, biomedical devices, biomimetic robotics etc. This paper presents a comparison between tapered and uniform cantilevered Nafion based IPMCs transducer. Electromechanical modelling is done for the tapered beam. Thickness can be varied according to the requirement of force and deflection. Numerical results pertaining to the force and deflection characteristics of both type IPMCs transducer are obtained. It is shown that the desired amount of force and deflections for tapered IPMCs can be achieved for a given voltage. Different fixed end (t0 and free end (t1 thickness values have been taken to justify the results using MATLAB.

  14. Discretization-dependent model for weakly connected excitable media

    Science.gov (United States)

    Arroyo, Pedro André; Alonso, Sergio; Weber dos Santos, Rodrigo

    2018-03-01

    Pattern formation has been widely observed in extended chemical and biological processes. Although the biochemical systems are highly heterogeneous, homogenized continuum approaches formed by partial differential equations have been employed frequently. Such approaches are usually justified by the difference of scales between the heterogeneities and the characteristic spatial size of the patterns. Under different conditions, for example, under weak coupling, discrete models are more adequate. However, discrete models may be less manageable, for instance, in terms of numerical implementation and mesh generation, than the associated continuum models. Here we study a model to approach discreteness which permits the computer implementation on general unstructured meshes. The model is cast as a partial differential equation but with a parameter that depends not only on heterogeneities sizes, as in the case of quasicontinuum models, but also on the discretization mesh. Therefore, we refer to it as a discretization-dependent model. We validate the approach in a generic excitable media that simulates three different phenomena: the propagation of action membrane potential in cardiac tissue, in myelinated axons of neurons, and concentration waves in chemical microemulsions.

  15. Formation of an Integrated Stock Price Forecast Model in Lithuania

    Directory of Open Access Journals (Sweden)

    Audrius Dzikevičius

    2016-12-01

    Full Text Available Technical and fundamental analyses are widely used to forecast stock prices due to lack of knowledge of other modern models and methods such as Residual Income Model, ANN-APGARCH, Support Vector Machine, Probabilistic Neural Network and Genetic Fuzzy Systems. Although stock price forecast models integrating both technical and fundamental analyses are currently used widely, their integration is not justified comprehensively enough. This paper discusses theoretical one-factor and multi-factor stock price forecast models already applied by investors at a global level and determines possibility to create and apply practically a stock price forecast model which integrates fundamental and technical analysis with the reference to the Lithuanian stock market. The research is aimed to determine the relationship between stock prices of the 14 Lithuanian companies listed in the Main List by the Nasdaq OMX Baltic and various fundamental variables. Based on correlation and regression analysis results and application of c-Squared Test, ANOVA method, a general stock price forecast model is generated. This paper discusses practical implications how the developed model can be used to forecast stock prices by individual investors and suggests additional check measures.

  16. CEAI: CCM-based email authorship identification model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2013-11-01

    Full Text Available In this paper we present a model for email authorship identification (EAI by employing a Cluster-based Classification (CCM technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature set to include some more interesting and effective features for email authorship identification (e.g., the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell. We also included Info Gain feature selection based content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM-based models, as well as the models proposed by Iqbal et al. (2010, 2013 [1,2]. The proposed model attains an accuracy rate of 94% for 10 authors, 89% for 25 authors, and 81% for 50 authors, respectively on Enron dataset, while 89.5% accuracy has been achieved on authors’ constructed real email dataset. The results on Enron dataset have been achieved on quite a large number of authors as compared to the models proposed by Iqbal et al. [1,2].

  17. On turbulence models for rod bundle flow computations

    International Nuclear Information System (INIS)

    Hazi, Gabor

    2005-01-01

    In commercial computational fluid dynamics codes there is more than one turbulence model built in. It is the user responsibility to choose one of those models, suitable for the problem studied. In the last decade, several computations were presented using computational fluid dynamics for the simulation of various problems of the nuclear industry. A common feature in a number of those simulations is that they were performed using the standard k-ε turbulence model without justifying the choice of the model. The simulation results were rarely satisfactory. In this paper, we shall consider the flow in a fuel rod bundle as a case study and discuss why the application of the standard k-ε model fails to give reasonable results in this situation. We also show that a turbulence model based on the Reynolds stress transport equations can provide qualitatively correct results. Generally, our aim is pedagogical, we would like to call the readers attention to the fact that turbulence models have to be selected based on theoretical considerations and/or adequate information obtained from measurements

  18. Modelling SDL, Modelling Languages

    Directory of Open Access Journals (Sweden)

    Michael Piefel

    2007-02-01

    Full Text Available Today's software systems are too complex to implement them and model them using only one language. As a result, modern software engineering uses different languages for different levels of abstraction and different system aspects. Thus to handle an increasing number of related or integrated languages is the most challenging task in the development of tools. We use object oriented metamodelling to describe languages. Object orientation allows us to derive abstract reusable concept definitions (concept classes from existing languages. This language definition technique concentrates on semantic abstractions rather than syntactical peculiarities. We present a set of common concept classes that describe structure, behaviour, and data aspects of high-level modelling languages. Our models contain syntax modelling using the OMG MOF as well as static semantic constraints written in OMG OCL. We derive metamodels for subsets of SDL and UML from these common concepts, and we show for parts of these languages that they can be modelled and related to each other through the same abstract concepts.

  19. Nongeneric tool support for model-driven product development; Werkzeugunterstuetzung fuer die modellbasierte Produktentwicklung. Maschinenlesbare Spezifikationen selbst erstellen

    Energy Technology Data Exchange (ETDEWEB)

    Bock, C. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Zuehlke, D. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Deutsches Forschungszentrum fuer Kuenstliche Intelligenz (DFKI), Kaiserslautern (DE). Zentrum fuer Mensch-Maschine-Interaktion (ZMMI)

    2006-07-15

    A well-defined specification process is a central success factor in human-machine-interface development. Consequently in interdisciplinary development teams specification documents are an important communication instrument. In order to replace todays typically paper-based specification and to leverage the benefits of their electronic equivalents developers demand comprehensive and applicable computer-based tool kits. Manufacturers' increasing awareness of appropriate tool support causes alternative approaches for tool kit creation to emerge. Therefore this article introduces meta-modelling as a promising attempt to create nongeneric tool support with justifiable effort. This enables manufacturers to take advantage of electronic specifications in product development processes.

  20. The green electricity market model. Proposal for an optional, cost-neutral direct marketing model for supplying electricity customers

    International Nuclear Information System (INIS)

    Heinemann, Ronald

    2014-01-01

    One of the main goals of the Renewable Energy Law (EEG) is the market integration of renewable energy resources. For this purpose it has introduced compulsory direct marketing on the basis of a moving market premium. At the same time the green electricity privilege, a regulation which made it possible for customers to be supplied with electricity from EEG plants, has been abolished without substitution with effect from 1 August 2014. This means that, aside from other direct marketing channels, which will not be economically viable save for in a few exceptional cases, it will no longer be possible in future to sell electricity from EEG plants to electricity customers under the designation ''electricity from renewable energy''. The reason for this is that electricity sold under the market premium model can no longer justifiably be said to originate from renewable energy. As a consequence, almost all green electricity products sold in Germany carry a foreign green electricity certificate.

  1. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  2. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P J [VTT Electronics, Oulu (Finland). Embedded Software

    1998-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  3. Modelling contractor’s bidding decision

    Directory of Open Access Journals (Sweden)

    Biruk Sławomir

    2017-03-01

    Full Text Available The authors aim to provide a set of tools to facilitate the main stages of the competitive bidding process for construction contractors. These involve 1 deciding whether to bid, 2 calculating the total price, and 3 breaking down the total price into the items of the bill of quantities or the schedule of payments to optimise contractor cash flows. To define factors that affect the decision to bid, the authors rely upon literature on the subject and put forward that multi-criteria methods are applied to calculate a single measure of contract attractiveness (utility value. An attractive contract implies that the contractor is likely to offer a lower price to increase chances of winning the competition. The total bid price is thus to be interpolated between the lowest acceptable and the highest justifiable price based on the contract attractiveness. With the total bid price established, the next step is to split it between the items of the schedule of payments. A linear programming model is proposed for this purpose. The application of the models is illustrated with a numerical example.

  4. The Intercultural Danube - a European Model

    Directory of Open Access Journals (Sweden)

    Gheorghe Lateș

    2014-08-01

    Full Text Available The EU construction began following the logic of economics, which in time it has created dysfunctions that seem to accentuate and create a quasi-general skepticism. This paper aims at analyzing the union construction and reconstruction on other conceptual premises, placing culture at the forefront of the new strategy. A multicultural Europe, based on the state primordial ethnicity is no longer current; the cultural diversity does not lead to unity, but rather it is a factor of dissolution. The Danubian model reunites races, languages and religions, being so diverse that their functional diachrony justifies the idea of reconstruction, based on what it was, and it did not generate tensions or conflicts. The ethnic identity did not become, in the Danube area, ethnicism, what it constitutes in a synchronic approach, as a model of rethinking the union, not by hierarchies, barriers, but rather by the opportunity of the coexistence of the peoples that connect history and the present of the horizontal axis River of a united Europe.

  5. On a Mathematical Model of Brain Activities

    International Nuclear Information System (INIS)

    Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.

    2007-01-01

    The procedure of recognition can be described as follows: There is a set of complex signals stored in the memory. Choosing one of these signals may be interpreted as generating a hypothesis concerning an 'expexted view of the world'. Then the brain compares a signal arising from our senses with the signal chosen from the memory leading to a change of the state of both signals. Furthermore, measurements of that procedure like EEG or MEG are based on the fact that recognition of signals causes a certain loss of excited neurons, i.e. the neurons change their state from 'excited' to 'nonexcited'. For that reason a statistical model of the recognition process should reflect both--the change of the signals and the loss of excited neurons. A first attempt to explain the process of recognition in terms of quantum statistics was given. In the present note it is not possible to present this approach in detail. In lieu we will sketch roughly a few of the basic ideas and structures of the proposed model of the recognition process (Section). Further, we introduce the basic spaces and justify the choice of spaces used in this approach. A more elaborate presentation including all proofs will be given in a series of some forthcoming papers. In this series also the procedures of creation of signals from the memory, amplification, accumulation and transformation of input signals, and measurements like EEG and MEG will be treated in detail

  6. Modelling the models

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models.   Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...

  7. Business capital accumulation and the user cost: is there a heterogeneity bias?

    OpenAIRE

    FATICA SERENA

    2016-01-01

    Using data from 23 market economy sectors across 10 OECD countries over the period 1984-2007 we show that the homogeneity assumption underlying empirical models for aggregate capital accumulation may lead to misspecification. Thus, we adopt a fully disaggregated approach – by asset types and sectors – to estimate the responsiveness of investment to the tax-adjusted user cost of capital. In this framework, we are able to link unobserved common factors to the nature of the shocks affecting the ...

  8. Study and discretization of kinetic models and fluid models at low Mach number

    International Nuclear Information System (INIS)

    Dellacherie, Stephane

    2011-01-01

    This thesis summarizes our work between 1995 and 2010. It concerns the analysis and the discretization of Fokker-Planck or semi-classical Boltzmann kinetic models and of Euler or Navier-Stokes fluid models at low Mach number. The studied Fokker-Planck equation models the collisions between ions and electrons in a hot plasma, and is here applied to the inertial confinement fusion. The studied semi-classical Boltzmann equations are of two types. The first one models the thermonuclear reaction between a deuterium ion and a tritium ion producing an α particle and a neutron particle, and is also in our case used to describe inertial confinement fusion. The second one (known as the Wang-Chang and Uhlenbeck equations) models the transitions between electronic quantified energy levels of uranium and iron atoms in the AVLIS isotopic separation process. The basic properties of these two Boltzmann equations are studied, and, for the Wang-Chang and Uhlenbeck equations, a kinetic-fluid coupling algorithm is proposed. This kinetic-fluid coupling algorithm incited us to study the relaxation concept for gas and immiscible fluids mixtures, and to underline connections with classical kinetic theory. Then, a diphasic low Mach number model without acoustic waves is proposed to model the deformation of the interface between two immiscible fluids induced by high heat transfers at low Mach number. In order to increase the accuracy of the results without increasing computational cost, an AMR algorithm is studied on a simplified interface deformation model. These low Mach number studies also incited us to analyse on cartesian meshes the inaccuracy at low Mach number of Godunov schemes. Finally, the LBM algorithm applied to the heat equation is justified

  9. Is surgical intervention always justified in the treatment of varicocele in children?

    Directory of Open Access Journals (Sweden)

    L. O. Severgina

    2014-11-01

    Full Text Available The adequate varicocele treatment is still unclear. After the comparison of clinical and morphological data (results of physical examination of the patients, value of resistive index in Doppler sonography, morphological analysis of incision biopsies obtained during left-sided varicocele operative treatment from 35 boys with age from 10 to 15 years and 10 adults we conclude about compensatory reactions in scrotal vein walls at all stages of varicocele development, especially in 3rd type veins. In adult patients with varicocele morphological changes in veins were more significant – vein wall prominent sclerosis was found, therefore operative treatment of this group patients seems to be more expedient.

  10. The global spread of Zika virus: is public and media concern justified in regions currently unaffected?

    Institute of Scientific and Technical Information of China (English)

    Narayan Gyawali; Richard S.Bradbury; Andrew W.Taylor-Robinson

    2016-01-01

    Background:Zika virus,an Aedes mosquito-borne flavivirus,is fast becoming a worldwide public health concern following its suspected association with over 4000 recent cases of microcephaly among newborn infants in Brazil.Discussion:Prior to its emergence in Latin America in 2015-2016,Zika was known to exist at a relatively low prevalence in parts of Africa,Asia and the Pacific islands.An extension of its apparent global dispersion may be enabled by climate conditions suitable to support the population growth ofA.aegypti and A.albopictus mosquitoes over an expanding geographical range.In addition,increased globalisation continues to pose a risk for the spread of infection.Further,suspicions of alternative modes of virus transmission (sexual and vertical),if proven,provide a platform for outbreaks in mosquito non-endemic regions as well.Since a vaccine or anti-viral therapy is not yet available,current means of disease prevention involve protection from mosquito bites,excluding pregnant females from travelling to Zika-endemic territories,and practicing safe sex in those countries.Importantly,in countries where Zika is reported as endemic,caution is advised in planning to conceive a baby until such time as the apparent association between infection with the virus and microcephaly is either confirmed or refuted.The question arises as to what advice is appropriate to give in more economically developed countries distant to the current epidemic and in which Zika has not yet been reported.Summary:Despite understandable concern among the general public that has been fuelled by the media,in regions where Zika is not present,such as North America,Europe and Australia,at this time any outbreak (initiated by an infected traveler returning from an endemic area) would very probably be contained locally.Since Aedes spp.has very limited spatial dispersal,overlapping high population densities of mosquitoes and humans would be needed to sustain a focus of infection.However,as A.aegypti is distinctly anthropophilic,future control strategies for Zika should be considered in tandem with the continuing threat to human wellbeing that is presented by dengue,yellow fever and Japanese encephalitis,all of which are transmitted by the same vector species.

  11. Are physicians' strikes ever morally justifiable? A call for a return to ...

    African Journals Online (AJOL)

    Though physicians strike provides an opportunity to generate more knowledge about the process in which legitimacy of an organization can be restored, it meets with a great deal of resistance not only by the public but from within the medical profession. This paper critically examines the legitimacy of strike by medical ...

  12. Is the use of ABPM justified in patients on 1 or 2 antihypertensive medications?

    Science.gov (United States)

    Mathur, Gaurav; Prasad, Rachana; Robinson, Anne; Rodrigues, Erwin; Wong, Peter

    2008-03-28

    We studied the utility of ABPM in patients with elevated clinic BP on 1-2 antihypertensive medications (group B, N=117), compared with those on no medications (group A, N=76) and on > or =3 medications (group C, N=110). 35% of patients in group B had adequately controlled 24-h BP based on ABPM, compared with 22.4% in group A (P=0.06) and 19.1% in group C (P=0.007). Antihypertensive treatment was not escalated in patients with adequately controlled BP. This suggests that ABPM has an important role in therapeutic decision-making for patients on 1-2 antihypertensive medications.

  13. Assessing outcomes to determine whether symptoms related to hypertension justify renal artery stenting.

    Science.gov (United States)

    Modrall, J Gregory; Rosero, Eric B; Timaran, Carlos H; Anthony, Thomas; Chung, Jayer; Valentine, R James; Trimmer, Clayton

    2012-02-01

    The goal of the study was to determine the blood pressure (BP) response to renal artery stenting (RAS) for patients with hypertension urgency, hypertension emergency, and angina with congestive heart failure (angina/congestive heart failure [CHF]). Patients who underwent RAS for hypertension emergencies (n = 13), hypertension urgencies (n = 25), and angina/CHF (n = 14) were included in the analysis. By convention, hypertension urgency was defined by a sustained systolic BP ≥ 180 mm Hg or diastolic BP ≥ 120 mm Hg, while the definition of hypertension emergency required the same BP parameters plus hypertension-related symptoms prompting hospitalization. Patient-specific response to RAS was defined according to modified American Heart Association reporting guidelines. The study cohort of 52 patients had a median age of 66 years (interquartile range 58-72). The BP response to RAS varied significantly according to the indication for RAS. Hypertension emergency provided the highest BP response rate (85%), while the response rate was significantly lower for hypertension urgency (52%) and angina/CHF (7%; P = .03). Only 1 of 14 patients with angina/CHF was a BP responder. Multivariate analysis showed that hypertension urgency or emergency were not independent predictors of BP response to RAS. Instead, the only independent predictor of a favorable BP response was the number of preoperative antihypertensive medications (odds ratio 7.5; 95% confidence interval 2.5-22.9; P = .0004), which is another indicator of the severity of hypertension. Angina/CHF was an independent predictor of failure to respond to RAS (odds ratio 118.6; 95% confidence interval 2.8-999.9; P = .013). Hypertension urgency and emergency are clinical manifestations of severe hypertension, but the number of preoperative antihypertensive medications proved to be a better predictor of a favorable BP response to RAS. In contrast, angina/CHF was a predictor of failure to respond to stenting, providing further evidence against the practice of incidental stenting during coronary interventions. Copyright © 2012 Society for Vascular Surgery. All rights reserved.

  14. Can routine commercial cord blood banking be scientifically and ethically justified?

    Directory of Open Access Journals (Sweden)

    Nicholas M Fisk

    2005-02-01

    Full Text Available BACKGROUND TO THE DEBATE: Umbilical cord blood--the blood that remains in the placenta after birth--can be collected and stored frozen for years. A well-accepted use of cord blood is as an alternative to bone marrow as a source of hematopoietic stem cells for allogeneic transplantation to siblings or to unrelated recipients; women can donate cord blood for unrelated recipients to public banks. However, private banks are now open that offer expectant parents the option to pay a fee for the chance to store cord blood for possible future use by that same child (autologous transplantation.

  15. A Justified Initial Accounting Estimate as an Integral Part of the Enterprise Accounting Policy

    Directory of Open Access Journals (Sweden)

    Marenych Tetyana H

    2016-05-01

    Full Text Available The aim of the article is justification of the need to specify in the order on accounting policies not only the elements of the accounting policy itself but also the initial accounting estimates, which will increase the reliability of financial reporting and the development of proposals on improvement of the given administrative documents of the enterprise. It is noted that in recent years the importance of a high-quality accounting policy has increased significantly not only for users of financial reports but also for achieving the purposes of determining the object of levying the profits tax. There revealed significant differences at reflecting in accounting the consequences of changes in the accounting policy and accounting estimate. There has been generalized the information in the order on the enterprise accounting policy with respect to accounting estimates. It is proposed to provide a separate section in the order, where there should be presented information about the list of accounting estimates taken, about how the company will make changes in the accounting policy, accounting estimate as well as correct errors

  16. A Justified Initial Accounting Estimate as an Integral Part of the Enterprise Accounting Policy

    OpenAIRE

    Marenych Tetyana H

    2016-01-01

    The aim of the article is justification of the need to specify in the order on accounting policies not only the elements of the accounting policy itself but also the initial accounting estimates, which will increase the reliability of financial reporting and the development of proposals on improvement of the given administrative documents of the enterprise. It is noted that in recent years the importance of a high-quality accounting policy has increased significantly not onl...

  17. Is the Inclusion of Animal Source Foods in Fortified Blended Foods Justified?

    Directory of Open Access Journals (Sweden)

    Kristen E. Noriega

    2014-09-01

    Full Text Available Fortified blended foods (FBF are used for the prevention and treatment of moderate acute malnutrition (MAM in nutritionally vulnerable individuals, particularly children. A recent review of FBF recommended the addition of animal source food (ASF in the form of whey protein concentrate (WPC, especially to corn-soy blends. The justification for this recommendation includes the potential of ASF to increase length, weight, muscle mass accretion and recovery from wasting, as well as to improve protein quality and provide essential growth factors. Evidence was collected from the following four different types of studies: (1 epidemiological; (2 ASF versus no intervention or a low-calorie control; (3 ASF versus an isocaloric non-ASF; and (4 ASF versus an isocaloric, isonitrogenous non-ASF. Epidemiological studies consistently associated improved growth outcomes with ASF consumption; however, little evidence from isocaloric and isocaloric, isonitrogenous interventions was found to support the inclusion of meat or milk in FBF. Evidence suggests that whey may benefit muscle mass accretion, but not linear growth. Overall, little evidence supports the costly addition of WPC to FBFs. Further, randomized isocaloric, isonitrogenous ASF interventions with nutritionally vulnerable children are needed.

  18. Psychiatry in the land of the Sphinx: is an overseas elective justified?

    Science.gov (United States)

    Rege, Sanil

    2008-08-01

    The aim of this paper is to provide a descriptive account of a 6-month sabbatical in Egypt to highlight the diversity of benefits in incorporating such activities within psychiatric training programs. An overseas elective offers an exciting practical opportunity to broaden one's experience of transcultural psychiatry and obtain a perspective on mental illness and its cultural variations. It also promotes an understanding of health service management in low and middle income countries and offers the opportunity to contribute to their healthcare at minimal cost. However, the elective needs to be undertaken at an optimal period of a psychiatrist's career and with minimal disruption to local services. Training schemes and employers could provide more opportunities for interested trainees, with specified projects and aims in mind, to undertake such electives so that they can begin to develop expertise in treating a particular cultural group. In turn, this would go a long way to producing culturally capable psychiatrists for the wide range of ethnic minorities in Australia.

  19. A fair range of choice: justifying maximum patient choice in the British National Health Service.

    Science.gov (United States)

    Wilmot, Stephen

    2007-06-01

    In this paper I put forward an ethical argument for the provision of extensive patient choice by the British National Health Service. I base this argument on traditional liberal rights to freedom of choice, on a welfare right to health care, and on a view of health as values-based. I argue that choice, to be ethically sustainable on this basis, must be values-based and rational. I also consider whether the British taxpayer may be persuadable with regard to the moral acceptability of patient choice, making use of Rawls' theory of political liberalism in this context. I identify issues that present problems in terms of public acceptance of choice, and also identify a boundary issue with regard to public health choices as against individual choices.

  20. Is the hype around the reproductive health claims of maca (Lepidium meyenii Walp.) justified?

    Science.gov (United States)

    Beharry, Shruti; Heinrich, Michael

    2018-01-30

    Maca - Lepidium meyenii Walp. has been cultivated and used by Andean people for over 1300-2000 years in Peru as food and medicine. Starting in the late 1990's it has developed into an important herbal medicine in China and is now cultivated there widely, too AIM OF STUDY: This study aims to provide an insight into the emergence of maca on the global market as an alternative remedy to treat reproductive health related problems in both men and women and to critically assess these health claims. A search of electronic databases such as EMBASE and a hand-search was done to acquire peer-reviewed articles and reports about maca. Lepidium meyenii is used traditionally as a tonic, fertility enhancer for both humans and cattle, and to treat a variety of ailments such as rheumatism, respiratory disorders and anaemia among others. Maca root is cooked, baked, fermented as a drink and made into porridge. In the last twenty years, maca was introduced onto the global market and demand has dramatically grown over this time with its promotion on the internet, as the 'Peruvian Ginseng' for libido and fertility enhancement. It has also been said to treat menopausal symptoms, erectile dysfunction and benign prostatic hyperplasia. The sky-rocketing demand for the plant has seen a shift from traditional cultivation methods to mass production practices with the use of fertilisers and also pesticides; as maca is now grown in areas other than the Andes such as in the Yunnan province in China. This can potentially affect the phytochemistry and composition of the plant and thus, the quality, safety and efficacy of maca products. Meanwhile, research into maca's medicinal properties has followed the spike in popularity of maca and has been focused mainly on maca's aphrodisiac and fertility enhancing properties. So far, the in vivo studies and clinical trials conducted have yielded inconclusive results. Some of the key limitations reside in methodology and sample size. Chemical profiling, led to the discovery of new compounds unique to maca, such as, 'macamides' and also other active metabolites like the glucosinolates; to which the medicinal effects of maca have been ascribed but cannot be confirmed due to lack of data. To date, the health claims of maca cannot be fully supported from a scientific standpoint and more research is needed. It appears that the indigenous local knowledge about the health benefits of maca has been dragged out of context to fit the demands of a growing market for herbal remedies. This globalisation (or hype esp. in China) also has had serious consequences for the local producers in Peru. The lack of protocols to regulate the production and marketing of maca during this rapid expansion, poses a threat to both the safety of consumers and the sustainability of supply. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.