WorldWideScience

Sample records for model misspecification justifying

  1. Linearity and Misspecification Tests for Vector Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    Teräsvirta, Timo; Yang, Yukai

    The purpose of the paper is to derive Lagrange multiplier and Lagrange multiplier type specification and misspecification tests for vector smooth transition regression models. We report results from simulation studies in which the size and power properties of the proposed asymptotic tests in small...

  2. The Vanishing Tetrad Test: Another Test of Model Misspecification

    Science.gov (United States)

    Roos, J. Micah

    2014-01-01

    The Vanishing Tetrad Test (VTT) (Bollen, Lennox, & Dahly, 2009; Bollen & Ting, 2000; Hipp, Bauer, & Bollen, 2005) is an extension of the Confirmatory Tetrad Analysis (CTA) proposed by Bollen and Ting (Bollen & Ting, 1993). VTT is a powerful tool for detecting model misspecification and can be particularly useful in cases in which…

  3. The Vanishing Tetrad Test: Another Test of Model Misspecification

    Science.gov (United States)

    Roos, J. Micah

    2014-01-01

    The Vanishing Tetrad Test (VTT) (Bollen, Lennox, & Dahly, 2009; Bollen & Ting, 2000; Hipp, Bauer, & Bollen, 2005) is an extension of the Confirmatory Tetrad Analysis (CTA) proposed by Bollen and Ting (Bollen & Ting, 1993). VTT is a powerful tool for detecting model misspecification and can be particularly useful in cases in which…

  4. Explained variation and predictive accuracy in general parametric statistical models: the role of model misspecification

    DEFF Research Database (Denmark)

    Rosthøj, Susanne; Keiding, Niels

    2004-01-01

    When studying a regression model measures of explained variation are used to assess the degree to which the covariates determine the outcome of interest. Measures of predictive accuracy are used to assess the accuracy of the predictions based on the covariates and the regression model. We give...... a detailed and general introduction to the two measures and the estimation procedures. The framework we set up allows for a study of the effect of misspecification on the quantities estimated. We also introduce a generalization to survival analysis....

  5. Restricted spatial regression in practice: Geostatistical models, confounding, and robustness under model misspecification

    Science.gov (United States)

    Hanks, Ephraim M.; Schliep, Erin M.; Hooten, Mevin B.; Hoeting, Jennifer A.

    2015-01-01

    In spatial generalized linear mixed models (SGLMMs), covariates that are spatially smooth are often collinear with spatially smooth random effects. This phenomenon is known as spatial confounding and has been studied primarily in the case where the spatial support of the process being studied is discrete (e.g., areal spatial data). In this case, the most common approach suggested is restricted spatial regression (RSR) in which the spatial random effects are constrained to be orthogonal to the fixed effects. We consider spatial confounding and RSR in the geostatistical (continuous spatial support) setting. We show that RSR provides computational benefits relative to the confounded SGLMM, but that Bayesian credible intervals under RSR can be inappropriately narrow under model misspecification. We propose a posterior predictive approach to alleviating this potential problem and discuss the appropriateness of RSR in a variety of situations. We illustrate RSR and SGLMM approaches through simulation studies and an analysis of malaria frequencies in The Gambia, Africa.

  6. ASYMPTOTICS FOR CHANGE-POINT MODELS UNDER VARYING DEGREES OF MIS-SPECIFICATION.

    Science.gov (United States)

    Song, Rui; Banerjee, Moulinath; Kosorok, Michael R

    2016-02-01

    Change-point models are widely used by statisticians to model drastic changes in the pattern of observed data. Least squares/maximum likelihood based estimation of change-points leads to curious asymptotic phenomena. When the change-point model is correctly specified, such estimates generally converge at a fast rate (n) and are asymptotically described by minimizers of a jump process. Under complete mis-specification by a smooth curve, i.e. when a change-point model is fitted to data described by a smooth curve, the rate of convergence slows down to n(1/3) and the limit distribution changes to that of the minimizer of a continuous Gaussian process. In this paper we provide a bridge between these two extreme scenarios by studying the limit behavior of change-point estimates under varying degrees of model mis-specification by smooth curves, which can be viewed as local alternatives. We find that the limiting regime depends on how quickly the alternatives approach a change-point model. We unravel a family of 'intermediate' limits that can transition, at least qualitatively, to the limits in the two extreme scenarios. The theoretical results are illustrated via a set of carefully designed simulations. We also demonstrate how inference for the change-point parameter can be performed in absence of knowledge of the underlying scenario by resorting to subsampling techniques that involve estimation of the convergence rate.

  7. Why item parcels are (almost) never appropriate: two wrongs do not make a right--camouflaging misspecification with item parcels in CFA models.

    Science.gov (United States)

    Marsh, Herbert W; Lüdtke, Oliver; Nagengast, Benjamin; Morin, Alexandre J S; Von Davier, Matthias

    2013-09-01

    The present investigation has a dual focus: to evaluate problematic practice in the use of item parcels and to suggest exploratory structural equation models (ESEMs) as a viable alternative to the traditional independent clusters confirmatory factor analysis (ICM-CFA) model (with no cross-loadings, subsidiary factors, or correlated uniquenesses). Typically, it is ill-advised to (a) use item parcels when ICM-CFA models do not fit the data, and (b) retain ICM-CFA models when items cross-load on multiple factors. However, the combined use of (a) and (b) is widespread and often provides such misleadingly good fit indexes that applied researchers might believe that misspecification problems are resolved--that 2 wrongs really do make a right. Taking a pragmatist perspective, in 4 studies we demonstrate with responses to the Rosenberg Self-Esteem Inventory (Rosenberg, 1965), Big Five personality factors, and simulated data that even small cross-loadings seriously distort relations among ICM-CFA constructs or even decisions on the number of factors; although obvious in item-level analyses, this is camouflaged by the use of parcels. ESEMs provide a viable alternative to ICM-CFAs and a test for the appropriateness of parcels. The use of parcels with an ICM-CFA model is most justifiable when the fit of both ICM-CFA and ESEM models is acceptable and equally good, and when substantively important interpretations are similar. However, if the ESEM model fits the data better than the ICM-CFA model, then the use of parcels with an ICM-CFA model typically is ill-advised--particularly in studies that are also interested in scale development, latent means, and measurement invariance.

  8. The impact of covariance misspecification in group-based trajectory models for longitudinal data with non-stationary covariance structure.

    Science.gov (United States)

    Davies, Christopher E; Glonek, Gary Fv; Giles, Lynne C

    2017-08-01

    One purpose of a longitudinal study is to gain a better understanding of how an outcome of interest changes among a given population over time. In what follows, a trajectory will be taken to mean the series of measurements of the outcome variable for an individual. Group-based trajectory modelling methods seek to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Group-based trajectory models generally assume a certain structure in the covariances between measurements, for example conditional independence, homogeneous variance between groups or stationary variance over time. Violations of these assumptions could be expected to result in poor model performance. We used simulation to investigate the effect of covariance misspecification on misclassification of trajectories in commonly used models under a range of scenarios. To do this we defined a measure of performance relative to the ideal Bayesian correct classification rate. We found that the more complex models generally performed better over a range of scenarios. In particular, incorrectly specified covariance matrices could significantly bias the results but using models with a correct but more complicated than necessary covariance matrix incurred little cost.

  9. Multilevel Models for Intensive Longitudinal Data with Heterogeneous Autoregressive Errors: The Effect of Misspecification and Correction with Cholesky Transformation

    Science.gov (United States)

    Jahng, Seungmin; Wood, Phillip K.

    2017-01-01

    Intensive longitudinal studies, such as ecological momentary assessment studies using electronic diaries, are gaining popularity across many areas of psychology. Multilevel models (MLMs) are most widely used analytical tools for intensive longitudinal data (ILD). Although ILD often have individually distinct patterns of serial correlation of measures over time, inferences of the fixed effects, and random components in MLMs are made under the assumption that all variance and autocovariance components are homogenous across individuals. In the present study, we introduced a multilevel model with Cholesky transformation to model ILD with individually heterogeneous covariance structure. In addition, the performance of the transformation method and the effects of misspecification of heterogeneous covariance structure were investigated through a Monte Carlo simulation. We found that, if individually heterogeneous covariances are incorrectly assumed as homogenous independent or homogenous autoregressive, MLMs produce highly biased estimates of the variance of random intercepts and the standard errors of the fixed intercept and the fixed effect of a level 2 covariate when the average autocorrelation is high. For intensive longitudinal data with individual specific residual covariance, the suggested transformation method showed lower bias in those estimates than the misspecified models when the number of repeated observations within individuals is 50 or more. PMID:28286490

  10. Joint modelling of longitudinal and survival data: incorporating delayed entry and an assessment of model misspecification.

    Science.gov (United States)

    Crowther, Michael J; Andersson, Therese M-L; Lambert, Paul C; Abrams, Keith R; Humphreys, Keith

    2016-03-30

    A now common goal in medical research is to investigate the inter-relationships between a repeatedly measured biomarker, measured with error, and the time to an event of interest. This form of question can be tackled with a joint longitudinal-survival model, with the most common approach combining a longitudinal mixed effects model with a proportional hazards survival model, where the models are linked through shared random effects. In this article, we look at incorporating delayed entry (left truncation), which has received relatively little attention. The extension to delayed entry requires a second set of numerical integration, beyond that required in a standard joint model. We therefore implement two sets of fully adaptive Gauss-Hermite quadrature with nested Gauss-Kronrod quadrature (to allow time-dependent association structures), conducted simultaneously, to evaluate the likelihood. We evaluate fully adaptive quadrature compared with previously proposed non-adaptive quadrature through a simulation study, showing substantial improvements, both in terms of minimising bias and reducing computation time. We further investigate, through simulation, the consequences of misspecifying the longitudinal trajectory and its impact on estimates of association. Our scenarios showed the current value association structure to be very robust, compared with the rate of change that we found to be highly sensitive showing that assuming a simpler trend when the truth is more complex can lead to substantial bias. With emphasis on flexible parametric approaches, we generalise previous models by proposing the use of polynomials or splines to capture the longitudinal trend and restricted cubic splines to model the baseline log hazard function. The methods are illustrated on a dataset of breast cancer patients, modelling mammographic density jointly with survival, where we show how to incorporate density measurements prior to the at-risk period, to make use of all the available

  11. Structural Break Tests Robust to Regression Misspecification

    NARCIS (Netherlands)

    Abi Morshed, Alaa; Andreou, E.; Boldea, Otilia

    2016-01-01

    Structural break tests developed in the literature for regression models are sensitive to model misspecification. We show - analytically and through simulations - that the sup Wald test for breaks in the conditional mean and variance of a time series process exhibits severe size distortions when the

  12. Estimating the mean and standard deviation of environmental data with below detection limit observations: Considering highly skewed data and model misspecification.

    Science.gov (United States)

    Shoari, Niloofar; Dubé, Jean-Sébastien; Chenouri, Shoja'eddin

    2015-11-01

    In environmental studies, concentration measurements frequently fall below detection limits of measuring instruments, resulting in left-censored data. Some studies employ parametric methods such as the maximum likelihood estimator (MLE), robust regression on order statistic (rROS), and gamma regression on order statistic (GROS), while others suggest a non-parametric approach, the Kaplan-Meier method (KM). Using examples of real data from a soil characterization study in Montreal, we highlight the need for additional investigations that aim at unifying the existing literature. A number of studies have examined this issue; however, those considering data skewness and model misspecification are rare. These aspects are investigated in this paper through simulations. Among other findings, results show that for low skewed data, the performance of different statistical methods is comparable, regardless of the censoring percentage and sample size. For highly skewed data, the performance of the MLE method under lognormal and Weibull distributions is questionable; particularly, when the sample size is small or censoring percentage is high. In such conditions, MLE under gamma distribution, rROS, GROS, and KM are less sensitive to skewness. Related to model misspecification, MLE based on lognormal and Weibull distributions provides poor estimates when the true distribution of data is misspecified. However, the methods of rROS, GROS, and MLE under gamma distribution are generally robust to model misspecifications regardless of skewness, sample size, and censoring percentage. Since the characteristics of environmental data (e.g., type of distribution and skewness) are unknown a priori, we suggest using MLE based on gamma distribution, rROS and GROS.

  13. Reply: New results justify open discussion of alternative models

    Science.gov (United States)

    Newman, Andrew; Stein, Seth; Weber, John; Engeln, Joseph; Mao, Aitlin; Dixon, Timothy

    A millennium ago, Jewish sages wrote that “the rivalry of scholars increases wisdom.” In contrast, Schweig et al. (Eos, this issue) demand that “great caution” be exercised in discussing alternatives to their model of high seismic hazard in the New Madrid seismic zone (NMSZ). We find this view surprising; we have no objection to their and their coworkers' extensive efforts promoting their model in a wide variety of public media, but see no reason not to explore a lower-hazard alternative based on both new data and reanalysis of data previously used to justify their model. In our view, the very purpose of collecting new data and reassessing existing data is to promote spirited testing and improvement of existing hypotheses. For New Madrid, such open reexamination seems scientifically appropriate, given the challenge of understanding intraplate earthquakes, and socially desirable because of the public policy implications.

  14. Which level of model complexity is justified by your data? A Bayesian answer

    Science.gov (United States)

    Schöniger, Anneli; Illman, Walter; Wöhling, Thomas; Nowak, Wolfgang

    2016-04-01

    When judging the plausibility and utility of a subsurface flow or transport model, the question of justifiability arises: which level of model complexity can still be justified by the available calibration data? Although it is common sense that more data are needed to reasonably constrain the parameter space of a more complex model, there is a lack of tools that can objectively quantify model justifiability as a function of the available data. We propose an approach to determine model justifiability in the context of comparing alternative conceptual models. Our approach rests on Bayesian model averaging (BMA). BMA yields posterior model probabilities that point the modeler to an optimal trade-off between model performance in reproducing a given calibration data set and model complexity. To find out which level of complexity can be justified by the available data, we disentangle the complexity component of the trade-off from its performance counterpart. Technically, we remove the performance component from the BMA analysis by replacing the actually observed data values with potential measurement values as predicted by the models. Our proposed analysis results in a "model confusion matrix". Based on this matrix, the modeler can identify the maximum level of model complexity that could possibly be justified by the available amount and type of data. As a side product, model (dis-)similarity is revealed. We have applied the model justifiability analysis to a case of aquifer characterization via hydraulic tomography. Four models of vastly different complexity have been proposed to represent the heterogeneity in hydraulic conductivity of a sandbox aquifer, ranging from a homogeneous medium to geostatistical random fields. We have used drawdown data from two to six pumping tests to condition the models and to determine model justifiability as a function of data set size. Our test case shows that a geostatistical parameterization scheme requires a substantial amount of

  15. The Relationship between Root Mean Square Error of Approximation and Model Misspecification in Confirmatory Factor Analysis Models

    Science.gov (United States)

    Savalei, Victoria

    2012-01-01

    The fit index root mean square error of approximation (RMSEA) is extremely popular in structural equation modeling. However, its behavior under different scenarios remains poorly understood. The present study generates continuous curves where possible to capture the full relationship between RMSEA and various "incidental parameters," such as…

  16. A Critical Review and Simulation Analysis of Measurement Model Misspecifications in Chinese Management Research%我国管理学研究中的测量模型误设及仿真分析

    Institute of Scientific and Technical Information of China (English)

    王念新; 仲伟俊; 梅姝娥

    2011-01-01

    在实证研究中,构念和观测变量之间的关系经常被忽视.本文详细区分了反映式测量模型和构成式测量模型,通过分析我国管理学领域的三本学术刊物2002年到2007年以结构方程模型(SEM)为数据分析工具的实证研究论文,发现我国管理学研究中测量模型误设的情况普遍存在.蒙特卡罗仿真分析表明,测量模型的误设将导致相关路径系数的被显著扩大或缩小,而且可能导致I型错误和Ⅱ型错误.由于现有的SEM软件不能处理构成式测量模型,本文提出了模型细化法和模型分解法,能够将构成式测量模型转换成反映式测量模型.%Structural Equation Modeling (SEM) is a prevalent research method in the Chinese management study. SEM can be divided into measurement models and structural models. A well-specified measurement model is a prerequisite to analyzing the structural model. Reflective and formative measurement models are two different measurement models. Measurement model miaspecification has been a serious problem in the fields of Marketing, Organizational Behavior, and Management Information Systems. Measurement model misspecification can lead to invalid research results. Many approaches can facilitate the internationalization and normalization of SEM in Chinese management studies. These approaches include the discussion of the difference between reflective and formative measurement models, analysis of measurement model misspecification problems, and investigation of the consequence of measurement model miaspecification problems. In the first part, we discuss major differences between reflective and formative measurement models. Measurement models include reflective and formative measurement models. In the reflective measurement model, constructs are treated as causes of measures, and the measures are reflective manifestations of underlying constructs. In the formative measurement model, measures are specified as causes of

  17. Detection of Q-Matrix Misspecification Using Two Criteria for Validation of Cognitive Structures under the Least Squares Distance Model

    Science.gov (United States)

    Romero, Sonia J.; Ordoñez, Xavier G.; Ponsoda, Vincente; Revuelta, Javier

    2014-01-01

    Cognitive Diagnostic Models (CDMs) aim to provide information about the degree to which individuals have mastered specific attributes that underlie the success of these individuals on test items. The Q-matrix is a key element in the application of CDMs, because contains links item-attributes representing the cognitive structure proposed for solve…

  18. Are stock prices too volatile to be justified by the dividend discount model?

    Science.gov (United States)

    Akdeniz, Levent; Salih, Aslıhan Altay; Ok, Süleyman Tuluğ

    2007-03-01

    This study investigates excess stock price volatility using the variance bound framework of LeRoy and Porter [The present-value relation: tests based on implied variance bounds, Econometrica 49 (1981) 555-574] and of Shiller [Do stock prices move too much to be justified by subsequent changes in dividends? Am. Econ. Rev. 71 (1981) 421-436.]. The conditional variance bound relationship is examined using cross-sectional data simulated from the general equilibrium asset pricing model of Brock [Asset prices in a production economy, in: J.J. McCall (Ed.), The Economics of Information and Uncertainty, University of Chicago Press, Chicago (for N.B.E.R.), 1982]. Results show that the conditional variance bounds hold, hence, our hypothesis of the validity of the dividend discount model cannot be rejected. Moreover, in our setting, markets are efficient and stock prices are neither affected by herd psychology nor by the outcome of noise trading by naive investors; thus, we are able to control for market efficiency. Consequently, we show that one cannot infer any conclusions about market efficiency from the unconditional variance bounds tests.

  19. Misspecification Effects in the Analysis of Panel Data

    Directory of Open Access Journals (Sweden)

    Vieira Marcel de Toledo

    2016-06-01

    Full Text Available Misspecification effects (meffs measure the effect on the sampling variance of an estimator of incorrect specification of both the sampling scheme and the model considered. We assess the effect of various features of complex sampling schemes on the inferences drawn from models for panel data using meffs. Many longitudinal social survey designs employ multistage sampling, leading to some clustering, which tends to lead to meffs greater than unity. An empirical study using data from the British Household Panel Survey is conducted, and a simulation study is performed. Our results suggest that clustering impacts are stronger for longitudinal studies than for cross-sectional studies, and that meffs for the regression coefficients increase with the number of waves analysed. Hence, estimated standard errors in the analysis of panel data can be misleading if any clustering is ignored.

  20. The Use of the L[subscript z] and L[subscript z]* Person-Fit Statistics and Problems Derived from Model Misspecification

    Science.gov (United States)

    Meijer, Rob R.; Tendeiro, Jorge N.

    2012-01-01

    We extend a recent didactic by Magis, Raiche, and Beland on the use of the l[subscript z] and l[subscript z]* person-fit statistics. We discuss a number of possibly confusing details and show that it is important to first investigate item response theory model fit before assessing person fit. Furthermore, it is argued that appropriate…

  1. Beyond Conflict and Spoilt Identities: How Rwandan Leaders Justify a Single Recategorization Model for Post-Conflict Reconciliation

    Directory of Open Access Journals (Sweden)

    Sigrun Marie Moss

    2014-08-01

    Full Text Available Since 1994, the Rwandan government has attempted to remove the division of the population into the ‘ethnic’ identities Hutu, Tutsi and Twa and instead make the shared Rwandan identity salient. This paper explores how leaders justify the single recategorization model, based on nine in-depth semi-structured interviews with Rwandan national leaders (politicians and bureaucrats tasked with leading unity implementation conducted in Rwanda over three months in 2011/2012. Thematic analysis revealed this was done through a meta-narrative focusing on the shared Rwandan identity. Three frames were found in use to “sell” this narrative where ethnic identities are presented as a an alien construction; b which was used to the disadvantage of the people; and c non-essential social constructs. The material demonstrates the identity entrepreneurship behind the single recategorization approach: the definition of the category boundaries, the category content, and the strategies for controlling and overcoming alternative narratives. Rwandan identity is presented as essential and legitimate, and as offering a potential way for people to escape spoilt subordinate identities. The interviewed leaders insist Rwandans are all one, and that the single recategorization is the right path for Rwanda, but this approach has been criticised for increasing rather than decreasing intergroup conflict due to social identity threat. The Rwandan case offers a rare opportunity to explore leaders’ own narratives and framing of these ‘ethnic’ identities to justify the single recategorization approach.

  2. Modelling severe Staphylococcus aureus sepsis in conscious pigs: are implications for animal welfare justified?

    DEFF Research Database (Denmark)

    Olsen, Helle G; Kjelgaard-Hansen, Mads; Tveden-Nyborg, Pernille;

    2016-01-01

    A porcine model of haematogenous Staphylococcus aureus sepsis has previously been established in our research group. In these studies, pigs developed severe sepsis including liver dysfunction during a 48 h study period. As pigs were awake during the study, animal welfare was challenged...

  3. Justifying Action Research

    Science.gov (United States)

    Helskog, Guro Hansen

    2014-01-01

    In this paper I use a general philosophy of science perspective in looking at the problem of justifying action research. First I try to clarify the concept of justification, by contrasting it with the concept of validity, which seems to be used almost as a synonym in some parts of the literature. I discuss the need for taking a stand in relation…

  4. Justifying departures from progressivity

    DEFF Research Database (Denmark)

    Heinemann, Trine; Steensig, Jakob

    2017-01-01

    -going activity. Some of the actions that altså prefaces can also be prefaced by phrases like ‘you know’ or ‘I mean’, which seem to do at least some of the work that altså does, but altså is used more frequently and across a wider range of actions. In our discussion, we raise the possibility that the usefulness......This chapter investigates the use of the Danish particle altså in turn-initial position. Turn-initial altså can be employed for prefacing a wide range of actions, including self- and other-initiated repair, questions, second stories and answers to both yes/no and wh-questions. We show that across...... these actions, participants in interaction produce altså to indicate (1) that the action they will produce departs from progressivity, (2) that it will expand on something prior, (3) that the departure is, therefore, justified, and (4) that it will contribute to reinstalling the progression of the larger on...

  5. Second order pseudo-maximum likelihood estimation and conditional variance misspecification

    OpenAIRE

    Lejeune, Bernard

    1997-01-01

    In this paper, we study the behavior of second order pseudo-maximum likelihood estimators under conditional variance misspecification. We determine sufficient and essentially necessary conditions for such a estimator to be, regardless of the conditional variance (mis)specification, consistent for the mean parameters when the conditional mean is correctly specified. These conditions implie that, even if mean and variance parameters vary independently, standard PML2 estimators are generally not...

  6. The Performance of the Linear Logistic Test Model When the Q-Matrix Is Misspecified: A Simulation Study

    Science.gov (United States)

    MacDonald, George T.

    2014-01-01

    A simulation study was conducted to explore the performance of the linear logistic test model (LLTM) when the relationships between items and cognitive components were misspecified. Factors manipulated included percent of misspecification (0%, 1%, 5%, 10%, and 15%), form of misspecification (under-specification, balanced misspecification, and…

  7. IRT Model Misspecification and Measurement of Growth in Vertical Scaling

    Science.gov (United States)

    Bolt, Daniel M.; Deng, Sien; Lee, Sora

    2014-01-01

    Functional form misfit is frequently a concern in item response theory (IRT), although the practical implications of misfit are often difficult to evaluate. In this article, we illustrate how seemingly negligible amounts of functional form misfit, when systematic, can be associated with significant distortions of the score metric in vertical…

  8. Supplier-induced demand: re-examining identification and misspecification in cross-sectional analysis.

    Science.gov (United States)

    Peacock, Stuart J; Richardson, Jeffrey R J

    2007-09-01

    This paper re-examines criticisms of cross-sectional methods used to test for supplier-induced demand (SID) and re-evaluates the empirical evidence using data from Australian medical services. Cross-sectional studies of SID have been criticised on two grounds. First, and most important, the inclusion of the doctor supply in the demand equation leads to an identification problem. This criticism is shown to be invalid, as the doctor supply variable is stochastic and depends upon a variety of other variables including the desirability of the location. Second, cross-sectional studies of SID fail diagnostic tests and produce artefactual findings due to model misspecification. Contrary to this, the re-evaluation of cross-sectional Australian data indicate that demand equations that do not include the doctor supply are misspecified. Empirical evidence from the re-evaluation of Australian medical services data supports the notion of SID. Demand and supply equations are well specified and have very good explanatory power. The demand equation is identified and the desirability of a location is an important predictor of the doctor supply. Results show an average price elasticity of demand of 0.22 and an average elasticity of demand with respect to the doctor supply of 0.46, with the impact of SID becoming stronger as the doctor supply rises. The conclusion we draw from this paper is that two of the main criticisms of the empirical evidence supporting the SID hypothesis have been inappropriately levelled at the methods used. More importantly, SID provides a satisfactory, and robust, explanation of the empirical data on the demand for medical services in Australia.

  9. The mis-specification of the expected rescaled adjusted range

    Science.gov (United States)

    Ellis, Craig

    2006-05-01

    Rescaled range analysis has regained popularity in the recent econophysics literature as a means of identifying long-term dependence in time-series data. Conclusions derived from the rescaled adjusted range statistic are conditional however upon the choice of an appropriate benchmark against which calculated results can be compared. One recent paper in Physica A by Couillard and Davison [Physica A 348 (2005) 404] concludes that the Anis and Lloyd [Biometrika 63 (1976) 111] model of the expected rescaled adjusted range is more accurate than that proposed by Peters [Fractal Market Analysis, Wiley, New York, 1994]. This finding is contrary to the evidence presented by Peters. This paper reveals significant inconsistencies in the empirical results reported by Peters, which when considered, support the conclusions of Couillard and Davison and explain the apparent contradiction in their results versus those of Peters.

  10. Justifying the exotic Theta+ pentaquark

    CERN Document Server

    Diakonov, Dmitri

    2009-01-01

    The existence of a light S=+1 baryon resonance follows from Quantum Field Theory applied to baryons. This is illustrated in the Skyrme model (where Theta+ exists but is too strong) and in a new mean field approach where Theta+ arises as a consequence of three known resonances: Lambda(1405), N(1440) and N(1535).

  11. The Self-Justifying Desire for Happiness

    DEFF Research Database (Denmark)

    Rodogno, Raffaele

    2004-01-01

    In Happiness, Tabensky equates the notion of happiness to Aristotelian eudaimonia. I shall claim that doing so amounts to equating two concepts that moderns cannot conceptually equate, namely, the good for a person and the good person or good life. In §2 I examine the way in which Tabensky deals...... with this issue and claim that his idea of happiness is as problematic for us moderns as is any translation of the notion of eudaimonia in terms of happiness. Naturally, if happiness understood as eudaimonia is ambiguous, so will be the notion of a desire for happiness, which we find at the core of Tabensky......'s whole project. In §3 I shall be concerned with another aspect of the desire for happiness; namely, its alleged self-justifying nature. I will attempt to undermine the idea that this desire is self-justifying by undermining the criterion on which Tabensky takes self-justifiability to rest, i.e. its...

  12. Are entry criteria for cataract surgery justified?

    Directory of Open Access Journals (Sweden)

    Daniel Böhringer

    Full Text Available PURPOSE: The German Ophthalmological Society (GOS recently proposed surgical entry criteria, i.e. 300 cataract surgeries. We herein correlate the surgical hands-on experience with the risk of posterior capsule ruptures in order to assess whether this number is appropriate. METHODS: We identified all cataract operations that had been performed at the University Eye Hospital Freiburg since 1995. For each surgeon, we assigned a running number to his/her procedures in the order they had been performed. Thereafter, we excluded all combined procedures and the second eyes. We then selected the 5475 surgical reports between November 2008 and November 2012 for detailed review. We additionally classified each surgery into low- vs. high- à priori risk for posterior capsule ruptures. We fitted a multifactorial logistic regression model to assess the GOS recommendation of 300 surgeries under supervision. In the low-risk group, we additionally visualized the 'typical' learning curve by plotting the posterior capsule ruptures against the respective rank numbers. RESULTS: The odds ratio for posterior capsule ruptures of 'learning-mode' (one of the respective surgeon's 300 first procedures vs. the non-learning-mode was 3.8 (p<0.0001. By contrast, classification into the low-risk group lowered the risk of posterior capsule ruptures three fold (p<0.0001. According to the low-risk plot, the surgeons started with a complication rate of 4% and continuously improved towards 0.5% after 1500 operations. Thereafter, the rate increased again and stabilized around one percent. CONCLUSION: The learning curve with respect to posterior capsule ruptures is surprisingly flat. The GOS entry criterion of 300 cataract procedures is therefore most likely justified. Careful selection of low-risk patients for the training surgeons may help in reducing the rate of posterior capsule ruptures during training.

  13. Modelling and forecasting WIG20 daily returns

    DEFF Research Database (Denmark)

    Amado, Cristina; Silvennoinen, Annestiina; Terasvirta, Timo

    of the model is that the deterministic component is specified before estimating the multiplicative conditional variance component. The resulting model is subjected to misspecification tests and its forecasting performance is compared with that of commonly applied models of conditional heteroskedasticity....

  14. On Three Ways to Justify Religious Beliefs

    NARCIS (Netherlands)

    Brümmer, V.

    2001-01-01

    This paper compares the ways in which revealed theology, natural theology and philosophical theology justify religious belief. Revealed theology does so with an appeal to revelation and natural theology with an appeal to reason and perception. It is argued that both are inadequate. Philosophical the

  15. Can drug patents be morally justified?

    Science.gov (United States)

    Sterckx, Sigrid

    2005-01-01

    This paper offers a few elements of an answer to the question to what extent drug patents can be morally justified. Justifications based on natural rights, distributive justice and utilitarian arguments are discussed and criticized. The author recognizes the potential of the patents to benefit society but argues that the system is currently evolving in the wrong direction, particularly in the field of drugs. More than a third of the world's population has no access to essential drugs. The working of the patent system is an important determinant of access to drugs. This paper argues that drug patents are not easily justified and that the 'architecture' of the patent system should be rethought in view of its mission of benefiting society.

  16. Value-Based Argumentation for Justifying Compliance

    Science.gov (United States)

    Burgemeestre, Brigitte; Hulstijn, Joris; Tan, Yao-Hua

    Compliance is often achieved 'by design' through a coherent system of controls consisting of information systems and procedures . This system-based control requires a new approach to auditing in which companies must demonstrate to the regulator that they are 'in control'. They must determine the relevance of a regulation for their business, justify which set of control measures they have taken to comply with it, and demonstrate that the control measures are operationally effective. In this paper we show how value-based argumentation theory can be applied to the compliance domain. Corporate values motivate the selection of control measures (actions) which aim to fulfill control objectives, i.e. adopted norms (goals). In particular, we show how to formalize the dialogue in which companies justify their compliance decisions to regulators using value-based argumentation. The approach is illustrated by a case study of the safety and security measures adopted in the context of EU customs regulation.

  17. Justifying clinical trials for porcine islet xenotransplantation.

    Science.gov (United States)

    Ellis, Cara E; Korbutt, Gregory S

    2015-01-01

    The development of the Edmonton Protocol encouraged a great deal of optimism that a cell-based cure for type I diabetes could be achieved. However, donor organ shortages prevent islet transplantation from being a widespread solution as the supply cannot possibly equal the demand. Porcine islet xenotransplantation has the potential to address these shortages, and recent preclinical and clinical trials show promising scientific support. Consequently, it is important to consider whether the current science meets the ethical requirements for moving toward clinical trials. Despite the potential risks and the scientific unknowns that remain to be investigated, there is optimism regarding the xenotransplantation of some types of tissue, and enough evidence has been gathered to ethically justify clinical trials for the most safe and advanced area of research, porcine islet transplantation. Researchers must make a concerted effort to maintain a positive image for xenotransplantation, as a few well-publicized failed trials could irrevocably damage public perception of xenotransplantation. Because all of society carries the burden of risk, it is important that the public be involved in the decision to proceed. As new information from preclinical and clinical trials develops, policy decisions should be frequently updated. If at any point evidence shows that islet xenotransplantation is unsafe, then clinical trials will no longer be justified and they should be halted. However, as of now, the expected benefit of an unlimited supply of islets, combined with adequate informed consent, justifies clinical trials for islet xenotransplantation.

  18. Misspecifications of stimulus presentation durations in experimental psychology: a systematic review of the psychophysics literature.

    Directory of Open Access Journals (Sweden)

    Tobias Elze

    Full Text Available BACKGROUND: In visual psychophysics, precise display timing, particularly for brief stimulus presentations, is often required. The aim of this study was to systematically review the commonly applied methods for the computation of stimulus durations in psychophysical experiments and to contrast them with the true luminance signals of stimuli on computer displays. METHODOLOGY/PRINCIPAL FINDINGS: In a first step, we systematically scanned the citation index Web of Science for studies with experiments with stimulus presentations for brief durations. Articles which appeared between 2003 and 2009 in three different journals were taken into account if they contained experiments with stimuli presented for less than 50 milliseconds. The 79 articles that matched these criteria were reviewed for their method of calculating stimulus durations. For those 75 studies where the method was either given or could be inferred, stimulus durations were calculated by the sum of frames (SOF method. In a second step, we describe the luminance signal properties of the two monitor technologies which were used in the reviewed studies, namely cathode ray tube (CRT and liquid crystal display (LCD monitors. We show that SOF is inappropriate for brief stimulus presentations on both of these technologies. In extreme cases, SOF specifications and true stimulus durations are even unrelated. Furthermore, the luminance signals of the two monitor technologies are so fundamentally different that the duration of briefly presented stimuli cannot be calculated by a single method for both technologies. Statistics over stimulus durations given in the reviewed studies are discussed with respect to different duration calculation methods. CONCLUSIONS/SIGNIFICANCE: The SOF method for duration specification which was clearly dominating in the reviewed studies leads to serious misspecifications particularly for brief stimulus presentations. We strongly discourage its use for brief stimulus

  19. Misspecifications of stimulus presentation durations in experimental psychology: a systematic review of the psychophysics literature.

    Science.gov (United States)

    Elze, Tobias

    2010-09-29

    In visual psychophysics, precise display timing, particularly for brief stimulus presentations, is often required. The aim of this study was to systematically review the commonly applied methods for the computation of stimulus durations in psychophysical experiments and to contrast them with the true luminance signals of stimuli on computer displays. In a first step, we systematically scanned the citation index Web of Science for studies with experiments with stimulus presentations for brief durations. Articles which appeared between 2003 and 2009 in three different journals were taken into account if they contained experiments with stimuli presented for less than 50 milliseconds. The 79 articles that matched these criteria were reviewed for their method of calculating stimulus durations. For those 75 studies where the method was either given or could be inferred, stimulus durations were calculated by the sum of frames (SOF) method. In a second step, we describe the luminance signal properties of the two monitor technologies which were used in the reviewed studies, namely cathode ray tube (CRT) and liquid crystal display (LCD) monitors. We show that SOF is inappropriate for brief stimulus presentations on both of these technologies. In extreme cases, SOF specifications and true stimulus durations are even unrelated. Furthermore, the luminance signals of the two monitor technologies are so fundamentally different that the duration of briefly presented stimuli cannot be calculated by a single method for both technologies. Statistics over stimulus durations given in the reviewed studies are discussed with respect to different duration calculation methods. The SOF method for duration specification which was clearly dominating in the reviewed studies leads to serious misspecifications particularly for brief stimulus presentations. We strongly discourage its use for brief stimulus presentations on CRT and LCD monitors.

  20. On Model Specification and Selection of the Cox Proportional Hazards Model*

    OpenAIRE

    Lin, Chen-Yen; Halabi, Susan

    2013-01-01

    Prognosis plays a pivotal role in patient management and trial design. A useful prognostic model should correctly identify important risk factors and estimate their effects. In this article, we discuss several challenges in selecting prognostic factors and estimating their effects using the Cox proportional hazards model. Although a flexible semiparametric form, the Cox’s model is not entirely exempt from model misspecification. To minimize possible misspecification, instead of imposing tradi...

  1. 7 CFR 48.7 - Evidence to justify dumping.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Evidence to justify dumping. 48.7 Section 48.7... Dumping § 48.7 Evidence to justify dumping. Any person, receiving produce in interstate commerce or in the..., prior to such destroying, abandoning, discarding or dumping, obtain a dumping certificate or...

  2. Justified Belief and the Topology of Evidence

    NARCIS (Netherlands)

    Baltag, A.; Bezhanishvili, N.; Özgün, A.; Smets, S.J.L.; Väänänen, J.; Hirvonen, Å.; de Queiroz, R.

    2016-01-01

    We introduce a new topological semantics for evidence, evidence-based justifications, belief and knowledge. This setting builds on the evidence model framework of van Benthem and Pacuit, as well as our own previous work on (a topological semantics for) Stalnaker’s doxastic-epistemic axioms. We prove

  3. Justified Belief and the Topology of Evidence

    NARCIS (Netherlands)

    Baltag, A.; Bezhanishvili, N.; Özgün, A.; Smets, S.J.L.; Väänänen, J.; Hirvonen, Å.; de Queiroz, R.

    2016-01-01

    We introduce a new topological semantics for evidence, evidence-based justifications, belief and knowledge. This setting builds on the evidence model framework of van Benthem and Pacuit, as well as our own previous work on (a topological semantics for) Stalnaker’s doxastic-epistemic axioms. We prove

  4. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....

  5. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....

  6. Justifying genetics as a possible legal defence to criminal ...

    African Journals Online (AJOL)

    Justifying genetics as a possible legal defence to criminal responsibility in Nigeria. ... on the relationship between nature and nurture (genes versus environment). ... who commit murder due to one psychotic or hereditary mental disorders end ...

  7. Justifying Definitions in Mathematics---Going Beyond Lakatos

    OpenAIRE

    Werndl, Charlotte

    2013-01-01

    This paper addresses the actual practice of justifying definitions in mathematics. First, I introduce the main account of this issue, namely Lakatos's proof-generated definitions. Based on a case study of definitions of randomness in ergodic theory, I identify three other common ways of justifying definitions: natural-world-justification, condition-justification and redundancy-justification. Also, I clarify the interrelationships between the different kinds of justification. Finally, I point ...

  8. Screening for foot problems in children: is this practice justifiable?

    Directory of Open Access Journals (Sweden)

    Evans Angela

    2012-07-01

    Full Text Available Abstract Podiatry screening of children is a common practice, which occurs largely without adequate data to support the need for such activity. Such programs may be either formalised, or more ad hoc in nature, depending upon the use of guidelines or existing models. Although often not used, the well-established criteria for assessing the merits of screening programs can greatly increase the understanding as to whether such practices are actually worthwhile. This review examines the purpose of community health screening in the Australian context, as occurs for tuberculosis, breast, cervical and prostate cancers, and then examines podiatry screening practices for children with reference to the criteria of the World Health Organisation (WHO. Topically, the issue of paediatric foot posture forms the focus of this review, as it presents with great frequency to a range of clinicians. Comparison is made with developmental dysplasia of the hip, in which instance the WHO criteria are well met. Considering that the burden of the condition being screened for must be demonstrable, and that early identification must be found to be beneficial, in order to justify a screening program, there is no sound support for either continuing or establishing podiatry screenings for children.

  9. Calculation-experimental method justifies the life of wagons

    Directory of Open Access Journals (Sweden)

    Валерія Сергіївна Воропай

    2015-11-01

    Full Text Available The article proposed a method to evaluate the technical state of tank wagons operating in chemical industry. An algorithm for evaluation the technical state of tank wagons was developed, that makes it possible on the basis of diagnosis and analysis of current condition to justify a further period of operation. The complex of works on testing the tanks and mathematical models for calculations of the design strength and reliability were proposed. The article is devoted to solving the problem of effective exploitation of the working fleet of tank wagons. Opportunities for further exploitation of cars, the complex of works on the assessment of their technical state and the calculation of the resources have been proposed in the article. Engineering research of the chemical industries park has reduced the shortage of the rolling stock for transportation of ammonia. The analysis of the chassis numerous faults and the main elements of tank wagons supporting structure after 20 years of exploitation was made. The algorithm of determining the residual life of the specialized tank wagons operating in an industrial plant has been proposed. The procedure for resource conservation of tank wagons carrying cargo under high pressure was first proposed. The improved procedure for identifying residual life proposed in the article has both theoretical and practical importance

  10. Ascertainment-adjusted parameter estimation approach to improve robustness against misspecification of health monitoring methods

    Science.gov (United States)

    Juesas, P.; Ramasso, E.

    2016-12-01

    Condition monitoring aims at ensuring system safety which is a fundamental requirement for industrial applications and that has become an inescapable social demand. This objective is attained by instrumenting the system and developing data analytics methods such as statistical models able to turn data into relevant knowledge. One difficulty is to be able to correctly estimate the parameters of those methods based on time-series data. This paper suggests the use of the Weighted Distribution Theory together with the Expectation-Maximization algorithm to improve parameter estimation in statistical models with latent variables with an application to health monotonic under uncertainty. The improvement of estimates is made possible by incorporating uncertain and possibly noisy prior knowledge on latent variables in a sound manner. The latent variables are exploited to build a degradation model of dynamical system represented as a sequence of discrete states. Examples on Gaussian Mixture Models, Hidden Markov Models (HMM) with discrete and continuous outputs are presented on both simulated data and benchmarks using the turbofan engine datasets. A focus on the application of a discrete HMM to health monitoring under uncertainty allows to emphasize the interest of the proposed approach in presence of different operating conditions and fault modes. It is shown that the proposed model depicts high robustness in presence of noisy and uncertain prior.

  11. Justifying Definitions in Mathematics---Going Beyond Lakatos

    CERN Document Server

    Werndl, Charlotte

    2013-01-01

    This paper addresses the actual practice of justifying definitions in mathematics. First, I introduce the main account of this issue, namely Lakatos's proof-generated definitions. Based on a case study of definitions of randomness in ergodic theory, I identify three other common ways of justifying definitions: natural-world-justification, condition-justification and redundancy-justification. Also, I clarify the interrelationships between the different kinds of justification. Finally, I point out how Lakatos's ideas are limited: they fail to show that various kinds of justification can be found and can be reasonable, and they fail to acknowledge the interplay between the different kinds of justification.

  12. Boosting the accuracy of hedonic pricing models

    NARCIS (Netherlands)

    M.C. van Wezel (Michiel); M. Kagie (Martijn); R. Potharst (Rob)

    2005-01-01

    textabstractHedonic pricing models attempt to model a relationship between object attributes and the object's price. Traditional hedonic pricing models are often parametric models that suffer from misspecification. In this paper we create these models by means of boosted CART models. The method is

  13. The use of imputed sibling genotypes in sibship-based association analysis: On modeling alternatives, power and model misspecification

    NARCIS (Netherlands)

    Minica, C.C.; Dolan, C.V.; Hottenga, J.J.; Willemsen, G.; Vink, J.M.; Boomsma, D.I.

    2013-01-01

    When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of tw

  14. Investigation into How Managers Justify Investments in IT Infrastructure

    Science.gov (United States)

    Ibe, Richmond Ikechukwu

    2012-01-01

    Organization leaders are dependent on information technology for corporate productivity; however, senior managers have expressed concerns about insufficient benefits from information technology investments. The problem researched was to understand how midsized businesses justify investments in information technology infrastructure. The purpose of…

  15. Connections between Generalizing and Justifying: Students' Reasoning with Linear Relationships

    Science.gov (United States)

    Ellis, Amy B.

    2007-01-01

    Research investigating algebra students' abilities to generalize and justify suggests that they experience difficulty in creating and using appropriate generalizations and proofs. Although the field has documented students' errors, less is known about what students do understand to be general and convincing. This study examines the ways in which…

  16. Nurturing towards Wisdom: Justifying Music in the Curriculum

    Science.gov (United States)

    Heimonen, Marja

    2008-01-01

    This essay considers the music curriculum from a philosophical perspective, focusing on the tension between freedom (personal autonomy) and discipline (moral and ethical principles). The approach could be characterized as hermeneutical: the aim is to deepen our understanding through discussing the basic arguments for justifying the inclusion of…

  17. Lay denial of knowledge for justified true beliefs.

    Science.gov (United States)

    Nagel, Jennifer; Juan, Valerie San; Mar, Raymond A

    2013-12-01

    Intuitively, there is a difference between knowledge and mere belief. Contemporary philosophical work on the nature of this difference has focused on scenarios known as "Gettier cases." Designed as counterexamples to the classical theory that knowledge is justified true belief, these cases feature agents who arrive at true beliefs in ways which seem reasonable or justified, while nevertheless seeming to lack knowledge. Prior empirical investigation of these cases has raised questions about whether lay people generally share philosophers' intuitions about these cases, or whether lay intuitions vary depending on individual factors (e.g. ethnicity) or factors related to specific types of Gettier cases (e.g. cases that include apparent evidence). We report an experiment on lay attributions of knowledge and justification for a wide range of Gettier Cases and for a related class of controversial cases known as Skeptical Pressure cases, which are also thought by philosophers to elicit intuitive denials of knowledge. Although participants rated true beliefs in Gettier and Skeptical Pressure cases as being justified, they were significantly less likely to attribute knowledge for these cases than for matched True Belief cases. This pattern of response was consistent across different variations of Gettier cases and did not vary by ethnicity or gender, although attributions of justification were found to be positively related to measures of empathy. These findings therefore suggest that across demographic groups, laypeople share similar epistemic concepts with philosophers, recognizing a difference between knowledge and justified true belief. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Small-Business Computing: Is Software Piracy Justified?

    Science.gov (United States)

    Immel, A. Richard

    1983-01-01

    Presents several different perspectives on the copying of computer software (discs, tapes, etc.) in an attempt to determine whether such infringement of copyright, often called "software piracy," can ever be justified. Implications for both the hardware and software firms and the users are also discussed. (EAO)

  19. Legislative Prohibitions on wearing a headscarf: Are they justified?

    Directory of Open Access Journals (Sweden)

    Fatima Osman

    2014-11-01

    Full Text Available A headscarf, a simple piece of cloth that covers the head, is a controversial garment that carries various connotations and meanings. While it may be accepted as just another item of clothing when worn by non-Muslim women, it is often the subject of much controversy when worn by Muslim women. In recent years the headscarf has been described as a symbol of Islam's oppression of women and simultaneously of terrorism. As the debate regarding the acceptability of the headscarf in the modern world continues, an increasing number of states have legislated to ban the wearing of the headscarf. This article critically examines the reasons underlying these bans and argues that these prohibitions are not justified. It does this by first analysing the place of the headscarf in Islam, its religious basis and its significance to Muslim women. It argues that the headscarf is more than just a mere religious symbol and that Muslim women wear the headscarf as a matter of religious obligation. The headscarf is considered to be an important religious practice protected by the right to freedom of religion. Thereafter the article examines legislative bans on the headscarf in France, Turkey and Switzerland in order to identify the most popular justifications advanced by states and courts for banning the headscarf. It critically evaluates the justifications for protecting secularism, preventing coercion, promoting equality and curbing religious extremism, and disputes that the reasons put forward by states and accepted by courts justify banning the headscarf. It thereafter explores how South African courts would respond to a headscarf ban and argues that schools and employers should accommodate the headscarf. While Muslim women may not have an absolute right to wear the headscarf, there has thus far been no justifiable reason for banning the headscarf.

  20. Cost-justifying usability an update for the internet age

    CERN Document Server

    Bias, Randolph G; Bias, Randolph G

    2005-01-01

    You just know that an improvement of the user interface will reap rewards, but how do you justify the expense and the labor and the time-guarantee a robust ROI!-ahead of time? How do you decide how much of an investment should be funded? And what is the best way to sell usability to others? In this completely revised and new edition, Randolph G. Bias (University of Texas at Austin, with 25 years' experience as a usability practitioner and manager) and Deborah J. Mayhew (internationally recognized usability consultant and author of two other seminal books including The Usability Enginee

  1. Is the use of sentient animals in basic research justifiable?

    OpenAIRE

    Greek Ray; Greek Jean

    2010-01-01

    Abstract Animals can be used in many ways in science and scientific research. Given that society values sentient animals and that basic research is not goal oriented, the question is raised: "Is the use of sentient animals in basic research justifiable?" We explore this in the context of funding issues, outcomes from basic research, and the position of society as a whole on using sentient animals in research that is not goal oriented. We conclude that the use of sentient animals in basic rese...

  2. Modeling Heterogeneous Variance-Covariance Components in Two-Level Models

    Science.gov (United States)

    Leckie, George; French, Robert; Charlton, Chris; Browne, William

    2014-01-01

    Applications of multilevel models to continuous outcomes nearly always assume constant residual variance and constant random effects variances and covariances. However, modeling heterogeneity of variance can prove a useful indicator of model misspecification, and in some educational and behavioral studies, it may even be of direct substantive…

  3. Justify a Dedicated Radiology Coder-Reimbursement Specialist.

    Science.gov (United States)

    Mulaikis, Melody W

    2015-01-01

    There are many opportunities to justify a dedicated staff member. We have to be able to answer the question "How does this position make money?" The bottom line is that it's crucial the facility does not forfeit appropriate reimbursement for its existing procedures. For new procedures, or equipment, this individual can also ensure cost-benefit analysis/ROI is correct for equipment and/or supply purchases. The specific opportunities vary by facility so you must determine where your potential opportunities lie. There is not one answer, but this article provides you with specific areas to evaluate. Keep in mind if you are evaluating opportunities related to specific procedures you need to utilize outpatient numbers and assume Medicare reimbursement rates so that you calculate a conservative estimate. There is nioney to be found in most hospital organizations, so take the time to identify the potential benefit for your own. You can quantify the impact of a dedicated individual based on your specific case mix, which is very useful when justifying a new position. Also, it's very important to remember, you get what you pay for-fill the new position wisely. Saving a small amount in salary may result in a large sacrifice in potential revenues.

  4. Marketing of human organs and tissues is justified and necessary.

    Science.gov (United States)

    Kevorkian, J

    1989-01-01

    The bioethical guidelines now banning commerce in human body parts to be used for transplantation manifest unrealistic and arbitrary inflexibility which perpetuates and worsens the deficit in organ supply. Instead of relying on traditionally revered but now outmoded and even irrelevant bioethical maxims, formulators of the guidelines should have concentrated on a more meaningful situational adaptation to contemporary real-life circumstances. Many unexpectedly relevant and important nuances of concepts such as property, ownership, and altruism must now be taken into account. Hypothetical examples explore the morality of a universal ban by fiat and the associated problems of organ supply and demand, of cost and affordability, and of fair equity. It is difficult to justify purely altruistic organ donation today, when the health care professions and industries are frantically pursuing commercial profits. It is concluded that the ban should be scrapped in favor of a well-organized, open, and legally regulated commercial market for human organs and tissues.

  5. Killing in Combat: Utilizing a Christian Perspective, When is a Soldier Justified in Taking a Life?

    Science.gov (United States)

    2014-06-13

    classical teaching. Necessarily, intent versus action will be investigated and the concept of utilitarian ethics and the doctrine of double effect...19. Legally Justified Killing – Christian .............................................................79 Figure 20. Ethically Justified Killing...111 x Figure 48. Ethically Justified Hesitation on Killing – Non-Christian ..........................112 Figure 49. Hesitation

  6. Can foster care ever be justified for weight management?

    Science.gov (United States)

    Williams, G M G; Bredow, Maria; Barton, John; Pryce, Rebekah; Shield, J P H

    2014-03-01

    Article nine of the UN Convention of the Rights of the Child states that 'Children must not be separated from their parents unless it is in the best interests of the child.' We describe the impact that placing a child into care can have on long-standing and intractable obesity when this is a component of a child safeguarding strategy. Significant weight loss was documented in a male adolescent following his placement into foster care due to emotional harm and neglect within his birth family. The child's body mass index (BMI) dropped from a peak of 45.6 to 35 over 18 months. We provide brief details of two further similar cases and outcomes. Childhood obesity is often not the sole concern during safeguarding proceedings. Removal from an 'obesogenic' home environment should be considered if failure by the parents/carers to address the obesity is a major cause for concern. It is essential that all other avenues have been explored before removing a child from his birth family. However, in certain circumstances we feel it may be justified.

  7. Sample size in orthodontic randomized controlled trials: are numbers justified?

    Science.gov (United States)

    Koletsi, Despina; Pandis, Nikolaos; Fleming, Padhraig S

    2014-02-01

    Sample size calculations are advocated by the Consolidated Standards of Reporting Trials (CONSORT) group to justify sample sizes in randomized controlled trials (RCTs). This study aimed to analyse the reporting of sample size calculations in trials published as RCTs in orthodontic speciality journals. The performance of sample size calculations was assessed and calculations verified where possible. Related aspects, including number of authors; parallel, split-mouth, or other design; single- or multi-centre study; region of publication; type of data analysis (intention-to-treat or per-protocol basis); and number of participants recruited and lost to follow-up, were considered. Of 139 RCTs identified, complete sample size calculations were reported in 41 studies (29.5 per cent). Parallel designs were typically adopted (n = 113; 81 per cent), with 80 per cent (n = 111) involving two arms and 16 per cent having three arms. Data analysis was conducted on an intention-to-treat (ITT) basis in a small minority of studies (n = 18; 13 per cent). According to the calculations presented, overall, a median of 46 participants were required to demonstrate sufficient power to highlight meaningful differences (typically at a power of 80 per cent). The median number of participants recruited was 60, with a median of 4 participants being lost to follow-up. Our finding indicates good agreement between projected numbers required and those verified (median discrepancy: 5.3 per cent), although only a minority of trials (29.5 per cent) could be examined. Although sample size calculations are often reported in trials published as RCTs in orthodontic speciality journals, presentation is suboptimal and in need of significant improvement.

  8. Four ways to justify temporal memory operators in the lossy wave equation

    CERN Document Server

    Holm, Sverre

    2015-01-01

    Attenuation of ultrasound often follows near power laws which cannot be modeled with conventional viscous or relaxation wave equations. The same is often the case for shear wave propagation in tissue also. More general temporal memory operators in the wave equation can describe such behavior. They can be justified in four ways: 1) Power laws for attenuation with exponents other than two correspond to the use of convolution operators with a temporal memory kernel which is a power law in time. 2) The corresponding constitutive equation is also a convolution, often with a temporal power law function. 3) It is also equivalent to an infinite set of relaxation processes which can be formulated via the complex compressibility. 4) The constitutive equation can also be expressed as an infinite sum of higher order derivatives. An extension to longitudinal waves in a nonlinear medium is also provided.

  9. Two independent pivotal statistics that test location and misspecification and add-up to the Anderson-Rubin statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2002-01-01

    We extend the novel pivotal statistics for testing the parameters in the instrumental variables regression model. We show that these statistics result from a decomposition of the Anderson-Rubin statistic into two independent pivotal statistics. The first statistic is a score statistic that tests loc

  10. The Impact of Misspecifying Class-Specific Residual Variances in Growth Mixture Models

    Science.gov (United States)

    Enders, Craig K.; Tofighi, Davood

    2008-01-01

    The purpose of this study was to examine the impact of misspecifying a growth mixture model (GMM) by assuming that Level-1 residual variances are constant across classes, when they do, in fact, vary in each subpopulation. Misspecification produced bias in the within-class growth trajectories and variance components, and estimates were…

  11. Explained Variation and Predictive Accuracy with an Extension to the Competing Risks Model

    DEFF Research Database (Denmark)

    Rosthøj, Susanne; Keiding, Niels

    2003-01-01

    Competing risks; efficiency; explained variation; misspecification; predictive accuracy; survival analysis......Competing risks; efficiency; explained variation; misspecification; predictive accuracy; survival analysis...

  12. Is Justified True Belief Knowledge? / ¿Una creencia verdadera justificada es conocimiento?

    Directory of Open Access Journals (Sweden)

    Edmund L. Gettier

    2013-12-01

    Full Text Available [ES] En este breve trabajo, se presenta una edición bilingüe de Is Justified True Belief Knowledge? (1963, de Edmund L. Gettier, donde se presentan contraejemplos a la definición de «conocimiento» como «creencia verdadera justificada». [ES] In this brief text, a bilingual edition of Is Justified True Belief Knowledge?, (1963 by Edmund L. Gettier, some counterexamples are presented to the definition of «knowledge» as «justified true belief».

  13. Are current disease-modifying therapeutics in multiple sclerosis justified on the basis of studies in experimental autoimmune encephalomyelitis?

    Science.gov (United States)

    Farooqi, Nasr; Gran, Bruno; Constantinescu, Cris S

    2010-11-01

    The precise aetio-pathology of multiple sclerosis remains elusive. However, important recent advances have been made and several therapies have been licensed for clinical use. Many of these were developed, validated or tested in the animal model, experimental autoimmune encephalomyelitis (EAE). This systematic review aims to assess whether the current disease modifying treatments and those that are the closest to the clinic are justified on the basis of the results of EAE studies. We discuss some aspects of the utility and caveats of EAE as a model for multiple sclerosis drug development.

  14. Taxonomy of literature to justify data governance as a pre-requisite for information governance

    CSIR Research Space (South Africa)

    Olaitan, O

    2016-09-01

    Full Text Available is devoted to data governance. This study chronicles extant literature to justify the position that data governance should be a prerequisite for information governance within organisations. The study argues that an information governance policy which is based...

  15. Summary Report: DoD Information Technology Contracts Awarded Without Competition Were Generally Justified

    Science.gov (United States)

    2015-09-09

    GENERAL AUDITOR GENERAL , DEPARTMENT OF THE ARMY SUBJECT: Summary Report : DoD Information Technology Contracts Awarded Without Competition Were...E M B E R 9 , 2 0 1 5 Summary Report : DoD Information Technology Contracts Awarded Without Competition Were Generally Justified Report No... Generally Justified ( Report No. DODIG-2015-167) We are providing the enclosed charts for your information and use. Contracting personnel at the Army, Navy

  16. Digital and multimedia forensics justified: An appraisal on professional policy and legislation

    Science.gov (United States)

    Popejoy, Amy Lynnette

    Recent progress in professional policy and legislation at the federal level in the field of forensic science constructs a transformation of new outcomes for future experts. An exploratory and descriptive qualitative methodology was used to critique and examine Digital and Multimedia Science (DMS) as a justified forensic discipline. Chapter I summarizes Recommendations 1, 2, and 10 of the National Academy of Sciences (NAS) Report 2009 regarding disparities and challenges facing the forensic science community. Chapter I also delivers the overall foundation and framework of this thesis, specifically how it relates to DMS. Chapter II expands on Recommendation 1: "The Promotion and Development of Forensic Science," and focuses chronologically on professional policy and legislative advances through 2014. Chapter III addresses Recommendation 2: "The Standardization of Terminology in Reporting and Testimony," and the issues of legal language and terminology, model laboratory reports, and expert testimony concerning DMS case law. Chapter IV analyzes Recommendation 10: "Insufficient Education and Training," identifying legal awareness for the digital and multimedia examiner to understand the role of the expert witness, the attorney, the judge and the admission of forensic science evidence in litigation in our criminal justice system. Finally, Chapter V studies three DME specific laboratories at the Texas state, county, and city level, concentrating on current practice and procedure.

  17. Physicians and strikes: can a walkout over the malpractice crisis be ethically justified?

    Science.gov (United States)

    Fiester, Autumn

    2004-01-01

    Malpractice insurance rates have created a crisis in American medicine. Rates are rising and reimbursements are not keeping pace. In response, physicians in the states hardest hit by this crisis are feeling compelled to take political action, and the current action of choice seems to be physician strikes. While the malpractice insurance crisis is acknowledged to be severe, does it justify the extreme action of a physician walkout? Should physicians engage in this type of collective action, and what are the costs to patients and the profession when such action is taken? I will offer three related arguments against physician strikes that constitute a prima facie prohibition against such action: first, strikes are intended to cause harm to patients; second, strikes are an affront to the physician-patient relationship; and, third, strikes risk decreasing the public's respect for the medical profession. As with any prima facie obligation, there are justifying conditions that may override the moral prohibition, but I will argue that the current malpractice crisis does not rise to the level of such a justifying condition. While the malpractice crisis demands and justifies a political response on the part of the nation's physicians, strikes and slow-downs are not an ethically justified means to the legitimate end of controlling insurance costs.

  18. Compatriot partiality and cosmopolitan justice: Can we justify compatriot partiality within the cosmopolitan framework?

    Directory of Open Access Journals (Sweden)

    Rachelle Bascara

    2016-10-01

    Full Text Available This paper shows an alternative way in which compatriot partiality could be justified within the framework of global distributive justice. Philosophers who argue that compatriot partiality is similar to racial partiality capture something correct about compatriot partiality. However, the analogy should not lead us to comprehensively reject compatriot partiality. We can justify compatriot partiality on the same grounds that liberation movements and affirmative action have been justified. Hence, given cosmopolitan demands of justice, special consideration for the economic well-being of your nation as a whole is justified if and only if the country it identifies is an oppressed developing nation in an unjust global order.This justification is incomplete. We also need to say why Person A, qua national of Country A, is justified in helping her compatriots in Country A over similarly or slightly more oppressed non-compatriots in Country B. I argue that Person A’s partiality towards her compatriots admits further vindication because it is part of an oppressed group’s project of self-emancipation, which is preferable to paternalistic emancipation.Finally, I identify three benefits in my justification for compatriot partiality. First, I do not offer a blanket justification for all forms of compatriot partiality. Partiality between members of oppressed groups is only a temporary effective measure designed to level an unlevel playing field. Second, because history attests that sovereign republics could arise as a collective response to colonial oppression, justifying compatriot partiality on the grounds that I have identified is conducive to the development of sovereignty and even democracy in poor countries, thereby avoiding problems of infringement that many humanitarian poverty alleviation efforts encounter. Finally, my justification for compatriot partiality complies with the implicit cosmopolitan commitment to the realizability of global justice

  19. Justifying Blame: why free will matters and why it does not

    NARCIS (Netherlands)

    M.M.S.K. Sie (Maureen)

    2005-01-01

    textabstractThis book shows why we can justify blaming people for their wrong actions even if free will turns out not to exist. Contrary to most contemporary philosophizing about this issue, we do this not by denying that free will is relevant to considerations about personal desert. Instead we reco

  20. Influenza Vaccination in dutch Nursing Homes: is tacit consent morally justified?

    NARCIS (Netherlands)

    Verweij, M.F.; Hoven, M.A. van den

    2005-01-01

    Objectives: Efficient procedures for obtaining informed (proxy) consent may contribute to high influenza vaccination rates in nursing homes. Yet are such procedures justified? This study’s objective was to gain insight in informed consent policies in Dutch nursing homes; to assess how these may affe

  1. Intervention in Countries with Unsustainable Energy Policies: Is it Ever Justifiable?

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, Bruce Edward [ORNL

    2010-08-01

    This paper explores whether it is ever justifiable for the international community to forcibly intervene in countries that have unsustainable energy policies. The literature on obligations to future generations suggests, philosophically, that intervention might be justified under certain circumstances. Additionally, the world community has intervened in the affairs of other countries for humanitarian reasons, such as in Kosovo, Somalia, and Haiti. However, intervention to deal with serious energy problems is a qualitatively different and more difficult problem. A simple risk analysis framework is used to organize the discussion about possible conditions for justifiable intervention. If the probability of deaths resulting from unsustainable energy policies is very large, if the energy problem can be attributed to a relatively small number of countries, and if the risk of intervention is acceptable (i.e., the number of deaths due to intervention is relatively small), then intervention may be justifiable. Without further analysis and successful solution of several vexing theoretical questions, it cannot be stated whether unsustainable energy policies being pursued by countries at the beginning of the 21st century meet the criteria for forcible intervention by the international community.

  2. "Teach Your Children Well": Arguing in Favor of Pedagogically Justifiable Hospitality Education

    Science.gov (United States)

    Potgieter, Ferdinand J.

    2016-01-01

    This paper is a sequel to the paper which I delivered at last year's BCES conference in Sofia. Making use of hermeneutic phenomenology and constructive interpretivism as methodological apparatus, I challenge the pedagogic justifiability of the fashionable notion of religious tolerance. I suggest that we need, instead, to reflect "de…

  3. The Luckless and the Doomed: Contractualism on Justified Risk-Imposition

    DEFF Research Database (Denmark)

    Holm, Sune Hannibal

    2017-01-01

    Several authors have argued that contractualism faces a dilemma when it comes to justifying risks generated by socially valuable activities. At the heart of the matter is the question of whether contractualists should adopt an ex post or an ex ante perspective when assessing whether an action...

  4. Context Based Inferences in Research Methodology: The Role of Culture in Justifying Knowledge Claims

    Science.gov (United States)

    Evers, Colin W.; Mason, Mark

    2011-01-01

    Drawing on work in epistemology and the philosophy of science, this paper seeks to provide very general reasons for why a comparative perspective needs to be applied to the inferential procedures of research methodologies where these concern the issue of justifying knowledge claims. In particular, the paper explores the role of culture on a number…

  5. Are Score Comparisons across Language Proficiency Test Batteries Justified?: An IELTS-TOEFL Comparability Study.

    Science.gov (United States)

    Geranpayeh, Ardeshir

    1994-01-01

    This paper reports on a study conducted to determine if comparisons between scores on the Test of English as a Foreign Language (TOEFL) and the International English Language Testing Service (IELTS) are justifiable. The test scores of 216 Iranian graduate students who took the TOEFL and IELTS, as well as the Iranian Ministry of Culture and Higher…

  6. Twisted trees and inconsistency of tree estimation when gaps are treated as missing data - The impact of model mis-specification in distance corrections.

    Science.gov (United States)

    McTavish, Emily Jane; Steel, Mike; Holder, Mark T

    2015-12-01

    Statistically consistent estimation of phylogenetic trees or gene trees is possible if pairwise sequence dissimilarities can be converted to a set of distances that are proportional to the true evolutionary distances. Susko et al. (2004) reported some strikingly broad results about the forms of inconsistency in tree estimation that can arise if corrected distances are not proportional to the true distances. They showed that if the corrected distance is a concave function of the true distance, then inconsistency due to long branch attraction will occur. If these functions are convex, then two "long branch repulsion" trees will be preferred over the true tree - though these two incorrect trees are expected to be tied as the preferred true. Here we extend their results, and demonstrate the existence of a tree shape (which we refer to as a "twisted Farris-zone" tree) for which a single incorrect tree topology will be guaranteed to be preferred if the corrected distance function is convex. We also report that the standard practice of treating gaps in sequence alignments as missing data is sufficient to produce non-linear corrected distance functions if the substitution process is not independent of the insertion/deletion process. Taken together, these results imply inconsistent tree inference under mild conditions. For example, if some positions in a sequence are constrained to be free of substitutions and insertion/deletion events while the remaining sites evolve with independent substitutions and insertion/deletion events, then the distances obtained by treating gaps as missing data can support an incorrect tree topology even given an unlimited amount of data.

  7. Is radiography justified for the evaluation of patients presenting with cervical spine trauma?

    Energy Technology Data Exchange (ETDEWEB)

    Theocharopoulos, Nicholas; Chatzakis, Georgios; Damilakis, John [Department of Medical Physics, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece) and Department of Natural Sciences, Technological Education Institute of Crete, P.O. Box 140, Iraklion 71004 Crete (Greece); Department of Radiology, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece); Department of Medical Physics, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece)

    2009-10-15

    radiogenic lethal cancer incidents. According to the decision model calculations, the use of CT is more favorable over the use of radiography alone or radiography with CT by a factor of 13, for low risk 20 yr old patients, to a factor of 23, for high risk patients younger than 80 yr old. The radiography/CT imaging strategy slightly outperforms plain radiography for high and moderate risk patients. Regardless of the patient age, sex, and fracture risk, the higher diagnostic accuracy obtained by the CT examination counterbalances the increase in dose compared to plain radiography or radiography followed by CT only for positive radiographs and renders CT utilization justified and the radiographic screening redundant.

  8. Bayesian inference for partially identified models exploring the limits of limited data

    CERN Document Server

    Gustafson, Paul

    2015-01-01

    Introduction Identification What Is against Us? What Is for Us? Some Simple Examples of Partially Identified ModelsThe Road Ahead The Structure of Inference in Partially Identified Models Bayesian Inference The Structure of Posterior Distributions in PIMs Computational Strategies Strength of Bayesian Updating, Revisited Posterior MomentsCredible Intervals Evaluating the Worth of Inference Partial Identification versus Model Misspecification The Siren Call of Identification Comp

  9. Conditional likelihood inference in generalized linear mixed models.

    OpenAIRE

    Sartori, Nicola; Severini , T.A

    2002-01-01

    Consider a generalized linear model with a canonical link function, containing both fixed and random effects. In this paper, we consider inference about the fixed effects based on a conditional likelihood function. It is shown that this conditional likelihood function is valid for any distribution of the random effects and, hence, the resulting inferences about the fixed effects are insensitive to misspecification of the random effects distribution. Inferences based on the conditional likelih...

  10. Robust and efficient designs for the Michaelis-Menten model

    OpenAIRE

    Dette, Holger; Biedermann, Stefanie

    2002-01-01

    For the Michaelis-Menten model, we determine designs that maximize the minimum of the D-efficiencies over a certain interval for the nonlinear parameter. The best two point designs can be found explicitly, and a characterization is given when these designs are optimal within the class of all designs. In most cases of practical interest, the determined designs are highly efficient and robust with respect to misspecification of the nonlinear parameter. The results are illustrated and applied in...

  11. Assessment of agreement between general practitioners and radiologists as to whether a radiation exposure is justified.

    Science.gov (United States)

    Dhingsa, R; Finlay, D B L; Robinson, G D; Liddicoat, A J

    2002-02-01

    The objective of this study was to assess agreement between General Practitioners (GPs) and Consultant Radiologists as to whether a radiation exposure is justified and whether a request conforms to the Royal College of Radiologists (RCR) guidelines. Three GPs and three Consultant Radiologists were asked to review 100 requests for plain film imaging from GPs and to state whether the request justified a radiation exposure and whether the request conformed to RCR guidelines. It was discovered that there is greater agreement between radiologists than between GPs; this is a consistent pattern. The best agreement was between two Consultant Radiologists using the RCR guidelines. The poorest was between GPs using the request form details. It is suggested that the guidelines should be symptom-based to improve efficacy.

  12. Routine X-ray of the chest is not justified in staging of cutaneous melanoma patients

    DEFF Research Database (Denmark)

    Gjorup, Caroline Asirvatham; Hendel, Helle Westergren; Pilegaard, Rita Kaae

    2016-01-01

    value was 8%, and the negative predictive value was 100%. CONCLUSION: Our results suggest that CXR cannot be justified in the initial staging of cutaneous melanoma patients. The guideline for the treatment of melanoma in Denmark is under revision: The use of CXR has been omitted. FUNDING: This study......INTRODUCTION: The incidence of cutaneous melanoma is increasing in Denmark and worldwide. However, the prevalence of distant metastases at the time of diagnosis has decreased to 1%. We therefore questioned the value of routine preoperative chest X-ray (CXR) for staging asymptomatic melanoma...... patients and hypothesised that routine CXR is not justified. METHODS: A retrospective study was conducted on patients undergoing wide local excision and sentinel lymph node biopsy for cutaneous melanoma in the period from 2010 to 2014. RESULTS: A total of 603 patients were included. The mean time of follow...

  13. Estimation of increased regional income that emanates from economically justified road construction projects

    Directory of Open Access Journals (Sweden)

    W. J. Pienaar

    2005-09-01

    Full Text Available This article identifies the possible development benefits than can emanate from economically justified road construction projects. It shows how the once-off increase in regional income resulting from investment in road construction projects, and the recurring additional regional income resulting from the use of new or improved roads can be estimated. The difference is shown that exists between a cost-benefit analysis (to determine how economically justified a project is and a regional economic income analysis (to estimate the general economic benefits that will be developed by investment in and usage of a road. Procedures are proposed through which the once-off and recurring increases in regional income can be estimated by using multiplier and accelerator analyses respectively. Finally guidelines are supplied on the appropriate usage of input variables in the calculation of the regional income multiplier.

  14. Heteroscedasticity as a Basis of Direction Dependence in Reversible Linear Regression Models.

    Science.gov (United States)

    Wiedermann, Wolfgang; Artner, Richard; von Eye, Alexander

    2017-01-01

    Heteroscedasticity is a well-known issue in linear regression modeling. When heteroscedasticity is observed, researchers are advised to remedy possible model misspecification of the explanatory part of the model (e.g., considering alternative functional forms and/or omitted variables). The present contribution discusses another source of heteroscedasticity in observational data: Directional model misspecifications in the case of nonnormal variables. Directional misspecification refers to situations where alternative models are equally likely to explain the data-generating process (e.g., x → y versus y → x). It is shown that the homoscedasticity assumption is likely to be violated in models that erroneously treat true nonnormal predictors as response variables. Recently, Direction Dependence Analysis (DDA) has been proposed as a framework to empirically evaluate the direction of effects in linear models. The present study links the phenomenon of heteroscedasticity with DDA and describes visual diagnostics and nine homoscedasticity tests that can be used to make decisions concerning the direction of effects in linear models. Results of a Monte Carlo simulation that demonstrate the adequacy of the approach are presented. An empirical example is provided, and applicability of the methodology in cases of violated assumptions is discussed.

  15. Modelling severe Staphylococcus aureus sepsis in conscious pigs: are implications for animal welfare justified?

    DEFF Research Database (Denmark)

    Olsen, Helle G; Kjelgaard-Hansen, Mads; Tveden-Nyborg, Pernille;

    2016-01-01

    ), and 30 mL (n = 1), corresponding to infusion durations of 3.33, 6.66, and 10 min at dose rates of 3 × 10(7), 1.5 × 10(7), and 1 × 10(7) cfu/min/kg BW, respectively. Blood samples were drawn for complete blood count, clinical chemistry, and inflammatory markers before and every 6 h after inoculation....... Prior to euthanasia, a galactose elimination capacity test was performed to assess liver function. Pigs were euthanised 48 h post inoculation for necropsy and histopathological evaluation. While infusion times of 6.66 min, and higher, did not induce liver dysfunction (n = 3), the infusion time of 3...

  16. Modelling severe Staphylococcus aureus sepsis in conscious pigs: are implications for animal welfare justified?

    DEFF Research Database (Denmark)

    Olsen, Helle G; Kjelgaard-Hansen, Mads; Tveden-Nyborg, Pernille

    2016-01-01

    by the severity of induced disease, which in some cases necessitated humane euthanasia. A pilot study was therefore performed in order to establish the sufficient inoculum concentration and application protocol needed to produce signs of liver dysfunction within limits of our pre-defined humane endpoints. Four.......33 min (n = 1) caused alterations in parameters similar to what had been seen in our previous studies, i.e., increasing bilirubin and aspartate aminotransferase, as well as histopathological occurrence of intravascular fibrin split products in the liver. This pig was however euthanised after 30 h...

  17. Belief in school meritocracy as a system-justifying tool for low status students.

    Science.gov (United States)

    Wiederkehr, Virginie; Bonnot, Virginie; Krauth-Gruber, Silvia; Darnon, Céline

    2015-01-01

    The belief that, in school, success only depends on will and hard work is widespread in Western societies despite evidence showing that several factors other than merit explain school success, including group belonging (e.g., social class, gender). In the present paper, we argue that because merit is the only track for low status students to reach upward mobility, Belief in School Meritocracy (BSM) is a particularly useful system-justifying tool to help them perceive their place in society as being deserved. Consequently, for low status students (but not high status students), this belief should be related to more general system-justifying beliefs (Study 1). Moreover, low status students should be particularly prone to endorsing this belief when their place within a system on which they strongly depend to acquire status is challenged (Study 2). In Study 1, high status (boys and high SES) were compared to low status (girls and low SES) high school students. Results indicated that BSM was related to system-justifying beliefs only for low SES students and for girls, but not for high SES students or for boys. In Study 2, university students were exposed (or not) to information about an important selection process that occurs at the university, depending on the condition. Their subjective status was assessed. Although such a confrontation reduced BSM for high subjective SES students, it tended to enhance it for low subjective SES students. Results are discussed in terms of system justification motives and the palliative function meritocratic ideology may play for low status students.

  18. Justifying molecular images in cell biology textbooks: From constructions to primary data.

    Science.gov (United States)

    Serpente, Norberto

    2016-02-01

    For scientific claims to be reliable and productive they have to be justified. However, on the one hand little is known on what justification precisely means to scientists, and on the other the position held by philosophers of science on what it entails is rather limited; for justifications customarily refer to the written form (textual expressions) of scientific claims, leaving aside images, which, as many cases from the history of science show are relevant to this process. The fact that images can visually express scientific claims independently from text, plus their vast variety and origins, requires an assessment of the way they are currently justified and in turn used as sources to justify scientific claims in the case of particular scientific fields. Similarly, in view of the different nature of images, analysis is required to determine on what side of the philosophical distinction between data and phenomena these different kinds of images fall. This paper historicizes and documents a particular aspect of contemporary life sciences research: the use of the molecular image as vehicle of knowledge production in cell studies, a field that has undergone a significant shift in visual expressions from the early 1980s onwards. Focussing on textbooks as sources that have been overlooked in the historiography of contemporary biomedicine, the aim is to explore (1) whether the shift of cell studies, entailing a superseding of the optical image traditionally conceptualised as primary data, by the molecular image, corresponds with a shift of justificatory practices, and (2) to assess the role of the molecular image as primary data. This paper also explores the dual role of images as teaching resources and as resources for the construction of knowledge in cell studies especially in its relation to discovery and justification. Finally, this paper seeks to stimulate reflection on what kind of archival resources could benefit the work of present and future epistemic

  19. ["The end justifies the means." Historical realism and causality in modernity].

    Science.gov (United States)

    Catteeuw, Laurie

    2014-01-01

    Perceived all along, from the beginning of the modern era, causality in history cannot be abstracted from a reflection on "the reason and uses of States". Such realism engages with the heart of the arts of governance, the practices which form the basis for the adage "the end justifies the means". In order to assess the implications of this maxim on the definition of causality, this article examines the modalities of the description of historical facts, its usages and censoring (understood as a necessary means to aspired ends), and the calculation of the aleatory dimensions of politics.

  20. Flexible distributions for triple-goal estimates in two-stage hierarchical models

    Science.gov (United States)

    Paddock, Susan M.; Ridgeway, Greg; Lin, Rongheng; Louis, Thomas A.

    2009-01-01

    Performance evaluations often aim to achieve goals such as obtaining estimates of unit-specific means, ranks, and the distribution of unit-specific parameters. The Bayesian approach provides a powerful way to structure models for achieving these goals. While no single estimate can be optimal for achieving all three inferential goals, the communication and credibility of results will be enhanced by reporting a single estimate that performs well for all three. Triple goal estimates [Shen and Louis, 1998. Triple-goal estimates in two-stage hierarchical models. J. Roy. Statist. Soc. Ser. B 60, 455–471] have this performance and are appealing for performance evaluations. Because triple-goal estimates rely more heavily on the entire distribution than do posterior means, they are more sensitive to misspecification of the population distribution and we present various strategies to robustify triple-goal estimates by using nonparametric distributions. We evaluate performance based on the correctness and efficiency of the robustified estimates under several scenarios and compare empirical Bayes and fully Bayesian approaches to model the population distribution. We find that when data are quite informative, conclusions are robust to model misspecification. However, with less information in the data, conclusions can be quite sensitive to the choice of population distribution. Generally, use of a nonparametric distribution pays very little in efficiency when a parametric population distribution is valid, but successfully protects against model misspecification. PMID:19603088

  1. When is deliberate killing of young children justified? Indigenous interpretations of infanticide in Bolivia.

    Science.gov (United States)

    de Hilari, Caroline; Condori, Irma; Dearden, Kirk A

    2009-01-01

    In the Andes, as elsewhere, infanticide is a difficult challenge that remains largely undocumented and misunderstood. From January to March 2004 we used community-based vital event surveillance systems, discussions with health staff, ethnographic interviews, and focus group discussions among Aymara men and women from two geographically distinct sites in the Andes of Bolivia to provide insights into the practice of infanticide. We noted elevated mortality at both sites. In one location, suspected causes of infanticide were especially high for girls. We also observed that community members maintain beliefs that justify infanticide under certain circumstances. Among the Aymara, justification for infanticide was both biological (deformities and twinship) and social (illegitimate birth, family size and poverty). Communities generally did not condemn killing when reasons for doing so were biological, but the taking of life for social reasons was rarely justified. In this cultural context, strategies to address the challenge of infanticide should include education of community members about alternatives to infanticide. At a program level, planners and implementers should target ethnic groups with high levels of infanticide and train health care workers to detect and address multiple warning signs for infanticide (for example, domestic violence and child maltreatment) as well as proxies for infant neglect and abuse such as mother/infant separation and bottle use.

  2. [Do histologic changes of the upper renal pole in double ectopic ureterocele justify a conservative approach?].

    Science.gov (United States)

    Arena, F; Nicotina, P A; Cruccetti, A; Centonze, A; Arena, S; Romeo, G

    2001-01-01

    The aim of this study was to review the histology of the upper pole segment in patients with duplex ectopic ureterocele to verify if a less aggressive surgery is justified in the prenatally diagnosed patients. We reviewed the histology of the upper pole segment of 15 consecutive patients with duplex system ectopic ureterocele treated between 1991 and 1999 at the Paediatric Surgery Unit of University Hospital of Messina. The diagnosis of duplex system ectopic ureterocele was made according to the criteria of the Section on Urology of the American Academy of Paediatrics. The histology specimens were assessed for dysplastic, inflammatory and obstructive changes. All 15 patients with duplex system ectopic ureterocele were surgically treated with heminephro-ureterectomy and the surgical specimens were histologically examined. Nine of the 15 patients were prenatally diagnosed. The histology of the upper pole segment of the 9 prenatally diagnosed showed in all patients segmental renal microcystic dysplasia, chondroid metaplasic islands and an inflammatory tubulo-interstitial nephropathy in 6 patients (66.6%) and in 2 (22.2%) nephroblastomatosis. The histology of six the postnatal postnatally diagnosed patients showed in all patients segmental multicystic renal dysplasia, inflammatory tubulo-interstitial nephropathy and segmental parenchymal scars. The upper pole histology of the patients with duplex ectopic ureterocele diagnosed prenatally did not show any evidence of reversible histological change. Considering the histology and the good outcome of patients treated with upper pole nephroureterectomy a less aggressive surgery with preservation of the upper pole does not seem justified.

  3. Can histologic changes of the upper pole justify a conservative approach in neonatal duplex ectopic ureterocele?

    Science.gov (United States)

    Arena, F; Nicotina, A; Cruccetti, A; Centonze, A; Arena, S; Romeo, G

    2002-12-01

    The aim of this study was to review the histology of the upper-pole segment in patients with duplex-system ectopic ureterocele (DEU) to determine if less aggressive surgery is justified in prenatally-diagnosed cases. The study included 15 consecutive patients with DEU treated between 1991 and 1999. The diagnosis was made according to the criteria of the Section on Urology of the American Academy of Pediatrics. The histology specimens were assessed for dysplastic, inflammatory, and obstructive changes. All 15 patients were surgically treated by heminephro-ureterectomy and the surgical specimens were histologically examined. Nine cases were diagnosed prenatally; the histology of the upper-pole segment in these patients showed segmental renal microcystic dysplasia, chondroid metaplasic islands, and an inflammatory tubulointerstitial nephropathy in 6 (66.6%) and nephroblastomatosis in 2 (22.2%). The histology of the 6 postnatally-diagnosed patients showed segmental multicystic renal dysplasia, inflammatory tubulo-interstitial nephropathy, and segmental parenchymal scars. The upper-pole histology of the prenatally-diagnosed patients did not show any evidence of reversible histologic changes. Considering this findings and the good outcome of patients treated with upper-pole nephroureterectomy, less aggressive surgery with preservation of the upper pole does not seem justified.

  4. Modelling Conditional and Unconditional Heteroskedasticity with Smoothly Time-Varying Structure

    DEFF Research Database (Denmark)

    Amado, Christina; Teräsvirta, Timo

    multiplier type misspecification tests. Finite-sample properties of these procedures and tests are examined by simulation. An empirical application to daily stock returns and another one to daily exchange rate returns illustrate the functioning and properties of our modelling strategy in practice....... The results show that the long memory type behaviour of the sample autocorrelation functions of the absolute returns can also be explained by deterministic changes in the unconditional variance....

  5. Model Selection Principles in Misspecified Models

    CERN Document Server

    Lv, Jinchi

    2010-01-01

    Model selection is of fundamental importance to high dimensional modeling featured in many contemporary applications. Classical principles of model selection include the Kullback-Leibler divergence principle and the Bayesian principle, which lead to the Akaike information criterion and Bayesian information criterion when models are correctly specified. Yet model misspecification is unavoidable when we have no knowledge of the true model or when we have the correct family of distributions but miss some true predictor. In this paper, we propose a family of semi-Bayesian principles for model selection in misspecified models, which combine the strengths of the two well-known principles. We derive asymptotic expansions of the semi-Bayesian principles in misspecified generalized linear models, which give the new semi-Bayesian information criteria (SIC). A specific form of SIC admits a natural decomposition into the negative maximum quasi-log-likelihood, a penalty on model dimensionality, and a penalty on model miss...

  6. [Is a hysterectomy justifiable to prevent post-tubal ligation syndrome?].

    Science.gov (United States)

    Maheux, R; Fugère, P

    1980-12-01

    Among 2057 tubal ligations performed between 1971-75 in "Hopital Saint-Luc" in Montreal, 78 patients had to be readmitted for hysterectomy. The main indication for hysterectomy among these patients was for menstrual disorders (65%). These menstrual disorders were present at the moment of the tubal ligation in about half of the patients. Among the patients who had to be reoperated for hysterectomy for menstrual disorders and who were asymptomatic at the momemt of their tubal ligation, 88% were using oral contraceptives for a mean period of 5.8 years. The low incidence of hysterectomy post-tubal ligation (3.8%) does not seem to justify a total hysterectomy to prevent what has been described as the "post tubal ligation syndrome" in the patients who are asymptomatic and desire a permanent sterilization. (Author's modified)

  7. Justifying the Choice and Use of a Game and a Song in My Lesson

    Institute of Scientific and Technical Information of China (English)

    庄新月

    2013-01-01

      Games and songs can bring a lot of pleasure to children. They are useful tools in children’s language learning. As Eng⁃lish teachers, we should know how to make full use of them to stimulate children’s interest and promote their learning. In this es⁃say, I will take one primary English lesson as an example to demonstrate my point. First I am going to talk about the advantages of using a game and a song, and then analyze the teaching or learning context and the activities in the lesson. At last I am going to fo⁃cus on how and why to use the game and the song in the classroom. In a word, I am going to justify my choice and use of a game and a song in a revision lesson.

  8. Can a right to health care be justified by linkage arguments?

    Science.gov (United States)

    Nickel, James W

    2016-08-01

    Linkage arguments, which defend a controversial right by showing that it is indispensable or highly useful to an uncontroversial right, are sometimes used to defend the right to health care (RHC). This article evaluates such arguments when used to defend RHC. Three common errors in using linkage arguments are (1) neglecting levels of implementation, (2) expanding the scope of the supported right beyond its uncontroversial domain, and (3) giving too much credit to the supporting right for outcomes in its area. A familiar linkage argument for RHC focuses on its contributions to the right to life. Among the problems with this argument are that it requires a positive conception of the right to life that is not uncontroversial and that it only justifies the subset of RHC that seeks to prevent loss of life. A linkage argument for RHC with better prospects claims that a well-realized right to health care enhances the realization of a number of uncontroversial rights.

  9. Perceptions about civil war in Central Africa: Can war be justified or solve problems?

    Directory of Open Access Journals (Sweden)

    Kitambala Lumbu

    2009-09-01

    Full Text Available Civil war and ethnic violence are major problems in Central Africa and have caused the death and displacement of millions of people over the years. The aim of this study was to investigate the perceptions of religious leaders, lecturers and students in theology at various tertiary institutions in Central Africa with regard to civil war in the region. A structured questionnaire was used to investigate participants� perceptions about and attitudes towards civil war. The questionnaire was completed by 1 364 participants who originated or lived in the Democratic Republic of the Congo (DRC and Rwanda. The results of the study illustrated the severe effect that civil wars had on the participants or their families and further indicated that Rwandans, Tutsis and males were more inclined toward justifying wars and seeing them as solutions for problems. The role of the Church in countering these perceptions is discussed.

  10. When the End (Automatically) Justifies the Means: Automatic Tendency Toward Sex Exchange for Crack Cocaine

    Science.gov (United States)

    Kopetz, Catalina E.; Collado, Anahi; Lejuez, Carl W.

    2016-01-01

    The current research explores the idea that self-defeating behaviors represent means toward individuals’ goals. In this quality, they may be automatically initiated upon goal activation without individual’s voluntary intention and thus exemplify the long-held idea that the end justifies the means. To investigate this notion empirically we explored one of the most problematic self-defeating behavior: engagement in sex exchange for crack cocaine. This behavior is common among female drug users despite its well-known health and legal consequences. Although these women know and understand the consequences of such behavior, they have a hard time resisting it when the goal of drug obtainment becomes accessible. Indeed, the current study shows that when the accessibility of such a goal is experimentally increased, participants for whom sex exchange represents an instrumental means to drug obtainment are faster to approach sex-exchange targets in a joystick task despite their self-reported intentions to avoid such behavior.

  11. Can science justify regulatory decisions about the cultivation of transgenic crops?

    Science.gov (United States)

    Raybould, Alan

    2012-08-01

    Results of scientific studies are sometimes claimed to provide scientific justification for regulatory decisions about the cultivation of certain transgenic crops. A decision may be scientifically justified if objective analysis shows that the decision is more likely than alternatives to lead to the achievement of specific policy objectives. If policy objectives are not defined operationally, as is often the case, scientific justification for decisions is not possible. The search for scientific justification for decisions leads to concentration on reducing scientific uncertainty about the behaviour of transgenic crops instead of reducing uncertainty about the objectives of policies that regulate their use. Focusing on reducing scientific uncertainty at the expense of clarifying policy objectives may have detrimental effects on scientists, science and society.

  12. What justifies the United States ban on federal funding for nonreproductive cloning?

    Science.gov (United States)

    Cunningham, Thomas V

    2013-11-01

    This paper explores how current United States policies for funding nonreproductive cloning are justified and argues against that justification. I show that a common conceptual framework underlies the national prohibition on the use of public funds for cloning research, which I call the simple argument. This argument rests on two premises: that research harming human embryos is unethical and that embryos produced via fertilization are identical to those produced via cloning. In response to the simple argument, I challenge the latter premise. I demonstrate there are important ontological differences between human embryos (produced via fertilization) and clone embryos (produced via cloning). After considering the implications my argument has for the morality of publicly funding cloning for potential therapeutic purposes and potential responses to my position, I conclude that such funding is not only ethically permissible, but also humane national policy.

  13. Is routine antenatal venereal disease research laboratory test still justified? Nigerian experience.

    Science.gov (United States)

    Nwosu, Betrand O; Eleje, George U; Obi-Nwosu, Amaka L; Ahiarakwem, Ita F; Akujobi, Comfort N; Egwuatu, Chukwudi C; Onyiuke, Chukwudumebi O C

    2015-01-01

    To determine the seroreactivity of pregnant women to syphilis in order to justify the need for routine antenatal syphilis screening. A multicenter retrospective analysis of routine antenatal venereal disease research laboratory (VDRL) test results between 1 September 2010 and 31 August 2012 at three specialist care hospitals in south-east Nigeria was done. A reactive VDRL result is subjected for confirmation using Treponema pallidum hemagglutination assay test. Analysis was by Epi Info 2008 version 3.5.1 and Stata/IC version 10. Adequate records were available regarding 2,156 patients and were thus reviewed. The mean age of the women was 27.4 years (±3.34), and mean gestational age was 26.4 weeks (±6.36). Only 15 cases (0.70%) were seropositive to VDRL. Confirmatory T. pallidum hemagglutination assay was positive in 4 of the 15 cases, giving an overall prevalence of 0.19% and a false-positive rate of 73.3%. There was no significant difference in the prevalence of syphilis in relation to maternal age and parity (P>0.05). While the prevalence of syphilis is extremely low in the antenatal care population at the three specialist care hospitals in south-east Nigeria, false-positive rate is high and prevalence did not significantly vary with maternal age or parity. Because syphilis is still a serious but preventable and curable disease, screening with VDRL alone, without confirmatory tests may not be justified. Because of the increase in the demand for evidence-based medicine and litigation encountered in medical practice, we may advocate that confirmatory test for syphilis is introduced in routine antenatal testing to reduce the problem of false positives. The government should increase the health budget that will include free routine antenatal testing including the T. pallidum hemagglutination assay.

  14. Modelling volatility by variance decomposition

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    on the multiplicative decomposition of the variance is developed. It is heavily dependent on Lagrange multiplier type misspecification tests. Finite-sample properties of the strategy and tests are examined by simulation. An empirical application to daily stock returns and another one to daily exchange rate returns...... illustrate the functioning and properties of our modelling strategy in practice. The results show that the long memory type behaviour of the sample autocorrelation functions of the absolute returns can also be explained by deterministic changes in the unconditional variance....

  15. Is routine antenatal venereal disease research laboratory test still justified? Nigerian experience

    Directory of Open Access Journals (Sweden)

    Nwosu BO

    2015-01-01

    Full Text Available Betrand O Nwosu,1 George U Eleje,1 Amaka L Obi-Nwosu,2 Ita F Ahiarakwem,3 Comfort N Akujobi,4 Chukwudi C Egwuatu,4 Chukwudumebi O Onyiuke5 1Department of Obstetrics and Gynecology, Nnamdi Azikiwe University, Nnewi Campus, Nnewi, Anambra State, Nigeria; 2Department of Family Medicine, Nnamdi Azikiwe University Teaching Hospital, Nnewi, Nigeria; 3Department of Medical Microbiology, Imo State University Teaching Hospital, Orlu, Imo State, Nigeria; 4Department of Medical Microbiology, Nnamdi Azikiwe University, Nnewi Campus, Nnewi, Anambra State, Nigeria; 5Department of Medical Microbiology, Nnamdi Azikiwe University Teaching Hospital, Nnewi, Anambra State, NigeriaObjective: To determine the seroreactivity of pregnant women to syphilis in order to justify the need for routine antenatal syphilis screening.Methods: A multicenter retrospective analysis of routine antenatal venereal disease research laboratory (VDRL test results between 1 September 2010 and 31 August 2012 at three specialist care hospitals in south-east Nigeria was done. A reactive VDRL result is subjected for confirmation using Treponema pallidum hemagglutination assay test. Analysis was by Epi Info 2008 version 3.5.1 and Stata/IC version 10.Results: Adequate records were available regarding 2,156 patients and were thus reviewed. The mean age of the women was 27.4 years (±3.34, and mean gestational age was 26.4 weeks (±6.36. Only 15 cases (0.70% were seropositive to VDRL. Confirmatory T. pallidum hemagglutination assay was positive in 4 of the 15 cases, giving an overall prevalence of 0.19% and a false-positive rate of 73.3%. There was no significant difference in the prevalence of syphilis in relation to maternal age and parity (P>0.05.Conclusion: While the prevalence of syphilis is extremely low in the antenatal care population at the three specialist care hospitals in south-east Nigeria, false-positive rate is high and prevalence did not significantly vary with maternal age or

  16. A Specification Test of Stochastic Diffusion Models

    Institute of Scientific and Technical Information of China (English)

    Shu-lin ZHANG; Zheng-hong WEI; Qiu-xiang BI

    2013-01-01

    In this paper,we propose a hypothesis testing approach to checking model mis-specification in continuous-time stochastic diffusion model.The key idea behind the development of our test statistic is rooted in the generalized information equality in the context of martingale estimating equations.We propose a bootstrap resampling method to implement numerically the proposed diagnostic procedure.Through intensive simulation studies,we show that our approach is well performed in the aspects of type Ⅰ error control,power improvement as well as computational efficiency.

  17. Is tenure justified? An experimental study of faculty beliefs about tenure, promotion, and academic freedom.

    Science.gov (United States)

    Ceci, Stephen J; Williams, Wendy M; Mueller-Johnson, Katrin

    2006-12-01

    The behavioral sciences have come under attack for writings and speech that affront sensitivities. At such times, academic freedom and tenure are invoked to forestall efforts to censure and terminate jobs. We review the history and controversy surrounding academic freedom and tenure, and explore their meaning across different fields, at different institutions, and at different ranks. In a multifactoral experimental survey, 1,004 randomly selected faculty members from top-ranked institutions were asked how colleagues would typically respond when confronted with dilemmas concerning teaching, research, and wrong-doing. Full professors were perceived as being more likely to insist on having the academic freedom to teach unpopular courses, research controversial topics, and whistle-blow wrong-doing than were lower-ranked professors (even associate professors with tenure). Everyone thought that others were more likely to exercise academic freedom than they themselves were, and that promotion to full professor was a better predictor of who would exercise academic freedom than was the awarding of tenure. Few differences emerged related either to gender or type of institution, and behavioral scientists' beliefs were similar to scholars from other fields. In addition, no support was found for glib celebrations of tenure's sanctification of broadly defined academic freedoms. These findings challenge the assumption that tenure can be justified on the basis of fostering academic freedom, suggesting the need for a re-examination of the philosophical foundation and practical implications of tenure in today's academy.

  18. Can context justify an ethical double standard for clinical research in developing countries?

    Directory of Open Access Journals (Sweden)

    Landes Megan

    2005-07-01

    Full Text Available Abstract Background The design of clinical research deserves special caution so as to safeguard the rights of participating individuals. While the international community has agreed on ethical standards for the design of research, these frameworks still remain open to interpretation, revision and debate. Recently a breach in the consensus of how to apply these ethical standards to research in developing countries has occurred, notably beginning with the 1994 placebo-controlled trials to reduce maternal to child transmission of HIV-1 in Africa, Asia and the Caribbean. The design of these trials sparked intense debate with the inclusion of a placebo-control group despite the existence of a 'gold standard' and trial supporters grounded their justifications of the trial design on the context of scarcity in resource-poor settings. Discussion These 'contextual' apologetics are arguably an ethical loophole inherent in current bioethical methodology. However, this convenient appropriation of 'contextual' analysis simply fails to acknowledge the underpinnings of feminist ethical analysis upon which it must stand. A more rigorous analysis of the political, social, and economic structures pertaining to the global context of developing countries reveals that the bioethical principles of beneficence and justice fail to be met in this trial design. Conclusion Within this broader, and theoretically necessary, understanding of context, it becomes impossible to justify an ethical double standard for research in developing countries.

  19. What is left to justify the use of chlorhexidine in hand hygiene?

    Science.gov (United States)

    Kampf, Günter

    2008-10-01

    The CDC guideline for hand hygiene describes chlorhexidine gluconate as an agent with "substantial residual activity". But not all studies support this claim. In both suspension tests (e.g. EN 13727) and tests under practical conditions (e.g. EN 1500) it is crucial to neutralize any residual activity in the sampling fluid in order to make sure that the agent does not continue to damage surviving cells after exposure. The neutralization step must also be validated. If this is not done the efficacy may be significantly overestimated, and the healthcare professional may rely on data which do not represent the true efficacy of an agent. A review of eight studies which are cited to support "substantial residual activity" show that none of them were performed with validated neutralization. Seven of them do not demonstrate any residual activity for chlorhexidine gluconate. Only in one study some residual activity is described but the validity of the study design does not allow make this claim as no neutralizing agents were used at all. The benefits of using an active agent must outweigh any risks in order to justify its use. If no real benefits are left for chlorhexidine gluconate in hand hygiene, all the risks count even more such as skin irritation, allergic reactions including anaphylactic shock, and acquired bacterial resistance. Unless there is new and valid evidence to clearly support a benefit of using chlorhexidine gluconate in hand hygiene, healthcare workers should prefer formulations without this agent.

  20. Developing and theoretically justifying innovative organizational practices in health information assurance

    Science.gov (United States)

    Collmann, Jeff R.

    2003-05-01

    This paper justifies and explains current efforts in the Military Health System (MHS) to enhance information assurance in light of the sociological debate between "Normal Accident" (NAT) and "High Reliability" (HRT) theorists. NAT argues that complex systems such as enterprise health information systems display multiple, interdependent interactions among diverse parts that potentially manifest unfamiliar, unplanned, or unexpected sequences that operators may not perceive or immediately understand, especially during emergencies. If the system functions rapidly with few breaks in time, space or process development, the effects of single failures ramify before operators understand or gain control of the incident thus producing catastrophic accidents. HRT counters that organizations with strong leadership support, continuous training, redundant safety features and "cultures of high reliability" contain the effects of component failures even in complex, tightly coupled systems. Building highly integrated, enterprise-wide computerized health information management systems risks creating the conditions for catastrophic breaches of data security as argued by NAT. The data security regulations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) implicitly depend on the premises of High Reliability Theorists. Limitations in HRT thus have implications for both safe program design and compliance efforts. MHS and other health care organizations should consider both NAT and HRT when designing and deploying enterprise-wide computerized health information systems.

  1. Selection Criteria in Regime Switching Conditional Volatility Models

    Directory of Open Access Journals (Sweden)

    Thomas Chuffart

    2015-05-01

    Full Text Available A large number of nonlinear conditional heteroskedastic models have been proposed in the literature. Model selection is crucial to any statistical data analysis. In this article, we investigate whether the most commonly used selection criteria lead to choice of the right specification in a regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH and the Markov-Switching GARCH models. Simulation experiments reveal that information criteria and loss functions can lead to misspecification ; BIC sometimes indicates the wrong regime switching framework. Depending on the Data Generating Process used in the experiments, great care is needed when choosing a criterion.

  2. Army Justified Initial Production Plan for the Paladin Integrated Management Program but Has Not Resolved Two Vehicle Performance Deficiencies (Redacted)

    Science.gov (United States)

    2016-08-05

    a total of 37 howitzers and deploy the first vehicles in March 2017. An extensive AFES redesign could require significant human resources and time...Department of Defense Report No. DODIG-2016-118 A U G U S T 5 , 2 0 1 6 Army Justified Initial Production Plan for the Paladin Integrated Management ...DODIG-2016-118 (Project No. D2016-D000AU-0003.000) │ i Results in Brief Army Justified Initial Production Plan for the Paladin Integrated Management

  3. Is routine thromboprophylaxis justified among Indian patients sustaining major orthopedic trauma? A systematic review

    Directory of Open Access Journals (Sweden)

    Ramesh K Sen

    2011-01-01

    Full Text Available Venous thromboembolism (VTE is one of the most common preventable cause of morbidity and mortality after trauma. Though most of the western countries have their guidelines for thromboprophylaxis in these patients, India still does not have these. The increasing detection of VTE among Indian population, lack of awareness, underestimation of the risk, and fear of bleeding complications after chemical prophylaxis have made deep vein thrombosis (DVT a serious problem, hence a standard guideline for thromboprophylaxis after trauma is essential. The present review article discusses the incidence of DVT and role of thromboprophylaxis in Indian patients who have sustained major orthopedic trauma. A thorough search of ′PubMed′ and ′Google Scholar′ revealed 10 studies regarding venous thromboembolism in Indian patients after major orthopedic trauma surgery (hip or proximal femur fracture and spine injury. Most of these studies have evaluated venous thromboembolism in patients of arthroplasty and trauma. The incidence, risk factors, diagnosis and management of VTE in the subgroup of trauma patients (1049 patients were separately evaluated after segregating them from the arthroplasty patients. Except two studies, which were based on spinal injury, all other studies recommended screening/ thromboprophylaxis in posttraumatic conditions in the Indian population. Color Doppler was used as common diagnostic or screening tool in most of the studies (eight studies, 722 patients. The incidence of VTE among thromboprophylaxis-receiving group was found to be 8% (10/125, whereas it was much higher (14.49%, 40/276 in patients not receiving any form of prophylaxis. Indian patients have definite risk of venous thromboembolism after major orthopedic trauma (except spinal injury, and thromboprophylaxis either by chemical or mechanical methods seems to be justified in them.

  4. Application of Fibrin Glue Sealant After Hepatectomy Does Not Seem Justified

    Science.gov (United States)

    Figueras, Juan; Llado, Laura; Miro, Mónica; Ramos, Emilio; Torras, Jaume; Fabregat, Juan; Serrano, Teresa

    2007-01-01

    Objective: To evaluate the efficacy, amount of hemorrhage, biliary leakage, complications, and postoperative evolution after fibrin glue sealant application in patients undergoing liver resection. Summary Background Data: Fibrin sealants have become popular as a means of improving perioperative hemostasis and reducing biliary leakage after liver surgery. However, trials regarding its use in liver surgery remain limited and of poor methodologic quality. Patients and Methods: A total of 300 patients undergoing hepatic resection were randomly assigned to fibrin glue application or control groups. Characteristics and debit of drainage and postoperative complications were evaluated. The amount of blood loss, measurements of hematologic parameters liver test, and postoperative evolution (particularly involving biliary fistula and morbidity) was also recorded. Results: Postoperatively, no differences were observed in the amount of transfusion (0.15 ± 0.66 vs. 0.17 ± 0.63 PRCU; P = 0.7234) or in the patients that required transfusion (18% vs. 12%; P = 0.2), respectively, for the fibrin glue or control group. There were no differences in overall drainage volumes (1180 ± 2528 vs. 960 ± 1253 mL) or in days of postoperative drainage (7.9 ± 5 vs. 7.1 ± 4.7). Incidence of biliary fistula was similar in the fibrin glue and control groups, (10% vs. 11%). There were no differences regarding postoperative morbidity between groups (23% vs. 23%; P = 1). Conclusions: Application of fibrin sealant in the raw surface of the liver does not seem justified. Blood loss, transfusion, incidence of biliary fistula, and outcome are comparable to patients without fibrin glue. Therefore, discontinuation of routine use of fibrin sealant would result in significant cost saving. PMID:17414601

  5. Are sample sizes clear and justified in RCTs published in dental journals?

    Directory of Open Access Journals (Sweden)

    Despina Koletsi

    Full Text Available Sample size calculations are advocated by the CONSORT group to justify sample sizes in randomized controlled trials (RCTs. The aim of this study was primarily to evaluate the reporting of sample size calculations, to establish the accuracy of these calculations in dental RCTs and to explore potential predictors associated with adequate reporting. Electronic searching was undertaken in eight leading specific and general dental journals. Replication of sample size calculations was undertaken where possible. Assumed variances or odds for control and intervention groups were also compared against those observed. The relationship between parameters including journal type, number of authors, trial design, involvement of methodologist, single-/multi-center study and region and year of publication, and the accuracy of sample size reporting was assessed using univariable and multivariable logistic regression. Of 413 RCTs identified, sufficient information to allow replication of sample size calculations was provided in only 121 studies (29.3%. Recalculations demonstrated an overall median overestimation of sample size of 15.2% after provisions for losses to follow-up. There was evidence that journal, methodologist involvement (OR = 1.97, CI: 1.10, 3.53, multi-center settings (OR = 1.86, CI: 1.01, 3.43 and time since publication (OR = 1.24, CI: 1.12, 1.38 were significant predictors of adequate description of sample size assumptions. Among journals JCP had the highest odds of adequately reporting sufficient data to permit sample size recalculation, followed by AJODO and JDR, with 61% (OR = 0.39, CI: 0.19, 0.80 and 66% (OR = 0.34, CI: 0.15, 0.75 lower odds, respectively. Both assumed variances and odds were found to underestimate the observed values. Presentation of sample size calculations in the dental literature is suboptimal; incorrect assumptions may have a bearing on the power of RCTs.

  6. Are sample sizes clear and justified in RCTs published in dental journals?

    Science.gov (United States)

    Koletsi, Despina; Fleming, Padhraig S; Seehra, Jadbinder; Bagos, Pantelis G; Pandis, Nikolaos

    2014-01-01

    Sample size calculations are advocated by the CONSORT group to justify sample sizes in randomized controlled trials (RCTs). The aim of this study was primarily to evaluate the reporting of sample size calculations, to establish the accuracy of these calculations in dental RCTs and to explore potential predictors associated with adequate reporting. Electronic searching was undertaken in eight leading specific and general dental journals. Replication of sample size calculations was undertaken where possible. Assumed variances or odds for control and intervention groups were also compared against those observed. The relationship between parameters including journal type, number of authors, trial design, involvement of methodologist, single-/multi-center study and region and year of publication, and the accuracy of sample size reporting was assessed using univariable and multivariable logistic regression. Of 413 RCTs identified, sufficient information to allow replication of sample size calculations was provided in only 121 studies (29.3%). Recalculations demonstrated an overall median overestimation of sample size of 15.2% after provisions for losses to follow-up. There was evidence that journal, methodologist involvement (OR = 1.97, CI: 1.10, 3.53), multi-center settings (OR = 1.86, CI: 1.01, 3.43) and time since publication (OR = 1.24, CI: 1.12, 1.38) were significant predictors of adequate description of sample size assumptions. Among journals JCP had the highest odds of adequately reporting sufficient data to permit sample size recalculation, followed by AJODO and JDR, with 61% (OR = 0.39, CI: 0.19, 0.80) and 66% (OR = 0.34, CI: 0.15, 0.75) lower odds, respectively. Both assumed variances and odds were found to underestimate the observed values. Presentation of sample size calculations in the dental literature is suboptimal; incorrect assumptions may have a bearing on the power of RCTs.

  7. On emergencies and emigration: how (not) to justify compulsory medical service.

    Science.gov (United States)

    Blake, Michael

    2016-04-20

    I have argued that the best way to understand the supposed right to restrict emigration is with reference to the concept of an emergency; restrictions on emigration are permitted, if at all, only as responses to an emergency situation, and must be judged with reference to the ethics of responding to such an emergency. Eszter Kollar argues, against this, that the concept of 'emergency' fails to describe the actual situation in low/middle-income countries, in which shortages of medical personnel are long-standing problems; she also argues that there is no need to invoke the concept of an emergency, when we might simply discuss these restrictions with reference to the relative importance of the human goods and interests involved. I argue, against Kollar, that we have no reason to think that an emergency must involve novelty; if the moral stakes are significant enough, we have reason to think of a situation as an emergency, regardless of when that situation began. I argue, too, that we have reason to differentiate between restrictions of liberties undertaken as part of the process of specifying liberal freedoms and emergency restrictions of those liberties defended by liberalism itself. The latter, I suggest, ought to be recognised and defended as a distinct moral category, if only to recognise the continuing moral remainder when a liberal right is temporarily suspended under emergency circumstances. I conclude that a permission to restrict emigration is, if at all, only justifiable as an emergency response to unfavourable circumstances, and ought not to be analysed in the more conventional liberal terms Kollar deploys.

  8. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration.

  9. Upper Midwest farmer perceptions: Too much uncertainty about impacts of climate change to justify changing current agricultural practices

    Science.gov (United States)

    Lois Wright Morton; Gabrielle E. Roesch-McNally; Adam Wilke

    2017-01-01

    To be uncertain is to be unsure or have doubt. Results from a random sample survey show the majority (89.5%) of farmers in the Upper Midwest perceived there was too much uncertainty about the impacts of climate to justify changing their agricultural practices and strategies, despite scientific evidence regarding the causes and potential consequences of climate change....

  10. Is Carbon Motivated Border Tax Justifiable?%碳关税的合理性何在?

    Institute of Scientific and Technical Information of China (English)

    林伯强; 李爱军

    2012-01-01

    2012年1月,欧盟将航空业纳入碳交易体系,意味着碳关税正式付诸实践。由于巨大的二氧化碳排放量和排放增量,中国将会受到碳关税问题的冲击。然而,碳关税可以降低中国二氧化碳排放量,能源税、碳税等措施同样也可以降低中国二氧化碳排放量。那么,哪种碳减排措施是更为有效的碳减排工具?换句话讲,碳关税是不是一个有效的碳减排工具,在碳减排问题上是否具有合理性?本文采用多国CGE模型进行分析,试图回答这些问题。结果表明,碳关税与碳关税等效措施的影响存在显著差异。相比较而言,碳关税会导致较高的碳减排成本,较高的碳泄漏率,对世界二氧化碳减排的贡献相对较小。因而,碳关税不具有合理性。不过,碳关税却是有效的威胁手段,因为它可以迫使发展中国家采用碳减排措施。%Carbon motivated border tax (CMBT for short) came into practice when EU levied airline carbon tax in January 2012. Due to large carbon emissions and incremental carbon emissions, China would face the challenge of CMBT. CMBT could reduce China's carbon emissions, and energy tax or carbon tax (termed as CMBT -emission-equivalent policies) in China could also reduce carbon emissions. Then, which policy option would be more effective to reduce carbon emissions.9 Put it differently, could carbon emissions reduction justify CMBT? The paper applies a muhination CGE trying to analyze and answer these questions. Our simulation results based on CGE model indicate that there would be significant differences in the effects between CMBT and CMBT-emission-equivalent policies. Compared to CMBT -emission-equivalent policies, CMBT would be more costly in reducing carbon emissions, resulting in high carbon leakage rate, and contribute less to world's emission reduction. Therefore, carbon emissions reduciton alone will not justify CMBT. However, CMBT could

  11. Can "presumed consent" justify the duty to treat infectious diseases? An analysis

    Directory of Open Access Journals (Sweden)

    Arda Berna

    2008-03-01

    -fifth of the participants in this study either lacked adequate knowledge of the occupational risks when they chose the medical profession or were not sufficiently informed of these risks during their faculty education and training. Furthermore, in terms of the moral duty to provide care, it seems that most HCWs are more concerned about the availability of protective measures than about whether they had been informed of a particular risk beforehand. For all these reasons, the presumed consent argument is not persuasive enough, and cannot be used to justify the duty to provide care. It is therefore more useful to emphasize justifications other than presumed consent when defining the duty of HCWs to provide care, such as the social contract between society and the medical profession and the fact that HCWs have a greater ability to provide medical aid.

  12. [Cesarean birth: justifying indication or justified concern?].

    Science.gov (United States)

    Muñoz-Enciso, José Manuel; Rosales-Aujang, Enrique; Domínguez-Ponce, Guillermo; Serrano-Díaz, César Leopoldo

    2011-02-01

    Caesarean section is the most common surgery performed in all hospitals of second level of care in the health sector and more frequently in private hospitals in Mexico. To determine the behavior that caesarean section in different hospitals in the health sector in the city of Aguascalientes and analyze the indications during the same period. A descriptive and cross in the top four secondary hospitals in the health sector of the state of Aguascalientes, which together account for 81% of obstetric care in the state, from 1 September to 31 October 2008. Were analyzed: indication of cesarean section and their classification, previous pregnancies, marital status, gestational age, weight and minute Apgar newborn and given birth control during the event. were recorded during the study period, 2.964 pregnancies after 29 weeks, of whom 1.195 were resolved by Caesarean section with an overall rate of 40.3%. We found 45 different indications, which undoubtedly reflect the great diversity of views on the institutional medical staff to schedule a cesarean section. Although each institution has different resources and a population with different characteristics, treatment protocols should be developed by staff of each hospital to have the test as a cornerstone of labor, also request a second opinion before a caesarean section, all try to reduce the frequency of cesarean section.

  13. Expectation Maximization Algorithm for Box-Cox Transformation Cure Rate Model and Assessment of Model Mis-specication under Weibull Lifetimes.

    Science.gov (United States)

    Pal, Suvra; Balakrishnan, N

    2017-05-16

    In this paper, we develop likelihood inference based on the expectation maximization (EM) algorithm for the Box- Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model mis-specification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.

  14. Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Enercon Services, Inc.

    2011-03-14

    Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnup Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in

  15. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  16. Relational and geometric approaches to justifying the magnetic fields of astrophysical objects

    Science.gov (United States)

    Babenko, I. A.

    We propose justification of the Sutherland hypotheses about origin of the magnetic fields of the Earth, Sun and other astrophysical objects as a part of the relational theory of space-time and interactions ("binary geometrophysics") and multidimensional geometrical models of physical interactions (like the Kaluza-Klein theories).

  17. Justifying the Gompertz curve of mortality via the generalized Polya process of shocks.

    Science.gov (United States)

    Cha, Ji Hwan; Finkelstein, Maxim

    2016-06-01

    A new probabilistic model of aging that can be applied to organisms is suggested and analyzed. Organisms are subject to shocks that follow the generalized Polya process (GPP), which has been recently introduced and characterized in the literature. Distinct from the nonhomogeneous Poisson process that has been widely used in applications, the important feature of this process is the dependence of its future behavior on the number of previous events (shocks). The corresponding survival and the mortality rate functions are derived and analyzed. The general approach is used for justification of the Gompertz law of human mortality.

  18. Estimation and model selection of semiparametric multivariate survival functions under general censorship.

    Science.gov (United States)

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2010-07-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root-n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.

  19. La guerre en Irak peut-elle être justifiée comme un cas d’intervention humanitaire?

    Directory of Open Access Journals (Sweden)

    Stéphane Courtois

    2006-05-01

    Full Text Available Most current criticisms against the intervention in Iraq have tackled the two justifications articulated by the members of the coalition:(1 that the United States had to neutralize the threats that Iraq generated for their own security and to the political stability in the Middle Eastand (2 that the war in Iraq can be justified as a necessary stage in the war against international terrorism. The principal objection against justification (1 is that it was, and remains, unfounded. Against justification (2, many have replied that the intervention in Iraq had no connection,or at best had merely an indirect connection, with the fight against terrorism. In a recent text,Fernando Tesón claims that the American intervention in Iraq can nevertheless be morally justified as a case of humanitarian intervention. By “humanitarian intervention”, one must understand a coercive action taken by a state or a group of states inside the sphere of jurisdiction of an independent political community, without the permission of the latter, in order to preventor to end a massive violation of individual rights perpetrated against innocent persons which are not co-nationals inside this political community. I argue in this article that the American intervention in Iraq does not satisfy the conditions of a legitimate humanitarian intervention, as opposed to what Fernando Tesón claims.

  20. Don't Lie but Don't Tell the Whole Truth: The Therapeutic Privilege - Is it Ever Justified?

    Science.gov (United States)

    Edwin, Ak

    2008-12-01

    This position paper will show that withholding information from a competent patient is a violation of the doctor's role as a fiduciary and is not ever justified. As a fiduciary, the doctor's relationship with his or her patient must be one of candour since it will be impossible for the patient to trust the doctor without regular candid information regarding the patient's condition and its outcome. Although the use of the therapeutic privilege has been recognized by several courts and is supported by scientific literature, I will explore why withholding information from a competent patient is a violation of the doctor's role as a fiduciary and as such is not legally or ethically defensible.While some courts have recognized the therapeutic privilege as a way of promoting patient wellbeing and respecting the Hippocratic dictum of "primum non nocere" {or first do no harm}, my position is that this is not ethically justifiable. Since information is a powerful tool for both harm and good, consciously withholding information from competent patients disempowers them and requires greater justification than patient welfare.Even though there is legal recognition of therapeutic privilege, it is not applicable on ethical grounds. In addition to disrespecting autonomy, withholding information from competent patients does not benefit them in the long run and can actually cause more harm than good. Consequently, a doctor who withholds information from a competent patient unless in the exceptional case of patient waiver violates the ethical principles of autonomy, beneficence and nonmaleficence.

  1. Does the Occasion Justify the Denunciation?: a Multilevel Approach for Brazilian Accountants

    Directory of Open Access Journals (Sweden)

    Bernardo de Abreu Guelber Fajardo

    2014-01-01

    Full Text Available Frauds represent large losses to the global economy, and one of the main means for their containment is by means of denunciations within organizations: whistle blowing. This research aims to analyze whistle blowing within the Brazilian context, considering the influence of costs and intrinsic benefits as well as aspects of the individual's interaction with his/her organization, profession and society at large. By means of a questionnaire answered by 124 accountants, a multilevel model was applied to analyze these aspects. The results demonstrate the importance of situational aspects as a positive influence in favor of denunciations. These results are useful for organizations and regulatory institutions in developing institutional mechanisms to encourage denunciation. Moreover, the results are also useful for teachers of professional ethics and members of the Federal and Regional Accounting Councils, which are dedicated to the assessment of alleged deviations from the professional code of ethics.

  2. Legionella pneumophila, armed to the hilt: justifying the largest arsenal of effectors in the bacterial world.

    Science.gov (United States)

    Ensminger, Alexander W

    2016-02-01

    Many bacterial pathogens use dedicated translocation systems to deliver arsenals of effector proteins to their hosts. Once inside the host cytosol, these effectors modulate eukaryotic cell biology to acquire nutrients, block microbial degradation, subvert host defenses, and enable pathogen transmission to other hosts. Among all bacterial pathogens studied to date, the gram-negative pathogen, Legionella pneumophila, maintains the largest arsenal of effectors, with over 330 effector proteins translocated by the Dot/Icm type IVB translocation system. In this review, I will discuss some of the recent work on understanding the consequences of this large arsenal. I will also present several models that seek to explain how L. pneumophila has acquired and subsequently maintained so many more effectors than its peers.

  3. Is the use of wildlife group-specific concentration ratios justified?

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Michael D. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Beresford, Nicholas A. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Centre for Ecology and Hydrology, Bailrigg, Lancaster, LA1 4AP (United Kingdom); Copplestone, David [School of Natural Sciences, University of Stirling, Stirling, FK9 4LA (United Kingdom); Howard, Brenda J. [Centre for Ecology and Hydrology, Bailrigg, Lancaster, LA1 4AP (United Kingdom); Yankovich, Tamara L. [International Atomic Energy Agency, Vienna International Centre, 1400 Vienna (Austria)

    2014-07-01

    The international Wildlife Transfer Database (WTD; www.wildlifetransferdatabase.org/?) provides the most comprehensive international compilation of radionuclide transfer parameters (concentration ratios) for wildlife. The concentration ratio (CR{sub wo-media}) is a constant that describes the ratio between the activity concentration of a radionuclide in the whole- organism and the activity concentration of that radionuclide in a reference environmental medium (e.g. soil or filtered water). Developed to support activities of the International Atomic Energy Agency (IAEA) and the International Commission on Radiological Protection (ICRP), the WTD now contains over 100,000 CR{sub wo-media} values. The WTD has been used to generate summary statistics for broad wildlife groups (e.g. amphibian, arthropod, mammal, reptile, shrub, tree etc). The group-specific summary statistics include mean and standard deviation (both arithmetic and geometric) and range. These summarised CR{sub wo-media} values (generally arithmetic or geometric mean) are used in most of the modelling approaches currently implemented for wildlife dose assessment. Beyond the broad organism group summary statistics presented within the WTD, it is possible to generate CR{sub wo-media} summary statistics for some organism sub-categories (e.g. carnivorous, herbivorous and omnivorous birds). However, using a statistical analysis we developed recently for the analysis of summarised datasets, we have shown that there is currently little statistical justification for the use of organism sub-category CR{sub wo-media} values. Large variability is a characteristic of many of the organism-radionuclide datasets within the WTD, even within individual input data sets. Therefore, the statistical validity of defining different CR{sub wo-media} values for these broad wildlife groups may also be questioned. However, no analysis has been undertaken to date to determine the statistical significance of any differences between

  4. Should she be granted asylum? Examining the justifiability of the persecution criterion and nexus clause in asylum law

    Directory of Open Access Journals (Sweden)

    Noa Wirth Nogradi

    2016-10-01

    Full Text Available The current international asylum regime recognizes only persecuted persons as rightful asylum applicants. The Geneva Convention and Protocol enumerate specific grounds upon which persecution is recognized. Claimants who cannot demonstrate a real risk of persecution based on one of the recognized grounds are unlikely to be granted asylum. This paper aims to relate real-world practices to normative theories, asking whether the Convention’s restricted preference towards persecuted persons is normatively justified. I intend to show that the justifications of the persecution criterion also apply to grounds currently lacking recognition. My main concern will be persecution on the grounds of gender.The first section introduces the dominant standpoints in theories of asylum, which give different answers to the question of who should be granted asylum, based on different normative considerations. Humanitarian theories base their claims on the factual neediness of asylum-seekers, holding that whoever is in grave danger of harm or deprivation should be granted asylum. Political theories base their justifications on conceptions of legitimacy and membership, holding that whoever has been denied membership in their original state should be granted asylum. Under political theories, Matthew Price’s theory will be discussed, which provides a normative justification of the currently recognized persecution criterion. The second section provides a descriptive definition of persecution based on Kuosmanen (2014, and evaluates the normative relevance of the different elements of this definition based on the theories presented previously. The third section is devoted to the examination of the normative justifiability of the nexus clause’s exclusive list of the bases (grounds upon which persons might be persecuted. The section argues that while the clause does not recognize that persecution might be based on gender, in fact many women experience harms based on

  5. A Lacanian Reading of the Two Novels The Scarlet Letter And Private Memoirs And Confessions of A Justified Sinner

    Directory of Open Access Journals (Sweden)

    Marjan Yazdanpanahi

    2016-07-01

    Full Text Available This paper discusses two novels The Private Memoirs and Confessions of a Justified Sinner and The Scarlet Letter written by James Hogg and Nathaniel Hawthorn from the perspective of Jacques Lacan theories: the mirror stage, the-name-of-the-father and desire. The mirror stage refers to historical value and an essential libidinal relationship with the body-image. The-name-of-the-father is defined as the prohibitive role of the father as the one who lays down the incest taboo in the Oedipus complex. Meanwhile, desire is neither the appetite for satisfaction, nor the demand for love, but the difference that results from the subtraction of the first from the second.

  6. PET/CT in cancer: moderate sample sizes may suffice to justify replacement of a regional gold standard

    DEFF Research Database (Denmark)

    Gerke, Oke; Poulsen, Mads Hvid; Bouchelouche, Kirsten

    2009-01-01

    PURPOSE: For certain cancer indications, the current patient evaluation strategy is a perfect but locally restricted gold standard procedure. If positron emission tomography/computed tomography (PET/CT) can be shown to be reliable within the gold standard region and if it can be argued that PET....../CT also performs well in adjacent areas, then sample sizes in accuracy studies can be reduced. PROCEDURES: Traditional standard power calculations for demonstrating sensitivities of both 80% and 90% are shown. The argument is then described in general terms and demonstrated by an ongoing study...... of metastasized prostate cancer. RESULTS: An added value in accuracy of PET/CT in adjacent areas can outweigh a downsized target level of accuracy in the gold standard region, justifying smaller sample sizes. CONCLUSIONS: If PET/CT provides an accuracy benefit in adjacent regions, then sample sizes can be reduced...

  7. Is a Clean Development Mechanism project economically justified? Case study of an International Carbon Sequestration Project in Iran.

    Science.gov (United States)

    Katircioglu, Salih; Dalir, Sara; Olya, Hossein G

    2016-01-01

    The present study evaluates a carbon sequestration project for the three plant species in arid and semiarid regions of Iran. Results show that Haloxylon performed appropriately in the carbon sequestration process during the 6 years of the International Carbon Sequestration Project (ICSP). In addition to a high degree of carbon dioxide sequestration, Haloxylon shows high compatibility with severe environmental conditions and low maintenance costs. Financial and economic analysis demonstrated that the ICSP was justified from an economic perspective. The financial assessment showed that net present value (NPV) (US$1,098,022.70), internal rate of return (IRR) (21.53%), and payback period (6 years) were in an acceptable range. The results of the economic analysis suggested an NPV of US$4,407,805.15 and an IRR of 50.63%. Therefore, results of this study suggest that there are sufficient incentives for investors to participate in such kind of Clean Development Mechanism (CDM) projects.

  8. The protection of fundamental rights in the Netherlands and South Africa compared: can the many differences be justified?

    Directory of Open Access Journals (Sweden)

    G van der Schyff

    2008-08-01

    Full Text Available This contribution considers the protection of fundamental rights in the Netherlands and South Africa. Both countries strive to be constitutional democracies that respect basic rights. But both countries go about this aim in very different ways. These different paths to constitutionalism are compared, as well as the reasons for these differences and whether it can be said that these differences are justifiable. This is done by comparing the character of the rights guaranteed in the Dutch and South African legal orders, the sources of these rights and the locus or centre of protection in both systems. The conclusion is reached that no single or perfect route to attaining the desired protection of fundamental rights exists, but that one should always enquire as to the state of individual freedom and the right to make free political choices in measuring the worth of a system's protection of rights.

  9. Most Antidepressant Use in Primary Care Is Justified; Results of the Netherlands Study of Depression and Anxiety

    Science.gov (United States)

    Piek, Ellen; van der Meer, Klaas; Hoogendijk, Witte J. G.; Penninx, Brenda W. J. H.; Nolen, Willem A.

    2011-01-01

    Background Depression is a common illness, often treated in primary care. Many studies have reported undertreatment with antidepressants in primary care. Recently, some studies also reported overtreatment with antidepressants. The present study was designed to assess whether treatment with antidepressants in primary care is in accordance with current guidelines, with a special focus on overtreatment. Methodology We used baseline data of primary care respondents from the Netherlands Study of Depression and Anxiety (NESDA) (n = 1610). Seventy-nine patients with treatment in secondary care were excluded. We assessed justification for treatment with antidepressant according to the Dutch primary care guidelines for depression and for anxiety disorders. Use of antidepressants was based on drug-container inspection or, if unavailable, on self-report. Results were recalculated to the original population of primary care patients from which the participants in NESDA were selected (n = 10,677). Principal Findings Of 1531 included primary care patients, 199 (13%) used an antidepressant, of whom 188 (94.5%) (possibly) justified. After recalculating these numbers to the original population (n = 10,677), we found 908 (95% CI 823 to 994) antidepressant users. Forty-nine (95% CI 20 to 78) of them (5.4%) had no current justification for an antidepressant, but 27 of them (54.5%) had a justified reason for an antidepressant at some earlier point in their life. Conclusions We found that overtreatment with antidepressants in primary care is not a frequent problem. Too long continuation of treatment seems to explain the largest proportion of overtreatment as opposed to inappropriate initiation of treatment. PMID:21479264

  10. Most antidepressant use in primary care is justified; results of the Netherlands Study of Depression and Anxiety.

    Directory of Open Access Journals (Sweden)

    Ellen Piek

    Full Text Available BACKGROUND: Depression is a common illness, often treated in primary care. Many studies have reported undertreatment with antidepressants in primary care. Recently, some studies also reported overtreatment with antidepressants. The present study was designed to assess whether treatment with antidepressants in primary care is in accordance with current guidelines, with a special focus on overtreatment. METHODOLOGY: We used baseline data of primary care respondents from the Netherlands Study of Depression and Anxiety (NESDA (n = 1610. Seventy-nine patients with treatment in secondary care were excluded. We assessed justification for treatment with antidepressant according to the Dutch primary care guidelines for depression and for anxiety disorders. Use of antidepressants was based on drug-container inspection or, if unavailable, on self-report. Results were recalculated to the original population of primary care patients from which the participants in NESDA were selected (n = 10,677. PRINCIPAL FINDINGS: Of 1531 included primary care patients, 199 (13% used an antidepressant, of whom 188 (94.5% (possibly justified. After recalculating these numbers to the original population (n = 10,677, we found 908 (95% CI 823 to 994 antidepressant users. Forty-nine (95% CI 20 to 78 of them (5.4% had no current justification for an antidepressant, but 27 of them (54.5% had a justified reason for an antidepressant at some earlier point in their life. CONCLUSIONS: We found that overtreatment with antidepressants in primary care is not a frequent problem. Too long continuation of treatment seems to explain the largest proportion of overtreatment as opposed to inappropriate initiation of treatment.

  11. Justifier l’injustifiable

    Directory of Open Access Journals (Sweden)

    Olivier Jouanjan

    2006-04-01

    Full Text Available Le « droit » tient aussi dans les discours qu’on tient sur lui, notamment les discours des juristes. L’analyse des discours des juristes engagés du Troisième Reich fait ressortir un schéma général de justification, un principe grammatical génératif de ces discours qu’on peut qualifier de « décisionnisme substantiel ». Le positivisme juridique, parce qu’abstrait et « juif », fut désigné comme l’ennemi principal de la science du « droit » nazi, une « science » qui ne pouvait se concevoir elle-même que comme politique. En analysant la construction idéologico-juridique de l’État total, la destruction de la notion de droits subjectifs, la substitution au concept de personnalité juridique d’une notion « concrète » de l’« être-membre-de-la-communauté », puis en montrant le fonctionnement de ces discours dans la pratique, la présente contribution met en évidence la double logique de l’incorporation et de l’incarnation à l’œuvre dans la science nazie du droit, une « science » dont Carl Schmitt fait la « théorie » en 1934 à travers la « pensée de l’ordre concret ».

  12. Justifying Educational Language Rights

    Science.gov (United States)

    May, Stephen

    2014-01-01

    The author of this chapter observes that post-9/11 there has been a rapid and significant retrenchment of multiculturalism as public policy, particularly within education. This apparent retrenchment of multiculturalism as public policy has been bolstered by parallel arguments for a more "cosmopolitan" approach to education within an…

  13. E-governance justified

    Directory of Open Access Journals (Sweden)

    William Akotam Agangiba

    2013-03-01

    Full Text Available Information and Communication Technology today has become an indispensable part in our lives, gaining wide application in human activities. This is due to the fact that, its use is less expensive, more secure, and allows speedy information transmission and access. It serves as a good base for the development and success of today’s relatively young technologies. It has relieved people of manual day-to-day activities in such areas as businesses organizations, transport industry, teaching and research, banking, broadcasting, entertainment amongst other. This paper takes an overview study of e-governance, one of the most demanding applications of information and communication technology for public services. The paper summarizes the concept of e-governance, its major essence and some ongoing e-governance activities in some parts of the world.

  14. Is Sport Nationalism Justifiable?

    Directory of Open Access Journals (Sweden)

    José Luis Pérez Triviño

    2015-07-01

    Full Text Available The article aims to clarify the deep relationships established between sport and nationalism by considering, among other factors, the instrumentalisation of sport by political elites, political apathy of citizens, economic resources for sport, the question of violence or identitarian matters. In order to define if the combination of sport and nationalism is admissible, the paper defines sport nationalism and distinguishes the political use of sport for purposes of domestic and foreign policy. In the first section the analysis focuses on whether a causal link with respect to the contribution to violence can be established and with respect to its use in the internal politics of a state, the paper differentiates between normal political circumstances and political crises in order to properly address the question of whether there are grounds to assert that sport can distract citizens from asserting their genuine interests.

  15. Is Sport Nationalism Justifiable?

    Directory of Open Access Journals (Sweden)

    José Luis Pérez Triviño

    2012-01-01

    Full Text Available The article aims to clarify the deep relationships established between sport and nationalism by considering, among other factors, the instrumentalisation of sport by political elites, political apathy of citizens, economic resources for sport, the question of violence or identitarian matters. In order to define if the combination of sport and nationalism is admissible, the paper defines sport nationalism and distinguishes the political use of sport for purposes of domestic and foreign policy. In the first section the analysis focuses on whether a causal link with respect to the contribution to violence can be established and with respect to its use in the internal politics of a state, the paper differentiates between normal political circumstances and political crises in order to properly address the question of whether there are grounds to assert that sport can distract citizens from asserting their genuine interests.

  16. Topics in modelling of clustered data

    CERN Document Server

    Aerts, Marc; Ryan, Louise M; Geys, Helena

    2002-01-01

    Many methods for analyzing clustered data exist, all with advantages and limitations in particular applications. Compiled from the contributions of leading specialists in the field, Topics in Modelling of Clustered Data describes the tools and techniques for modelling the clustered data often encountered in medical, biological, environmental, and social science studies. It focuses on providing a comprehensive treatment of marginal, conditional, and random effects models using, among others, likelihood, pseudo-likelihood, and generalized estimating equations methods. The authors motivate and illustrate all aspects of these models in a variety of real applications. They discuss several variations and extensions, including individual-level covariates and combined continuous and discrete outcomes. Flexible modelling with fractional and local polynomials, omnibus lack-of-fit tests, robustification against misspecification, exact, and bootstrap inferential procedures all receive extensive treatment. The application...

  17. Rethinking Recruitment in Policing in Australia: Can the Continued Use of Masculinised Recruitment Tests and Pass Standards that Limit the Number of Women be Justified?

    Directory of Open Access Journals (Sweden)

    Susan Robinson

    2015-06-01

    Full Text Available Over the past couple of decades, Australian police organisations have sought to increase the numbers of women in sworn policing roles by strictly adhering to equal treatment of men and women in the recruitment process. Unfortunately this blind adherence to equal treatment in the recruitment processes may inadvertently disadvantage and limit women. In particular, the emphasis on masculine attributes in recruitment, as opposed to the ‘soft’ attributes of communication and conflict resolution skills, and the setting of the minimum pass standards according to average male performance, disproportionately disadvantages women and serves to unnecessarily limit the number of women in policing. This paper reviews studies undertaken by physiotherapists and a range of occupational experts to discuss the relevance of physical fitness and agility tests and the pass standards that are applied to these in policing. It is suggested that masculinised recruitment tests that pose an unnecessary barrier to women cannot be justified unless directly linked to the job that is to be undertaken. Utilising a policy development and review model, an analysis of the problem posed by physical testing that is unadjusted for gender, is applied. As a result, it is recommended that police organisations objectively review recruitment processes and requirements to identify and eliminate unnecessary barriers to women’s entry to policing. It is also recommended that where fitness and agility tests are deemed essential to the job, the pass level is adjusted for gender.

  18. Spatial measurement error and correction by spatial SIMEX in linear regression models when using predicted air pollution exposures.

    Science.gov (United States)

    Alexeeff, Stacey E; Carroll, Raymond J; Coull, Brent

    2016-04-01

    Spatial modeling of air pollution exposures is widespread in air pollution epidemiology research as a way to improve exposure assessment. However, there are key sources of exposure model uncertainty when air pollution is modeled, including estimation error and model misspecification. We examine the use of predicted air pollution levels in linear health effect models under a measurement error framework. For the prediction of air pollution exposures, we consider a universal Kriging framework, which may include land-use regression terms in the mean function and a spatial covariance structure for the residuals. We derive the bias induced by estimation error and by model misspecification in the exposure model, and we find that a misspecified exposure model can induce asymptotic bias in the effect estimate of air pollution on health. We propose a new spatial simulation extrapolation (SIMEX) procedure, and we demonstrate that the procedure has good performance in correcting this asymptotic bias. We illustrate spatial SIMEX in a study of air pollution and birthweight in Massachusetts.

  19. Catastrophic Decline of World's Largest Primate: 80% Loss of Grauer's Gorilla (Gorilla beringei graueri) Population Justifies Critically Endangered Status.

    Science.gov (United States)

    Plumptre, Andrew J; Nixon, Stuart; Kujirakwinja, Deo K; Vieilledent, Ghislain; Critchlow, Rob; Williamson, Elizabeth A; Nishuli, Radar; Kirkby, Andrew E; Hall, Jefferson S

    2016-01-01

    Grauer's gorilla (Gorilla beringei graueri), the World's largest primate, is confined to eastern Democratic Republic of Congo (DRC) and is threatened by civil war and insecurity. During the war, armed groups in mining camps relied on hunting bushmeat, including gorillas. Insecurity and the presence of several militia groups across Grauer's gorilla's range made it very difficult to assess their population size. Here we use a novel method that enables rigorous assessment of local community and ranger-collected data on gorilla occupancy to evaluate the impacts of civil war on Grauer's gorilla, which prior to the war was estimated to number 16,900 individuals. We show that gorilla numbers in their stronghold of Kahuzi-Biega National Park have declined by 87%. Encounter rate data of gorilla nests at 10 sites across its range indicate declines of 82-100% at six of these sites. Spatial occupancy analysis identifies three key areas as the most critical sites for the remaining populations of this ape and that the range of this taxon is around 19,700 km2. We estimate that only 3,800 Grauer's gorillas remain in the wild, a 77% decline in one generation, justifying its elevation to Critically Endangered status on the IUCN Red List of Threatened Species.

  20. [Justifying genetic and immune markers of efficiency and sensitivity under combined exposure to risk factors in mining industry workers].

    Science.gov (United States)

    Dolgikh, O V; Zaitseva, N V; Krivtsov, A V; Gorshkova, K G; Lanin, D V; Bubnova, O A; Dianova, D G; Lykhina, T S; Vdovina, N A

    2014-01-01

    The authors evaluated and justified immunologic and genetic markers under combined exposure to risk factors in mining industry workers. Analysis covered polymorphism features of 29 genes with variant alleles possibly participating in occupationally conditioned diseases formation and serving as sensitivity markers of these diseases risk. The genes association selected demonstrates reliably changed polymorphism vs. the reference group (SOD2 superoxidedismutase gene, ANKK1 dophamine receptor gene, SULT1A1 sulphtransaminase gene, MTHFR methylene tetrahydrofolate reductase gene, VEGF endothelial growth factor gene, TNF-alpha tumor necrosis factor gene). Under combined exposure to occupational hazards (sylvinite dust, noise) in mining industry, this association can serve as adequate marking complex of sensitivity to development of occupationally conditioned diseases. Increased-production of immune cytokine regulation markers: tumor necrosis factor and vascular endothelial growth factor. Genes SOD2, ANKK1, SULT1A1, VEGF, TNFalpha are recommended as sensitivity markers, and the coded cytokines (tumor necrosis factor and endothelial growth factor) are proposed as effect markers in evaluation of health risk for workers in mining industry.

  1. Lagrangian Time Series Models for Ocean Surface Drifter Trajectories

    CERN Document Server

    Sykulski, Adam M; Lilly, Jonathan M; Danioux, Eric

    2016-01-01

    This paper proposes stochastic models for the analysis of ocean surface trajectories obtained from freely-drifting satellite-tracked instruments. The proposed time series models are used to summarise large multivariate datasets and infer important physical parameters of inertial oscillations and other ocean processes. Nonstationary time series methods are employed to account for the spatiotemporal variability of each trajectory. Because the datasets are large, we construct computationally efficient methods through the use of frequency-domain modelling and estimation, with the data expressed as complex-valued time series. We detail how practical issues related to sampling and model misspecification may be addressed using semi-parametric techniques for time series, and we demonstrate the effectiveness of our stochastic models through application to both real-world data and to numerical model output.

  2. The frequency of Tay-Sachs disease causing mutations in the Brazilian Jewish population justifies a carrier screening program

    Directory of Open Access Journals (Sweden)

    Roberto Rozenberg

    Full Text Available CONTEXT: Tay-Sachs disease is an autosomal recessive disease characterized by progressive neurologic degeneration, fatal in early childhood. In the Ashkenazi Jewish population the disease incidence is about 1 in every 3,500 newborns and the carrier frequency is 1 in every 29 individuals. Carrier screening programs for Tay-Sachs disease have reduced disease incidence by 90% in high-risk populations in several countries. The Brazilian Jewish population is estimated at 90,000 individuals. Currently, there is no screening program for Tay-Sachs disease in this population. OBJECTIVE: To evaluate the importance of a Tay-Sachs disease carrier screening program in the Brazilian Jewish population by determining the frequency of heterozygotes and the acceptance of the program by the community. SETTING: Laboratory of Molecular Genetics - Institute of Biosciences - Universidade de São Paulo. PARTICIPANTS: 581 senior students from selected Jewish high schools. PROCEDURE: Molecular analysis of Tay-Sachs disease causing mutations by PCR amplification of genomic DNA, followed by restriction enzyme digestion. RESULTS: Among 581 students that attended educational classes, 404 (70% elected to be tested for Tay-Sachs disease mutations. Of these, approximately 65% were of Ashkenazi Jewish origin. Eight carriers were detected corresponding to a carrier frequency of 1 in every 33 individuals in the Ashkenazi Jewish fraction of the sample. CONCLUSION: The frequency of Tay-Sachs disease carriers among the Ashkenazi Jewish population of Brazil is similar to that of other countries where carrier screening programs have led to a significant decrease in disease incidence. Therefore, it is justifiable to implement a Tay-Sachs disease carrier screening program for the Brazilian Jewish population.

  3. Sentinel lymph node biopsy in patients with a needle core biopsy diagnosis of ductal carcinoma in situ: is it justified?

    LENUS (Irish Health Repository)

    Doyle, B

    2012-02-01

    BACKGROUND: The incidence of ductal carcinoma in situ (DCIS) has increased markedly with the introduction of population-based mammographic screening. DCIS is usually diagnosed non-operatively. Although sentinel lymph node biopsy (SNB) has become the standard of care for patients with invasive breast carcinoma, its use in patients with DCIS is controversial. AIM: To examine the justification for offering SNB at the time of primary surgery to patients with a needle core biopsy (NCB) diagnosis of DCIS. METHODS: A retrospective analysis was performed of 145 patients with an NCB diagnosis of DCIS who had SNB performed at the time of primary surgery. The study focused on rates of SNB positivity and underestimation of invasive carcinoma by NCB, and sought to identify factors that might predict the presence of invasive carcinoma in the excision specimen. RESULTS: 7\\/145 patients (4.8%) had a positive sentinel lymph node, four macrometastases and three micrometastases. 6\\/7 patients had invasive carcinoma in the final excision specimen. 55\\/145 patients (37.9%) with an NCB diagnosis of DCIS had invasive carcinoma in the excision specimen. The median invasive tumour size was 6 mm. A radiological mass and areas of invasion <1 mm, amounting to "at least microinvasion" on NCB were predictive of invasive carcinoma in the excision specimen. CONCLUSIONS: SNB positivity in pure DCIS is rare. In view of the high rate of underestimation of invasive carcinoma in patients with an NCB diagnosis of DCIS in this study, SNB appears justified in this group of patients.

  4. Justifying the Use of a Second Language Oral Test as an Exit Test in Hong Kong: An Application of Assessment Use Argument Framework

    Science.gov (United States)

    Jia, Yujie

    2013-01-01

    This study employed Bachman and Palmer's (2010) Assessment Use Argument framework to investigate to what extent the use of a second language oral test as an exit test in a Hong Kong university can be justified. It also aimed to help test developers of this oral test identify the most critical areas in the current test design that might need…

  5. Why Police Kill Black Males with Impunity: Applying Public Health Critical Race Praxis (PHCRP) to Address the Determinants of Policing Behaviors and "Justifiable" Homicides in the USA.

    Science.gov (United States)

    Gilbert, Keon L; Ray, Rashawn

    2016-04-01

    Widespread awareness of the recent deaths of several black males at the hands of police has revealed an unaddressed public health challenge-determining the root causes of excessive use of force by police applied to black males that may result in "justifiable homicides." The criminalization of black males has a long history in the USA, which has resulted in an increase in policing behaviors by legal authorities and created inequitable life chances for black males. Currently, the discipline of public health has not applied an intersectional approach that investigates the intersection of race and gender to understanding police behaviors that lead to "justifiable homicides" for black males. This article applies the core tenets and processes of Public Health Critical Race Praxis (PHCRP) to develop a framework that can improve research and interventions to address the disparities observed in recent trend analyses of "justifiable homicides." Accordingly, we use PHCRP to offer an alternative framework on the social, legal, and health implications of violence-related incidents. We aim to move the literature in this area forward to help scholars, policymakers, and activists build the capacity of communities to address the excessive use of force by police to reduce mortality rates from "justifiable homicides."

  6. Is emergency and salvage coronary artery bypass grafting justified? The Nordic Emergency/Salvage coronary artery bypass grafting study.

    Science.gov (United States)

    Axelsson, Tomas A; Mennander, Ari; Malmberg, Markus; Gunn, Jarmo; Jeppsson, Anders; Gudbjartsson, Tomas

    2016-05-01

    According to the EuroSCORE-II criteria, patients undergoing emergency coronary artery bypass grafting (CABG) are operated on before the beginning of the next working day after decision to operate while salvage CABG patients require cardiopulmonary resuscitation en route to the operating theatre. The objective of this multicentre study was to investigate the efficacy of emergency and salvage CABG. A retrospective analysis of all patients that underwent emergency or salvage CAGB at four North-European university hospitals from 2006 to 2014. A total of 614 patients; 580 emergency and 34 salvage CABG patients (mean age 67 ± 10 years, 56% males) were included. All patients had an acute coronary syndrome: 234 (38%) had an ST segment elevation myocardial infarction (STEMI) and 289 (47%) had a non-STEMI. Haemodynamic instability requiring inotropic drugs and/or intra-aortic balloon pump preoperatively occurred in 87 (14%) and 82 (13%) of the patients, respectively. Three hundred and thirty-one patient (54%) were transferred to the operating room immediately after angiography and 205 (33%) had a failure of an attempted percutaneous coronary intervention. Cardiopulmonary resuscitation within 1 h before the operation was performed in 49 patients (8%), and 9 patients (1%) received cardiac massage during sternotomy. Hospital mortality for emergency and salvage operations was 13 and 41%, respectively. Early complications included reoperation for bleeding (15%), postoperative stroke (6%) and de novo dialysis for acute kidney injury (6%). Overall 5-year survival rate was 79% for emergency operations and 46% for salvage operations. Only one out of 9 patients receiving cardiac massage during sternotomy survived. Early mortality in patients undergoing emergent and salvage CABG is substantial, especially in salvage patients. Long-term survival is acceptable in both emergent and salvage patients. Life-saving emergency and salvage CABG is justified in most patients but salvage patients

  7. Evaluation of model fit in nonlinear multilevel structural equation modeling

    Directory of Open Access Journals (Sweden)

    Karin eSchermelleh-Engel

    2014-03-01

    Full Text Available Evaluating model fit in nonlinear multilevel structural equation models (MSEM presents a challenge as no adequate test statistic is available. Nevertheless, using a product indicator approach a likelihood ratio test for linear models is provided which may also be useful for nonlinear MSEM. The main problem with nonlinear models is that product variables are nonnormally distributed. Although robust test statistics have been developed for linear SEM to ensure valid results under the condition of nonnormality, they were not yet investigated for nonlinear MSEM. In a Monte Carlo study, the performance of the robust likelihood ratio test was investigated for models with single-level latent interaction effects using the unconstrained product indicator approach. As overall model fit evaluation has a potential limitation in detecting the lack of fit at a single level even for linear models, level-specific model fit evaluation was also investigated using partially saturated models. Four population models were considered: a model with interaction effects at both levels, an interaction effect at the within-group level, an interaction effect at the between-group level, and a model with no interaction effects at both levels. For these models the number of groups, predictor correlation, and model misspecification was varied. The results indicate that the robust test statistic performed sufficiently well. Advantages of level-specific model fit evaluation for the detection of model misfit are demonstrated.

  8. Evaluation of model fit in nonlinear multilevel structural equation modeling.

    Science.gov (United States)

    Schermelleh-Engel, Karin; Kerwer, Martin; Klein, Andreas G

    2014-01-01

    Evaluating model fit in nonlinear multilevel structural equation models (MSEM) presents a challenge as no adequate test statistic is available. Nevertheless, using a product indicator approach a likelihood ratio test for linear models is provided which may also be useful for nonlinear MSEM. The main problem with nonlinear models is that product variables are non-normally distributed. Although robust test statistics have been developed for linear SEM to ensure valid results under the condition of non-normality, they have not yet been investigated for nonlinear MSEM. In a Monte Carlo study, the performance of the robust likelihood ratio test was investigated for models with single-level latent interaction effects using the unconstrained product indicator approach. As overall model fit evaluation has a potential limitation in detecting the lack of fit at a single level even for linear models, level-specific model fit evaluation was also investigated using partially saturated models. Four population models were considered: a model with interaction effects at both levels, an interaction effect at the within-group level, an interaction effect at the between-group level, and a model with no interaction effects at both levels. For these models the number of groups, predictor correlation, and model misspecification was varied. The results indicate that the robust test statistic performed sufficiently well. Advantages of level-specific model fit evaluation for the detection of model misfit are demonstrated.

  9. On deciding to have a lobotomy: either lobotomies were justified or decisions under risk should not always seek to maximise expected utility.

    Science.gov (United States)

    Cooper, Rachel

    2014-02-01

    In the 1940s and 1950s thousands of lobotomies were performed on people with mental disorders. These operations were known to be dangerous, but thought to offer great hope. Nowadays, the lobotomies of the 1940s and 1950s are widely condemned. The consensus is that the practitioners who employed them were, at best, misguided enthusiasts, or, at worst, evil. In this paper I employ standard decision theory to understand and assess shifts in the evaluation of lobotomy. Textbooks of medical decision making generally recommend that decisions under risk are made so as to maximise expected utility (MEU) I show that using this procedure suggests that the 1940s and 1950s practice of psychosurgery was justifiable. In making sense of this finding we have a choice: Either we can accept that psychosurgery was justified, in which case condemnation of the lobotomists is misplaced. Or, we can conclude that the use of formal decision procedures, such as MEU, is problematic.

  10. How to define and build an effective cyber threat intelligence capability how to understand, justify and implement a new approach to security

    CERN Document Server

    Dalziel, Henry; Carnall, James

    2014-01-01

    Intelligence-Led Security: How to Understand, Justify and Implement a New Approach to Security is a concise review of the concept of Intelligence-Led Security. Protecting a business, including its information and intellectual property, physical infrastructure, employees, and reputation, has become increasingly difficult. Online threats come from all sides: internal leaks and external adversaries; domestic hacktivists and overseas cybercrime syndicates; targeted threats and mass attacks. And these threats run the gamut from targeted to indiscriminate to entirely accidental. Amo

  11. A Software Engine to Justify the Conclusions of an Expert System for Detecting Renal Obstruction on 99mTc-MAG3 Scans

    Science.gov (United States)

    Garcia, Ernest V.; Taylor, Andrew; Manatunga, Daya; Folks, Russell

    2013-01-01

    The purposes of this study were to describe and evaluate a software engine to justify the conclusions reached by a renal expert system (RENEX) for assessing patients with suspected renal obstruction and to obtain from this evaluation new knowledge that can be incorporated into RENEX to attempt to improve diagnostic performance. Methods RENEX consists of 60 heuristic rules extracted from the rules used by a domain expert to generate the knowledge base and a forward-chaining inference engine to determine obstruction. The justification engine keeps track of the sequence of the rules that are instantiated to reach a conclusion. The interpreter can then request justification by clicking on the specific conclusion. The justification process then reports the English translation of all concatenated rules instantiated to reach that conclusion. The justification engine was evaluated with a prospective group of 60 patients (117 kidneys). After reviewing the standard renal mercaptoacetyltriglycine (MAG3) scans obtained before and after the administration of furosemide, a masked expert determined whether each kidney was obstructed, whether the results were equivocal, or whether the kidney was not obstructed and identified and ranked the main variables associated with each interpretation. Two parameters were then tabulated: the frequency with which the main variables associated with obstruction by the expert were also justified by RENEX and the frequency with which the justification rules provided by RENEX were deemed to be correct by the expert. Only when RENEX and the domain expert agreed on the diagnosis (87 kidneys) were the results used to test the justification. Results RENEX agreed with 91% (184/203) of the rules supplied by the expert for justifying the diagnosis. RENEX provided 103 additional rules justifying the diagnosis; the expert agreed that 102 (99%) were correct, although the rules were considered to be of secondary importance. Conclusion We have described and

  12. Proofs of the Technical Results Justifying a Biologically Inspired Algorithm for Reactive Navigation of Nonholonomic Robots in Maze-Like Environments

    CERN Document Server

    Matveev, Alexey S; Savkin, Andrey V

    2011-01-01

    We present technical results justifying a method for guidance of a Dubins-like vehicle with saturated control towards a target in a steady simply connected maze-like environment. The vehicle always has access to to the target relative bearing angle and the distance to the nearest point of the maze if it is within the given sensor range. The proposed control law is composed by biologically inspired reflex-level rules. Mathematically rigorous analysis of this law is provided; its convergence and performance are confirmed by computer simulations and experiments with real robots.

  13. Outcome and survival of patients aged 75 years and older compared to younger patients after ruptured abdominal aortic aneurysm repair: do the results justify the effort?

    DEFF Research Database (Denmark)

    Shahidi, S; Schroeder, T Veith; Carstensen, M.

    2009-01-01

    on prospectively registered data. The protocol was an "all-comers" policy. Seventy-two patients, who were operated on for RAAA in our department from January 1, 2005, to December 30, 2005, are included in this study. The follow-up time of survivors was 1 year. We defined 75-year-old patients as elderly because...... the only significant (p believe that treatment for RAAA can be justified in elderly patients. In our experience, surgical open repair has been life-saving in 33% of patients aged 75 years and older, at a relatively low price for each...

  14. Thermal damage reduction associated with in vivo skin electroporation: A numerical investigation justifying aggressive pre-cooling

    Energy Technology Data Exchange (ETDEWEB)

    Becker, S.M.; Kuznetsov, A.V. [North Carolina State University, Raleigh (United States). Mechanical and Aerospace Engineering

    2007-01-15

    Electroporation is an approach used to enhance transdermal transport of large molecules in which the skin is exposed to a series of electric pulses. Electroporation temporarily destabilizes the structure of the outer skin layer, the stratum corneum, by creating microscopic pores through which agents, which ordinarily are unable to pass into the skin, are able to pass through this outer barrier. Of possible concern when exposing biological tissue to an electric field is thermal tissue damage associated with Joule heating. In order to find the electrical and transient thermal solutions associated with this process, this study develops a three-dimensional transient finite-volume composite model of in vivo skin electroporation. The electroporation process modeled consists of five 150ms long DC square wave pulses administered at 1-s intervals with an applied voltage of 400V. This paper finds that minor thermal influence of the electrode plate and the of a small presence blood vessel have a large impact on thermal damage. An aggressive pre-cooling technique is presented which is shown to dramatically reduce the risk of thermal damage. (author)

  15. Through rose-colored glasses: system-justifying beliefs dampen the effects of relative deprivation on well-being and political mobilization.

    Science.gov (United States)

    Osborne, Danny; Sibley, Chris G

    2013-08-01

    Individual-based and group-based forms of relative deprivation (IRD and GRD, respectively) are linked with individual- and group-based responses to inequality, respectively. System justification theory, however, argues that we are motivated to believe that people's outcomes are equitably determined. As such, endorsement of system-justifying beliefs should dampen people's reactions to outcomes perceived to be unequal and ultimately undermine support for political mobilization. We examined these hypotheses in a national probability sample of New Zealanders (N = 6,886). As expected, IRD predicted individual-based responses to inequality (i.e., satisfaction with one's standard of living and psychological distress) better than GRD. Conversely, GRD predicted group-based responses to inequality (i.e., perceived discrimination against one's group and support for political mobilization) better than IRD. Each of these relationships was, however, notably weaker among participants who were high, relative to low, on system justification. These results demonstrate that system-justifying beliefs have a palliative effect on people's experiences with inequality.

  16. Current Evidence to Justify, and the Methodological Considerations for a Randomised Controlled Trial Testing the Hypothesis that Statins Prevent the Malignant Progression of Barrett's Oesophagus

    Institute of Scientific and Technical Information of China (English)

    David Thurtle; Leo Alexandre; Yoon K Loke; Ed Cheong; Andrew Hart

    2014-01-01

    Barrett's oesophagus is the predominant risk factor for oesophageal adenocarcinoma, a cancer whose incidence is increasing and which has a poor prognosis. This article reviews the latest experimental and epidemiological evidence justifying the development of a randomised controlled trial investigating the hypothesis that statins prevent the malignant progression of Barrett's oesophagus, and explores the methodological considerations for such a trial. The experimental evidence suggests anti-carcinogenic properties of statins on oesophageal cancer cell lines, based on the inhibition of the mevalonate pathway and the production of pro-apoptotic proteins. The epidemiological evidence reports inverse associations between statin use and the incidence of oesophageal carcinoma in both general population and Barrett's oesophagus cohorts. Such a randomised controlled trial would be a large multi-centre trial, probably investigating simvastatin, given the wide clinical experience with this drug, relatively low side-effect profile and low ifnancial cost. As with any clinical trial, high adherence is important, which could be increased with therapy, patient, doctor and system-focussed interventions. We would suggest there is now sufifcient evidence to justify a full clinical trial that attempts to prevent this aggressive cancer in a high-risk population.

  17. Current Evidence to Justify, and the Methodological Considerations for a Randomised Controlled Trial Testing the Hypothesis that Statins Prevent the Malignant Progression of Barrett's Oesophagus

    Directory of Open Access Journals (Sweden)

    David Thurtle

    2014-12-01

    Full Text Available Barrett’s oesophagus is the predominant risk factor for oesophageal adenocarcinoma, a cancer whose incidence is increasing and which has a poor prognosis. This article reviews the latest experimental and epidemiological evidence justifying the development of a randomised controlled trial investigating the hypothesis that statins prevent the malignant progression of Barrett’s oesophagus, and explores the methodological considerations for such a trial. The experimental evidence suggests anti-carcinogenic properties of statins on oesophageal cancer cell lines, based on the inhibition of the mevalonate pathway and the production of pro-apoptotic proteins. The epidemiological evidence reports inverse associations between statin use and the incidence of oesophageal carcinoma in both general population and Barrett’s oesophagus cohorts. Such a randomised controlled trial would be a large multi-centre trial, probably investigating simvastatin, given the wide clinical experience with this drug, relatively low side-effect profile and low financial cost. As with any clinical trial, high adherence is important, which could be increased with therapy, patient, doctor and system-focussed interventions. We would suggest there is now sufficient evidence to justify a full clinical trial that attempts to prevent this aggressive cancer in a high-risk population.

  18. Justifying the Justification Hypothesis: scientific-humanism, Equilintegration (EI) Theory, and the Beliefs, Events, and Values Inventory (BEVI).

    Science.gov (United States)

    Shealy, Craig N

    2005-01-01

    The Justification Hypothesis (JH; Henriques, 2003) is a basic, general, and macro-level construct that is highly compelling. However, it needs greater specification (i.e., justification) regarding what it is, how it might be operationalized and measured, and what it does and does not predict in the real world. In the present analysis, the act of "justification" is conceptualized as the ongoing attempt to convince self and/or others that one's beliefs and values, which is to say one's "version of reality" or VOR, is correct, defensible, and good. In addressing these issues, this paper is divided into two complementary parts: (a) consideration of justification dynamics and exemplars from a scientific-humanist perspective and (b) an examination of how justification systems and processes have been studied vis-a-vis research and theory on beliefs and values as well as an extant model--Equilintegration (EI) Theory--and method--the Beliefs, Events, and Values Inventory (BEVI).

  19. Bias analysis and the simulation-extrapolation method for survival data with covariate measurement error under parametric proportional odds models.

    Science.gov (United States)

    Yi, Grace Y; He, Wenqing

    2012-05-01

    It has been well known that ignoring measurement error may result in substantially biased estimates in many contexts including linear and nonlinear regressions. For survival data with measurement error in covariates, there has been extensive discussion in the literature with the focus on proportional hazards (PH) models. Recently, research interest has extended to accelerated failure time (AFT) and additive hazards (AH) models. However, the impact of measurement error on other models, such as the proportional odds model, has received relatively little attention, although these models are important alternatives when PH, AFT, or AH models are not appropriate to fit data. In this paper, we investigate this important problem and study the bias induced by the naive approach of ignoring covariate measurement error. To adjust for the induced bias, we describe the simulation-extrapolation method. The proposed method enjoys a number of appealing features. Its implementation is straightforward and can be accomplished with minor modifications of existing software. More importantly, the proposed method does not require modeling the covariate process, which is quite attractive in practice. As the precise values of error-prone covariates are often not observable, any modeling assumption on such covariates has the risk of model misspecification, hence yielding invalid inferences if this happens. The proposed method is carefully assessed both theoretically and empirically. Theoretically, we establish the asymptotic normality for resulting estimators. Numerically, simulation studies are carried out to evaluate the performance of the estimators as well as the impact of ignoring measurement error, along with an application to a data set arising from the Busselton Health Study. Sensitivity of the proposed method to misspecification of the error model is studied as well.

  20. Justifiability of amniocentesis on the basis of positive findings of triple test, ultrasound scan and advanced maternal age

    Directory of Open Access Journals (Sweden)

    Dragoslav Bukvic

    2011-05-01

    Full Text Available Objective. To assess the effectiveness of antenatal screening for chromosomal abnormalities based on maternal age (≥35 years, positive ultrasound findings or a positive triple test. Materials and methods. Retrospective six-year study. The pregnant women routinely underwent established clinical and laboratory practice at the Department of Medical Genetics between 1997 and 2003. The women’s case notes were examined to identify indications for karyotyping, gestation period and the outcome of karyotyping and pregnancy. Results. Invasive antenatal tests were performed on 1440 cases, 1168 (81.11% age 35(a, 72 (5.00% positive triple test (b, 24 (1.67% positive ultrasound scanning (c and 176 (12.2% other (psychological, personal reasons, etc (d. The overall positive predictive value was 1.67% (1.6%(a, 1.4% (b, 12.5% (c, 0.0% (d. The constructed model of logistic regression gave an odds-ratio of 8.647 for the “positive ultrasound result vs. maternal age ≥35” indication, while the odds-ratio for the triple test vs. maternal age ≥35 was 0.854. Conclusions. Amniocentesis and cytogenetic analysis of foetal karyotype should be presented as a diagnostic possibility to all women over 35 years. The application of biochemical markers was far from the expected results. If we compare results for indication positive ultrasound scanning vs. maternal age, an oddsratio of ~9 was obtained. These results demonstrate that the likelihood of obtaining positive results (i.e. the presence of chromosome alterations from an amniocentesis having this indication is almost 9 times higher than from having an amniocentesis performed solely for advanced maternal age.

  1. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data.

    Science.gov (United States)

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms.

  2. On adjustment for auxiliary covariates in additive hazard models for the analysis of randomized experiments

    DEFF Research Database (Denmark)

    Vansteelandt, S.; Martinussen, Torben; Tchetgen, E. J Tchetgen

    2014-01-01

    's dependence on time or on the auxiliary covariates is misspecified, and even away from the null hypothesis of no treatment effect. We furthermore show that adjustment for auxiliary baseline covariates does not change the asymptotic variance of the estimator of the effect of a randomized treatment. We conclude......We consider additive hazard models (Aalen, 1989) for the effect of a randomized treatment on a survival outcome, adjusting for auxiliary baseline covariates. We demonstrate that the Aalen least-squares estimator of the treatment effect parameter is asymptotically unbiased, even when the hazard...... that, in view of its robustness against model misspecification, Aalen least-squares estimation is attractive for evaluating treatment effects on a survival outcome in randomized experiments, and the primary reasons to consider baseline covariate adjustment in such settings could be interest in subgroup...

  3. The EU Seal Products Ban – Why Ineffective Animal Welfare Protection Cannot Justify Trade Restrictions under European and International Trade Law

    Directory of Open Access Journals (Sweden)

    Martin Hennig

    2015-03-01

    Full Text Available In this article, the author questions the legitimacy of the general ban on trade in seal products adopted by the European Union. It is submitted that the EU Seal Regime, which permits the marketing of Greenlandic seal products derived from Inuit hunts, but excludes Canadian and Norwegian seal products from the European market, does not ensure a satisfactory degree of animal welfare protection in order to justify the comprehensive trade restriction in place. It is argued that the current ineffective EU ban on seal products, which according to the WTO Appellate Body cannot be reconciled with the objective of protecting animal welfare, has no legal basis in EU Treaties and should be annulled.

  4. Can beneficial ends justify lying? Neural responses to the passive reception of lies and truth-telling with beneficial and harmful monetary outcomes.

    Science.gov (United States)

    Yin, Lijun; Weber, Bernd

    2016-03-01

    Can beneficial ends justify morally questionable means? To investigate how monetary outcomes influence the neural responses to lying, we used a modified, cheap talk sender-receiver game in which participants were the direct recipients of lies and truthful statements resulting in either beneficial or harmful monetary outcomes. Both truth-telling (vs lying) as well as beneficial (vs harmful) outcomes elicited higher activity in the nucleus accumbens. Lying (vs truth-telling) elicited higher activity in the supplementary motor area, right inferior frontal gyrus, superior temporal sulcus and left anterior insula. Moreover, the significant interaction effect was found in the left amygdala, which showed that the monetary outcomes modulated the neural activity in the left amygdala only when truth-telling rather than lying. Our study identified a neural network associated with the reception of lies and truth, including the regions linked to the reward process, recognition and emotional experiences of being treated (dis)honestly.

  5. Testing in a Random Effects Panel Data Model with Spatially Correlated Error Components and Spatially Lagged Dependent Variables

    Directory of Open Access Journals (Sweden)

    Ming He

    2015-11-01

    Full Text Available We propose a random effects panel data model with both spatially correlated error components and spatially lagged dependent variables. We focus on diagnostic testing procedures and derive Lagrange multiplier (LM test statistics for a variety of hypotheses within this model. We first construct the joint LM test for both the individual random effects and the two spatial effects (spatial error correlation and spatial lag dependence. We then provide LM tests for the individual random effects and for the two spatial effects separately. In addition, in order to guard against local model misspecification, we derive locally adjusted (robust LM tests based on the Bera and Yoon principle (Bera and Yoon, 1993. We conduct a small Monte Carlo simulation to show the good finite sample performances of these LM test statistics and revisit the cigarette demand example in Baltagi and Levin (1992 to illustrate our testing procedures.

  6. The Ends Justify the Memes

    OpenAIRE

    Miller, Ian D.; Cupchik, Gerald C.

    2016-01-01

    This talk presents an update on my research into memes.  It begins with an introduction to memes that is suitable for any audience.  It concludes with a detailed description of human research and simulation results that converge with one another.  I also present a short online study on email forwarding chains.

  7. About 'restriction', 'justified' and 'necessary'

    DEFF Research Database (Denmark)

    Werlauff, Erik

    2016-01-01

    The article is an academic fairy tale about why and how all national corporate tax protection legislation should undergo a 3-part test to ensure its consistency with EU law. Each Member State introduce a compulsory 3-step test for each new (corporate) tax provision. The test is simple: (1) Does...

  8. Are Vulnerability Disclosure Deadlines Justified?

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Jason L. Wright; Lawrence Wellman

    2011-09-01

    Vulnerability research organizations Rapid7, Google Security team, and Zero Day Initiative recently imposed grace periods for public disclosure of vulnerabilities. The grace periods ranged from 45 to 182 days, after which disclosure might occur with or without an effective mitigation from the affected software vendor. At this time there is indirect evidence that the shorter grace periods of 45 and 60 days may not be practical. However, there is strong evidence that the recently announced Zero Day Initiative grace period of 182 days yields benefit in speeding up the patch creation process, and may be practical for many software products. Unfortunately, there is also evidence that the 182 day grace period results in more vulnerability announcements without an available patch.

  9. Are Fuel Price Hikes Justifiable?

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    China saw its third fuel price hike this year when the National Development and Reform Commission, China’s top price regulator, hiked gasoline and diesel retail prices up by 9 percent, effective on June 30. It is the second rally in a month after the country initiated a new fuel pricing scheme in May.

  10. Creating, Naming, and Justifying Fractions

    Science.gov (United States)

    Siebert, Daniel; Gaskin, Nicole

    2006-01-01

    For students to develop meaningful conceptions of fractions and fraction operations, they need to think of fractions in terms other than as just whole-number combinations. In this article, we suggest two powerful images for thinking about fractions that move beyond whole-number reasoning. (Contains 5 figures.)

  11. A positive theory of monetary policy and robust control

    OpenAIRE

    Juha Kilponen

    2004-01-01

    This paper applies the robust control approach to a simple positive theory of monetary policy, when the central bank’s model of the economy is subject to misspecifications. It is shown that a central bank should react more aggressively to supply shocks when the model misspecifications grow larger. Moreover, the model misspecifications aggravate the inflation bias and a trade-off between output stabilisation and inflation worsens when the uncertainty surrounding the central bank’s model increa...

  12. Scrutiny Land: Scrutiny Land is the place where government needs to justify to a court its restrictions on the liberties of the people.

    Science.gov (United States)

    Barnett, Randy E

    2008-06-01

    Scrutiny Land is the place where government needs to justify to a court its restrictions on the liberties of the people. In the 1930s, the Supreme Court began limiting access to Scrutiny Land. While the New Deal Court merely shifted the burden to those challenging a law to show that a restriction of liberty is irrational, the Warren Court made the presumption of constitutionality effectively irrebuttable. After this, only one road to Scrutiny Land remained: showing that the liberty being restricted was a fundamental right. The Glucksberg Two-Step, however, limited the doctrine of fundamental rights to those (1) narrowly defined liberties that are (2) deeply rooted in tradition and history. In this Article, I explain how the ability to define accurately almost any liberty as broad or narrow improperly gives courts complete discretion to protect liberty or not as it chooses. I then describe an alternative that is suggested by the approach taken by the Court in Lawrence v. Texas: a general presumption of liberty. Not only is such an approach practical, it is also more consistent with the text and original meaning of the Constitution than is the Glucksburg Two-Step.

  13. Feature Matching in Time Series Modelling

    CERN Document Server

    Xia, Yingcun

    2011-01-01

    Using a time series model to mimic an observed time series has a long history. However, with regard to this objective, conventional estimation methods for discrete-time dynamical models are frequently found to be wanting. In the absence of a true model, we prefer an alternative approach to conventional model fitting that typically involves one-step-ahead prediction errors. Our primary aim is to match the joint probability distribution of the observable time series, including long-term features of the dynamics that underpin the data, such as cycles, long memory and others, rather than short-term prediction. For want of a better name, we call this specific aim {\\it feature matching}. The challenges of model mis-specification, measurement errors and the scarcity of data are forever present in real time series modelling. In this paper, by synthesizing earlier attempts into an extended-likelihood, we develop a systematic approach to empirical time series analysis to address these challenges and to aim at achieving...

  14. Risk Factors Such as Male Sex, Smoking, Metabolic Syndrome, Obesity, and Fatty Liver Do Not Justify Screening Colonoscopies Before Age 45.

    Science.gov (United States)

    Jung, Yoon Suk; Yun, Kyung Eun; Chang, Yoosoo; Ryu, Seungho; Park, Dong Il

    2016-04-01

    Recently, many studies have reported that male sex, smoking, fatty liver, metabolic syndrome (MetS), and obesity are risk factors for colorectal neoplasia (CRN). However, current guidelines recommend that persons at average risk of colorectal cancer begin screening colonoscopy at age 50 years without consideration of those risk factors. To investigate an appropriate time to start screening colonoscopies in persons with risk factors for CRN. We performed a cross-sectional study on 27,894 Korean aged ≥30 years who underwent a first colonoscopy as part of a health screening program. To compare the efficacy of colonoscopic screening for the detection of advanced CRN among age groups with risk factors, we calculated the number needed to screen (NNS) to identify 1 patient with advanced CRN. The NNS for those 30-39 years old with all risk factors, male gender, smoking (≥10 pack-years), MetS, obesity, and fatty liver, was higher than that for ≥50-year-old female subjects (55.4 vs. 26.4). The NNS for those 40-44 years old with all risk factors (37.1) was also higher than that for ≥50-year-old female subjects. However, the NNS for those 45-49 years old with risk factors (16.9-22.9) was lower than that for ≥50-year-old women. The efficacy of colonoscopic screening in people 30-44 years old with multiple risk factors is lower than that in ≥50-year-old women. Risk factors such as male sex, smoking, MetS, obesity, and fatty liver do not justify starting screening colonoscopies before age 45.

  15. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  16. Modeling Error in Quantitative Macro-Comparative Research

    Directory of Open Access Journals (Sweden)

    Salvatore J. Babones

    2015-08-01

    Full Text Available Much quantitative macro-comparative research (QMCR relies on a common set of published data sources to answer similar research questions using a limited number of statistical tools. Since all researchers have access to much the same data, one might expect quick convergence of opinion on most topics. In reality, of course, differences of opinion abound and persist. Many of these differences can be traced, implicitly or explicitly, to the different ways researchers choose to model error in their analyses. Much careful attention has been paid in the political science literature to the error structures characteristic of time series cross-sectional (TSCE data, but much less attention has been paid to the modeling of error in broadly cross-national research involving large panels of countries observed at limited numbers of time points. Here, and especially in the sociology literature, multilevel modeling has become a hegemonic – but often poorly understood – research tool. I argue that widely-used types of multilevel models, commonly known as fixed effects models (FEMs and random effects models (REMs, can produce wildly spurious results when applied to trended data due to mis-specification of error. I suggest that in most commonly-encountered scenarios, difference models are more appropriate for use in QMC.

  17. Comment les enfants justifient-ils ce qu’ils savent faire ? Le concept de milieu géométrique dans l’approche piagétienne de la formation des raisons

    Directory of Open Access Journals (Sweden)

    IOANNA BERTHOUD-PAPANDROPOULOU

    2008-01-01

    Full Text Available The relation between know-how and know to justify his/her action is explored within the Piagetian constructivist theoretical frame, particularly within the issue of Reasons. Reasons are considered as a reconstitution of the activity, contributing to its understanding by the subject. Thirty four children aged three to nine have been faced with a double task: determine the middle of geometric figures and then justify the chosen location. Results show that while determination is correctly performed at all ages by efficient perceptive evaluation, the justification undergoes a development leading from illustrative, to argumentative and finally to properly founding reasons from the age of eight years on. The relationship between action and reason is discussed on a cognitive, a social and an educational level.

  18. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  19. Examining the Factor Structure of the Self-Compassion Scale in Four Distinct Populations: Is the Use of a Total Scale Score Justified?

    Science.gov (United States)

    Neff, Kristin D; Whittaker, Tiffany A; Karl, Anke

    2017-01-31

    This study examined the factor structure of the Self-Compassion Scale (SCS) using a bifactor model, a higher order model, a 6-factor correlated model, a 2-factor correlated model, and a 1-factor model in 4 distinct populations: college undergraduates (N = 222), community adults (N = 1,394), individuals practicing Buddhist meditation (N = 215), and a clinical sample of individuals with a history of recurrent depression (N = 390). The 6-factor correlated model demonstrated the best fit across samples, whereas the 1- and 2-factor models had poor fit. The higher order model also showed relatively poor fit across samples, suggesting it is not representative of the relationship between subscale factors and a general self-compassion factor. The bifactor model, however, had acceptable fit in the student, community, and meditator samples. Although fit was suboptimal in the clinical sample, results suggested an overall self-compassion factor could still be interpreted with some confidence. Moreover, estimates suggested a general self-compassion factor accounted for at least 90% of the reliable variance in SCS scores across samples, and item factor loadings and intercepts were equivalent across samples. Results suggest that a total SCS score can be used as an overall mesure of self-compassion.

  20. Optimal designs for discriminating between dose-response models in toxicology studies

    CERN Document Server

    Dette, Holger; Shpilev, Piter; Wong, Weng Kee; 10.3150/10-BEJ257

    2010-01-01

    We consider design issues for toxicology studies when we have a continuous response and the true mean response is only known to be a member of a class of nested models. This class of non-linear models was proposed by toxicologists who were concerned only with estimation problems. We develop robust and efficient designs for model discrimination and for estimating parameters in the selected model at the same time. In particular, we propose designs that maximize the minimum of $D$- or $D_1$-efficiencies over all models in the given class. We show that our optimal designs are efficient for determining an appropriate model from the postulated class, quite efficient for estimating model parameters in the identified model and also robust with respect to model misspecification. To facilitate the use of optimal design ideas in practice, we have also constructed a website that freely enables practitioners to generate a variety of optimal designs for a range of models and also enables them to evaluate the efficiency of ...

  1. Multivariate models of mixed assortment: phenotypic assortment and social homogamy for education and fluid ability.

    Science.gov (United States)

    Reynolds, C A; Baker, L A; Pedersen, N L

    2000-11-01

    Phenotypic assortment is assumed to be the principal mechanism of spouse similarity in most biometrical studies. Other assortment mechanisms, such as social homogamy, may be plausible. Two models are presented that consider phenotypic assortment and social homogamy simultaneously (i.e., mixed assortment), where selective associations between social background factors (Model I) versus selective associations between total environments (Model II) distinguish the models. A series of illustrative analyses was undertaken for education and fluid ability available on a sample of 116 Swedish twin pairs and their spouses. On the basis of several fit criteria Model I was preferred over Model II. Both social homogamy and phenotypic assortment may contribute to spouse similarity for educational attainment and fluid ability. Furthermore, spouse similarity for fluid ability may arise indirectly from social homogamy and phenotypic assortment for educational attainment. Power analyses indicated greater observed power for Model I than Model II. Additional power analyses indicated that considerably more twin-spouse sets would be needed for Model II than Model I, to resolve social homogamy and phenotypic assortment. Effects of misspecification of mechanisms of spouse similarity are also briefly discussed.

  2. Goodness-of-fit tests and model diagnostics for negative binomial regression of RNA sequencing data.

    Directory of Open Access Journals (Sweden)

    Gu Mi

    Full Text Available This work is about assessing model adequacy for negative binomial (NB regression, particularly (1 assessing the adequacy of the NB assumption, and (2 assessing the appropriateness of models for NB dispersion parameters. Tools for the first are appropriate for NB regression generally; those for the second are primarily intended for RNA sequencing (RNA-Seq data analysis. The typically small number of biological samples and large number of genes in RNA-Seq analysis motivate us to address the trade-offs between robustness and statistical power using NB regression models. One widely-used power-saving strategy, for example, is to assume some commonalities of NB dispersion parameters across genes via simple models relating them to mean expression rates, and many such models have been proposed. As RNA-Seq analysis is becoming ever more popular, it is appropriate to make more thorough investigations into power and robustness of the resulting methods, and into practical tools for model assessment. In this article, we propose simulation-based statistical tests and diagnostic graphics to address model adequacy. We provide simulated and real data examples to illustrate that our proposed methods are effective for detecting the misspecification of the NB mean-variance relationship as well as judging the adequacy of fit of several NB dispersion models.

  3. Goodness-of-fit tests and model diagnostics for negative binomial regression of RNA sequencing data.

    Science.gov (United States)

    Mi, Gu; Di, Yanming; Schafer, Daniel W

    2015-01-01

    This work is about assessing model adequacy for negative binomial (NB) regression, particularly (1) assessing the adequacy of the NB assumption, and (2) assessing the appropriateness of models for NB dispersion parameters. Tools for the first are appropriate for NB regression generally; those for the second are primarily intended for RNA sequencing (RNA-Seq) data analysis. The typically small number of biological samples and large number of genes in RNA-Seq analysis motivate us to address the trade-offs between robustness and statistical power using NB regression models. One widely-used power-saving strategy, for example, is to assume some commonalities of NB dispersion parameters across genes via simple models relating them to mean expression rates, and many such models have been proposed. As RNA-Seq analysis is becoming ever more popular, it is appropriate to make more thorough investigations into power and robustness of the resulting methods, and into practical tools for model assessment. In this article, we propose simulation-based statistical tests and diagnostic graphics to address model adequacy. We provide simulated and real data examples to illustrate that our proposed methods are effective for detecting the misspecification of the NB mean-variance relationship as well as judging the adequacy of fit of several NB dispersion models.

  4. A one-step-ahead pseudo-DIC for comparison of Bayesian state-space models.

    Science.gov (United States)

    Millar, R B; McKechnie, S

    2014-12-01

    In the context of state-space modeling, conventional usage of the deviance information criterion (DIC) evaluates the ability of the model to predict an observation at time t given the underlying state at time t. Motivated by the failure of conventional DIC to clearly choose between competing multivariate nonlinear Bayesian state-space models for coho salmon population dynamics, and the computational challenge of alternatives, this work proposes a one-step-ahead DIC, DICp, where prediction is conditional on the state at the previous time point. Simulations revealed that DICp worked well for choosing between state-space models with different process or observation equations. In contrast, conventional DIC could be grossly misleading, with a strong preference for the wrong model. This can be explained by its failure to account for inflated estimates of process error arising from the model mis-specification. DICp is not based on a true conditional likelihood, but is shown to have interpretation as a pseudo-DIC in which the compensatory behavior of the inflated process errors is eliminated. It can be easily calculated using the DIC monitors within popular BUGS software when the process and observation equations are conjugate. The improved performance of DICp is demonstrated by application to the multi-stage modeling of coho salmon abundance in Lobster Creek, Oregon. © 2014, The International Biometric Society.

  5. Hyperspectral remote sensing of plant biochemistry using Bayesian model averaging with variable and band selection

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Kaiguang; Valle, Denis; Popescu, Sorin; Zhang, Xuesong; Malick, Bani

    2013-05-15

    Model specification remains challenging in spectroscopy of plant biochemistry, as exemplified by the availability of various spectral indices or band combinations for estimating the same biochemical. This lack of consensus in model choice across applications argues for a paradigm shift in hyperspectral methods to address model uncertainty and misspecification. We demonstrated one such method using Bayesian model averaging (BMA), which performs variable/band selection and quantifies the relative merits of many candidate models to synthesize a weighted average model with improved predictive performances. The utility of BMA was examined using a portfolio of 27 foliage spectral–chemical datasets representing over 80 species across the globe to estimate multiple biochemical properties, including nitrogen, hydrogen, carbon, cellulose, lignin, chlorophyll (a or b), carotenoid, polar and nonpolar extractives, leaf mass per area, and equivalent water thickness. We also compared BMA with partial least squares (PLS) and stepwise multiple regression (SMR). Results showed that all the biochemicals except carotenoid were accurately estimated from hyerspectral data with R2 values > 0.80.

  6. Evaluation of (241)Am deposited in different parts of the leg bones and skeleton to justify in vivo measurements of the knee for estimating total skeletal activity.

    Science.gov (United States)

    Khalaf, Majid; Brey, Richard R; Derryberry, DeWayne

    2013-01-01

    The percentage of Am deposited in different parts of leg bones relative to the total leg activity was calculated from radiochemical analysis results from six whole body donors participating in the U.S. Transuranium and Uranium Registries (USTUR). In five of these six USTUR cases, the percentage of Am deposited in the knee region as well as in the entire leg was separately calculated relative to total skeletal activity. The purpose of this study is to find a region in the leg that is both suitable for in vivo measurement of Am deposited in the bones and has a good correlation with the total skeletal Am burden. In all analyzed cases, the femur was the bone with the highest percentage of Am deposited in the leg (48.8%). In the five cases that have complete whole skeletal analysis, the percentage of Am activity in the knee relative to entire skeletal activity was 4.8%, and the average value of its coefficient of variation was 10.6%. The percentage of Am in the leg relative to total skeletal activity was 20% with an average coefficient of variation of 13.63%. The Am activity in the knee as well as in the leg was strongly correlated (R = 99.5% and R = 99.1%, respectively) with the amount of Am activity in the entire skeleton using a simple linear relationship. The highest correlation was found between the amount of Am deposited in the knee and the amount of Am deposited in the entire skeleton. This correlation is important because it might enable an accurate assessment of the total skeletal Am burden to be performed from in vivo monitoring of the knee region. In all analyzed cases, an excellent correlation (R = 99.9%) was found between the amount of Am activity in the knee and the amount of Am activity in the entire leg. The results of this study suggest three simple models: two models to predict the total skeletal activity based on either leg or knee activity, and the third model to predict the total leg activity based on knee activity. The results also suggest that the

  7. DEVELOPMENT PROBABILITY-LINGUISTIC MODELS VULNERABILITY ASSESSMENT OF AVIATION SECURITY IMPORTANT TECHNICAL FACILITIES

    National Research Council Canada - National Science Library

    2016-01-01

    ... are justified, and the assessment problem of the protected object vulnerability is formulated. The main advantage of the developed model is the extensive opportunities of formalization of diverse information on the security status of the object...

  8. Ecologically justified regulatory provisions for riverine hydroelectric power plants and minimum instream flow requirements in diverted streams; Oekologisch begruendete, dynamische Mindestwasserregelungen bei Ausleitungskraftwerken

    Energy Technology Data Exchange (ETDEWEB)

    Jorde, K.

    1997-12-31

    The study was intended to develop a model versatile enough to permit quantification of various water demand scenarios in connection with operation of riverine hydroelectric power plants. Specific emphasis was to be placed on defining the minimum instream flow to be maintained in river segments because of the elementary significance to flowing water biocinoses. Based on fictitious minimum water requirements, various scenarious were simulated for flow regimes depending on power plant operation, so as to establish a system for comparative analysis and evaluation of resulting economic effects on power plant efficiency on the one hand, and the ecologic effects on the aquatic habitat. The information derived was to serve as a basis for decision-making for regulatory purposes. For this study, the temporal and spatial variability of the flow regime at the river bed in a river segment was examined for the first time. Based on this information, complemented by information obtained from habitat simulations, a method was derived for determination of ecologic requirements and their incorporation into regulatory water management provisions. The field measurements were carried out with the FST hemisphere as a proven and most efficient and reliable method of assessing flow regimes at river beds. Evaluation of the measured instream flow data characterising three morphologically different segments of diverted rivers was done with the CASIMIR computer code. The ASS models derived were used for comparative assessment of existing regulatory provisions and recommended amendments determining required minimum instream flow in diverted rivers. The requirements were defined taking as a basis data obtained for three different years. (orig./CB) [Deutsch] Ziel der Arbeit war die Entwicklung eines Modellverfahrens, das flexibel die Quantifizierung unterschiedlicher Nutzansprueche an Laufwasserkraftanlagen ermoeglicht. Insbesondere der Erhalt einer gewissen Dynamik, die fuer

  9. Adjusting for overdispersion in piecewise exponential regression models to estimate excess mortality rate in population-based research

    Directory of Open Access Journals (Sweden)

    Miguel Angel Luque-Fernandez

    2016-10-01

    Full Text Available Abstract Background In population-based cancer research, piecewise exponential regression models are used to derive adjusted estimates of excess mortality due to cancer using the Poisson generalized linear modelling framework. However, the assumption that the conditional mean and variance of the rate parameter given the set of covariates x i are equal is strong and may fail to account for overdispersion given the variability of the rate parameter (the variance exceeds the mean. Using an empirical example, we aimed to describe simple methods to test and correct for overdispersion. Methods We used a regression-based score test for overdispersion under the relative survival framework and proposed different approaches to correct for overdispersion including a quasi-likelihood, robust standard errors estimation, negative binomial regression and flexible piecewise modelling. Results All piecewise exponential regression models showed the presence of significant inherent overdispersion (p-value <0.001. However, the flexible piecewise exponential model showed the smallest overdispersion parameter (3.2 versus 21.3 for non-flexible piecewise exponential models. Conclusion We showed that there were no major differences between methods. However, using a flexible piecewise regression modelling, with either a quasi-likelihood or robust standard errors, was the best approach as it deals with both, overdispersion due to model misspecification and true or inherent overdispersion.

  10. Modeling units of study from a pedagogical perspective: the pedagogical meta-model behind EML

    NARCIS (Netherlands)

    Koper, Rob

    2003-01-01

    This text is a short summary of the work on pedagogical analysis carried out when EML (Educational Modelling Language) was being developed. Because we address pedagogical meta-models the consequence is that I must justify the underlying pedagogical models it describes. I have included a (far from co

  11. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    Science.gov (United States)

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  12. Relationship between Remittance and Economic Growth in Bangladesh: an Autoregressive Distributed Lag Model (ARDL

    Directory of Open Access Journals (Sweden)

    Shapan Chandra Majumder

    2016-03-01

    Full Text Available This study examines the long-run impact of remittances on economic growth in Bangladesh. Bangladesh, being one of the top remittance-recipient countries in the world, has drawn attention to the remittance-output relationship in recent years. In 2014, remittances contributed to 8.2% of GDP of Bangladesh while the contribution was 6.7% in 2006. The main objective of this study is to investigate the impact of the remittance on economic growth (GDP. We adopted Autoregressive Distributed Lag (ARDL models or dynamic linear regressions are widely used to examine the relationship between remittances and economic growth in the country. In testing for the unit root properties of the time series data, all variables are found stationary at first differencing level under the ADF and PP stationary tests. The study made use of diagnostic tests such as the residual normality test, heteroskedacity and serial autocorrelation tests for misspecification in order to validate the parameter estimation outcomes achieved by the estimated model. The stability test of the model is also checked by CUSUM test. The ARDL model presents that there exist a statistically significant long run positive relationship between remittance and economic growth of gross domestic product in Bangladesh.

  13. Identification and estimation of nonlinear models using two samples with nonclassical measurement errors

    KAUST Repository

    Carroll, Raymond J.

    2010-05-01

    This paper considers identification and estimation of a general nonlinear Errors-in-Variables (EIV) model using two samples. Both samples consist of a dependent variable, some error-free covariates, and an error-prone covariate, for which the measurement error has unknown distribution and could be arbitrarily correlated with the latent true values; and neither sample contains an accurate measurement of the corresponding true variable. We assume that the regression model of interest - the conditional distribution of the dependent variable given the latent true covariate and the error-free covariates - is the same in both samples, but the distributions of the latent true covariates vary with observed error-free discrete covariates. We first show that the general latent nonlinear model is nonparametrically identified using the two samples when both could have nonclassical errors, without either instrumental variables or independence between the two samples. When the two samples are independent and the nonlinear regression model is parameterized, we propose sieve Quasi Maximum Likelihood Estimation (Q-MLE) for the parameter of interest, and establish its root-n consistency and asymptotic normality under possible misspecification, and its semiparametric efficiency under correct specification, with easily estimated standard errors. A Monte Carlo simulation and a data application are presented to show the power of the approach.

  14. Modeling of hydrogen interactions with beryllium

    Energy Technology Data Exchange (ETDEWEB)

    Longhurst, G.R. [Lockheed Martin Idaho Technologies Co., Idaho Falls, ID (United States)

    1998-01-01

    In this paper, improved mathematical models are developed for hydrogen interactions with beryllium. This includes the saturation effect observed for high-flux implantation of ions from plasmas and retention of tritium produced from neutronic transmutations in beryllium. Use of the models developed is justified by showing how they can replicated experimental data using the TMAP4 tritium transport code. (author)

  15. 基于市场目标,广告和促销的市场营销计划%A marketing plan to justify the market target, advertising and promotion

    Institute of Scientific and Technical Information of China (English)

    曹炜

    2013-01-01

    本文主要是给出市场营销计划和BCFTCS Holidays的相关信息以供董事会决策。本文研究某个度假休闲的典型地区作为数据研究的来源,并基于市场目标,广告和促销阐述市场营销计划。%The purpose of this report is to give the marketing plan and information on BCFTCS Holidays to the Board of Directors. The writer will research a good region as a holiday destination search for data and resources, and make a marketing plan to justify the market target, advertising and promotion.

  16. Main Problems of China' s Water Justified System and the Perfection Ways%中国水资源论证制度存在的主要问题及完善的思路

    Institute of Scientific and Technical Information of China (English)

    冯嘉

    2012-01-01

    水资源论证制度对于合理优化配置水资源、促进经济社会与环境资源协调发展具有重要的推动作用,因而应当进一步推进该制度的实施。但受制于《建设项目水资源论证管理办法》的立法层级过低,水资源论证制度实施面临诸多难题。无论是适用范围狭窄,还是制度实施与上位法规定不符,其根源都是缺乏较高层级的水资源论证制度立法。因而完善水资源论证制度的基本思路就是加强立法,制定一部有关水资源论证制度的行政法规,并以此为统领,完善水资源论证制度立法体系。只有如此才能彻底扫清制约水资源论证制度发展的障碍,为进一步促进水资源论证制度功能的发挥奠定良好的基础。%China' s Water Justified System has the potential to enhance water resources protection as well as promote sustainable development. However, the implementation of this system is faced with many difficulties due to the low legal hierarchy of Regulations on Water Resources dustification for Construction Project. The main problems are as follows: firstly, some kinds of water such as rainwater, desalinated water, drained water and running water are not defined as water resources by China' s Water Act, so the system can't take them into the scope of application; secondly, agriculture, industry and city plans which have profound impacts on water resources and at the same time are strictly limited by it are not defined as part of the scope of the Water Justified System by the existing rule; thirdly, the legal rule which enacted the system is in violation of the Administrative Licensing Act, making it very difficult to be put into enforcement. Besides, the vocational qualification stipulated by the system also heavily violates upper laws, which make it impossible to be carried out; fourthly, the Administrative Punishment Act is also be violated by the system in many aspects, although the

  17. Forecasting Performance of Asymmetric GARCH Stock Market Volatility Models

    Directory of Open Access Journals (Sweden)

    Hojin Lee

    2009-12-01

    Full Text Available We investigate the asymmetry between positive and negative returns in their effect on conditional variance of the stock market index and incorporate the characteristics to form an out-of-sample volatility forecast. Contrary to prior evidence, however, the results in this paper suggest that no asymmetric GARCH model is superior to basic GARCH(1,1 model. It is our prior knowledge that, for equity returns, it is unlikely that positive and negative shocks have the same impact on the volatility. In order to reflect this intuition, we implement three diagnostic tests for volatility models: the Sign Bias Test, the Negative Size Bias Test, and the Positive Size Bias Test and the tests against the alternatives of QGARCH and GJR-GARCH. The asymmetry test results indicate that the sign and the size of the unexpected return shock do not influence current volatility differently which contradicts our presumption that there are asymmetric effects in the stock market volatility. This result is in line with various diagnostic tests which are designed to determine whether the GARCH(1,1 volatility estimates adequately represent the data. The diagnostic tests in section 2 indicate that the GARCH(1,1 model for weekly KOSPI returns is robust to the misspecification test. We also investigate two representative asymmetric GARCH models, QGARCH and GJR-GARCH model, for our out-of-sample forecasting performance. The out-of-sample forecasting ability test reveals that no single model is clearly outperforming. It is seen that the GJR-GARCH and QGARCH model give mixed results in forecasting ability on all four criteria across all forecast horizons considered. Also, the predictive accuracy test of Diebold and Mariano based on both absolute and squared prediction errors suggest that the forecasts from the linear and asymmetric GARCH models need not be significantly different from each other.

  18. Can Mathematics be Justified by Natural Logic?

    Science.gov (United States)

    Schreiber, Lothar; Sommer, Hanns

    2010-11-01

    Charles Darwin claimed that the forms and the behaviour of living beings can be explained from their will to survive. But what are the consequences of this idea for humans knowledge, their theories of nature and their mathematics?. We discuss the view that even Plato's objective world of mathematical objects does not exist absolutely, without the intentions of mathematicians. Using Husserl's Phenomenological Method, cognition can be understood as a process by which meaning is deduced from empirical data relative to intentions. Thereby the essential structure of any cognition process can be detected and this structure is mirrored in logic. A natural logic becomes the direct result of cognition. Only in a second step, mathematics is obtained by abstraction from natural logic. In this way mathematics gains a well-defined foundation and is no longer part of a dubious 'a-priori knowledge' (Kant). This access to mathematics offers a new look on many old problems, e.g. the Petersburg problem and the problem 'P = NP?'. We demonstrate that this new justification of mathematics has also important applications in Artificial Intelligence. Our method provides a procedure to construct an adequate logic to solve most efficiently the problems of a given problem class. Thus, heuristics can be tailor-made for the necessities of applications.

  19. Preemptive Strike: Justifying the Second Iraq War

    OpenAIRE

    MACPHERSON, Jeff

    2010-01-01

    この小論では、イラクに対する先制攻撃とその後のイラクでの戦争に関してブッシュ政権が下した結論に影響を及ぼしたいくつかの要因について言及している。イラクに対する外交政策及び、アメリカがイラクに侵攻した正当性、そして戦争の結果について検討した。論考の最後にはイラク戦争とベトナム戦争の類似性も明らかにしている。

  20. Gastric carcinoma: when is palliative gastrectomy justified?

    Directory of Open Access Journals (Sweden)

    Hubert Scheidbach

    2011-12-01

    Full Text Available Gastric carcinoma is frequently diagnosed with an advanced stage of non-curable tumor growth characterized by infiltration of the gastric serosa, peritoneal tumor spread and/or metastases within lymph nodes and liver. Currently, there is a controversy on the value of palliative resection with regard to the safety and benefit to the patient outcome. Based on the available literature, this overview summarizes the various aspects and interprets the limited data on the palliative resection of gastric carcinoma. It turns out that the available study results may indicate potential for an improved quality of life and a prolongation of survival if an acceptable morbidity and mortality are present.

  1. On justifying eco-unfriendly behaviors

    NARCIS (Netherlands)

    Meijers, M.H.C.

    2014-01-01

    The climate is changing, species are about to go extinct, and mountains of garbage are ever increasing. In order to preserve the earth and provide a good living standard for all its inhabitants, it is important for people to continue making environmentally friendly choices. This dissertation, howeve

  2. Three Requirements for Justifying an Educational Neuroscience

    Science.gov (United States)

    Hruby, George G.

    2012-01-01

    Background: Over the past quarter century, efforts to bridge between research in the neurosciences and research, theory, and practice in education have grown from a mere hope to noteworthy scholarly sophistication. Many dedicated educational researchers have developed the secondary expertise in the necessary neurosciences and related fields to…

  3. Justified Humanitarian Intervention: Operation ALLIED FORCE

    Science.gov (United States)

    2013-04-25

    Intervention” in J.L. Holzgrefe and Robert O. Keohane , eds. Humanitarian Intervention: Ethical, Legal and Political Dilemmas (Cambridge: Cambridge...Gamble. Annapolis, MD: Naval Institute Press, 2007. Holzgrefe, J.L. and Robert O. Keohane . eds. Humanitarian Intervention: Ethical, Legal and

  4. Justifying Physical Education Based on Neuroscience Evidence

    Science.gov (United States)

    Berg, Kris

    2010-01-01

    Research has shown that exercise improves cognitive function and psychological traits that influence behavior (e.g., mood, level of motivation). The evidence in the literature also shows that physical education may enhance learning or that academic performance is at least maintained despite a reduction in classroom time in order to increase time…

  5. Self-Esteem: Justifying Its Existence.

    Science.gov (United States)

    Street, Sue; Isaacs, Madelyn

    1998-01-01

    The role of self-esteem as a professional and personality construct has been obscured by its panacea role. Definitions of self-esteem and related terms are distinguished. Self-esteem is discussed as a developmental construct, a personality construct, and as a therapeutic goal. Therapeutic, educational, and counseling implications are discussed.…

  6. Justifying Study Abroad in Financially Difficult Times

    Science.gov (United States)

    Ludlum, Marty; Ice, Randal; Sheetz-Nguyen, Jessica

    2013-01-01

    In this paper, we will develop the justification for study abroad. We will discuss the current economic climate and its impact on budgets. Next, we will explain the many benefits of the study abroad programs. Then we will propose some less expensive alternatives to the traditional study abroad programs. We will conclude with expectations for the…

  7. Self-Esteem: Justifying Its Existence.

    Science.gov (United States)

    Street, Sue; Isaacs, Madelyn

    1998-01-01

    The role of self-esteem as a professional and personality construct has been obscured by its panacea role. Definitions of self-esteem and related terms are distinguished. Self-esteem is discussed as a developmental construct, a personality construct, and as a therapeutic goal. Therapeutic, educational, and counseling implications are discussed.…

  8. Justified and unjustified use of growth hormone.

    NARCIS (Netherlands)

    A-J. van der Lely (Aart-Jan)

    2004-01-01

    textabstractGrowth hormone (GH) replacement therapy for children and adults with proven GH deficiency due to a pituitary disorder has become an accepted therapy with proven efficacy. GH is increasingly suggested, however, as a potential treatment for frailty, osteoporosis, morbid o

  9. Justifying Physical Education Based on Neuroscience Evidence

    Science.gov (United States)

    Berg, Kris

    2010-01-01

    Research has shown that exercise improves cognitive function and psychological traits that influence behavior (e.g., mood, level of motivation). The evidence in the literature also shows that physical education may enhance learning or that academic performance is at least maintained despite a reduction in classroom time in order to increase time…

  10. Three Requirements for Justifying an Educational Neuroscience

    Science.gov (United States)

    Hruby, George G.

    2012-01-01

    Background: Over the past quarter century, efforts to bridge between research in the neurosciences and research, theory, and practice in education have grown from a mere hope to noteworthy scholarly sophistication. Many dedicated educational researchers have developed the secondary expertise in the necessary neurosciences and related fields to…

  11. Constructed criteria: redefining merit to justify discrimination.

    Science.gov (United States)

    Uhlmann, Ericluis; Cohen, Geoffrey L

    2005-06-01

    This article presents an account of job discrimination according to which people redefine merit in a manner congenial to the idiosyncratic credentials of individual applicants from desired groups. In three studies, participants assigned male and female applicants to gender-stereotypical jobs. However, they did not view male and female applicants as having different strengths and weaknesses. Instead, they redefined the criteria for success at the job as requiring the specific credentials that a candidate of the desired gender happened to have. Commitment to hiring criteria prior to disclosure of the applicant's gender eliminated discrimination, suggesting that bias in the construction of hiring criteria plays a causal role in discrimination.

  12. Three requirements for justifying an educational neuroscience.

    Science.gov (United States)

    Hruby, George G

    2012-03-01

    Over the past quarter century, efforts to bridge between research in the neurosciences and research, theory, and practice in education have grown from a mere hope to noteworthy scholarly sophistication. Many dedicated educational researchers have developed the secondary expertise in the necessary neurosciences and related fields to generate both empirical research and theoretical syntheses of noteworthy promise. Nonetheless, thoughtful and critical scholars in education have expressed concern about both the intellectual coherence and ethical dangers of this new area. It is still an open question whether educational neuroscience is for some time yet to remain only a formative study area for adventurous scholars or is already a fully fledged field of educational scholarship. In this paper, I suggest that to be a worthy field of educational research, educational neuroscience will need to address three issues: intellectual coherence, mutually informing and respected scholarly expertise, and an ethical commitment to the moral implications and obligations shared within educational research generally. I shall set forth some examples of lapses in this regard, focusing primarily on work on reading development, as that is my area of expertise, and make recommendations for due diligence. Arguments. First, intellectual coherence requires both precision in definition of technical terms (so that diverse scholars and professionals may communicate findings and insights consistently across fields), and precision in the logical warrants by which educational implications are drawn from empirical data from the neurosciences. Both needs are facilitated by careful attention to categorical boundary and avoidance of category error. Second, educational neuroscientists require focused and broad expertise in both the neurosciences and educational scholarship on teaching and learning in classrooms (and/or ancillary fields). If history is our guide, neuroscience implications for practice will prove unlikely in practice without expertise on practice. Additionally, respect for the expertise of others in this hybrid and necessarily collaborative enterprise is required. Third, educational neuroscience must take seriously the heightened moral and ethical concerns and commitments of educational professionals generally and educational researchers particularly. This means keeping a vigilant eye towards preserving the integrity of empirical and theoretical findings against rhetorical misuse by educational marketers, policy makers, and polemicists targeting the general public. I conclude that educational neuroscience is more than a hybrid patchwork of individual interests constituting a study area, and is perhaps ready to stand as a legitimate field of educational inquiry. It will not be accepted as such, however, nor should it be, unless the need to demonstrate a capacity for consistent intellectual coherence, scholarly expertise, and ethical commitment is met. ©2012 The British Psychological Society.

  13. A variational data assimilation system for soil–atmosphere flux estimates for the Community Land Model (CLM3.5

    Directory of Open Access Journals (Sweden)

    C. M. Hoppe

    2014-05-01

    Full Text Available This paper presents the development and implementation of a spatio-temporal variational data assimilation system (4D-var for the soil–vegetation–atmosphere transfer model "Community Land Model" (CLM3.5, along with the development of the adjoint code for the core soil–atmosphere transfer scheme of energy and soil moisture. The purpose of this work is to obtain an improved estimation technique for the energy fluxes (sensible and latent heat fluxes between the soil and the atmosphere. Optimal assessments of these fluxes are neither available from model simulations nor measurements alone, while a 4D-var data assimilation has the potential to combine both information sources by a Best Linear Unbiased Estimate (BLUE. The 4D-var method requires the development of the adjoint model of the CLM which is established in this work. The new data assimilation algorithm is able to assimilate soil temperature and soil moisture measurements for one-dimensional columns of the model grid. Numerical experiments were first used to test the algorithm under idealised conditions. It was found that the analysis delivers improved results whenever there is a dependence between the initial values and the assimilated quantity. Furthermore, soil temperature and soil moisture from in situ field measurements were assimilated. These calculations demonstrate the improved performance of flux estimates, whenever soil property parameters are available of sufficient quality. Misspecifications could also be identified by the performance of the variational scheme.

  14. Education and gender bias in the sex ratio at birth: evidence from India.

    Science.gov (United States)

    Echávarri, Rebeca A; Ezcurra, Roberto

    2010-02-01

    This article investigates the possible existence of a nonlinear link between female disadvantage in natality and education. To this end, we devise a theoretical model based on the key role of social interaction in explaining people's acquisition of preferences, which justifies the existence of a nonmonotonic relationship between female disadvantage in natality and education. The empirical validity of the proposed model is examined for the case of India, using district-level data. In this context, our econometric analysis pays particular attention to the role of spatial dependence to avoid any potential problems of misspecification. The results confirm that the relationship between the sex ratio at birth and education in India follows an inverted U-shape. This finding is robust to the inclusion of additional explanatory variables in the analysis, and to the choice of the spatial weight matrix used to quantify the spatial interdependence between the sample districts.

  15. Evaluating remedial alternatives for an acid mine drainage stream: A model post audit

    Science.gov (United States)

    Runkel, Robert L.; Kimball, Briant A.; Walton-Day, Katherine; Verplanck, Philip L.; Broshears, Robert E.

    2012-01-01

    A post audit for a reactive transport model used to evaluate acid mine drainage treatment systems is presented herein. The post audit is based on a paired synoptic approach in which hydrogeochemical data are collected at low (existing conditions) and elevated (following treatment) pH. Data obtained under existing, low-pH conditions are used for calibration, and the resultant model is used to predict metal concentrations observed following treatment. Predictions for Al, As, Fe, H+, and Pb accurately reproduce the observed reduction in dissolved concentrations afforded by the treatment system, and the information provided in regard to standard attainment is also accurate (predictions correctly indicate attainment or nonattainment of water quality standards for 19 of 25 cases). Errors associated with Cd, Cu, and Zn are attributed to misspecification of sorbent mass (precipitated Fe). In addition to these specific results, the post audit provides insight in regard to calibration and sensitivity analysis that is contrary to conventional wisdom. Steps taken during the calibration process to improve simulations of As sorption were ultimately detrimental to the predictive results, for example, and the sensitivity analysis failed to bracket observed metal concentrations.

  16. Estimating successive cancer risks in Lynch Syndrome families using a progressive three-state model.

    Science.gov (United States)

    Choi, Yun-Hee; Briollais, Laurent; Green, Jane; Parfrey, Patrick; Kopciuk, Karen

    2014-02-20

    Lynch Syndrome (LS) families harbor mutated mismatch repair genes,which predispose them to specific types of cancer. Because individuals within LS families can experience multiple cancers over their lifetime, we developed a progressive three-state model to estimate the disease risk from a healthy (state 0) to a first cancer (state 1) and then to a second cancer (state 2). Ascertainment correction of the likelihood was made to adjust for complex sampling designs with carrier probabilities for family members with missing genotype information estimated using their family's observed genotype and phenotype information in a one-step expectation-maximization algorithm. A sandwich variance estimator was employed to overcome possible model misspecification. The main objective of this paper is to estimate the disease risk (penetrance) for age at a second cancer after someone has experienced a first cancer that is also associated with a mutated gene. Simulation study results indicate that our approach generally provides unbiased risk estimates and low root mean squared errors across different family study designs, proportions of missing genotypes, and risk heterogeneities. An application to 12 large LS families from Newfoundland demonstrates that the risk for a second cancer was substantial and that the age at a first colorectal cancer significantly impacted the age at any LS subsequent cancer. This study provides new insights for developing more effective management of mutation carriers in LS families by providing more accurate multiple cancer risk estimates.

  17. Finding the right balance between groundwater model complexity and experimental effort via Bayesian model selection

    Science.gov (United States)

    Schöniger, Anneli; Illman, Walter A.; Wöhling, Thomas; Nowak, Wolfgang

    2015-12-01

    Groundwater modelers face the challenge of how to assign representative parameter values to the studied aquifer. Several approaches are available to parameterize spatial heterogeneity in aquifer parameters. They differ in their conceptualization and complexity, ranging from homogeneous models to heterogeneous random fields. While it is common practice to invest more effort into data collection for models with a finer resolution of heterogeneities, there is a lack of advice which amount of data is required to justify a certain level of model complexity. In this study, we propose to use concepts related to Bayesian model selection to identify this balance. We demonstrate our approach on the characterization of a heterogeneous aquifer via hydraulic tomography in a sandbox experiment (Illman et al., 2010). We consider four increasingly complex parameterizations of hydraulic conductivity: (1) Effective homogeneous medium, (2) geology-based zonation, (3) interpolation by pilot points, and (4) geostatistical random fields. First, we investigate the shift in justified complexity with increasing amount of available data by constructing a model confusion matrix. This matrix indicates the maximum level of complexity that can be justified given a specific experimental setup. Second, we determine which parameterization is most adequate given the observed drawdown data. Third, we test how the different parameterizations perform in a validation setup. The results of our test case indicate that aquifer characterization via hydraulic tomography does not necessarily require (or justify) a geostatistical description. Instead, a zonation-based model might be a more robust choice, but only if the zonation is geologically adequate.

  18. “It’s an Issue that Must Be Addressed, Once Infertility Is Declared a Disease” Study of The Discursive Mechanisms Used by Chilean Deputies to Justify their Positions Regarding Assisted Reproduction

    Directory of Open Access Journals (Sweden)

    Yanko Pavicevic Cifuentes

    2015-10-01

    Full Text Available In Chile, the use and development of assisted reproductive technologies (ART have been on a constant rise, with a total of 1932 cycles registered in 2009 (SOCMER, 2009. However, the legal frame that regulates these technologies is weak, so in practice, the clinics that provide them are the ones that supervise their use. This is why, we wanted to understand the  position of those who have the faculty to legislate on the use of ART in Chile, namely, the deputies of the Republic, focusing on how they justify their standpoints. Our investigation was of qualitative nature, because it gives more space for reflexivity and flexibility in the investigation process (Mason, 2002. We collected information using semi-structured interviews, conducted to 16 deputies of the two main political coalitions present in Chile. In the deputies’ discourse, different positions concerning the use of ART’s are manifested: there are those who demand the respect of human dignity and nature and those who foment scientific development, but in general, there is consensus about the necessity to amplify ART’s regulation and incentivize its development, safeguarding ethics and equity, always.

  19. Preliminary Multivariable Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip

    2010-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. Previously, the authors published two single variable cost models based on 19 flight missions. The current paper presents the development of a multi-variable space telescopes cost model. The validity of previously published models are tested. Cost estimating relationships which are and are not significant cost drivers are identified. And, interrelationships between variables are explored

  20. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results...

  1. Numerical Modeling of Rotary Kiln Productivity Increase

    NARCIS (Netherlands)

    Romero-Valle, M.A.; Pisaroni, M.; Van Puyvelde, D.; Lahaye, D.J.P.; Sadi, R.

    2013-01-01

    Rotary kilns are used in many industrial processes ranging from cement manufacturing to waste incineration. The operating conditions vary widely depending on the process. While there are many models available within the literature and industry, the wide range of operating conditions justifies furthe

  2. Efficient estimation and prediction for the Bayesian binary spatial model with flexible link functions.

    Science.gov (United States)

    Roy, Vivekananda; Evangelou, Evangelos; Zhu, Zhengyuan

    2016-03-01

    Spatial generalized linear mixed models (SGLMMs) are popular models for spatial data with a non-Gaussian response. Binomial SGLMMs with logit or probit link functions are often used to model spatially dependent binomial random variables. It is known that for independent binomial data, the robit regression model provides a more robust (against extreme observations) alternative to the more popular logistic and probit models. In this article, we introduce a Bayesian spatial robit model for spatially dependent binomial data. Since constructing a meaningful prior on the link function parameter as well as the spatial correlation parameters in SGLMMs is difficult, we propose an empirical Bayes (EB) approach for the estimation of these parameters as well as for the prediction of the random effects. The EB methodology is implemented by efficient importance sampling methods based on Markov chain Monte Carlo (MCMC) algorithms. Our simulation study shows that the robit model is robust against model misspecification, and our EB method results in estimates with less bias than full Bayesian (FB) analysis. The methodology is applied to a Celastrus Orbiculatus data, and a Rhizoctonia root data. For the former, which is known to contain outlying observations, the robit model is shown to do better for predicting the spatial distribution of an invasive species. For the latter, our approach is doing as well as the classical models for predicting the disease severity for a root disease, as the probit link is shown to be appropriate. Though this article is written for Binomial SGLMMs for brevity, the EB methodology is more general and can be applied to other types of SGLMMs. In the accompanying R package geoBayes, implementations for other SGLMMs such as Poisson and Gamma SGLMMs are provided.

  3. Some Problems in Using Diffusion Models for New Products.

    Science.gov (United States)

    Bernhardt, Irwin; Mackenzie, Kenneth D.

    This paper analyzes some of the problems of using diffusion models to formulate marketing strategies for new products. Though future work in this area appears justified, many unresolved problems limit its application. There is no theory for adoption and diffusion processes; such a theory is outlined in this paper. The present models are too…

  4. A normative model of hospital marketing decision making.

    Science.gov (United States)

    Hammond, K L; Brown, G; Humphreys, N

    1993-01-01

    A hospital marketing model is proposed for use as a framework for applying marketing strategy and concepts to hospitals. The cells of the model, primarily summarizing the many decisions of the marketing management process as can be applied to hospitals, are justified by the health care marketing literature.

  5. Justifying knowledge, justifying method, taking action: epistemologies, methodologies, and methods in qualitative research.

    Science.gov (United States)

    Carter, Stacy M; Little, Miles

    2007-12-01

    In this article, the authors clarify a framework for qualitative research, in particular for evaluating its quality, founded on epistemology, methodology, and method. They define these elements and discuss their respective contributions and interrelationships. Epistemology determines and is made visible through method, particularly in the participant- researcher relationship, measures of research quality, and form, voice, and representation in analysis and writing. Epistemology guides methodological choices and is axiological. Methodology shapes and is shaped by research objectives, questions, and study design. Methodologies can prescribe choices of method, resonate with particular academic disciplines, and encourage or discourage the use and/or development of theory. Method is constrained by and makes visible methodological and epistemic choices. If we define good quality qualitative research as research that attends to all three elements and demonstrates internal consistency between them, standardized checklists can be transcended and innovation and diversity in qualitative research practice facilitated.

  6. Testing for Stock Market Contagion: A Quantile Regression Approach

    NARCIS (Netherlands)

    S.Y. Park (Sung); W. Wang (Wendun); N. Huang (Naijing)

    2015-01-01

    markdownabstract__Abstract__ Regarding the asymmetric and leptokurtic behavior of financial data, we propose a new contagion test in the quantile regression framework that is robust to model misspecification. Unlike conventional correlation-based tests, the proposed quantile contagion test

  7. SNP_NLMM: A SAS Macro to Implement a Flexible Random Effects Density for Generalized Linear and Nonlinear Mixed Models.

    Science.gov (United States)

    Vock, David M; Davidian, Marie; Tsiatis, Anastasios A

    2014-01-01

    Generalized linear and nonlinear mixed models (GMMMs and NLMMs) are commonly used to represent non-Gaussian or nonlinear longitudinal or clustered data. A common assumption is that the random effects are Gaussian. However, this assumption may be unrealistic in some applications, and misspecification of the random effects density may lead to maximum likelihood parameter estimators that are inconsistent, biased, and inefficient. Because testing if the random effects are Gaussian is difficult, previous research has recommended using a flexible random effects density. However, computational limitations have precluded widespread use of flexible random effects densities for GLMMs and NLMMs. We develop a SAS macro, SNP_NLMM, that overcomes the computational challenges to fit GLMMs and NLMMs where the random effects are assumed to follow a smooth density that can be represented by the seminonparametric formulation proposed by Gallant and Nychka (1987). The macro is flexible enough to allow for any density of the response conditional on the random effects and any nonlinear mean trajectory. We demonstrate the SNP_NLMM macro on a GLMM of the disease progression of toenail infection and on a NLMM of intravenous drug concentration over time.

  8. Experiment selection for the discrimination of semi-quantitative models of dynamical systems

    NARCIS (Netherlands)

    Vatcheva, [No Value; de Jong, H; Bernard, O; Mars, NJI

    2006-01-01

    Modeling an experimental system often results in a number of alternative models that are all justified by the available experimental data. To discriminate among these models, additional experiments are needed. Existing methods for the selection of discriminatory experiments in statistics and in arti

  9. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Science.gov (United States)

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  10. Numerical Modeling of Rotary Kiln Productivity Increase

    OpenAIRE

    2013-01-01

    Rotary kilns are used in many industrial processes ranging from cement manufacturing to waste incineration. The operating conditions vary widely depending on the process. While there are many models available within the literature and industry, the wide range of operating conditions justifies further modeling work to improve the understanding of the processes taking place within the kiln. The kiln being studied in this work produces calcium aluminate cements (CAC). In a first stage of the pro...

  11. Kinetic models in spin chemistry. 1. The hyperfine interaction

    DEFF Research Database (Denmark)

    Mojaza, M.; Pedersen, J. B.

    2012-01-01

    Kinetic models for quantum systems are quite popular due to their simplicity, although they are difficult to justify. We show that the transformation from quantum to kinetic description can be done exactly for the hyperfine interaction of one nuclei with arbitrary spin; more spins are described w...

  12. Models

    DEFF Research Database (Denmark)

    Juel-Christiansen, Carsten

    2005-01-01

    Artiklen fremhæver den visuelle rotation - billeder, tegninger, modeller, værker - som det privilligerede medium i kommunikationen af ideer imellem skabende arkitekter......Artiklen fremhæver den visuelle rotation - billeder, tegninger, modeller, værker - som det privilligerede medium i kommunikationen af ideer imellem skabende arkitekter...

  13. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  14. Conditioning of the stationary kriging matrices for some well-known covariance models

    Energy Technology Data Exchange (ETDEWEB)

    Posa, D. (IRMA-CNR, Bari (Italy))

    1989-10-01

    In this paper, the condition number of the stationary kriging matrix is studied for some well-known covariance models. Indeed, the robustness of the kriging weights is strongly affected by this measure. Such an analysis can justify the choice of a covariance function among other admissible models which could fit a given experimental covariance equally well.

  15. Fitting Multilevel Models with Ordinal Outcomes: Performance of Alternative Specifications and Methods of Estimation

    Science.gov (United States)

    Bauer, Daniel J.; Sterba, Sonya K.

    2011-01-01

    Previous research has compared methods of estimation for fitting multilevel models to binary data, but there are reasons to believe that the results will not always generalize to the ordinal case. This article thus evaluates (a) whether and when fitting multilevel linear models to ordinal outcome data is justified and (b) which estimator to employ…

  16. Modelling

    CERN Document Server

    Spädtke, P

    2013-01-01

    Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.

  17. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  18. Estimating treatment effect in a proportional hazards model in randomized clinical trials with all-or-nothing compliance.

    Science.gov (United States)

    Li, Shuli; Gray, Robert J

    2016-09-01

    We consider methods for estimating the treatment effect and/or the covariate by treatment interaction effect in a randomized clinical trial under noncompliance with time-to-event outcome. As in Cuzick et al. (2007), assuming that the patient population consists of three (possibly latent) subgroups based on treatment preference: the ambivalent group, the insisters, and the refusers, we estimate the effects among the ambivalent group. The parameters have causal interpretations under standard assumptions. The article contains two main contributions. First, we propose a weighted per-protocol (Wtd PP) estimator through incorporating time-varying weights in a proportional hazards model. In the second part of the article, under the model considered in Cuzick et al. (2007), we propose an EM algorithm to maximize a full likelihood (FL) as well as the pseudo likelihood (PL) considered in Cuzick et al. (2007). The E step of the algorithm involves computing the conditional expectation of a linear function of the latent membership, and the main advantage of the EM algorithm is that the risk parameters can be updated by fitting a weighted Cox model using standard software and the baseline hazard can be updated using closed-form solutions. Simulations show that the EM algorithm is computationally much more efficient than directly maximizing the observed likelihood. The main advantage of the Wtd PP approach is that it is more robust to model misspecifications among the insisters and refusers since the outcome model does not impose distributional assumptions among these two groups. © 2016, The International Biometric Society.

  19. EM Adaptive LASSO-A Multilocus Modeling Strategy for Detecting SNPs Associated with Zero-inflated Count Phenotypes.

    Science.gov (United States)

    Mallick, Himel; Tiwari, Hemant K

    2016-01-01

    Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP), Negative Binomial, and Zero-inflated Negative Binomial (ZINB). However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization) algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely encountered in practice.

  20. EM Adaptive LASSO – A Multilocus Modeling Strategy for Detecting SNPs Associated With Zero-inflated Count Phenotypes

    Directory of Open Access Journals (Sweden)

    Himel eMallick

    2016-03-01

    Full Text Available Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP, Negative Binomial, and Zero-inflated Negative Binomial (ZINB. However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely

  1. model

    African Journals Online (AJOL)

    trie neural construction oí inoiviouo! unci communal identities in ... occurs, Including models based on Information processing,1 ... Applying the DSM descriptive approach to dissociation in the ... a personal, narrative path lhal connects personal lo ethnic ..... managed the problem in the context of the community, using a.

  2. Piecewise linear and Boolean models of chemical reaction networks

    Science.gov (United States)

    Veliz-Cuba, Alan; Kumar, Ajit; Josić, Krešimir

    2014-01-01

    Models of biochemical networks are frequently complex and high-dimensional. Reduction methods that preserve important dynamical properties are therefore essential for their study. Interactions in biochemical networks are frequently modeled using Hill functions (xn/(Jn + xn)). Reduced ODEs and Boolean approximations of such model networks have been studied extensively when the exponent n is large. However, while the case of small constant J appears in practice, it is not well understood. We provide a mathematical analysis of this limit, and show that a reduction to a set of piecewise linear ODEs and Boolean networks can be mathematically justified. The piecewise linear systems have closed form solutions that closely track those of the fully nonlinear model. The simpler, Boolean network can be used to study the qualitative behavior of the original system. We justify the reduction using geometric singular perturbation theory and compact convergence, and illustrate the results in network models of a toggle switch and an oscillator. PMID:25412739

  3. Piecewise linear and Boolean models of chemical reaction networks.

    Science.gov (United States)

    Veliz-Cuba, Alan; Kumar, Ajit; Josić, Krešimir

    2014-12-01

    Models of biochemical networks are frequently complex and high-dimensional. Reduction methods that preserve important dynamical properties are therefore essential for their study. Interactions in biochemical networks are frequently modeled using Hill functions ([Formula: see text]). Reduced ODEs and Boolean approximations of such model networks have been studied extensively when the exponent [Formula: see text] is large. However, while the case of small constant [Formula: see text] appears in practice, it is not well understood. We provide a mathematical analysis of this limit and show that a reduction to a set of piecewise linear ODEs and Boolean networks can be mathematically justified. The piecewise linear systems have closed-form solutions that closely track those of the fully nonlinear model. The simpler, Boolean network can be used to study the qualitative behavior of the original system. We justify the reduction using geometric singular perturbation theory and compact convergence, and illustrate the results in network models of a toggle switch and an oscillator.

  4. Discussing the Strehler-Mildvan model of mortality

    Directory of Open Access Journals (Sweden)

    Maxim Finkelstein

    2012-03-01

    Full Text Available BACKGROUND Half a century ago Strehler and Mildvan (1960 have published the seminal paper that, based on some assumptions (postulates, theoretically 'justified' the Gompertz law of mortality. OBJECTIVE We wish to discuss assumptions and limitations of the original Strehler-Mildvan model (as well as of the Strehler-Mildvan correlation and consider some modifications and departures from this model. METHODS We use the framework of stochastic point processes for analyzing the original Strehler-Mildvan model. We also suggest the 'lifesaving approach' for describing the departure from rectangularization to shifts in survival curves for human mortality that has been observed in the second half of the previous century. RESULTS We show that the Strehler-Mildvan model can be justified only under the additional assumption that the process of shocks (demands for energy follows the Poisson pattern. We also suggest a modification that accounts for the oldest-old mortality plateau.

  5. The disruption management model.

    Science.gov (United States)

    McAlister, James

    2011-10-01

    Within all organisations, business continuity disruptions present a set of dilemmas that managers may not have dealt with before in their normal daily duties. The disruption management model provides a simple but effective management tool to enable crisis management teams to stay focused on recovery in the midst of a business continuity incident. The model has four chronological primary headlines, which steer the team through a quick-time crisis decision-making process. The procedure facilitates timely, systematic, rationalised and justified decisions, which can withstand post-event scrutiny. The disruption management model has been thoroughly tested within an emergency services environment and is proven to significantly support clear and concise decision making in a business continuity context.

  6. A mathematical model of star formation in the Galaxy

    Directory of Open Access Journals (Sweden)

    M.A. Sharaf

    2012-06-01

    Full Text Available This paper is generally concerned with star formation in the Galaxy, especially blue stars. Blue stars are the most luminous, massive and the largest in radius. A simple mathematical model of the formation of the stars is established and put in computational algorithm. This algorithm enables us to know more about the formation of the star. Some real and artificial examples had been used to justify this model.

  7. Uncertainty, sensitivity analysis and the role of data based mechanistic modeling in hydrology

    Directory of Open Access Journals (Sweden)

    M. Ratto

    2006-09-01

    Full Text Available In this paper, we discuss the problem of calibration and uncertainty estimation for hydrologic systems from two points of view: a bottom-up, reductionist approach; and a top-down, data-based mechanistic (DBM approach. The two approaches are applied to the modelling of the River Hodder catchment in North-West England. The bottom-up approach is developed using the TOPMODEL, whose structure is evaluated by global sensitivity analysis (GSA in order to specify the most sensitive and important parameters; and the subsequent exercises in calibration and validation are carried out in the light of this sensitivity analysis. GSA helps to improve the calibration of hydrological models, making their properties more transparent and highlighting mis-specification problems. The DBM model provides a quick and efficient analysis of the rainfall-flow data, revealing important characteristics of the catchment-scale response, such as the nature of the effective rainfall nonlinearity and the partitioning of the effective rainfall into different flow pathways. TOPMODEL calibration takes more time and it explains the flow data a little less well than the DBM model. The main differences in the modelling results are in the nature of the models and the flow decomposition they suggest. The "quick'' (63% and "slow'' (37% components of the decomposed flow identified in the DBM model show a clear partitioning of the flow, with the quick component apparently accounting for the effects of surface and near surface processes; and the slow component arising from the displacement of groundwater into the river channel (base flow. On the other hand, the two output flow components in TOPMODEL have a different physical interpretation, with a single flow component (95% accounting for both slow (subsurface and fast (surface dynamics, while the other, very small component (5% is interpreted as an instantaneous surface runoff generated by rainfall falling on areas of

  8. Causal Inference and Model Selection in Complex Settings

    Science.gov (United States)

    Zhao, Shandong

    Propensity score methods have become a part of the standard toolkit for applied researchers who wish to ascertain causal effects from observational data. While they were originally developed for binary treatments, several researchers have proposed generalizations of the propensity score methodology for non-binary treatment regimes. In this article, we firstly review three main methods that generalize propensity scores in this direction, namely, inverse propensity weighting (IPW), the propensity function (P-FUNCTION), and the generalized propensity score (GPS), along with recent extensions of the GPS that aim to improve its robustness. We compare the assumptions, theoretical properties, and empirical performance of these methods. We propose three new methods that provide robust causal estimation based on the P-FUNCTION and GPS. While our proposed P-FUNCTION-based estimator preforms well, we generally advise caution in that all available methods can be biased by model misspecification and extrapolation. In a related line of research, we consider adjustment for posttreatment covariates in causal inference. Even in a randomized experiment, observations might have different compliance performance under treatment and control assignment. This posttreatment covariate cannot be adjusted using standard statistical methods. We review the principal stratification framework which allows for modeling this effect as part of its Bayesian hierarchical models. We generalize the current model to add the possibility of adjusting for pretreatment covariates. We also propose a new estimator of the average treatment effect over the entire population. In a third line of research, we discuss the spectral line detection problem in high energy astrophysics. We carefully review how this problem can be statistically formulated as a precise hypothesis test with point null hypothesis, why a usual likelihood ratio test does not apply for problem of this nature, and a doable fix to correctly

  9. A new notion of soundness in bare public-key model

    Institute of Scientific and Technical Information of China (English)

    ZHAO Yunlei; ZHU Hong

    2003-01-01

    A new notion of soundness in bare public-key (BPK) model is presented. This new notion just lies in between one-time soundness and sequential soundness and its reasonableness is justified in the context of resettable zero-knowledge when resettable zero-knowledge prover is implemented by smart card.

  10. Development, Implementation, and Evaluation of the Apollo Model of Pediatric Rehabilitation Service Delivery

    Science.gov (United States)

    Camden, Chantal; Swaine, Bonnie; Tetreault, Sylvie; Bergeron, Sophie; Lambert, Carole

    2013-01-01

    This article presents the experience of a rehabilitation program that undertook the challenge to reorganize its services to address accessibility issues and improve service quality. The context in which the reorganization process occurred, along with the relevant literature justifying the need for a new service delivery model, and an historical…

  11. Optimising the management of complex dynamic ecosystems. An ecological-economic modelling approach

    NARCIS (Netherlands)

    Hein, L.G.

    2005-01-01

    Keywords: ecological-economic modelling; ecosystem services; resource use; efficient; sustainability; wetlands, rangelands.

    Ecosyst

  12. A simple model for atomic layer doped field-effect transistor (ALD-FET) electronic states

    Energy Technology Data Exchange (ETDEWEB)

    Mora R, M.E. [Centro de Investigaciones en Optica, Unidad Aguascalientes. Juan de Montoro 207, Zona Centro, 20000 Aguascalientes (Mexico); Gaggero S, L.M. [Escuela de Fisica, Universidad Autonoma de Zacatecas, Av. Preparatoria 301, 98060 Zacatecas (Mexico)

    1998-12-31

    We propose a simple potential model based on the Thomas-Fermi approximation to reproduce the main properties of the electronic structure of an atomic layer doped field effect transistor. Preliminary numerical results for a Si-based ALD-FET justify why bound electronic states are not observed in the experiment. (Author)

  13. Excess covariance and dynamic instability in a multi-asset model

    NARCIS (Netherlands)

    Anufriev, M.; Bottazzi, G.; Marsili, M.; Pin, P.

    2011-01-01

    The presence of excess covariance in financial price returns is an accepted empirical fact: the price dynamics of financial assets tend to be more correlated than their fundamentals would justify. We propose an intertemporal equilibrium multi-assets model of financial markets with an explicit and

  14. Optimising the management of complex dynamic ecosystems. An ecological-economic modelling approach

    NARCIS (Netherlands)

    Hein, L.G.

    2005-01-01

    Keywords: ecological-economic modelling; ecosystem services; resource use; efficient; sustainability; wetlands, rangelands.

  15. First update of the International Xenotransplantation Association consensus statement on conditions for undertaking clinical trials of porcine islet products in type 1 diabetes--Chapter 4: pre-clinical efficacy and complication data required to justify a clinical trial.

    Science.gov (United States)

    Cooper, David K C; Bottino, Rita; Gianello, Pierre; Graham, Melanie; Hawthorne, Wayne J; Kirk, Allan D; Korsgren, Olle; Park, Chung-Gyu; Weber, Collin

    2016-01-01

    In 2009, the International Xenotransplantation Association (IXA) published a consensus document that provided guidelines and "recommendations" (not regulations) for those contemplating clinical trials of porcine islet transplantation. These guidelines included the IXA's opinion on what constituted "rigorous pre-clinical studies using the most relevant animal models" and were based on "non-human primate testing." We now report our discussion following a careful review of the 2009 guidelines as they relate to pre-clinical testing. In summary, we do not believe there is a need to greatly modify the conclusions and recommendations of the original consensus document. Pre-clinical studies should be sufficiently rigorous to provide optimism that a clinical trial is likely to be safe and has a realistic chance of success, but need not be so demanding that success might only be achieved by very prolonged experimentation, as this would not be in the interests of patients whose quality of life might benefit immensely from a successful islet xenotransplant. We believe these guidelines will be of benefit to both investigators planning a clinical trial and to institutions and regulatory authorities considering a proposal for a clinical trial. In addition, we suggest consideration should be given to establishing an IXA Clinical Trial Advisory Committee that would be available to advise (but not regulate) researchers considering initiating a clinical trial of xenotransplantation.

  16. Affinity and Hostility in Divided Communities: a Mathematical Model

    CERN Document Server

    Thron, Christopher

    2015-01-01

    We propose, develop, and analyze a mathematical model of intergroup attitudes in a community that is divided between two distinct social groups (which may be distinguished by religion, ethnicity, or some other socially distinguishing factor). The model is based on very simple premises that are both intuitive and justified by sociological research. We investigate the behavior of the model in various special cases, for various model configurations. We discuss the stability of the model, and the continuous or discontinuous dependence of model behavior on various parameters. Finally, we discuss possible implications for strategies to improve intergroup affinity, and to defuse tension and prevent deterioration of intergroup relationships.

  17. CERAMIC: Case-Control Association Testing in Samples with Related Individuals, Based on Retrospective Mixed Model Analysis with Adjustment for Covariates.

    Directory of Open Access Journals (Sweden)

    Sheng Zhong

    2016-10-01

    Full Text Available We consider the problem of genetic association testing of a binary trait in a sample that contains related individuals, where we adjust for relevant covariates and allow for missing data. We propose CERAMIC, an estimating equation approach that can be viewed as a hybrid of logistic regression and linear mixed-effects model (LMM approaches. CERAMIC extends the recently proposed CARAT method to allow samples with related individuals and to incorporate partially missing data. In simulations, we show that CERAMIC outperforms existing LMM and generalized LMM approaches, maintaining high power and correct type 1 error across a wider range of scenarios. CERAMIC results in a particularly large power increase over existing methods when the sample includes related individuals with some missing data (e.g., when some individuals with phenotype and covariate information have missing genotype, because CERAMIC is able to make use of the relationship information to incorporate partially missing data in the analysis while correcting for dependence. Because CERAMIC is based on a retrospective analysis, it is robust to misspecification of the phenotype model, resulting in better control of type 1 error and higher power than that of prospective methods, such as GMMAT, when the phenotype model is misspecified. CERAMIC is computationally efficient for genomewide analysis in samples of related individuals of almost any configuration, including small families, unrelated individuals and even large, complex pedigrees. We apply CERAMIC to data on type 2 diabetes (T2D from the Framingham Heart Study. In a genome scan, 9 of the 10 smallest CERAMIC p-values occur in or near either known T2D susceptibility loci or plausible candidates, verifying that CERAMIC is able to home in on the important loci in a genome scan.

  18. Interpretational Confounding Is Due to Misspecification, Not to Type of Indicator: Comment on Howell, Breivik, and Wilcox (2007)

    Science.gov (United States)

    Bollen, Kenneth A.

    2007-01-01

    R. D. Howell, E. Breivik, and J. B. Wilcox (2007) have argued that causal (formative) indicators are inherently subject to interpretational confounding. That is, they have argued that using causal (formative) indicators leads the empirical meaning of a latent variable to be other than that assigned to it by a researcher. Their critique of causal…

  19. A Heterogeneous Bayesian Regression Model for Cross-Sectional Data Involving a Single Observation per Response Unit

    Science.gov (United States)

    Fong, Duncan K. H.; Ebbes, Peter; DeSarbo, Wayne S.

    2012-01-01

    Multiple regression is frequently used across the various social sciences to analyze cross-sectional data. However, it can often times be challenging to justify the assumption of common regression coefficients across all respondents. This manuscript presents a heterogeneous Bayesian regression model that enables the estimation of…

  20. Quasi-neutral limit of the drift-diffusion model for semiconductors with general sign-changing doping profile

    Institute of Scientific and Technical Information of China (English)

    HSIAO; Ling

    2008-01-01

    The quasi-neutral limit of time-dependent drift diffusion model with general sign-changing doping profile is justified rigorously in super-norm (i.e., uniformly in space). This improves the spatial square norm limit by Wang, Xin and Markowich.

  1. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  2. Quantisation of the string fragmentation model

    Energy Technology Data Exchange (ETDEWEB)

    Artru, X.; Bowler, M.G.

    1988-01-01

    We quantise the classical Artu-Mennessier strong model with a Feynman sum over histories method. This procedure yields both propagation amplitudes and a Veneziano mass spectrum for mesons, with resonance poles for unstable states. Applied to the process e/sup +/e/sup -/ -> hadrons our amplitudes justify previous applications of string models and in particular the relative amplitudes for different string configurations are in agreement with a recent conjecture of Andersson and Hofmann which can account for the observed Bose-Einstein correlations in e/sup +/e/sup -/ annihilation.

  3. The applicability of the wind compression model

    CERN Document Server

    Cariková, Zuzana

    2014-01-01

    Compression of the stellar winds from rapidly rotating hot stars is described by the wind compression model. However, it was also shown that rapid rotation leads to rotational distortion of the stellar surface, resulting in the appearance of non-radial forces acting against the wind compression. In this note we justify the wind compression model for moderately rotating white dwarfs and slowly rotating giants. The former could be conducive to understanding density/ionization structure of the mass outflow from symbiotic stars and novae, while the latter can represent an effective mass-transfer mode in the wide interacting binaries.

  4. Stochastic Optimal Control Models for Online Stores

    CERN Document Server

    Bradonjić, Milan

    2011-01-01

    We present a model for the optimal design of an online auction/store by a seller. The framework we use is a stochastic optimal control problem. In our setting, the seller wishes to maximize her average wealth level, where she can control her price per unit via her reputation level. The corresponding Hamilton-Jacobi-Bellmann equation is analyzed for an introductory case. We then turn to an empirically justified model, and present introductory analysis. In both cases, {\\em pulsing} advertising strategies are recovered for resource allocation. Further numerical and functional analysis will appear shortly.

  5. Towards an Improved Performance Measure for Language Models

    CERN Document Server

    Ueberla, J P

    1997-01-01

    In this paper a first attempt at deriving an improved performance measure for language models, the probability ratio measure (PRM) is described. In a proof of concept experiment, it is shown that PRM correlates better with recognition accuracy and can lead to better recognition results when used as the optimisation criterion of a clustering algorithm. Inspite of the approximations and limitations of this preliminary work, the results are very encouraging and should justify more work along the same lines.

  6. One-dimensional adhesion model for large scale structures

    Directory of Open Access Journals (Sweden)

    Kayyunnapara Thomas Joseph

    2010-05-01

    Full Text Available We discuss initial value problems and initial boundary value problems for some systems of partial differential equations appearing in the modelling for the large scale structure formation in the universe. We restrict the initial data to be bounded measurable and locally bounded variation function and use Volpert product to justify the product which appear in the equation. For more general initial data in the class of generalized functions of Colombeau, we construct the solution in the sense of association.

  7. How robust are probabilistic models of higher-level cognition?

    Science.gov (United States)

    Marcus, Gary F; Davis, Ernest

    2013-12-01

    An increasingly popular theory holds that the mind should be viewed as a near-optimal or rational engine of probabilistic inference, in domains as diverse as word learning, pragmatics, naive physics, and predictions of the future. We argue that this view, often identified with Bayesian models of inference, is markedly less promising than widely believed, and is undermined by post hoc practices that merit wholesale reevaluation. We also show that the common equation between probabilistic and rational or optimal is not justified.

  8. When Is Peer Rejection Justifiable? Children's Understanding across Two Cultures

    Science.gov (United States)

    Park, Yoonjung; Killen, Melanie

    2010-01-01

    This study investigated how Korean (N = 397) and U.S. (N = 333) children and adolescents (10 and 13 years of age) evaluated personality (aggression, shyness) and group (gender, nationality) characteristics as a basis for peer rejection in three contexts (friendship rejection, group exclusion, victimization). Overall, peer rejection based on…

  9. Justifying the Ivory Tower: Higher Education and State Economic Growth

    Science.gov (United States)

    Baldwin, J. Norman; McCracken, William A., III

    2013-01-01

    As the U.S. continues to embrace a comprehensive plan for economic recovery, this article investigates the validity of the claim that investing in higher education will help restore state economic growth and prosperity. It presents the findings from a study that indicates that the most consistent predictors of state economic growth related to…

  10. Justifying Music Instruction in American Public Schools: A Historical Perspective.

    Science.gov (United States)

    Jorgensen, Estelle R.

    1995-01-01

    Charts the development of music education from early utilitarianism up to its current emphasis on aesthetic value. Recent attempts to pursue music education as an interdisciplinary subject have been limited due to budget cuts. Briefly discusses this financial crisis and suggests some sources of alternative funding. (MJP)

  11. Beyond Baby Doe: Does Infant Transplantation Justify Euthanasia?

    Science.gov (United States)

    Coulter, David L.

    1988-01-01

    The paper examines ethical issues in the transplantation of organs from infants with anencephaly into infants with severe heart and kidney disease. It argues that active euthanasia of infants with anencephaly should be prohibited to safeguard the rights of all persons with severe neurological disabilities. (Author/DB)

  12. Dialyzer reuse: justified cost saving for south Asian region.

    Science.gov (United States)

    Dhrolia, Murtaza F; Nasir, Kiran; Imtiaz, Salman; Ahmad, Aasim

    2014-08-01

    In spite of controversies, dialyzer reuse has remained an integral part of hemodialysis because of lower cost, good overall safety record, and improved membrane biocompatibility. Reuse declined in developed countries from the beginning of this century because of mass production of hemodialyzers at favourable price with better biocompatible membrane. Abandoning dialyzer reuse became challenging in South Asian region, where more than 40% of the population live below the International Poverty Line of $1.25 per day, less than 10% of end stage renal disease patients receive renal replacement therapy, and upto 70% of those starting dialysis stop treatment due to cost within the first 3 months. Dialyzer reuse is an efficient cost-saving method that allows the use of more efficient and expensive biocompatible synthetic membranes, thereby providing high-quality dialysis to individuals living in countries with limited medical resources without compromising the safety or effectiveness of the treatment.

  13. Justified Ilegality?: Controlled clientelism by the Chilean administration

    Directory of Open Access Journals (Sweden)

    Marcelo Moriconi Bezerra

    2011-07-01

    Full Text Available The Chilean civil service is considered one of the most efficient in Latin America. However, different studies describe the informal institutions that operate between the Legislative Power and the bureaucracy to fill positions in the public administration. Although some of these clientelistic practices are against the law, they have been accepted and defended in both the political and scientific spheres. Legality is not considered an important value if certain indexes have a positive development. In this context, it is important to study how corruption and clientelism have been ignored, or hidden, through political discourses and technical reports about the situation of bureaucracy. All of this allows a better understanding of why after 20 years of administrative reforms there are damaging practices which negatively affect democracy that have not been eradicated.

  14. Justifying Music Instruction in American Public Schools: A Historical Perspective.

    Science.gov (United States)

    Jorgensen, Estelle R.

    1995-01-01

    Charts the development of music education from early utilitarianism up to its current emphasis on aesthetic value. Recent attempts to pursue music education as an interdisciplinary subject have been limited due to budget cuts. Briefly discusses this financial crisis and suggests some sources of alternative funding. (MJP)

  15. British media attacks on homeopathy: are they justified?

    Science.gov (United States)

    Vithoulkas, George

    2008-04-01

    Homeopathy is being attacked by the British media. These attacks draw support from irresponsible and unjustified claims by certain teachers of homeopathy. Such claims include the use of 'dream' and 'imaginative' methods for provings. For prescribing some such teachers attempt to replace the laborious process of matching symptom picture and remedy with spurious theories based on 'signatures', sensations and other methods. Other irresponsible claims have also been made. These "new ideas" risk destroying the principles, theory, and practice of homeopathy.

  16. Common extensor origin release in recalcitrant lateral epicondylitis - role justified?

    Directory of Open Access Journals (Sweden)

    Mukundan Cibu

    2010-05-01

    Full Text Available Abstract The aim of our study was to analyse the efficacy of operative management in recalcitrant lateral epicondylitis of elbow. Forty patients included in this study were referred by general practitioners with a diagnosis of tennis elbow to the orthopaedic department at a district general hospital over a five year period. All had two or more steroid injections at the tender spot, without permanent relief of pain. All subsequently underwent simple fasciotomy of the extensor origin. Of forty patients thirty five had improvement in pain and function, two had persistent symptoms and three did not perceive any improvement. Twenty five had excellent, ten had well, two had fair and three had poor outcomes (recurrent problem; pain at rest and night. Two patients underwent revision surgery. Majority of the patients had improvement in pain and function following operative treatment. In this study, an extensor fasciotomy was demonstrated to be an effective treatment for refractory chronic lateral epicondylitis; however, further studies are warranted.

  17. Antibiotics in dental practice: how justified are we.

    Science.gov (United States)

    Oberoi, Sukhvinder S; Dhingra, Chandan; Sharma, Gaurav; Sardana, Divesh

    2015-02-01

    Antibiotics are prescribed by dentists in dental practice, during dental treatment as well as for prevention of infection. Indications for the use of systemic antibiotics in dentistry are limited because most dental and periodontal diseases are best managed by operative intervention and oral hygiene measures. The use of antibiotics in dental practice is characterised by empirical prescription based on clinical and bacteriological epidemiological factors, resulting in the use of a very narrow range of broad-spectrum antibiotics for short periods of time. This has led to the development of antimicrobial resistance (AMR) in a wide range of microbes and to the consequent inefficacy of commonly used antibiotics. Dentists can make a difference by the judicious use of antimicrobials--prescribing the correct drug, at the standard dosage and appropriate regimen--only when systemic spread of infection is evident. The increasing resistance problems of recent years are probably related to the over- or misuse of broad-spectrum agents. There is a clear need for the development of prescribing guidelines and educational initiatives to encourage the rational and appropriate use of drugs in dentistry. This paper highlights the need for dentists to improve antibiotic prescribing practices in an attempt to curb the increasing incidence of antibiotic resistance and other side effects of antibiotic abuse. The literature provides evidence of inadequate prescribing practices by dentists for a number of factors, ranging from inadequate knowledge to social factors.

  18. Justifying plans to improve performance of an existing cooling system

    Energy Technology Data Exchange (ETDEWEB)

    Burns, J. [Stone & Webster Engineering Corp., Boston, MA (United States); Godard, D.; Randall, R. [Niagara-Mohawk Power Company, Syracuse, NY (United States); Cooper, J. [Cooper & Associates, P.A., Tampa, FL (United States)

    1996-08-01

    This paper discusses the kinds of quantitative justification needed to convince today`s cost-conscious, informed utility management that proposed improvements to the cooling system are feasible and will be of strong economic benefit to the station. It summarizes the evaluations developed during the review of circulating water system improvement candidates that accompanied the recent 4.5% power uprate of an existing large station with a closed cycle cooling system which utilizes a natural draft cooling tower. Presented in the paper are the capital costs and turbine performance improvements related to: air blanketing reduction by baffle plate additions to the condenser air coolers; minimizing costs of waterbox/bundle cleaning programs; cooling system performance monitoring enhancements; the prudency of tube staking after uprate; the benefits of a circulating water flow increase; better cooling tower hot water distribution; adding a layer of fill to the cooling tower; and finally the value of a helper tower. Considered too in this paper are the performance test surveys of both the condenser and cooling tower that identified the cause and/or performance deficiencies. The general principles to be discussed will be applicable to all sizes and types of power plant cooling systems. The paper however, will focus on the 1994-1995 case study of a 675,000 GPM closed cooling system with a 537 ft. counterflow natural draft cooling tower and a 670,000 sq. ft. six bundle single pass condenser which serves the six flow low pressure (LP) turbine of an 1100 MW nuclear plant. One example of the outcome of the program was an approximate 20% increase in condenser cleanliness from 55% to 75%. 9 refs., 7 figs.

  19. Why Status Effects Need not Justify Egalitarian Income Policy

    NARCIS (Netherlands)

    Graafland, J.J.

    2010-01-01

    Economic research overwhelmingly shows that the utility individuals derive from their income depends on the incomes of others. Theoretical literature has proven that these status effects imply a more egalitarian income policy than in the conventional case, in which people value their income independ

  20. Does Biology Justify Ideology? The Politics of Genetic Attribution

    Science.gov (United States)

    Suhay, Elizabeth; Jayaratne, Toby Epstein

    2013-01-01

    Conventional wisdom suggests that political conservatives are more likely than liberals to endorse genetic explanations for many human characteristics and behaviors. Whether and to what extent this is true has received surprisingly limited systematic attention. We examine evidence from a large U.S. public opinion survey that measured the extent to which respondents believed genetic explanations account for a variety of differences among individuals as well as groups in society. We find that conservatives were indeed more likely than liberals to endorse genetic explanations for perceived race and class differences in characteristics often associated with socioeconomic inequality (intelligence, math skills, drive, and violence). Different ideological divisions emerged, however, with respect to respondents’ explanations for sexual orientation. Here, liberals were more likely than conservatives to say that sexual orientation is due to genes and less likely to say that it is due to choice or the environment. These patterns suggest that conservative and liberal ideologues will tend to endorse genetic explanations where their policy positions are bolstered by “naturalizing” human differences. That said, debates over genetic influence may be more politicized with respect to race, class, and sexual orientation than population differences generally: We find that left/right political ideology was not significantly associated with genetic (or other) attributions for individual differences in intelligence, math skills, drive, or violence. We conclude that conceptions of the proper role of government are closely intertwined with assumptions about the causes of human difference, but that this relationship is a complex one. PMID:26379311

  1. Renal transplantation between HIV-positive donors and recipients justified.

    Science.gov (United States)

    Muller, Elmi; Barday, Zunaid; Mendelson, Marc; Kahn, Delawir

    2012-03-02

    HIV infection was previously an absolute contraindication to renal transplantation. However, with the advent of highly active antiretroviral therapy (HAART), renal transplantation using HIV-negative donor kidneys has successfully been employed for HIV-infected patients with end-stage renal failure. In resource-limited countries, places on dialysis programmes are severely restricted; HIV-infected patients, like many others with co-morbidity, are often denied treatment. Kidneys (and other organs) from HIV-infected deceased donors are discarded. The transplantation of HIV-positive donor kidneys to HIV-infected recipients is now a viable alternative to chronic dialysis or transplantation of HIV-negative donor kidneys. This significantly increases the pool of donor kidneys to the advantage of HIV-positive and -negative patients. Arguments are presented that led to our initiation of renal transplantation from HIV-positive deceased donors to HIV-positive recipients at Groote Schuur Hospital, Cape Town.

  2. Is selenium supplementation in autoimmune thyroid diseases justified?

    DEFF Research Database (Denmark)

    Winther, Kristian H.; Bonnema, Steen; Hegedüs, Laszlo

    2017-01-01

    diseases under conditions of low dietary selenium intake. Two systematic reviews have evaluated controlled trials among patients with autoimmune thyroiditis and report that selenium supplementation decreases circulating thyroid autoantibodies. The immunomodulatory effects of selenium might involve reducing...

  3. Navy Officials Justified the MQ-4C Triton Procurement Quantity

    Science.gov (United States)

    2015-09-16

    Naval Air Systems Command internal controls over Triton quantity requirements were effective as they applied to the audit objectives, because Navy...Visit us at www.dodig.mil September 16, 2015 Objective This is the first in a series of audits on the Navy MQ-4C Triton (Triton) Unmanned Aircraft...System (UAS) Program. Our overall objective for the series of audits was to determine whether the Navy effectively managed the Triton UAS

  4. Social dominance and ethical ideology: the end justifies the means?

    Science.gov (United States)

    Wilson, Marc Stewart

    2003-10-01

    Although many social psychological researchers have tried to identify the antecedents of unethical or immoral behavior, investigators have little considered the content of ethical beliefs that associate with important personality variables such as authoritarianism (B. Altemeyer, 1981, 1996) and social dominance orientation (SDO; J. Sidanius, 1993). Previous studies suggest that authoritarianism is associated with the rejection of relativistic standards for moral actions and--to a lesser extent--the idealistic belief that moral actions should not harm others (J. W. McHoskey, 1996). In the present study, 160 New Zealand University students completed measures of SDO (J. Sidanius), Right Wing Authoritarianism (RWA, B. Altemeyer, 1981), and two subscales of ethical ideology: Relativism and Idealism (D. R. Forsyth, 1980). As expected, SDO showed a negative relationship with Idealism, a belief that actions should not harm others. But, contrary to expectations, SDO showed no consistent association with relativism, a belief that the moralities of actions are not comparable. On the basis of those findings, people with high SDO might be described as "ruthless" in their pursuit of desirable goals and are indifferent about whether the morality of different actions can be compared or even matter.

  5. How three Narratives of Modernity justify Economic Inequality

    DEFF Research Database (Denmark)

    Larsen, Christian Albrekt

    2016-01-01

    The acceptance of income differences varies across countries. This article suggests belief in three narratives of modernity to account for this: the “tunnel effect”, related to perceptions of generational mobility; the “procedural justice effect”, related to the perceived fairness in the process...

  6. Justifying the design and selection of literacy and thinking tools

    Directory of Open Access Journals (Sweden)

    David Whitehead

    2008-10-01

    Full Text Available Criteria for the design and selection of literacy and thinking tools that allow educators to justifywhat they do are described within a wider framework of learning theory and research into bestpractice. Based on a meta-analysis of best practice, results from a three year project designedto evaluate the effectiveness of a secondary school literacy initiative in New Zealand, togetherwith recent research from cognitive and neuro-psychologists, it is argued that the design andselection of literacy and thinking tools used in elementary schools should be consistent with (iteaching focused (ii learner focused, (iii thought linked (iv neurologically consistent, (vsubject specific, (vi text linked, (vii developmentally appropriate, and (viii assessment linkedcriteria.

  7. [The dark face of cosmetics. A justified or excessive diatribe?].

    Science.gov (United States)

    Piérard, G E; Piérard-Franchimont, C; Lesuisse, M; Hermanns, J-F; Hermanns-Lê, T

    2015-10-01

    In recent years, the population and producers of consumer products became aware of deleterious effects of some substances on human health and environment. Cosmetic products are part of such concern. What are the risks currently involved? The so-called "natural", "bio" or "green" products, do they represent an ideal panacea? This topic has a complex issue because documents available for the general public are of unequal quality, and objective scientifc publications remain rare and prone to controversies.

  8. Justifying Torture: Explaining Democratic States' Noncompliance with International Humanitarian Law

    Science.gov (United States)

    Kanstroom, Emily

    2007-01-01

    On June 28, 1951, France ratified the 1949 Geneva Conventions, which prohibited the torture of prisoners of war. On August 2, 1955, the United States of America ratified the same document. Between 1954 and 1962, France fought a war against Algeria, which sought its independence from colonial rule. From September 11, 2001 until the present, the…

  9. Polyurethane foam-covered breast implants: a justified choice?

    Science.gov (United States)

    Scarpa, C; Borso, G F; Vindigni, V; Bassetto, F

    2015-01-01

    Even if the safety of the polyurethane prosthesis has been the subject of many studies and professional and public controversies. Nowadays, polyurethane covered implants are very popular in plastic surgery for the treatment of capsular contracture. We have identified 41 papers (1 is a communication of the FDA) by using search browsers such as Pubmed, Medline, and eMedicine. Eleven manuscripts have been used for an introduction, and the remaining thirty have been subdivided into three tables whose results have been summarized in three main chapters: (1) capsular formation and contracture, (2) complications, (3) biodegradation and cancer risk. (1) The polyurethanic capsule is a well defined foreign body reaction characterized by synovial metaplasia, a thin layer of disarranged collagen fibers and a high vascularization. These features make possible a "young" capsule and a low occurrence of capsular contracture even over a long period (10 years); (2) the polyurethane implants may be difficult to remove but there is no evidence that they cause an increase in the other complications; (3) there is no evidence of polyurethane related cancer in long-term studies (after 5 years). Polyurethane foam covered breast implants remain a valid choice for the treatment of capsular contracture even if it would be very useful to verify the ease of removal of the prosthesis and to continue investigations on biodegradation products.

  10. [Implantable subcutaneous venous access. 1. Reasons that justify its use].

    Science.gov (United States)

    Postigo Mota, S; Durán Gómez, N; Lavado García, J M; Rey Sánchez, P; Canal Macías, M L; Pedrera Zamorano, J D

    2002-02-01

    Chemotherapy treatment or the administering parenteral feeding requires permanent venous access during weeks, months or perhaps years. To have available an adequate venous access while treating gravely ill patients is fundamental in order to guarantee the perfusion of fluids blood transfusions administration of medicines, to supply intravenous feeding, to draw blood samples, etc. In this article, which will have a follow-up which concentrates on proper handling, the authors expose the reasons why subcutaneous venous access implants and used as well as how to deal with one of their main complications; extravasation. A bibliography will accompany the follow-up article.

  11. Justifying the Ivory Tower: Higher Education and State Economic Growth

    Science.gov (United States)

    Baldwin, J. Norman; McCracken, William A., III

    2013-01-01

    As the U.S. continues to embrace a comprehensive plan for economic recovery, this article investigates the validity of the claim that investing in higher education will help restore state economic growth and prosperity. It presents the findings from a study that indicates that the most consistent predictors of state economic growth related to…

  12. familygoals: Family Influencers, Calibrated Amateurism, and Justifying Young Digital Labor

    National Research Council Canada - National Science Library

    Abidin, Crystal

    2017-01-01

    Following in the celebrity trajectory of mommy bloggers, global micro-microcelebrities, and reality TV families, family Influencers on social media are one genre of microcelebrity for whom the “anchor...

  13. Justifying scale type for a latent variable: Formative or reflective?

    Science.gov (United States)

    Liu, Hao; Bahron, Arsiah; Bagul, Awangku Hassanal Bahar Pengiran

    2015-12-01

    The study attempted to explore the possibilities to create a procedure at the experimental level to double confirm whether manifest variables scale type is formative or reflective. Now, the criteria of making such a decision are heavily depended on researchers' judgment at the conceptual and operational level. The study created an experimental procedure that seems could double confirm the decisions from the conceptual and operational level judgments. The experimental procedure includes the following tests, Variance Inflation Factor (VIF), Tolerance (TOL), Ridge Regression, Cronbach's alpha, Dillon-Goldstein's rho, and first and second eigenvalue. The procedure considers manifest variables' both multicollinearity and consistency. As the result, the procedure received the same judgment with the carefully established decision making at the concept and operational level.

  14. Beyond Baby Doe: Does Infant Transplantation Justify Euthanasia?

    Science.gov (United States)

    Coulter, David L.

    1988-01-01

    The paper examines ethical issues in the transplantation of organs from infants with anencephaly into infants with severe heart and kidney disease. It argues that active euthanasia of infants with anencephaly should be prohibited to safeguard the rights of all persons with severe neurological disabilities. (Author/DB)

  15. Oxidative stress and antioxidants for idiopathic oligoasthenoteratospermia: Is it justified?

    Directory of Open Access Journals (Sweden)

    Ashok Agarwal

    2011-01-01

    Full Text Available Oxidative stress contributes to defective spermatogenesis and the poor quality of sperm associated with idiopathic male factor infertility. The aim of this study was to review the current literature on the effects of various types of antioxidant supplements in patients to improve fertilization and pregnancy rates in subfertile males with idiopathic oligoasthenoteratozoospermia (iOAT. Review of recent publications through PubMed and the Cochrane database. Oxidative stress is implicated in impaired spermatogenesis leading to the poor semen parameters and increased DNA damage and apoptosis in iOAT. Strategies to modulate the level of oxidative stress within the male reproductive tract include the use of oral antioxidant compounds to reinforce the body′s defence against oxidative damage. In our evaluation, carnitines were considered the most established pharmacotherapeutic agent to treat iOAT, as evidence and data concerning carnitine supplementation have been shown to be most consistent and relevant to the population of interest. Other therapies, such as combined vitamin E and C therapy, are still considered controversial as vitamin C can act as a pro-oxidant in certain instances and the results of randomized controlled trials have failed to show significant benefit to sperm parameters and pregnancy rates. There is a need for further investigation with randomized controlled studies to confirm the efficacy and safety of antioxidant supplementation in the medical treatment of idiopathic male infertility as well as the need to determine the dosage required to improve semen parameters, fertilization rates and pregnancy outcomes in iOAT.

  16. Corporate governance and banks : How justified is the match?

    NARCIS (Netherlands)

    van der Elst, C.F.

    2015-01-01

    Banks and bank governance are different. We critically assess the arguments used to pervade these divergences in operational activities. We also question if and how, in light of the specificity of banking activities, bank governance translates the operational peculiarities in different governance

  17. The management of business risk in justifying economic decisions

    National Research Council Canada - National Science Library

    G.М. Tarasyuk; D.I. Polishchuk

    2015-01-01

    .... Organizational and economic methods for reducing risks to ensure minimal damage to its economic activity, the recommended basic principles which must be followed when choosing a management strategy...

  18. Contemporary Methods of Social Introduction: Is the Stigmatisation justified?

    Directory of Open Access Journals (Sweden)

    Lisa M. Steffek

    2009-12-01

    Full Text Available Historically, individuals in search of a romantic partner have expanded their pool of alternatives by meeting others through their personal social networks. In the last few decades, however, a growing singles population, coupled with advances in technology, has promoted the utilisation and modernization of contemporary marriage market intermediaries (MMIs, including online dating sites, social networking sites, and professional matchmaking services. Importantly, these contemporary MMIs depart from more normative methods for meeting others, making their use ripe for social stigmatization, as evidenced by myriad portrayals in the popular media. The purpose of the present research was to provide an empirical exploration of the validity of the layperson stigma towards users of contemporary MMIs by assessing the extent to which users and nonusers of these various services differ on key individual characteristics relevant to relationship initiation and progression. Specifically, we surveyed 96 individuals, all of whom were attending a singles‘ happy hour, and compared users and nonusers of contemporary MMIs on several important characteristics. Although users reported going on more dates and perceived greater attractiveness in others at the event, no differences were observed in personality (i.e., the Big 5 or adult attachment classification (i.e., secure vs. insecure. Altogether, our findings suggest that users of contemporary MMIs are not socially undesirable people (or at least any more undesirable than nonusers.

  19. Quasi-hydrostatic Primitive Equations for Ocean Global Circulation Models

    Institute of Scientific and Technical Information of China (English)

    Carine LUCAS; Madalina PETCU; Antoine ROUSSEAU

    2010-01-01

    Global existence of weak and strong solutions to the quasi-hydrostatic primitive equations is studied in this paper.This model,that derives from the full non-hydrostatic model for geophysical fluid dynamics in the zero-limit of the aspect ratio,is more realistic than the classical hydrostatic model,since the traditional approximation that consists in neglecting a part of the Coriolis force is relaxed.After justifying the derivation of the model,the authors provide a rigorous proof of global existence of weak solutions,and well-posedness for strong solutions in dimension three.

  20. Compensated kidney donation: an ethical review of the Iranian model.

    Science.gov (United States)

    Bagheri, Alireza

    2006-09-01

    Iran has a program of compensated kidney donation from living unrelated (LUR) donors since 1997. The aim of the program was to address the increasing demand for kidney transplantation in a morally sound manner. The program was successful in terms of increasing the number of kidneys available for transplantation. This paper presents a critical review of the program and its clinical status. Denying organ donors legitimate compensation because of the understandable fear of an organ trade is not morally justifiable, and the Iranian model of compensated LUR kidney donation offers substantial benefits that overcome these concerns. Despite its benefits, the program lacks secure measures to prevent the risk of a direct monetary relationship between donors and recipients, and it must be revised in order to be morally justifiable.

  1. What Is Essential in Developmental Evaluation? On Integrity, Fidelity, Adultery, Abstinence, Impotence, Long-Term Commitment, Integrity, and Sensitivity in Implementing Evaluation Models

    Science.gov (United States)

    Patton, Michael Quinn

    2016-01-01

    Fidelity concerns the extent to which a specific evaluation sufficiently incorporates the core characteristics of the overall approach to justify labeling that evaluation by its designated name. Fidelity has traditionally meant implementing a model in exactly the same way each time following the prescribed steps and procedures. The essential…

  2. The Justifiable Analysis of Punitive Damage Compensation in Tourism Law---Alternative Solutions to Mental Damage Compensation%旅游法上惩罚性赔偿的正当性分析--兼论旅游精神损害赔偿的可替代性

    Institute of Scientific and Technical Information of China (English)

    杨振宏

    2014-01-01

    The punitive damage compensation is a great breakthrough in compensation for breach of tour contracts.Considering the particular character of mental damage in tour contracts , the punitive damage com-pensation has made remedies for breach of contract more comprehensible .Based on features of tour activities , further studies on the justifiability of and alternative measures for punitive damage compensation and mental damage compensation for breach of tour contract are of important theoretical and practical significance .%惩罚性赔偿是旅游合同违约损害赔偿的一个重大突破。鉴于精神损害在旅游合同中具有特殊地位,旅游惩罚性赔偿弥补了以往精神损害赔偿在违约责任中得不到支持的遗憾。结合旅游活动的自身特点,进一步探讨惩罚性赔偿与精神损害赔偿在旅游违约中的正当性、可替代性具有十分重要的理论和现实意义。

  3. Statistical mechanics of two-dimensional foams: Physical foundations of the model.

    Science.gov (United States)

    Durand, Marc

    2015-12-01

    In a recent series of papers, a statistical model that accounts for correlations between topological and geometrical properties of a two-dimensional shuffled foam has been proposed and compared with experimental and numerical data. Here, the various assumptions on which the model is based are exposed and justified: the equiprobability hypothesis of the foam configurations is argued. The range of correlations between bubbles is discussed, and the mean-field approximation that is used in the model is detailed. The two self-consistency equations associated with this mean-field description can be interpreted as the conservation laws of number of sides and bubble curvature, respectively. Finally, the use of a "Grand-Canonical" description, in which the foam constitutes a reservoir of sides and curvature, is justified.

  4. The field-space metric in spiral inflation and related models

    Science.gov (United States)

    Erlich, Joshua; Olsen, Jackson; Wang, Zhen

    2016-09-01

    Multi-field inflation models include a variety of scenarios for how inflation proceeds and ends. Models with the same potential but different kinetic terms are common in the literature. We compare spiral inflation and Dante's inferno-type models, which differ only in their field-space metric. We justify a single-field effective description in these models and relate the single-field description to a mass-matrix formalism. We note the effects of the nontrivial field-space metric on inflationary observables, and consequently on the viability of these models. We also note a duality between spiral inflation and Dante's inferno models with different potentials.

  5. Mesoscopic and continuum modelling of angiogenesis

    KAUST Repository

    Spill, F.

    2014-03-11

    Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. © 2014 Springer-Verlag Berlin Heidelberg.

  6. A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives.

    Science.gov (United States)

    Martinez, Aleix; Du, Shichuan

    2012-05-01

    In cognitive science and neuroscience, there have been two leading models describing how humans perceive and classify facial expressions of emotion-the continuous and the categorical model. The continuous model defines each facial expression of emotion as a feature vector in a face space. This model explains, for example, how expressions of emotion can be seen at different intensities. In contrast, the categorical model consists of C classifiers, each tuned to a specific emotion category. This model explains, among other findings, why the images in a morphing sequence between a happy and a surprise face are perceived as either happy or surprise but not something in between. While the continuous model has a more difficult time justifying this latter finding, the categorical model is not as good when it comes to explaining how expressions are recognized at different intensities or modes. Most importantly, both models have problems explaining how one can recognize combinations of emotion categories such as happily surprised versus angrily surprised versus surprise. To resolve these issues, in the past several years, we have worked on a revised model that justifies the results reported in the cognitive science and neuroscience literature. This model consists of C distinct continuous spaces. Multiple (compound) emotion categories can be recognized by linearly combining these C face spaces. The dimensions of these spaces are shown to be mostly configural. According to this model, the major task for the classification of facial expressions of emotion is precise, detailed detection of facial landmarks rather than recognition. We provide an overview of the literature justifying the model, show how the resulting model can be employed to build algorithms for the recognition of facial expression of emotion, and propose research directions in machine learning and computer vision researchers to keep pushing the state of the art in these areas. We also discuss how the model can

  7. Expression of cause, evidence, justify and motivation rhetorical relations by causal hypotactic clauses in Brazilian Portuguese=Expressão das relações retóricas de causa, evidência, justificativa e motivação por meio de orações hipotáticas adverbiais causais no português brasileiro

    Directory of Open Access Journals (Sweden)

    Juliano Desiderato Antonio

    2012-07-01

    Full Text Available This paper aims at investigating the expression of cause, evidence, justify and motivation rhetorical relations by means of causal hypotactic clauses in formal oral discourse (university lectures and interviews with academic researchers in Brazilian Portuguese. The investigation is based on Rhetorical Structure Theory (RST, a theory of text organization which describes the implicit relations that arise from the combination of parts of texts. The identification of these relations was based on a parameter from Functional Discourse Grammar (FDG: layers of representational and interpersonal levels. From interpersonal level, layers move and discourse act were used. From representational level, layers propositional content and states of affairs were employed. Non-volitional cause relations are established by clauses conveying states of affairs, volitional cause relation is established by clauses conveying propositional contents. Justify relation and evidence relation are established by clauses conveying discourse acts (in evidence relation an instance of what was stated in the nucleus portion is provided in the satellite portion. Finally, motivation relation is established by clauses conveying a motivation subsidiary discourse act.O objetivo deste trabalho é investigar a expressão das relações retóricas de causa, evidência, justificativa e motivação por meio de orações hipotáticas adverbiais causais em elocuções formais (aulas e entrevistas no português brasileiro. A investigação se baseia na Teoria da Estrutura Retórica do Texto (RST, uma teoria da organização textual que descreve as relações implícitas que surgem da combinação de partes do texto. A identificação dessas relações se baseou em parâmetros da Gramática Discursivo-Funcional (GDF: camadas dos níveis representacional e interpessoal. Do nível interpessoal, utilizaram-se as camadas movimento e ato discursivo. Do nível representacional utilizaram-se as camadas

  8. Evolution Methods of Formation of Neuronet Models of Complex Economic Systems

    Directory of Open Access Journals (Sweden)

    Khemelyov Oleksandr H.

    2014-01-01

    Full Text Available The article analyses principles of formation of neuronet models of complex economic systems. It justifies prospectiveness of use of artificial intellect methods when modelling complex economic systems. It shows a possibility of use of evolution methods when forming neuronet models of complex economic systems for ensuring invariance of their generalising properties. It offers an algorithm with a genome from operons of fixed length. It considers all operons from the point of view of functional positions. It notes a specific feature of the algorithm, which allows excluding anthropogenic factors when selecting the neuronet models architecture. It proves adequacy of the formed neuronet models of complex economic systems.

  9. Fertility and Female Employment: Problems of Causal Direction.

    Science.gov (United States)

    Cramer, James C.

    1980-01-01

    Considers multicollinearity in nonrecursive models, misspecification of models, discrepancies between attitudes and behavior, and differences between static and dynamic models as explanations for contradictory information on the causal relationship between fertility and female employment. Finds that initially fertility affects employment but that,…

  10. Jump Markov models and transition state theory: the quasi-stationary distribution approach.

    Science.gov (United States)

    Di Gesù, Giacomo; Lelièvre, Tony; Le Peutrec, Dorian; Nectoux, Boris

    2016-12-22

    We are interested in the connection between a metastable continuous state space Markov process (satisfying e.g. the Langevin or overdamped Langevin equation) and a jump Markov process in a discrete state space. More precisely, we use the notion of quasi-stationary distribution within a metastable state for the continuous state space Markov process to parametrize the exit event from the state. This approach is useful to analyze and justify methods which use the jump Markov process underlying a metastable dynamics as a support to efficiently sample the state-to-state dynamics (accelerated dynamics techniques). Moreover, it is possible by this approach to quantify the error on the exit event when the parametrization of the jump Markov model is based on the Eyring-Kramers formula. This therefore provides a mathematical framework to justify the use of transition state theory and the Eyring-Kramers formula to build kinetic Monte Carlo or Markov state models.

  11. "Discrete" vacuum geometry as a tool for Dirac fundamental quantization of Minkowskian Higgs model

    CERN Document Server

    Lantsman, Leonid

    2007-01-01

    We demonstrate that assuming the "discrete" vacuum geometry in the Minkowskian Higgs model with vacuum BPS monopole solutions can justify the Dirac fundamental quantization of that model. The important constituent of this quantization is getting various rotary effects, including collective solid rotations inside the physical BPS monopole vacuum, and just assuming the "discrete" vacuum geometry seems to be that thing able to justify these rotary effects. More precisely, assuming the "discrete" geometry for the appropriate vacuum manifold implies the presence of thread topological defects (side by side with point hedgehog topological defects and walls between different topological domains) inside this manifold in the shape of specific (rectilinear) threads: gauge and Higgs fields located in the spatial region intimately near the axis $z$ of the chosen (rest) reference frame. This serves as the source of collective solid rotations proceeding inside the BPS monopole vacuum suffered the Dirac fundamental quantizat...

  12. Jump Markov models and transition state theory: the Quasi-Stationary Distribution approach

    CERN Document Server

    Di Gesù, Giacomo; Peutrec, Dorian Le; Nectoux, Boris

    2016-01-01

    We are interested in the connection between a metastable continuous state space Markov process (satisfying e.g. the Langevin or overdamped Langevin equation) and a jump Markov process in a discrete state space. More precisely, we use the notion of quasi-stationary distribution within a metastable state for the continuous state space Markov process to parametrize the exit event from the state. This approach is useful to analyze and justify methods which use the jump Markov process underlying a metastable dynamics as a support to efficiently sample the state-to-state dynamics (accelerated dynamics techniques). Moreover, it is possible by this approach to quantify the error on the exit event when the parametrization of the jump Markov model is based on the Eyring-Kramers formula. This therefore provides a mathematical framework to justify the use of transition state theory and the Eyring-Kramers formula to build kinetic Monte Carlo or Markov state models.

  13. Modelling heart rate kinetics.

    Science.gov (United States)

    Zakynthinaki, Maria S

    2015-01-01

    The objective of the present study was to formulate a simple and at the same time effective mathematical model of heart rate kinetics in response to movement (exercise). Based on an existing model, a system of two coupled differential equations which give the rate of change of heart rate and the rate of change of exercise intensity is used. The modifications introduced to the existing model are justified and discussed in detail, while models of blood lactate accumulation in respect to time and exercise intensity are also presented. The main modification is that the proposed model has now only one parameter which reflects the overall cardiovascular condition of the individual. The time elapsed after the beginning of the exercise, the intensity of the exercise, as well as blood lactate are also taken into account. Application of the model provides information regarding the individual's cardiovascular condition and is able to detect possible changes in it, across the data recording periods. To demonstrate examples of successful numerical fit of the model, constant intensity experimental heart rate data sets of two individuals have been selected and numerical optimization was implemented. In addition, numerical simulations provided predictions for various exercise intensities and various cardiovascular condition levels. The proposed model can serve as a powerful tool for a complete means of heart rate analysis, not only in exercise physiology (for efficiently designing training sessions for healthy subjects) but also in the areas of cardiovascular health and rehabilitation (including application in population groups for which direct heart rate recordings at intense exercises are not possible or not allowed, such as elderly or pregnant women).

  14. Modelling heart rate kinetics.

    Directory of Open Access Journals (Sweden)

    Maria S Zakynthinaki

    Full Text Available The objective of the present study was to formulate a simple and at the same time effective mathematical model of heart rate kinetics in response to movement (exercise. Based on an existing model, a system of two coupled differential equations which give the rate of change of heart rate and the rate of change of exercise intensity is used. The modifications introduced to the existing model are justified and discussed in detail, while models of blood lactate accumulation in respect to time and exercise intensity are also presented. The main modification is that the proposed model has now only one parameter which reflects the overall cardiovascular condition of the individual. The time elapsed after the beginning of the exercise, the intensity of the exercise, as well as blood lactate are also taken into account. Application of the model provides information regarding the individual's cardiovascular condition and is able to detect possible changes in it, across the data recording periods. To demonstrate examples of successful numerical fit of the model, constant intensity experimental heart rate data sets of two individuals have been selected and numerical optimization was implemented. In addition, numerical simulations provided predictions for various exercise intensities and various cardiovascular condition levels. The proposed model can serve as a powerful tool for a complete means of heart rate analysis, not only in exercise physiology (for efficiently designing training sessions for healthy subjects but also in the areas of cardiovascular health and rehabilitation (including application in population groups for which direct heart rate recordings at intense exercises are not possible or not allowed, such as elderly or pregnant women.

  15. Modelling Heart Rate Kinetics

    Science.gov (United States)

    Zakynthinaki, Maria S.

    2015-01-01

    The objective of the present study was to formulate a simple and at the same time effective mathematical model of heart rate kinetics in response to movement (exercise). Based on an existing model, a system of two coupled differential equations which give the rate of change of heart rate and the rate of change of exercise intensity is used. The modifications introduced to the existing model are justified and discussed in detail, while models of blood lactate accumulation in respect to time and exercise intensity are also presented. The main modification is that the proposed model has now only one parameter which reflects the overall cardiovascular condition of the individual. The time elapsed after the beginning of the exercise, the intensity of the exercise, as well as blood lactate are also taken into account. Application of the model provides information regarding the individual’s cardiovascular condition and is able to detect possible changes in it, across the data recording periods. To demonstrate examples of successful numerical fit of the model, constant intensity experimental heart rate data sets of two individuals have been selected and numerical optimization was implemented. In addition, numerical simulations provided predictions for various exercise intensities and various cardiovascular condition levels. The proposed model can serve as a powerful tool for a complete means of heart rate analysis, not only in exercise physiology (for efficiently designing training sessions for healthy subjects) but also in the areas of cardiovascular health and rehabilitation (including application in population groups for which direct heart rate recordings at intense exercises are not possible or not allowed, such as elderly or pregnant women). PMID:25876164

  16. Proposed best practice for projects that involve modelling and simulation.

    Science.gov (United States)

    O'Kelly, Michael; Anisimov, Vladimir; Campbell, Chris; Hamilton, Sinéad

    2017-03-01

    Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Efficient estimation of moments in linear mixed models

    CERN Document Server

    Wu, Ping; Zhu, Li-Xing; 10.3150/10-BEJ330

    2012-01-01

    In the linear random effects model, when distributional assumptions such as normality of the error variables cannot be justified, moments may serve as alternatives to describe relevant distributions in neighborhoods of their means. Generally, estimators may be obtained as solutions of estimating equations. It turns out that there may be several equations, each of them leading to consistent estimators, in which case finding the efficient estimator becomes a crucial problem. In this paper, we systematically study estimation of moments of the errors and random effects in linear mixed models.

  18. Modeling the Effects of Mergers in the Retail Sector

    DEFF Research Database (Denmark)

    Blomgren-Hansen, Niels

    2013-01-01

    by the merger and some measure of competition. Furthermore, the authorities must make their decision quickly, rendering deliberate data collection and econometric analyses infeasible in practice. The decision must be based on easily accessible data. In this paper, a simple model of the interaction between...... to the sector of more independent competitors. The competition authorities were justified in conditioning its approval on the removal of contract-based barriers to entry. Analytically, the main results of this work are the following: (1) In a linear model characterized by heterogeneous products and constant...

  19. Overdeepening development in a glacial landscape evolution model with quarrying

    DEFF Research Database (Denmark)

    Ugelvig, Sofie Vej; Egholm, D.L.; Iverson, Neal R.;

    In glacial landscape evolution models, subglacial erosion rates are often related to basal sliding or ice discharge by a power-law. This relation can be justified when considering bed abrasion, where rock debris transported in the basal ice drives erosion. However, the relation is not well...... to sudden jumps in erosion rate and fjord formation along margins that experienced periodic ice sheet configurations in the Quaternary. Egholm, D. L. et al. Modeling the flow of glaciers in steep terrains: The integrated second-order shallow ice approximation (iSOSIA). Journal of Geophysical Research, 116...

  20. Mathematical models of magnetite desliming for automated quality control systems

    Science.gov (United States)

    Olevska, Yu.; Mishchenko, V.; Olevskyi, V.

    2016-10-01

    The aim of the study is to provide multifactor mathematical models suitable for use in automatic control systems of desliming process. For this purpose we described the motion of a two-phase environment regard to the shape the desliming machine and technological parameters of the enrichment process. We created the method for preparation of dependences of the enrichment process quality from the technological and design parameters. To automate the process we constructed mathematical models to justify intensive technological modes and optimal parameters for design of desliming machine.

  1. An Algebraic Solution for the Kermack-McKendrick Model

    CERN Document Server

    Carvalho, Alexsandro M

    2016-01-01

    We present an algebraic solution for the Susceptible-Infective-Removed (SIR) model originally presented by Kermack-McKendrick in 1927. Starting from the differential equation for the removed subjects presented by them in the original paper, we re-write it in a slightly different form in order to derive formally the solution, unless one integration. Then, using algebraic techniques and some well justified numerical assumptions we obtain an analytic solution for the integral. Finally, we compare the numerical solution of the differential equations of the SIR model with the analytically solution here proposed, showing an excellent agreement.

  2. Weak-strong uniqueness for measure-valued solutions of some compressible fluid models

    Science.gov (United States)

    Gwiazda, Piotr; Świerczewska-Gwiazda, Agnieszka; Wiedemann, Emil

    2015-10-01

    We prove weak-strong uniqueness in the class of admissible measure-valued solutions for the isentropic Euler equations in any space dimension and for the Savage-Hutter model of granular flows in one and two space dimensions. For the latter system, we also show the complete dissipation of momentum in finite time, thus rigorously justifying an assumption that has been made in the engineering and numerical literature.

  3. Weak-strong uniqueness for measure-valued solutions of some compressible fluid models

    CERN Document Server

    Gwiazda, Piotr; Wiedemann, Emil

    2015-01-01

    We prove weak-strong uniqueness in the class of admissible measure-valued solutions for the isentropic Euler equations in any space dimension and for the Savage-Hutter model of granular flows in one and two space dimensions. For the latter system, we also show the complete dissipation of momentum in finite time, thus rigorously justifying an assumption that has been made in the engineering and numerical literature.

  4. Efficient Work Team Scheduling: Using Psychological Models of Knowledge Retention to Improve Code Writing Efficiency

    Directory of Open Access Journals (Sweden)

    Michael J. Pelosi

    2014-12-01

    Full Text Available Development teams and programmers must retain critical information about their work during work intervals and gaps in order to improve future performance when work resumes. Despite time lapses, project managers want to maximize coding efficiency and effectiveness. By developing a mathematically justified, practically useful, and computationally tractable quantitative and cognitive model of learning and memory retention, this study establishes calculations designed to maximize scheduling payoff and optimize developer efficiency and effectiveness.

  5. Complex Dynamics in a Nonlinear Cobweb Model for Real Estate Market

    Directory of Open Access Journals (Sweden)

    Junhai Ma

    2007-01-01

    Full Text Available We establish a nonlinear real estate model based on cobweb theory, where the demand function and supply function are quadratic. The stability conditions of the equilibrium are discussed. We demonstrate that as some parameters varied, the stability of Nash equilibrium is lost through period-doubling bifurcation. The chaotic features are justified numerically via computing maximal Lyapunov exponents and sensitive dependence on initial conditions. The delayed feedback control (DFC method is applied to control the chaos of system.

  6. Validation of simulation strategies for the flow in a model propeller turbine during a runaway event

    Science.gov (United States)

    Fortin, M.; Houde, S.; Deschênes, C.

    2014-03-01

    Recent researches indicate that the useful life of a turbine can be affected by transient events. This study aims to define and validate strategies for the simulation of the flow within a propeller turbine model in runaway condition. Using unsteady pressure measurements on two runner blades for validation, different strategies are compared and their results analysed in order to quantify their precision. This paper will focus on justifying the choice of the simulations strategies and on the analysis of preliminary results.

  7. Weak decay constant of pseudscalar meson in a QCD-inspired model

    CERN Document Server

    Salcedo, L A M; Hadj-Michef, D; Frederico, T

    2003-01-01

    We show that a linear scaling between the weak decay constants of pseudoscalar and the vector mesons masses is supported by the available experimental data. The decay constant scale as $f_m/f_{pi}=M_V/M_{\\rho}$ (f_m is decay constant and M_V vector meson ground state mass). This simple form is justified within a renormalized light-front QCD-inpired model for quark-antiquark bound states.

  8. f(T) modified teleparallel gravity as an alternative for holographic and new agegraphic dark energy models

    Institute of Scientific and Technical Information of China (English)

    Kayoomars Karami; Asrin Abdolmaleki

    2013-01-01

    In the present work,we reconstruct different f(T)-gravity models corresponding to the original and entropy-corrected versions of the holographic and new agegraphic dark energy models.We also obtain the equation of state parameters of the corresponding f(T)-gravity models.We conclude that the original holographic and new agegraphic f(T)-gravity models behave like the phantom or quintessence model,whereas in the entropy-corrected models,the equation of state parameter can justify the transition from the quintessence state to the phantom regime as indicated by the recent observations.

  9. ANALYSIS MODEL FOR INVENTORY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    CAMELIA BURJA

    2010-01-01

    Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.

  10. Towards an Encompassing Maturity Model for the Management of Hospital Information Systems.

    Science.gov (United States)

    de Carvalho, João Vidal; Rocha, Álvaro; Vasconcelos, José

    2015-09-01

    Maturity models are tools that favour the management of organizations, including their information systems management task, and hospital organizations are no exception. In the present paper we put forth a preliminary investigation aimed at the development of an encompassing maturity model for the management of hospital information systems. The development of this model is justified to the extent that current maturity models in the field of hospital information systems management are still in an early development stage, and especially because they are poorly detailed, do not provide tools to determine the maturity stage nor structure the characteristics of maturity stages according to different influencing factors.

  11. Modeling density-driven flow in porous media principles, numerics, software

    CERN Document Server

    Holzbecher, Ekkehard O

    1998-01-01

    Modeling of flow and transport in groundwater has become an important focus of scientific research in recent years. Most contributions to this subject deal with flow situations, where density and viscosity changes in the fluid are neglected. This restriction may not always be justified. The models presented in the book demonstrate immpressingly that the flow pattern may be completely different when density changes are taken into account. The main applications of the models are: thermal and saline convection, geothermal flow, saltwater intrusion, flow through salt formations etc. This book not only presents basic theory, but the reader can also test his knowledge by applying the included software and can set up own models.

  12. Evidence for the credibility of health economic models for health policy decision-making

    DEFF Research Database (Denmark)

    Søgaard, Rikke; Lindholt, Jes S.

    2012-01-01

    extracted and the models were assessed for quality against guidelines for best practice by a multidisciplinary team. RESULTS: Seven models were identified and found to provide divergent guidance. Only three reports met 10 of the 15 quality criteria. CONCLUSIONS: Researchers in the field seem to have...... benefited from general advances in health economic modelling and some improvements in reporting were noted. However, the low level of agreement between studies in model structures and assumptions, and difficulty in justifying these (convergent validity), remain a threat to the credibility of health economic...

  13. The Multidimensional Testlet-Effect Rasch Model%多维题组效应Rasch模型

    Institute of Scientific and Technical Information of China (English)

    詹沛达; 王文中; 王立君; 李晓敏

    2014-01-01

    Testlet design has been widely adopted in educational and psychological assessment. A testlet is a cluster of items that share a common stimulus (e.g., a reading comprehension passage or a figure), and the possible local dependence among items within a testlet is called testlet-effect. Various models have been developed to take into account such testlet effect. Examples included the Rasch testlet model, two-parameter logistic Bayesian testlet model, and higher-order testlet model. However, these existing models all assume that an item is affected by only one single testlet effect. Therefore, they are essentially unidimensional testlet-effect models. In practice, multiple testlet effects may simultaneously affect item responses in a testlet. For example, in addition to common stimulus, items can be grouped according to their domains, knowledge units, or item format, such that multiple testlet effects are involved. In essence, an item measures multiple latent traits, in addition to the target latent trait(s) that the test was designed to measure. Existing unidimensional testlet-effect models become inapplicable when multiple testlet effects are involved. To account for multiple testlet effect, in this study we develop the so-called (within-item) multidimensional testlet-effect Rasch model. The parameters can be estimated with marginal maximum likelihood estimation methods or Bayesian methods with Markov chain Monte Carlo (MCMC) algorithms. In this study, a popular computer program for Rasch models, ConQuest, was used. A series of simulations were conducted to evaluate parameter recovery of the new model, consequences of model misspecification, and the effectiveness of model-data fit statistics. Results show that the parameters of the new model can be recovered fairly well; and ignoring the multiple testlet effects resulted in a biased estimation of item parameters, and an overestimation of test reliability. Additionally, it did little harm on parameter estimation to

  14. Asteroid thermal modeling in the presence of reflected sunlight

    Science.gov (United States)

    Myhrvold, Nathan

    2016-10-01

    This study addresses thermal modeling of asteroids with a new derivation of the Near Earth Asteroid Thermal (NEATM) model which correctly accounts for the presence of reflected sunlight in short wave IR bands. Kirchhoff's law of thermal radiation applies to this case and has important implications. New insight is provided into the ???? parameter in the NEATM model and it is extended to thermal models besides NEATM. The role of surface material properties on ???? is examined using laboratory spectra of meteorites and other asteroid compositional proxies; the common assumption that emissivity ????=0.9 in asteroid thermal models may not be justified and can lead to misestimating physical parameters. In addition, indeterminacy in thermal modeling can limit its ability to uniquely determine temperature and other physical properties. A new curve-fitting approach allows thermal modeling to be done independently of visible-band observational parameters, such as the absolute magnitude ????.

  15. An Additive-Multiplicative Restricted Mean Residual Life Model

    DEFF Research Database (Denmark)

    Mansourvar, Zahra; Martinussen, Torben; Scheike, Thomas H.

    2016-01-01

    mean residual life model to study the association between the restricted mean residual life function and potential regression covariates in the presence of right censoring. This model extends the proportional mean residual life model using an additive model as its covariate dependent baseline....... For the suggested model, some covariate effects are allowed to be time-varying. To estimate the model parameters, martingale estimating equations are developed, and the large sample properties of the resulting estimators are established. In addition, to assess the adequacy of the model, we investigate a goodness...... of fit test that is asymptotically justified. The proposed methodology is evaluated via simulation studies and further applied to a kidney cancer data set collected from a clinical trial....

  16. Local histograms and image occlusion models

    CERN Document Server

    Massar, Melody L; Fickus, Matthew; Kovacevic, Jelena

    2011-01-01

    The local histogram transform of an image is a data cube that consists of the histograms of the pixel values that lie within a fixed neighborhood of any given pixel location. Such transforms are useful in image processing applications such as classification and segmentation, especially when dealing with textures that can be distinguished by the distributions of their pixel intensities and colors. We, in particular, use them to identify and delineate biological tissues found in histology images obtained via digital microscopy. In this paper, we introduce a mathematical formalism that rigorously justifies the use of local histograms for such purposes. We begin by discussing how local histograms can be computed as systems of convolutions. We then introduce probabilistic image models that can emulate textures one routinely encounters in histology images. These models are rooted in the concept of image occlusion. A simple model may, for example, generate textures by randomly speckling opaque blobs of one color on ...

  17. On a Quantum Model of Brain Activities

    Science.gov (United States)

    Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.

    2010-01-01

    One of the main activities of the brain is the recognition of signals. A first attempt to explain the process of recognition in terms of quantum statistics was given in [6]. Subsequently, details of the mathematical model were presented in a (still incomplete) series of papers (cf. [7, 2, 5, 10]). In the present note we want to give a general view of the principal ideas of this approach. We will introduce the basic spaces and justify the choice of spaces and operations. Further, we bring the model face to face with basic postulates any statistical model of the recognition process should fulfill. These postulates are in accordance with the opinion widely accepted in psychology and neurology.

  18. The Separate Spheres Model of Gendered Inequality.

    Directory of Open Access Journals (Sweden)

    Andrea L Miller

    Full Text Available Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals' endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology.

  19. The Separate Spheres Model of Gendered Inequality.

    Science.gov (United States)

    Miller, Andrea L; Borgida, Eugene

    2016-01-01

    Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI) has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals' endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology.

  20. Multiscale modelling and analysis of collective decision making in swarm robotics.

    Science.gov (United States)

    Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey

    2014-01-01

    We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.

  1. Multiscale modelling and analysis of collective decision making in swarm robotics.

    Directory of Open Access Journals (Sweden)

    Matthias Vigelius

    Full Text Available We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.

  2. Model Based Reasoning by Introductory Students When Analyzing Earth Systems and Societal Challenges

    Science.gov (United States)

    Holder, L. N.; Herbert, B. E.

    2014-12-01

    Understanding how students use their conceptual models to reason about societal challenges involving societal issues such as natural hazard risk assessment, environmental policy and management, and energy resources can improve instructional activity design that directly impacts student motivation and literacy. To address this question, we created four laboratory exercises for an introductory physical geology course at Texas A&M University that engages students in authentic scientific practices by using real world problems and issues that affect societies based on the theory of situated cognition. Our case-study design allows us to investigate the various ways that students utilize model based reasoning to identify and propose solutions to societally relevant issues. In each of the four interventions, approximately 60 students in three sections of introductory physical geology were expected to represent and evaluate scientific data, make evidence-based claims about the data trends, use those claims to express conceptual models, and use their models to analyze societal challenges. Throughout each step of the laboratory exercise students were asked to justify their claims, models, and data representations using evidence and through the use of argumentation with peers. Cognitive apprenticeship was the foundation for instruction used to scaffold students so that in the first exercise they are given a partially completed model and in the last exercise students are asked to generate a conceptual model on their own. Student artifacts, including representation of earth systems, representation of scientific data, verbal and written explanations of models and scientific arguments, and written solutions to specific societal issues or environmental problems surrounding earth systems, were analyzed through the use of a rubric that modeled authentic expertise and students were sorted into three categories. Written artifacts were examined to identify student argumentation and

  3. Modeling liquid hydrogen cavitating flow with the full cavitation model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, X.B.; Qiu, L.M.; Qi, H.; Zhang, X.J.; Gan, Z.H. [Institute of Refrigeration and Cryogenic Engineering, Zhejiang University, Hangzhou 310027 (China)

    2008-12-15

    Cavitation is the formation of vapor bubbles within a liquid where flow dynamics cause the local static pressure to drop below the vapor pressure. This paper strives towards developing an effective computational strategy to simulate liquid hydrogen cavitation relevant to liquid rocket propulsion applications. The aims are realized by performing a steady state computational fluid dynamic (CFD) study of liquid hydrogen flow over a 2D hydrofoil and an axisymmetric ogive in Hord's reports with a so-called full cavitation model. The thermodynamic effect was demonstrated with the assumption of thermal equilibrium between the gas phase and liquid phase. Temperature-dependent fluid thermodynamic properties were specified along the saturation line from the ''Gaspak 3.2'' databank. Justifiable agreement between the computed surface pressure, temperature and experimental data of Hord was obtained. Specifically, a global sensitivity analysis is performed to examine the sensitivity of the turbulent computations to the wall grid resolution, wall treatments and changes in model parameters. A proper near-wall model and grid resolution were suggested. The full cavitation model with default model parameters provided solutions with comparable accuracy to sheet cavitation in liquid hydrogen for the two geometries. (author)

  4. Holling's "hungry mantid" model for the invertebrate functional response considered as a Markov process. III. Stable satiation distribution.

    Science.gov (United States)

    Heijmans, H J

    1984-01-01

    In this paper, we study an analytical model describing predatory behaviour. It is assumed that the parameter describing the predator's behaviour is its satiation. Using semigroup methods and compactness arguments we prove that a stable satiation distribution is reached if t----infinity. Furthermore, using a Trotter-Kato theorem we justify the transition to the much simpler problem that is obtained if the prey biomass tends to zero.

  5. The mathematical models of electromagnetic field dynamics and heat transfer in closed electrical contacts including Thomson effect

    Science.gov (United States)

    Kharin, Stanislav; Sarsengeldin, Merey; Kassabek, Samat

    2016-08-01

    We represent mathematical models of electromagnetic field dynamics and heat transfer in closed symmetric and asymmetric electrical contacts including Thomson effect, which are essentially nonlinear due to the dependence of thermal and electrical conductivities on temperature. Suggested solutions are based on the assumption of identity of equipotentials and isothermal surfaces, which agrees with experimental data and valid for both linear and nonlinear cases. Well known Kohlrausch temperature-potential relation is analytically justified.

  6. AXIOLOGICAL MODEL OF INSTRUCTIONAL DESIGN

    Directory of Open Access Journals (Sweden)

    Takushevich I. A.

    2015-10-01

    Full Text Available The article presents instructional design as a new approach to the issue of developing value-oriented worldview. Scientific research and analysis led the author to summarize instructional design theory, broaden the definition of instructional design and apply it to instruction and learning in a new manner. The goal to build a pattern of instruction aimed at developing learners’ value-oriented worldview required the author to study the existing instructional design model, to analyse and generalize a number of monographs and articles devoted to the problem of building value systems and value orientations, and finally to investigate and apply the new knowledge to real life in the form of experiment. The work conducted brought the author to axiological model of instructional design, which consists of three dimensions: a linear sequence of the events from designing the instructional material to independent learning activities, interaction between a teacher and a learner, pace of learning and design. The article touches upon every dimension, level and stage of the model, describes and defines the procedures that take place on each of them, as well as suggests a possible way to visualize the model in a form of a sketch. The author also points out the advantages of using instructional design as an efficient and smart tool to organize learning and justifies the use of the new instructional design model in XXI century

  7. A dynamic p53-mdm2 model with distributed delay

    Science.gov (United States)

    Horhat, Raluca; Horhat, Raul Florin

    2014-12-01

    Specific activator and repressor transcription factors which bind to specific regulator DNA sequences, play an important role in gene activity control. Interactions between genes coding such transcripion factors should explain the different stable or sometimes oscillatory gene activities characteristic for different tissues. In this paper, the dynamic P53-Mdm2 interaction model with distributed delays is investigated. Both weak and Dirac kernels are taken into consideration. For Dirac case, the Hopf bifurcation is investigated. Some numerical examples are finally given for justifying the theoretical results.

  8. Complex groundwater flow systems as traveling agent models

    CERN Document Server

    López-Corona, Oliver; Escolero, Oscar; González, Tomás; Morales-Casique, Eric

    2014-01-01

    Analyzing field data from pumping tests, we show that as with many other natural phenomena, groundwater flow exhibits a complex dynamics described by 1/f power spectrum. This result is theoretically studied within an agent perspective. Using a traveling agent model, we prove that this statistical behavior emerges when the medium is complex. Some heuristic reasoning is provided to justify both spatial and dynamic complexity, as the result of the superposition of an infinite number of stochastic processes. Even more, we show that this implies that non-Kolmogorovian probability is needed for its study, and provide a set of new partial differential equations for groundwater flow.

  9. A new econometric test for asymmetric price adjustment by cointegrating vector restrictions with an application to the U.S. and Dutch pork chains

    NARCIS (Netherlands)

    Kuiper, W.E.; Pennings, J.M.E.; Verhees, F.J.H.M.

    2011-01-01

    A new test of asymmetric price adjustment is proposed on the basis of the super-consistent cointegrating vector estimator in the Johansen (1995) cointegration procedure. The super-consistency makes the test robust to misspecifications in the short-run model. Application of the test to the price spre

  10. Testing for Stock Market Contagion: A Quantile Regression Approach

    NARCIS (Netherlands)

    S.Y. Park (Sung); W. Wang (Wendun); N. Huang (Naijing)

    2015-01-01

    markdownabstract__Abstract__ Regarding the asymmetric and leptokurtic behavior of financial data, we propose a new contagion test in the quantile regression framework that is robust to model misspecification. Unlike conventional correlation-based tests, the proposed quantile contagion test allows

  11. DOES YOUR EVENT LOG FIT THE HIGH-LEVEL PROCESS MODEL?

    Directory of Open Access Journals (Sweden)

    A. K. Begicheva

    2015-01-01

    Full Text Available Process mining is a relatively new field of computer science, which deals with process discovery and analysis based on event logs. In this paper we consider the problem of models and event logs conformance checking. Conformance checking is intensively studied in the frame of process mining research, but only models and event logs of the same granularity were considered in the literature. Here we present and justify the method of checking conformance between a high-level model (e.g. built by an expert and a low-level log (generated by a system.The article is published in the author’s wording.

  12. A comparative study of spherical and flat-Earth geopotential modeling at satellite elevations

    Science.gov (United States)

    Parrott, M. H.; Hinze, W. J.; Braile, L. W.

    1985-01-01

    Flat-Earth and spherical-Earth geopotential modeling of crustal anomaly sources at satellite elevations are compared by computing gravity and scalar magnetic anomalies perpendicular to the strike of variably dimensioned rectangular prisms at altitudes of 150, 300, and 450 km. Results indicate that the error caused by the flat-Earth approximation is less than 10% in most geometric conditions. Generally, error increase with larger and wider anomaly sources at higher altitudes. For most crustal source modeling applications at conventional satellite altitudes, flat-Earth modeling can be justified and is numerically efficient.

  13. Animal models of pediatric chronic kidney disease. Is adenine intake an appropriate model?

    Directory of Open Access Journals (Sweden)

    Débora Claramunt

    2015-11-01

    Full Text Available Pediatric chronic kidney disease (CKD has peculiar features. In particular, growth impairment is a major clinical manifestation of CKD that debuts in pediatric age because it presents in a large proportion of infants and children with CKD and has a profound impact on the self-esteem and social integration of the stunted patients. Several factors associated with CKD may lead to growth retardation by interfering with the normal physiology of growth plate, the organ where longitudinal growth rate takes place. The study of growth plate is hardly possible in humans and justifies the use of animal models. Young rats made uremic by 5/6 nephrectomy have been widely used as a model to investigate growth retardation in CKD. This article examines the characteristics of this model and analyzes the utilization of CKD induced by high adenine diet as an alternative research protocol.

  14. Modelling lifetime data with multivariate Tweedie distribution

    Science.gov (United States)

    Nor, Siti Rohani Mohd; Yusof, Fadhilah; Bahar, Arifah

    2017-05-01

    This study aims to measure the dependence between individual lifetimes by applying multivariate Tweedie distribution to the lifetime data. Dependence between lifetimes incorporated in the mortality model is a new form of idea that gives significant impact on the risk of the annuity portfolio which is actually against the idea of standard actuarial methods that assumes independent between lifetimes. Hence, this paper applies Tweedie family distribution to the portfolio of lifetimes to induce the dependence between lives. Tweedie distribution is chosen since it contains symmetric and non-symmetric, as well as light-tailed and heavy-tailed distributions. Parameter estimation is modified in order to fit the Tweedie distribution to the data. This procedure is developed by using method of moments. In addition, the comparison stage is made to check for the adequacy between the observed mortality and expected mortality. Finally, the importance of including systematic mortality risk in the model is justified by the Pearson's chi-squared test.

  15. Indoor-to-outdoor particle concentration ratio model for human exposure analysis

    Science.gov (United States)

    Lee, Jae Young; Ryu, Sung Hee; Lee, Gwangjae; Bae, Gwi-Nam

    2016-02-01

    This study presents an indoor-to-outdoor particle concentration ratio (IOR) model for improved estimates of indoor exposure levels. This model is useful in epidemiological studies with large population, because sampling indoor pollutants in all participants' house is often necessary but impractical. As a part of a study examining the association between air pollutants and atopic dermatitis in children, 16 parents agreed to measure the indoor and outdoor PM10 and PM2.5 concentrations at their homes for 48 h. Correlation analysis and multi-step multivariate linear regression analysis was performed to develop the IOR model. Temperature and floor level were found to be powerful predictors of the IOR. Despite the simplicity of the model, it demonstrated high accuracy in terms of the root mean square error (RMSE). Especially for long-term IOR estimations, the RMSE was as low as 0.064 and 0.063 for PM10 and PM2.5, respectively. When using a prediction model in an epidemiological study, understanding the consequence of the modeling error and justifying the use of the model is very important. In the last section, this paper discussed the impact of the modeling error and developed a novel methodology to justify the use of the model.

  16. Extension of ITU IMT-A Channel Models for Elevation Domains and Line-of-Sight Scenarios

    CERN Document Server

    Zhong, Zhimeng; Li, Xin; Li, Xue

    2013-01-01

    In this contribution, the 3-dimensional (3D) channel characteristics, particularly in the elevation domains, are extracted through measurements in typical urban macro and micro environments in Xi'an China. Stochastic channel model parameters are obtained based on the high-resolution multi-path parameter estimates. In addition, a modified spatial channel model (SCM) for the line-of-sight (LoS) scenario is proposed where the LoS polarization matrix is parameterized in accordance with the reality. Measurement results justify the reasonability of the proposed model. These works significantly improve the applicability of the ITU SCM models in realistic 3D channel simulations.

  17. Modeling in biopharmaceutics, pharmacokinetics, and pharmacodynamics homogeneous and heterogeneous approaches

    CERN Document Server

    Macheras, Panos

    2006-01-01

    The state of the art in Biopharmaceutics, Pharmacokinetics, and Pharmacodynamics Modeling is presented in this book. It shows how advanced physical and mathematical methods can expand classical models in order to cover heterogeneous drug-biological processes and therapeutic effects in the body. The book is divided into four parts; the first deals with the fundamental principles of fractals, diffusion and nonlinear dynamics; the second with drug dissolution, release, and absorption; the third with empirical, compartmental, and stochastic pharmacokinetic models, and the fourth mainly with nonclassical aspects of pharmacodynamics. The classical models that have relevance and application to these sciences are also considered throughout. Many examples are used to illustrate the intrinsic complexity of drug administration related phenomena in the human, justifying the use of advanced modeling methods. This timely and useful book will appeal to graduate students and researchers in pharmacology, pharmaceutical scienc...

  18. Multiaxial Temperature- and Time-Dependent Failure Model

    Science.gov (United States)

    Richardson, David; McLennan, Michael; Anderson, Gregory; Macon, David; Batista-Rodriquez, Alicia

    2003-01-01

    A temperature- and time-dependent mathematical model predicts the conditions for failure of a material subjected to multiaxial stress. The model was initially applied to a filled epoxy below its glass-transition temperature, and is expected to be applicable to other materials, at least below their glass-transition temperatures. The model is justified simply by the fact that it closely approximates the experimentally observed failure behavior of this material: The multiaxiality of the model has been confirmed (see figure) and the model has been shown to be applicable at temperatures from -20 to 115 F (-29 to 46 C) and to predict tensile failures of constant-load and constant-load-rate specimens with failure times ranging from minutes to months..

  19. Brand Value - Proposed Model Danrise

    Directory of Open Access Journals (Sweden)

    Daniel Nascimento Pereira da Silva

    2011-12-01

    Full Text Available Brands have taken dominance in the strategies of enterprises once they are able to generate feelings, sensations and emotions in their clients. These values, value for the enterprises and for the brands themselves, are not measurable. A strong brand configures itself as the highest representative of an enterprise and the brand is regarded as an asset of the enterprise. The evolution of a brand, as an intangible and strategic asset, becomes more vitally important for the enterprises, as a way of maximizing the results. This need, whether of the market or the enterprises, justifies the direction of the research for this vector – the value of the brand. A main objective of the research is to present a new model of brand evaluation. This model is supported by a tangible and intangible aspects and the intangible aspect, evaluates the knowledge and capacity of their managers and workers to build a brand with value through the correct ordering of the priorities of the dimensions of the proposed model. The model was tested on the brand ‗Blue Rise.‘ 

  20. Persistent vs. Permanent Income Shocks in the Buffer-Stock Model

    DEFF Research Database (Denmark)

    Druedahl, Jeppe; Jørgensen, Thomas Høgholm

    2017-01-01

    misspecification. Across most parameterizations, and using the method of simulated moments, we find a relatively large estimation bias in preference parameters. For example, assuming a unit root process when the true AR(1) coefficient is 0.97, leads to an estimation bias of up to 30 percent in the constant...... relative risk aversion (CRRA) coefficient. If used for calibration, misspecified preferences could, for example, lead to a serious misjudgment in the value of social insurance mechanisms. Economic behavior, such as the marginal propensity to consume (MPC), of households simulated from the estimated...

  1. Identifying nonlinear biomechanical models by multicriteria analysis

    Science.gov (United States)

    Srdjevic, Zorica; Cveticanin, Livija

    2012-02-01

    In this study, the methodology developed by Srdjevic and Cveticanin (International Journal of Industrial Ergonomics 34 (2004) 307-318) for the nonbiased (objective) parameter identification of the linear biomechanical model exposed to vertical vibrations is extended to the identification of n-degree of freedom (DOF) nonlinear biomechanical models. The dynamic performance of the n-DOF nonlinear model is described in terms of response functions in the frequency domain, such as the driving-point mechanical impedance and seat-to-head transmissibility function. For randomly generated parameters of the model, nonlinear equations of motion are solved using the Runge-Kutta method. The appropriate data transformation from the time-to-frequency domain is performed by a discrete Fourier transformation. Squared deviations of the response functions from the target values are used as the model performance evaluation criteria, thus shifting the problem into the multicriteria framework. The objective weights of criteria are obtained by applying the Shannon entropy concept. The suggested methodology is programmed in Pascal and tested on a 4-DOF nonlinear lumped parameter biomechanical model. The identification process over the 2000 generated sets of parameters lasts less than 20 s. The model response obtained with the imbedded identified parameters correlates well with the target values, therefore, justifying the use of the underlying concept and the mathematical instruments and numerical tools applied. It should be noted that the identified nonlinear model has an improved accuracy of the biomechanical response compared to the accuracy of a linear model.

  2. Thruster Modelling for Underwater Vehicle Using System Identification Method

    Directory of Open Access Journals (Sweden)

    Mohd Shahrieel Mohd Aras

    2013-05-01

    Full Text Available This paper describes a study of thruster modelling for a remotely operated underwater vehicle (ROV by system identification using Microbox 2000/2000C. Microbox 2000/2000C is an XPC target machine device to interface between an ROV thruster with the MATLAB 2009 software. In this project, a model of the thruster will be developed first so that the system identification toolbox in MATLAB can be used. This project also presents a comparison of mathematical and empirical modelling. The experiments were carried out by using a mini compressor as a dummy depth pressure applied to a pressure sensor. The thruster model will thrust and submerge until it reaches a set point and maintain the set point depth. The depth was based on pressure sensor measurement. A conventional proportional controller was used in this project and the results gathered justified its selection.

  3. Thruster Modelling for Underwater Vehicle Using System Identification Method

    Directory of Open Access Journals (Sweden)

    Mohd Shahrieel Mohd Aras

    2013-05-01

    Full Text Available Abstract This paper describes a study of thruster modelling for a remotely operated underwater vehicle (ROV by system identification using Microbox 2000/2000C. Microbox 2000/2000C is an XPC target machine device to interface between an ROV thruster with the MATLAB 2009 software. In this project, a model of the thruster will be developed first so that the system identification toolbox in MATLAB can be used. This project also presents a comparison of mathematical and empirical modelling. The experiments were carried out by using a mini compressor as a dummy depth pressure applied to a pressure sensor. The thruster model will thrust and submerge until it reaches a set point and maintain the set point depth. The depth was based on pressure sensor measurement. A conventional proportional controller was used in this project and the results gathered justified its selection.

  4. Modeling cerebral blood flow during posture change from sitting to standing

    DEFF Research Database (Denmark)

    Ottesen, Johnny T.; Olufsen, M.; Tran, H.T.

    2004-01-01

    extremities, the brain, and the heart. We use physiologically based control mechanisms to describe the regulation of cerebral blood flow velocity and arterial pressure in response to orthostatic hypotension resulting from postural change. To justify the fidelity of our mathematical model and control......Abstract Hypertension, decreased cerebral blood flow, and diminished cerebral blood flow velocity regulation, are among the first signs indicating the presence of cerebral vascular disease. In this paper, we will present a mathematical model that can predict blood flow and pressure during posture...

  5. Global Stability Analysis for an Internet Congestion Control Model with a Time-Varying Link Capacity

    CERN Document Server

    Rezaie, B; Analoui, M; Khorsandi, S

    2009-01-01

    In this paper, a global stability analysis is given for a rate-based congestion control system modeled by a nonlinear delayed differential equation. The model determines the dynamics of a single-source single-link network, with a time-varying capacity of link and a fixed communication delay. We obtain a sufficient delay-independent conditions on system parameters under which global asymptotic stability of the system is guarantied. The proof is based on an extension of Lyapunov-Krasovskii theorem for a class of nonlinear time-delay systems. The numerical simulations for a typical scenario justify the theoretical results.

  6. A Model to Calculate the Return on Investment After a Software Implementation

    Directory of Open Access Journals (Sweden)

    PADUAM, T. C.

    2015-06-01

    Full Text Available Currently the organization has been concerned with the analysis of the impact of IT investments. Economic pressures, combined with years of significant IT spending without demonstrating clear returns, forced companies to improve their financial practices and justify better and more clearly every penny invested way. Thus, this article presents the model to calculate the return on investment after deploying software. This model was generated from two experiments conducted in the laboratory and in the field, applied in southern Brazil, which showed effective action to catch the post-deployment time metrics. Nevertheless, this article may be applicable to all companies wishing to calculate the temporal return from a deployment

  7. A distributed hydrological model with its application to the Jinghe watershed in the Yellow River Basin

    Institute of Scientific and Technical Information of China (English)

    WANG; Zhonggen; ZHENG; Hongxing; LIU; Changming; WU; Xian

    2004-01-01

    For the purpose of water resources management in the Yellow River Basin with highly spatial difference, a daily distributed hydrological model was proposed, of which the determination of spatially-distributed parameters and model inputs processing were performed by means of GIS/RS. In the model, the computation of runoff yield was based on the topography index method and flow routing was modeled by Maskingum method. The operation of the model is followed by means of "command structure"technique based upon the topography of river network. A case study using the model was conducted for the Jinghe watershed, which locates at the middle Yellow River Basin. The simulation of the hydrological processes in 1996 has shown that water quantity balance errors were less than 5% and the Nash-Sutcliffe coefficient arrived at 0.7, indicating that the model structure is justifiable, and the precision of the model can satisfy the purpose of water resources management.

  8. Stability Analysis of a Simplified Yet Complete Model for Chronic Myelegenous Leukemia

    CERN Document Server

    Jauffret, Marie Doumic; Perthame, Benoît

    2009-01-01

    We analyze the asymptotic behavior of a partial differential equation (PDE) model for hematopoiesis. This PDE model is derived from the original agent-based model formulated by (Roeder et al., Nat. Med., 2006), and it describes the progression of blood cell development from the stem cell to the terminally differentiated state. To conduct our analysis, we start with the PDE model of (Kim et al, JTB, 2007), which coincides very well with the simulation results obtained by Roeder et al. We simplify the PDE model to make it amenable to analysis and justify our approximations using numerical simulations. An analysis of the simplified PDE model proves to exhibit very similar properties to those of the original agent-based model, even if for slightly different parameters. Hence, the simplified model is of value in understanding the dynamics of hematopoiesis and of chronic myelogenous leukemia, and it presents the advantage of having fewer parameters, which makes comparison with both experimental data and alternative...

  9. Significance of radiation models in investigating the flow phenomena around a Jovian entry body

    Science.gov (United States)

    Tiwari, S. N.; Subramanian, S. V.

    1978-01-01

    Formulation is presented to demonstrate the significance of a simplified radiation model in investigating the flow-phenomena in the viscous radiating shock layer of a Jovian entry body. For this, a nongray absorption model for hydrogen-helium gas is developed which consists of 30 steps over the spectral range of 0-20 eV. By employing this model results were obtained for temperature, pressure, density, and radiative flux in the shock layer and along the body surface. These are compared with results of two sophisticated radiative transport models available in the literature. Use of the present radiation model results in significant reduction in computational time. Results of this model are found to be in general agreement with results of other models. It is concluded that use of the present model is justified in investigating the flow phenomena around a Jovian entry body because it is relatively simple, computationally fast, and yields fairly accurate results.

  10. The Nuisance of Nuisance Regression: Spectral Misspecification in a Common Approach to Resting-State fMRI Preprocessing Reintroduces Noise and Obscures Functional Connectivity

    Science.gov (United States)

    Hallquist, Michael N.; Hwang, Kai; Luna, Beatriz

    2013-01-01

    Recent resting-state functional connectivity fMRI (RS-fcMRI) research has demonstrated that head motion during fMRI acquisition systematically influences connectivity estimates despite bandpass filtering and nuisance regression, which are intended to reduce such nuisance variability. We provide evidence that the effects of head motion and other nuisance signals are poorly controlled when the fMRI time series are bandpass-filtered but the regressors are unfiltered, resulting in the inadvertent reintroduction of nuisance-related variation into frequencies previously suppressed by the bandpass filter, as well as suboptimal correction for noise signals in the frequencies of interest. This is important because many RS-fcMRI studies, including some focusing on motion-related artifacts, have applied this approach. In two cohorts of individuals (n = 117 and 22) who completed resting-state fMRI scans, we found that the bandpass-regress approach consistently overestimated functional connectivity across the brain, typically on the order of r = .10 – .35, relative to a simultaneous bandpass filtering and nuisance regression approach. Inflated correlations under the bandpass-regress approach were associated with head motion and cardiac artifacts. Furthermore, distance-related differences in the association of head motion and connectivity estimates were much weaker for the simultaneous filtering approach. We recommend that future RS-fcMRI studies ensure that the frequencies of nuisance regressors and fMRI data match prior to nuisance regression, and we advocate a simultaneous bandpass filtering and nuisance regression strategy that better controls nuisance-related variability. PMID:23747457

  11. On an elastic dissipation model as continuous approximation for discrete media

    Directory of Open Access Journals (Sweden)

    I. V. Andrianov

    2006-01-01

    Full Text Available Construction of an accurate continuous model for discrete media is an important topic in various fields of science. We deal with a 1D differential-difference equation governing the behavior of an n-mass oscillator with linear relaxation. It is known that a string-type approximation is justified for low part of frequency spectra of a continuous model, but for free and forced vibrations a solution of discrete and continuous models can be quite different. A difference operator makes analysis difficult due to its nonlocal form. Approximate equations can be obtained by replacing the difference operators via a local derivative operator. Although application of a model with derivative of more than second order improves the continuous model, a higher order of approximated differential equation seriously complicates a solution of continuous problem. It is known that accuracy of the approximation can dramatically increase using Padé approximations. In this paper, one- and two-point Padé approximations suitable for justify choice of structural damping models are used.

  12. Probabilistic forward model for electroencephalography source analysis

    Energy Technology Data Exchange (ETDEWEB)

    Plis, Sergey M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); George, John S [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Jun, Sung C [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Ranken, Doug M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Volegov, Petr L [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Schmidt, David M [MS-D454, Applied Modern Physics Group, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2007-09-07

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates.

  13. Design and experiment of data-driven modeling and flutter control of a prototype wing

    Science.gov (United States)

    Lum, Kai-Yew; Xu, Cai-Lin; Lu, Zhenbo; Lai, Kwok-Leung; Cui, Yongdong

    2017-06-01

    This paper presents an approach for data-driven modeling of aeroelasticity and its application to flutter control design of a wind-tunnel wing model. Modeling is centered on system identification of unsteady aerodynamic loads using computational fluid dynamics data, and adopts a nonlinear multivariable extension of the Hammerstein-Wiener system. The formulation is in modal coordinates of the elastic structure, and yields a reduced-order model of the aeroelastic feedback loop that is parametrized by airspeed. Flutter suppression is thus cast as a robust stabilization problem over uncertain airspeed, for which a low-order H∞ controller is computed. The paper discusses in detail parameter sensitivity and observability of the model, the former to justify the chosen model structure, and the latter to provide a criterion for physical sensor placement. Wind tunnel experiments confirm the validity of the modeling approach and the effectiveness of the control design.

  14. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  15. Simulation modeling of outcomes and cost effectiveness.

    Science.gov (United States)

    Ramsey, S D; McIntosh, M; Etzioni, R; Urban, N

    2000-08-01

    Modeling will continue to be used to address important issues in clinical practice and health policy issues that have not been adequately studied with high-quality clinical trials. The apparent ad hoc nature of models belies the methodologic rigor that is applied to create the best models in cancer prevention and care. Models have progressed from simple decision trees to extremely complex microsimulation analyses, yet all are built using a logical process based on objective evaluation of the path between intervention and outcome. The best modelers take great care to justify both the structure and content of the model and then test their assumptions using a comprehensive process of sensitivity analysis and model validation. Like clinical trials, models sometimes produce results that are later found to be invalid as other data become available. When weighing the value of models in health care decision making, it is reasonable to consider the alternatives. In the absence of data, clinical policy decisions are often based on the recommendations of expert opinion panels or on poorly defined notions of the standard of care or medical necessity. Because such decision making rarely entails the rigorous process of data collection, synthesis, and testing that is the core of well-conducted modeling, it is usually not possible for external audiences to examine the assumptions and data that were used to derive the decisions. One of the modeler's most challenging tasks is to make the structure and content of the model transparent to the intended audience. The purpose of this article is to clarify the process of modeling, so that readers of models are more knowledgeable about their uses, strengths, and limitations.

  16. A MANAGEMENT MODEL FOR SUSTAINABLE DEVELOPMENT OF THE TOURIST DESTINATION

    Directory of Open Access Journals (Sweden)

    Krasimir ALEKSANDROV

    2013-01-01

    Full Text Available In recent years, Bulgaria is about to market successfully one of the few competitive advantages that the country has as a tourist destination – the diverse and authentic nature. It is an indisputable fact that tourism in its diversity is closely linked to the choice of destination. Sustainable destination management is critical for tourism development, particularly by having effective spatial planning and land use control and through investment decisions on infrastructure and services. The aim of this paper is to propose a management model of a tourist destination in the context of the ideas and policies for sustainable development. The thesis that is justified is that sustainable tourism destination is the result of a proper use of an appropriate governance model. The development and implementation of specific management model make the destination of an all year-round tourism in its different varieties (recreational, sports, etc., bearing economic, social and environmental benefits to society.

  17. Study of a model equation in detonation theory: multidimensional effects

    CERN Document Server

    Faria, Luiz M; Rosales, Rodolfo R

    2015-01-01

    We extend the reactive Burgers equation presented in Kasimov et al. Phys. Rev. Lett., 110 (2013) and Faria et al. SIAM J. Appl. Maths, 74 (2014), to include multidimensional effects. Furthermore, we explain how the model can be rationally justified following the ideas of the asymptotic theory developed in Faria et al. JFM (2015). The proposed model is a forced version of the unsteady small disturbance transonic flow equations. We show that for physically reasonable choices of forcing functions, traveling wave solutions akin to detonation waves exist. It is demonstrated that multidimensional effects play an important role in the stability and dynamics of the traveling waves. Numerical simulations indicate that solutions of the model tend to form multi-dimensional patterns analogous to cells in gaseous detonations.

  18. Comparative analysis of existing models for power-grid synchronization

    CERN Document Server

    Nishikawa, Takashi

    2015-01-01

    The dynamics of power-grid networks is becoming an increasingly active area of research within the physics and network science communities. The results from such studies are typically insightful and illustrative, but are often based on simplifying assumptions that can be either difficult to assess or not fully justified for realistic applications. Here we perform a comprehensive comparative analysis of three leading models recently used to study synchronization dynamics in power-grid networks -- a fundamental problem of practical significance given that frequency synchronization of all power generators in the same interconnection is a necessary condition for a power grid to operate. We show that each of these models can be derived from first principles within a common framework based on the classical model of a generator, thereby clarifying all assumptions involved. This framework allows us to view power grids as complex networks of coupled second-order phase oscillators with both forcing and damping terms. U...

  19. Model instruments of effective segmentation of the fast food market

    Directory of Open Access Journals (Sweden)

    Mityaeva Tetyana L.

    2013-03-01

    Full Text Available The article presents results of optimisation step-type calculations of economic effectiveness of promotion of fast food with consideration of key parameters of assessment of efficiency of the marketing strategy of segmentation. The article justifies development of a mathematical model on the bases of 3D-presentations and three-dimensional system of management variables. The modern applied mathematical packages allow formation not only of one-dimensional and two-dimensional arrays and analyse links of variables, but also of three-dimensional, besides, the more links and parameters are taken into account, the more adequate and adaptive are results of modelling and, as a result, more informative and strategically valuable. The article shows modelling possibilities that allow taking into account strategies and reactions on formation of the marketing strategy under conditions of entering the fast food market segments.

  20. Lotka-Volterra competition models for sessile organisms.

    Science.gov (United States)

    Spencer, Matthew; Tanner, Jason E

    2008-04-01

    Markov models are widely used to describe the dynamics of communities of sessile organisms, because they are easily fitted to field data and provide a rich set of analytical tools. In typical ecological applications, at any point in time, each point in space is in one of a finite set of states (e.g., species, empty space). The models aim to describe the probabilities of transitions between states. In most Markov models for communities, these transition probabilities are assumed to be independent of state abundances. This assumption is often suspected to be false and is rarely justified explicitly. Here, we start with simple assumptions about the interactions among sessile organisms and derive a model in which transition probabilities depend on the abundance of destination states. This model is formulated in continuous time and is equivalent to a Lotka-Volterra competition model. We fit this model and a variety of alternatives in which transition probabilities do not depend on state abundances to a long-term coral reef data set. The Lotka-Volterra model describes the data much better than all models we consider other than a saturated model (a model with a separate parameter for each transition at each time interval, which by definition fits the data perfectly). Our approach provides a basis for further development of stochastic models of sessile communities, and many of the methods we use are relevant to other types of community. We discuss possible extensions to spatially explicit models.

  1. The field-space metric in spiral inflation and related models

    Energy Technology Data Exchange (ETDEWEB)

    Erlich, Joshua [High Energy Theory Group, Department of Physics, College of William and Mary,Williamsburg, VA 23187 (United States); Olsen, Jackson [School of Physics and Astronomy, University of Minnesota,Minneapolis, MN 55455 (United States); Wang, Zhen [High Energy Theory Group, Department of Physics, College of William and Mary,Williamsburg, VA 23187 (United States)

    2016-09-22

    Multi-field inflation models include a variety of scenarios for how inflation proceeds and ends. Models with the same potential but different kinetic terms are common in the literature. We compare spiral inflation and Dante’s inferno-type models, which differ only in their field-space metric. We justify a single-field effective description in these models and relate the single-field description to a mass-matrix formalism. We note the effects of the nontrivial field-space metric on inflationary observables, and consequently on the viability of these models. We also note a duality between spiral inflation and Dante’s inferno models with different potentials.

  2. IMPORTANCE OF DIFFERENT MODELS IN DECISION MAKING, EXPLAINING THE STRATEGIC BEHAVIOR IN ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Cristiano de Oliveira Maciel

    2006-11-01

    Full Text Available This study is about the different models of decision process analyzing the organizational strategy. The article presents the strategy according to a cognitive approach. The discussion about that approach has three models of decision process: rational actor model, organizational behavior, and political model. These models, respectively, present some improvement in the decision making results, search for a good decision facing the cognitive restrictions of the administrator, and lots of talks for making a decision. According to the emphasis of each model, the possibilities for analyzing the strategy are presented. The article also shows that it is necessary to take into account the three different ways of analysis. That statement is justified once the analysis as well as the decision making become more complex, mainly those which are more important for the organizations.

  3. Accurate Mobility Modeling and Location Prediction Based on Pattern Analysis of Handover Series in Mobile Networks

    Directory of Open Access Journals (Sweden)

    Péter Fülöp

    2009-01-01

    Full Text Available The efficient dimensioning of cellular wireless access networks depends highly on the accuracy of the underlying mathematical models of user distribution and traffic estimations. Mobility prediction also considered as an effective method contributing to the accuracy of IP multicast based multimedia transmissions, and ad hoc routing algorithms. In this paper we focus on the tradeoff between the accuracy and the complexity of the mathematical models used to describe user movements in the network. We propose mobility model extension, in order to utilize user's movement history thus providing more accurate results than other widely used models in the literature. The new models are applicable in real-life scenarios, because these rely on additional information effectively available in cellular networks (e.g. handover history, too. The complexity of the proposed models is analyzed, and the accuracy is justified by means of simulation.

  4. A model for steady state stage III creep regime at low-high stress/temperature range

    Directory of Open Access Journals (Sweden)

    Nicola Bonora

    2008-07-01

    Full Text Available Although diffusional flow creep is often considered out of practical engineering applications, the need for a model capable to account for the resulting action of both diffusional and dislocation type creep is justified by the increasing demands of reliable creep design for very long lives (exceeding 100.000h, high stress-low temperatures and high temperature-low stress regimes. In this paper, a creep model formulation, in which the change of the creep mechanism has been accounted for through an explicit dependence of the creep exponent n on stress and temperature, has been proposed. An application example of the proposed approach to high purity aluminum is given.

  5. Asymptotic models for the generation of internal waves by a moving ship, and the dead-water phenomenon

    CERN Document Server

    Duchene, Vincent

    2011-01-01

    This paper deals with the dead-water phenomenon, which occurs when a ship sails in a stratified fluid, and experiences an important drag due to waves below the surface. More generally, we study the generation of internal waves by a disturbance moving at constant speed on top of two layers of fluids of different densities. Starting from the full Euler equations, we present several nonlinear asymptotic models, in the long wave regime. These models are rigorously justified by consistency or convergence results. A careful theoretical and numerical analysis is then provided, in order to predict the behavior of the flow and in which situations the dead-water effect appears.

  6. Multi-month prescriptions, fast-track refills, and community ART groups: results from a process evaluation in Malawi on using differentiated models of care to achieve national HIV treatment goals

    Directory of Open Access Journals (Sweden)

    Margaret L. Prust

    2017-07-01

    Conclusions: MMS is being implemented nationally and has already generated cost savings and efficiencies in Malawi for patients and the health system, but could be improved by more accurate patient differentiation. While expanding FTRs and CAGs may not offer significant further cost savings in Malawi, future studies should investigate if such alternative models lead to improvements in patient satisfaction or clinical outcomes that might justify their implementation.

  7. Asteroid thermal modeling in the presence of reflected sunlight with an application to WISE/NEOWISE observational data

    CERN Document Server

    Myhrvold, Nathan

    2016-01-01

    This study addresses thermal modeling of asteroids with a new derivation of the Near Earth Asteroid Thermal (NEATM) model which correctly accounts for the presence of reflected sunlight in short wave IR bands. Kirchhoff's law of thermal radiation applies to this case and has important implications. New insight is provided into the eta parameter in the NEATM model and it is extended to thermal models besides NEATM. The role of surface material properties on eta is examined using laboratory spectra of meteorites and other asteroid compositional proxies; the common assumption that emissivity e=0.9 in asteroid thermal models may not be justified and can lead to misestimating physical parameters. In addition, indeterminacy in thermal modeling can limit its ability to uniquely determine temperature and other physical properties. A new curve fitting approach allows thermal modeling to be done independent of visible band observational parameters such as the absolute magnitude H. These new thermal modeling techniques ...

  8. The Stabilising Role Of The Fiscal And Budgetary Policies Within The Simplified Keynesian Model

    Directory of Open Access Journals (Sweden)

    Campeanu, Emilia Mioara

    2011-06-01

    Full Text Available The purpose of the paper is to investigate the fiscal and budgetary policies using the simplified Keynesian model. It focuses on the imbalance in the fiscal and budgetary policies trying to justify the rationality of the deficits: the political strategic aspect to leave a hardto- administrate legacy, and the inter-party conflict, which may influence the budgetary decisions of the government. The purpose of the presentation is to show the stabilizing role of the public budget in relation with the full employment goal.

  9. Ellsworth C. Dougherty: A Pioneer in the Selection of Caenorhabditis elegans as a Model Organism

    Science.gov (United States)

    Ferris, Howard

    2015-01-01

    Ellsworth Dougherty (1921–1965) was a man of impressive intellectual dimensions and interests; in a relatively short career he contributed enormously as researcher and scholar to the biological knowledge base for selection of Caenorhabditis elegans as a model organism in neurobiology, genetics, and molecular biology. He helped guide the choice of strains that were eventually used, and, in particular, he developed the methodology and understanding for the nutrition and axenic culture of nematodes and other organisms. Dougherty insisted upon a concise terminology for culture techniques and coined descriptive neologisms that were justified by their linguistic roots. Among other contributions, he refined the classification system for the Protista. PMID:26272995

  10. A model for the inverse 1-median problem on trees under uncertain costs

    Directory of Open Access Journals (Sweden)

    Kien Trung Nguyen

    2016-01-01

    Full Text Available We consider the problem of justifying vertex weights of a tree under uncertain costs so that a prespecified vertex become optimal and the total cost should be optimal in the uncertainty scenario. We propose a model which delivers the information about the optimal cost which respect to each confidence level \\(\\alpha \\in [0,1]\\. To obtain this goal, we first define an uncertain variable with respect to the minimum cost in each confidence level. If all costs are independently linear distributed, we present the inverse distribution function of this uncertain variable in \\(O(n^{2}\\log n\\ time, where \\(n\\ is the number of vertices in the tree.

  11. Approximating response time distributions in closed queueing network models of computer performance

    Energy Technology Data Exchange (ETDEWEB)

    Salza, S.; Lavenberg, S.S.

    1981-01-01

    Hierarchical decomposition methods for approximating response time distributions in certain closed queueing network models of computer performance are investigated. The methods investigated apply whenever part of a customer's response time consists of a geometrically distributed number of successive cycles within a subnetwork. The key step involves replacing the subnetwork with parallel exponential servers having queue-size dependent service rates. Results on thinning stochastic point processes are used to justify this replacement when the mean number of cycles is large. Preliminary numerical comparisons of the approximations with simulation results indicate that the approximations are quite accurate even when the mean number of cycles is small. 17 references.

  12. Analysis of mismatched heterointerfaces by combined HREM image processing and modelling

    Energy Technology Data Exchange (ETDEWEB)

    Moebus, G.; Inkson, B.J. [Univ. of Sheffield, Dept. of Engineering Materials, Sheffield (United Kingdom); Levay, A. [Eoetvoes Univ., Dept. of Solid State Physics, Budapest (Hungary); Hytch, M.J. [Centre d' Etudes de Chimie Metallurgique, CNRS, Vitry-sur-Seine (France); Trampert, A. [Paul-Drude-Inst., Berlin (Germany); Wagner, T. [Max-Planck-Inst. fuer Metallforschung, Stuttgart (Germany)

    2003-04-01

    Lattice mismatched heterointerfaces are classified by a simple five parameter configuration space which allows to quantify the following properties: gradual partial coherence, distribution and localisation of misfit dislocations, anisotropy of strain fields, elastic dissimilarity of the lattices. HREM images are digitally processed into one-dimensional strain and Fourier spectrum profiles along the interface at selected distance from the interface. The interpretation of these profiles as a Fourier expansion of displacement waves is justified through a link to continuum modelling approaches presented earlier. Limitations and microscope conditions for this simple direct image interpretation approach are listed and discussed. (orig.)

  13. Introduction of the cluster model of organisation of activity of light industry enterprises of Ukraine

    Directory of Open Access Journals (Sweden)

    Filippov Mykhaylo I.

    2013-03-01

    Full Text Available The article analyses the world experience of introduction of the cluster model of activity of enterprises. It considers main prerequisites of creation and prospects of activity of clusters in light industry in Ukraine. It justifies application of the cluster approach in Ukraine, which is a necessary condition for revival of the domestic production, increase of efficiency of innovation development of regions and achievement of a high level of economic development and competitiveness. It provides proposals on improvement of the state policy of development of innovation clusters for increase of competitiveness of economy and ensuring entry of Ukraine into the circle of economically developed countries of the world.

  14. Absence of disorder-driven metal-insulator transitions in simple holographic models

    CERN Document Server

    Grozdanov, Sašo; Sachdev, Subir; Schalm, Koenraad

    2015-01-01

    We study electrical transport in a strongly coupled strange metal in two spatial dimensions at finite temperature and charge density, holographically dual to Einstein-Maxwell theory in an asymptotically $\\mathrm{AdS}_4$ spacetime, with arbitrary spatial inhomogeneity, up to mild assumptions. In condensed matter, these are candidate models for exotic strange metals without long-lived quasiparticles. We prove that the electrical conductivity is bounded from below by a universal minimal conductance: the quantum critical conductivity of a clean, charge-neutral plasma. Beyond non-perturbatively justifying mean-field approximations to disorder, our work demonstrates the practicality of new hydrodynamic insight into holographic transport.

  15. Stochastic nonlinear mixed effects: a metformin case study.

    Science.gov (United States)

    Matzuka, Brett; Chittenden, Jason; Monteleone, Jonathan; Tran, Hien

    2016-02-01

    In nonlinear mixed effect (NLME) modeling, the intra-individual variability is a collection of errors due to assay sensitivity, dosing, sampling, as well as model misspecification. Utilizing stochastic differential equations (SDE) within the NLME framework allows the decoupling of the measurement errors from the model misspecification. This leads the SDE approach to be a novel tool for model refinement. Using Metformin clinical pharmacokinetic (PK) data, the process of model development through the use of SDEs in population PK modeling was done to study the dynamics of absorption rate. A base model was constructed and then refined by using the system noise terms of the SDEs to track model parameters and model misspecification. This provides the unique advantage of making no underlying assumptions about the structural model for the absorption process while quantifying insufficiencies in the current model. This article focuses on implementing the extended Kalman filter and unscented Kalman filter in an NLME framework for parameter estimation and model development, comparing the methodologies, and illustrating their challenges and utility. The Kalman filter algorithms were successfully implemented in NLME models using MATLAB with run time differences between the ODE and SDE methods comparable to the differences found by Kakhi for their stochastic deconvolution.

  16. Web Services-Enhanced Agile Modeling and Integrating Business Processes

    CERN Document Server

    Belouadha, Fatima-Zahra; Roudiès, Ounsa

    2012-01-01

    In a global business context with continuous changes, the enterprises have to enhance their operational efficiency, to react more quickly, to ensure the flexibility of their business processes, and to build new collaboration pathways with external partners. To achieve this goal, they must use e-business methods, mechanisms and techniques while capitalizing on the potential of new information and communication technologies. In this context, we propose a standards, model and Web services-based approach for modeling and integrating agile enterprise business processes. The purpose is to benefit from Web services characteristics to enhance the processes design and realize their dynamic integration. The choice of focusing on Web services is essentially justified by their broad adoption by enterprises as well as their capability to warranty interoperability between both intra and inter-enterprises systems. Thereby, we propose in this chapter a metamodel for describing business processes, and discuss their dynamic in...

  17. A dynamic P53-MDM2 model with time delay

    Energy Technology Data Exchange (ETDEWEB)

    Mihalas, Gh.I. [Department of Biophysics and Medical Informatics, University of Medicine and Pharmacy, Piata Eftimie Murgu, nr. 3, 300041 Timisoara (Romania)]. E-mail: mihalas@medinfo.umft.ro; Neamtu, M. [Department of Forecasting, Economic Analysis, Mathematics and Statistics, West University of Timisoara, Str. Pestalozzi, nr. 14A, 300115 Timisoara (Romania)]. E-mail: mihaela.neamtu@fse.uvt.ro; Opris, D. [Department of Applied Mathematics, West University of Timisoara, Bd. V. Parvan, nr. 4, 300223 Timisoara (Romania)]. E-mail: opris@math.uvt.ro; Horhat, R.F. [Department of Biophysics and Medical Informatics, University of Medicine and Pharmacy, Piata Eftimie Murgu, nr. 3, 300041 Timisoara (Romania)]. E-mail: rhorhat@yahoo.com

    2006-11-15

    Specific activator and repressor transcription factors which bind to specific regulator DNA sequences, play an important role in gene activity control. Interactions between genes coding such transcription factors should explain the different stable or sometimes oscillatory gene activities characteristic for different tissues. Starting with the model P53-MDM2 described into [Mihalas GI, Simon Z, Balea G, Popa E. Possible oscillatory behaviour in P53-MDM2 interaction computer simulation. J Biol Syst 2000;8(1):21-9] and the process described into [Kohn KW, Pommier Y. Molecular interaction map of P53 and MDM2 logic elements, which control the off-on switch of P53 in response to DNA damage. Biochem Biophys Res Commun 2005;331:816-27] we enveloped a new model of this interaction. Choosing the delay as a bifurcation parameter we study the direction and stability of the bifurcating periodic solutions. Some numerical examples are finally given for justifying the theoretical results.

  18. A Novel Computer Virus Propagation Model under Security Classification

    Directory of Open Access Journals (Sweden)

    Qingyi Zhu

    2017-01-01

    Full Text Available In reality, some computers have specific security classification. For the sake of safety and cost, the security level of computers will be upgraded with increasing of threats in networks. Here we assume that there exists a threshold value which determines when countermeasures should be taken to level up the security of a fraction of computers with low security level. And in some specific realistic environments the propagation network can be regarded as fully interconnected. Inspired by these facts, this paper presents a novel computer virus dynamics model considering the impact brought by security classification in full interconnection network. By using the theory of dynamic stability, the existence of equilibria and stability conditions is analysed and proved. And the above optimal threshold value is given analytically. Then, some numerical experiments are made to justify the model. Besides, some discussions and antivirus measures are given.

  19. A risk management model for securing virtual healthcare communities.

    Science.gov (United States)

    Chryssanthou, Anargyros; Varlamis, Iraklis; Latsiou, Charikleia

    2011-01-01

    Virtual healthcare communities aim to bring together healthcare professionals and patients, improve the quality of healthcare services and assist healthcare professionals and researchers in their everyday activities. In a secure and reliable environment, patients share their medical data with doctors, expect confidentiality and demand reliable medical consultation. Apart from a concrete policy framework, several ethical, legal and technical issues must be considered in order to build a trustful community. This research emphasises on security issues, which can arise inside a virtual healthcare community and relate to the communication and storage of data. It capitalises on a standardised risk management methodology and a prototype architecture for healthcare community portals and justifies a security model that allows the identification, estimation and evaluation of potential security risks for the community. A hypothetical virtual healthcare community is employed in order to portray security risks and the solutions that the security model provides.

  20. EOQ Model for Time-Deteriorating Items Using Penalty cost

    Directory of Open Access Journals (Sweden)

    Meenakshi Srivastava

    2009-01-01

    Full Text Available In inventory, the utility of the deteriorating items decreases with time. The degree ofdeterioration of product utility can be treated as penalty cost in the inventory replenishmentsystem. In this paper, we present EOQ model for those perishable products, which do notdeteriorate for some period of time and after that time they continuously deteriorate with time andloose their importance. This loss can be incurred as penalty cost to the wholesaler / retailer. Theprime focus of our paper is to develop the EOQ model for time-deteriorating items using penaltycost with finite and infinite production rate. For simplicity, linear and exponential penalty costfunctions have been considered as a measurement of the utility of the product. The theoreticalexpressions are obtained for optimum inventory level and cycle time. All the theoreticaldevelopments are numerically justified.

  1. A multi-criteria model for maintenance job scheduling

    Directory of Open Access Journals (Sweden)

    Sunday A. Oke

    2007-12-01

    Full Text Available This paper presents a multi-criteria maintenance job scheduling model, which is formulated using a weighted multi-criteria integer linear programming maintenance scheduling framework. Three criteria, which have direct relationship with the primary objectives of a typical production setting, were used. These criteria are namely minimization of equipment idle time, manpower idle time and lateness of job with unit parity. The mathematical model constrained by available equipment, manpower and job available time within planning horizon was tested with a 10-job, 8-hour time horizon problem with declared equipment and manpower available as against the required. The results, analysis and illustrations justify multi-criteria consideration. Thus, maintenance managers are equipped with a tool for adequate decision making that guides against error in the accumulated data which may lead to wrong decision making. The idea presented is new since it provides an approach that has not been documented previously in the literature.

  2. Compartmental modeling and tracer kinetics

    CERN Document Server

    Anderson, David H

    1983-01-01

    This monograph is concerned with mathematical aspects of compartmental an­ alysis. In particular, linear models are closely analyzed since they are fully justifiable as an investigative tool in tracer experiments. The objective of the monograph is to bring the reader up to date on some of the current mathematical prob­ lems of interest in compartmental analysis. This is accomplished by reviewing mathematical developments in the literature, especially over the last 10-15 years, and by presenting some new thoughts and directions for future mathematical research. These notes started as a series of lectures that I gave while visiting with the Division of Applied ~1athematics, Brown University, 1979, and have developed in­ to this collection of articles aimed at the reader with a beginning graduate level background in mathematics. The text can be used as a self-paced reading course. With this in mind, exercises have been appropriately placed throughout the notes. As an aid in reading the material, the e~d of a ...

  3. Modeling Clinical Radiation Responses in the IMRT Era

    Science.gov (United States)

    Schwartz, J. L.; Murray, D.; Stewart, R. D.; Phillips, M. H.

    2014-03-01

    The purpose of this review is to highlight the critical issues of radiobiological models, particularly as they apply to clinical radiation therapy. Developing models of radiation responses has a long history that continues to the present time. Many different models have been proposed, but in the field of radiation oncology, the linear-quadratic (LQ) model has had the most impact on the design of treatment protocols. Questions have been raised as to the value of the LQ model given that the biological assumption underlying it has been challenged by molecular analyses of cell and tissue responses to radiation. There are also questions as to use of the LQ model for hypofractionation, especially for high dose treatments using a single fraction. While the LQ model might over-estimate the effects of large radiation dose fractions, there is insufficient information to fully justify the adoption of alternative models. However, there is increasing evidence in the literature that non-targeted and other indirect effects of radiation sometimes produce substantial deviations from LQ-like dose-response curves. As preclinical and clinical hypofractionation studies accumulate, new or refined dose-response models that incorporate high-dose/fraction non-targeted and indirect effects may be required, but for now the LQ model remains a simple, useful tool to guide the design of treatment protocols.

  4. A numerical study of the alpha model for two-dimensional magnetohydrodynamic turbulent flows

    CERN Document Server

    Mininni, P D; Pouquet, A G

    2004-01-01

    We explore some consequences of the ``alpha model,'' also called the ``Lagrangian-averaged'' model, for two-dimensional incompressible magnetohydrodynamic (MHD) turbulence. This model is an extension of the smoothing procedure in fluid dynamics which filters velocity fields locally while leaving their associated vorticities unsmoothed, and has proved useful for high Reynolds number turbulence computations. We consider several known effects (selective decay, dynamic alignment, inverse cascades, and the probability distribution functions of fluctuating turbulent quantities) in magnetofluid turbulence and compare the results of numerical solutions of the primitive MHD equations with their alpha-model counterparts' performance for the same flows, in regimes where available resolution is adequate to explore both. The hope is to justify the use of the alpha model in regimes that lie outside currently available resolution, as will be the case in particular in three-dimensional geometry or for magnetic Prandtl number...

  5. On the validity of evolutionary models with site-specific parameters.

    Directory of Open Access Journals (Sweden)

    Konrad Scheffler

    Full Text Available Evolutionary models that make use of site-specific parameters have recently been criticized on the grounds that parameter estimates obtained under such models can be unreliable and lack theoretical guarantees of convergence. We present a simulation study providing empirical evidence that a simple version of the models in question does exhibit sensible convergence behavior and that additional taxa, despite not being independent of each other, lead to improved parameter estimates. Although it would be desirable to have theoretical guarantees of this, we argue that such guarantees would not be sufficient to justify the use of these models in practice. Instead, we emphasize the importance of taking the variance of parameter estimates into account rather than blindly trusting point estimates - this is standardly done by using the models to construct statistical hypothesis tests, which are then validated empirically via simulation studies.

  6. Modeling and simulation of cascading contingencies

    Science.gov (United States)

    Zhang, Jianfeng

    This dissertation proposes a new approach to model and study cascading contingencies in large power systems. The most important contribution of the work involves the development and validation of a heuristic analytic model to assess the likelihood of cascading contingencies, and the development and validation of a uniform search strategy. We model the probability of cascading contingencies as a function of power flow and power flow changes. Utilizing logistic regression, the proposed model is calibrated using real industry data. This dissertation analyzes random search strategies for Monte Carlo simulations and proposes a new uniform search strategy based on the Metropolis-Hastings Algorithm. The proposed search strategy is capable of selecting the most significant cascading contingencies, and it is capable of constructing an unbiased estimator to provide a measure of system security. This dissertation makes it possible to reasonably quantify system security and justify security operations when economic concerns conflict with reliability concerns in the new competitive power market environment. It can also provide guidance to system operators about actions that may be taken to reduce the risk of major system blackouts. Various applications can be developed to take advantage of the quantitative security measures provided in this dissertation.

  7. Recovery from schizophrenia and the recovery model.

    Science.gov (United States)

    Warner, Richard

    2009-07-01

    The recovery model refers to subjective experiences of optimism, empowerment and interpersonal support, and to a focus on collaborative treatment approaches, finding productive roles for user/consumers, peer support and reducing stigma. The model is influencing service development around the world. This review will assess whether optimism about outcome from serious mental illness and other tenets of the recovery model are borne out by recent research. Remission of symptoms has been precisely defined, but the definition of 'recovery' is a more diffuse concept that includes such factors as being productive and functioning independently. Recent research and a large, earlier body of data suggest that optimism about outcome from schizophrenia is justified. A substantial proportion of people with the illness will recover completely and many more will regain good social functioning. Outcome is better for people in the developing world. Mortality for people with schizophrenia is increasing but is lower in the developing world. Working appears to help people recover from schizophrenia, and recent advances in vocational rehabilitation have been shown to be effective in countries with differing economies and labor markets. A growing body of research supports the concept that empowerment is an important component of the recovery process. Key tenets of the recovery model - optimism about recovery from schizophrenia, the importance of access to employment and the value of empowerment of user/consumers in the recovery process - are supported by the scientific research. Attempts to reduce the internalized stigma of mental illness should enhance the recovery process.

  8. QUASI-STATIC MODEL OF MAGNETICALLY COLLIMATED JETS AND RADIO LOBES. II. JET STRUCTURE AND STABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Colgate, Stirling A.; Li, Hui [Theoretical Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Fowler, T. Kenneth [University of California, Berkeley, CA 94720 (United States); Hooper, E. Bickford [Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States); McClenaghan, Joseph; Lin, Zhihong [University of California, Irvine, CA 92697 (United States)

    2015-11-10

    This is the second in a series of companion papers showing that when an efficient dynamo can be maintained by accretion disks around supermassive black holes in active galactic nuclei, it can lead to the formation of a powerful, magnetically driven, and mediated helix that could explain both the observed radio jet/lobe structures and ultimately the enormous power inferred from the observed ultrahigh-energy cosmic rays. In the first paper, we showed self-consistently that minimizing viscous dissipation in the disk naturally leads to jets of maximum power with boundary conditions known to yield jets as a low-density, magnetically collimated tower, consistent with observational constraints of wire-like currents at distances far from the black hole. In this paper we show that these magnetic towers remain collimated as they grow in length at nonrelativistic velocities. Differences with relativistic jet models are explained by three-dimensional magnetic structures derived from a detailed examination of stability properties of the tower model, including a broad diffuse pinch with current profiles predicted by a detailed jet solution outside the collimated central column treated as an electric circuit. We justify our model in part by the derived jet dimensions in reasonable agreement with observations. Using these jet properties, we also discuss the implications for relativistic particle acceleration in nonrelativistically moving jets. The appendices justify the low jet densities yielding our results and speculate how to reconcile our nonrelativistic treatment with general relativistic MHD simulations.

  9. Flexible boosting of accelerated failure time models

    Directory of Open Access Journals (Sweden)

    Hothorn Torsten

    2008-06-01

    Full Text Available Abstract Background When boosting algorithms are used for building survival models from high-dimensional data, it is common to fit a Cox proportional hazards model or to use least squares techniques for fitting semiparametric accelerated failure time models. There are cases, however, where fitting a fully parametric accelerated failure time model is a good alternative to these methods, especially when the proportional hazards assumption is not justified. Boosting algorithms for the estimation of parametric accelerated failure time models have not been developed so far, since these models require the estimation of a model-specific scale parameter which traditional boosting algorithms are not able to deal with. Results We introduce a new boosting algorithm for censored time-to-event data which is suitable for fitting parametric accelerated failure time models. Estimation of the predictor function is carried out simultaneously with the estimation of the scale parameter, so that the negative log likelihood of the survival distribution can be used as a loss function for the boosting algorithm. The estimation of the scale parameter does not affect the favorable properties of boosting with respect to variable selection. Conclusion The analysis of a high-dimensional set of microarray data demonstrates that the new algorithm is able to outperform boosting with the Cox partial likelihood when the proportional hazards assumption is questionable. In low-dimensional settings, i.e., when classical likelihood estimation of a parametric accelerated failure time model is possible, simulations show that the new boosting algorithm closely approximates the estimates obtained from the maximum likelihood method.

  10. Small is beautiful: models of small neuronal networks.

    Science.gov (United States)

    Lamb, Damon G; Calabrese, Ronald L

    2012-08-01

    Modeling has contributed a great deal to our understanding of how individual neurons and neuronal networks function. In this review, we focus on models of the small neuronal networks of invertebrates, especially rhythmically active CPG networks. Models have elucidated many aspects of these networks, from identifying key interacting membrane properties to pointing out gaps in our understanding, for example missing neurons. Even the complex CPGs of vertebrates, such as those that underlie respiration, have been reduced to small network models to great effect. Modeling of these networks spans from simplified models, which are amenable to mathematical analyses, to very complicated biophysical models. Some researchers have now adopted a population approach, where they generate and analyze many related models that differ in a few to several judiciously chosen free parameters; often these parameters show variability across animals and thus justify the approach. Models of small neuronal networks will continue to expand and refine our understanding of how neuronal networks in all animals program motor output, process sensory information and learn.

  11. Validity of the electrical model representation of the effects of nuclear magnetic resonance (1961); Validite de la representation par modele electrique des effets de resonance magnetique nucleaire (1961)

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G. [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1961-07-01

    When studying the behaviour of a magnetic resonance transducer formed by the association of an electrical network and of a set of nuclear spins, it is possible to bring about a representation that is analytically equivalent by means of an entirely electrical model, available for transients as well as steady-state. A detailed study of the validity conditions justifies its use in most cases. Also proposed is a linearity criterion of Bloch's equations in transient state that is simply the prolongation of the well-known condition of non-saturation in the steady-state. (author) [French] L'etude du comportement d'un transducteur a resonance magnetique forme de l'association d'un reseau electrique et d'un ensemble de noyaux dotes de spin, montre qu'il est possible d'en deduire une representation analytiquement equivalente au moyen d'un modele entierement electrique utilisable pour un regime transitoire aussi bien que pour un regime permanent. Une etude detaillee des conditions de validite permet d'en justifier l'emploi dans la majorite des cas. On propose enfin un critere de linearite des equations de Bloch en regime transitoire, qui constitue un prolongement de la condition connue de non-saturation en regime stationnaire. (auteur)

  12. A General Strategy for Physics-Based Model Validation Illustrated with Earthquake Phenomenology, Atmospheric Radiative Transfer, and Computational Fluid Dynamics

    CERN Document Server

    Sornette, Didier; Kamm, James R; Ide, Kayo

    2007-01-01

    Validation is often defined as the process of determining the degree to which a model is an accurate representation of the real world from the perspective of its intended uses. Validation is crucial as industries and governments depend increasingly on predictions by computer models to justify their decisions. In this article, we survey the model validation literature and propose to formulate validation as an iterative construction process that mimics the process occurring implicitly in the minds of scientists. We thus offer a formal representation of the progressive build-up of trust in the model, and thereby replace incapacitating claims on the impossibility of validating a given model by an adaptive process of constructive approximation. This approach is better adapted to the fuzzy, coarse-grained nature of validation. Our procedure factors in the degree of redundancy versus novelty of the experiments used for validation as well as the degree to which the model predicts the observations. We illustrate the n...

  13. Viscoelastic Modelling of Road Deflections for use with the Traffic Speed Deflectometer

    DEFF Research Database (Denmark)

    Pedersen, Louis

    This Ph.D. study is at its core about how asphalt and road structures responds to dynamic loads. Existing models for the deflections under a moving load using beam equations are revisited and it is concluded they leave room for improvement for the particular setup and problem at hand. Then a diff......This Ph.D. study is at its core about how asphalt and road structures responds to dynamic loads. Existing models for the deflections under a moving load using beam equations are revisited and it is concluded they leave room for improvement for the particular setup and problem at hand...... of a generalized Maxwell model. Validations by comparison to ViscoRoute simulations are also made. This justifies the use of the Laplace FEM for generating simulated data using a Huet-Sayegh model for the visco-elastic behaviour of asphalt. These simulated data, along with measured data, are then used to suggest...

  14. Dynamics of Social Group Competition: Modeling the Decline of Religious Affiliation

    Science.gov (United States)

    Abrams, Daniel M.; Yaple, Haley A.; Wiener, Richard J.

    2011-08-01

    When social groups compete for members, the resulting dynamics may be understandable with mathematical models. We demonstrate that a simple ordinary differential equation (ODE) model is a good fit for religious shift by comparing it to a new international data set tracking religious nonaffiliation. We then generalize the model to include the possibility of nontrivial social interaction networks and examine the limiting case of a continuous system. Analytical and numerical predictions of this generalized system, which is robust to polarizing perturbations, match those of the original ODE model and justify its agreement with real-world data. The resulting predictions highlight possible causes of social shift and suggest future lines of research in both physics and sociology.

  15. Geophysical models of heat and fluid flow in damageable poro-elastic continua

    Science.gov (United States)

    Roubíček, Tomáš

    2017-03-01

    A rather general model for fluid and heat transport in poro-elastic continua undergoing possibly also plastic-like deformation and damage is developed with the goal to cover various specific models of rock rheology used in geophysics of Earth's crust. Nonconvex free energy at small elastic strains, gradient theories (in particular the concept of second-grade nonsimple continua), and Biot poro-elastic model are employed, together with possible large displacement due to large plastic-like strains evolving during long time periods. Also the additive splitting is justified in stratified situations which are of interest in modelling of lithospheric crust faults. Thermodynamically based formulation includes entropy balance (in particular the Clausius-Duhem inequality) and an explicit global energy balance. It is further outlined that the energy balance can be used to ensure, under suitable data qualification, existence of a weak solution and stability and convergence of suitable approximation schemes at least in some particular situations.

  16. Geophysical models of heat and fluid flow in damageable poro-elastic continua

    Science.gov (United States)

    Roubíček, Tomáš

    2017-01-01

    A rather general model for fluid and heat transport in poro-elastic continua undergoing possibly also plastic-like deformation and damage is developed with the goal to cover various specific models of rock rheology used in geophysics of Earth's crust. Nonconvex free energy at small elastic strains, gradient theories (in particular the concept of second-grade nonsimple continua), and Biot poro-elastic model are employed, together with possible large displacement due to large plastic-like strains evolving during long time periods. Also the additive splitting is justified in stratified situations which are of interest in modelling of lithospheric crust faults. Thermodynamically based formulation includes entropy balance (in particular the Clausius-Duhem inequality) and an explicit global energy balance. It is further outlined that the energy balance can be used to ensure, under suitable data qualification, existence of a weak solution and stability and convergence of suitable approximation schemes at least in some particular situations.

  17. Analysis of discrete-to-discrete imaging models for iterative tomographic image reconstruction and compressive sensing

    CERN Document Server

    Jørgensen, Jakob H; Pan, Xiaochuan

    2011-01-01

    Discrete-to-discrete imaging models for computed tomography (CT) are becoming increasingly ubiquitous as the interest in iterative image reconstruction algorithms has heightened. Despite this trend, all the intuition for algorithm and system design derives from analysis of continuous-to-continuous models such as the X-ray and Radon transform. While the similarity between these models justifies some crossover, questions such as what are sufficient sampling conditions can be quite different for the two models. This sampling issue is addressed extensively in the first half of the article using singular value decomposition analysis for determining sufficient number of views and detector bins. The question of full sampling for CT is particularly relevant to current attempts to adapt compressive sensing (CS) motivated methods to application in CT image reconstruction. The second half goes in depth on this subject and discusses the link between object sparsity and sufficient sampling for accurate reconstruction. Par...

  18. Cortes' Multicultural Empowerment Model and¯Generative Teaching and Learning in Science

    Science.gov (United States)

    Loving, Cathleen C.

    Using Cortes' Multicultural Empowerment Model as a guide, and a moderate rational, realist philosophical framework (somewhat broadened by a post modern perspective), I adapt the Cortes' model to science teaching and to Wittrock's Model of Generative Learning and Teaching in science. My goal is to develop and demonstrate a balanced multicultural approach to teaching children of different ethnic cultures about the nature of science - one that both values and teaches their cultures and beliefs, while moving them towards important mainstream notions of good science. I justify the Cortes model by comparing it to other major multicultural approaches. I then interweave Cortes' notion of multicultural empowerment with Wittrock's generative attributes, using a lesson about plants as an example. The intent is to succeed not only in having all children learn science, but also learn about science.

  19. AN ACCURATE MODELING OF DELAY AND SLEW METRICS FOR ON-CHIP VLSI RC INTERCONNECTS FOR RAMP INPUTS USING BURR’S DISTRIBUTION FUNCTION

    Directory of Open Access Journals (Sweden)

    Rajib Kar

    2010-09-01

    Full Text Available This work presents an accurate and efficient model to compute the delay and slew metric of on-chip interconnect of high speed CMOS circuits foe ramp input. Our metric assumption is based on the Burr’s Distribution function. The Burr’s distribution is used to characterize the normalized homogeneous portion of the step response. We used the PERI (Probability distribution function Extension for Ramp Inputs technique that extends delay metrics and slew metric for step inputs to the more general and realistic non-step inputs. The accuracy of our models is justified with the results compared with that of SPICE simulations.

  20. Evaluating Emulation-based Models of Distributed Computing Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T.

    2017-10-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  1. A robust conditional approximation of marginal tail probabilities.

    OpenAIRE

    Brazzale, A. R.; Ventura, L.

    2001-01-01

    The aim of this contribution is to derive a robust approximate conditional procedure used to eliminate nuisance parameters in regression and scale models. Unlike the approximations to exact conditional solutions based on the likelihood function and on the maximum likelihood estimator, the robust conditional approximation of marginal tail probabilities does not suffer from lack of robustness to model misspecification. To assess the performance of the proposed robust conditional procedure the r...

  2. CONCEPTUAL AND METHODOLOGICAL MISTAKES IN PSYCHOLOGY AND HEALTH: A CASE STUDY ON THE USE AND ABUSE OF STRUCTURAL EQUATION MODELLING

    Directory of Open Access Journals (Sweden)

    Julio Alfonso Piña López

    2016-09-01

    Full Text Available In this article, a research paper is analysed, which was justified based on the theory of developmental psychopathology, the protective factors, self-regulation, resilience, and quality of life among individuals who lived with type 2 diabetes and hypertension. Structural equation modelling (SEM was used for the data analysis. Although the authors conclude that the data are adequate to the theory tested, they commit errors of logic, concept, methodology and interpretation which, taken together, demonstrate a flagrant rupture between the theory and the data.

  3. Coordinate Reference System Metadata in Interdisciplinary Environmental Modeling

    Science.gov (United States)

    Blodgett, D. L.; Arctur, D. K.; Hnilo, J.; Danko, D. M.; Rutledge, G. K.

    2011-12-01

    For global climate modeling based on a unit sphere, the positional accuracy of transformations between "real earth" coordinates and the spherical earth coordinates is practically irrelevant due to the coarse grid and precision of global models. Consequently, many climate models are driven by data using real-earth coordinates without transforming them to the shape of the model grid. Additionally, metadata to describe the earth shape and its relationship to latitude longitude demarcations, or datum, used for model output is often left unspecified or ambiguous. Studies of weather and climate effects on coastal zones, water resources, agriculture, biodiversity, and other critical domains typically require positional accuracy on the order of several meters or less. This precision requires that a precise datum be used and accounted for in metadata. While it may be understood that climate model results using spherical earth coordinates could not possibly approach this level of accuracy, precise coordinate reference system metadata is nevertheless required by users and applications integrating climate and geographic information. For this reason, data publishers should provide guidance regarding the appropriate datum to assume for their data. Without some guidance, analysts must make assumptions they are uncomfortable or unwilling to make and may spend inordinate amounts of time researching the correct assumption to make. A consequence of the (practically justified for global climate modeling) disregard for datums is that datums are also neglected when publishing regional or local scale climate and weather data where datum information may be important. For example, observed data, like precipitation and temperature measurements, used in downscaling climate model results are georeferenced precisely. If coordinate reference system metadata are disregarded in cases like this, systematic biases in geolocation can result. Additionally, if no datum transformation was applied to

  4. Giant vesicles "colonies": a model for primitive cell communities.

    Science.gov (United States)

    Carrara, Paolo; Stano, Pasquale; Luisi, Pier Luigi

    2012-07-09

    Current research on the origin of life typically focuses on the self-organisation of molecular components in individual cell-like compartments, thereby bringing about the emergence of self-sustaining minimal cells. This is justified by the fact that single cells are the minimal forms of life. No attempts have been made to investigate the cooperative mechanisms that could derive from the assembly of individual compartments. Here we present a novel experimental approach based on vesicles "colonies" as a model of primitive cell communities. Experiments show that several advantages could have favoured primitive cell colonies when compared with isolated primitive cells. In fact there are two novel unexpected features typical of vesicle colonies, namely solute capture and vesicle fusion, which can be seen as the basic physicochemical mechanisms at the origin of life.

  5. Risk Assessment For Spreadsheet Developments: Choosing Which Models to Audit

    CERN Document Server

    Butler, Raymond J

    2008-01-01

    Errors in spreadsheet applications and models are alarmingly common (some authorities, with justification cite spreadsheets containing errors as the norm rather than the exception). Faced with this body of evidence, the auditor can be faced with a huge task - the temptation may be to launch code inspections for every spreadsheet in an organisation. This can be very expensive and time-consuming. This paper describes risk assessment based on the "SpACE" audit methodology used by H M Customs & Excise's tax inspectors. This allows the auditor to target resources on the spreadsheets posing the highest risk of error, and justify the deployment of those resources to managers and clients. Since the opposite of audit risk is audit assurance the paper also offers an overview of some elements of good practice in the use of spreadsheets in business.

  6. Electromechanical modelling of tapered ionic polymer metal composites transducers

    Directory of Open Access Journals (Sweden)

    Rakesha Chandra Dash

    2016-09-01

    Full Text Available Ionic polymer metal composites (IPMCs are relatively new smart materials that exhibit a bidirectional electromechanical coupling. IPMCs have large number of important engineering applications such as micro robotics, biomedical devices, biomimetic robotics etc. This paper presents a comparison between tapered and uniform cantilevered Nafion based IPMCs transducer. Electromechanical modelling is done for the tapered beam. Thickness can be varied according to the requirement of force and deflection. Numerical results pertaining to the force and deflection characteristics of both type IPMCs transducer are obtained. It is shown that the desired amount of force and deflections for tapered IPMCs can be achieved for a given voltage. Different fixed end (t0 and free end (t1 thickness values have been taken to justify the results using MATLAB.

  7. Principles of the Proposed Czech Postal Sector Price Control Model

    Directory of Open Access Journals (Sweden)

    Libor Švadlenka

    2009-01-01

    Full Text Available The paper deals with the postal sector control. It resultsfrom the control theory and proves the justifiability of control inthe postal sector. Within the price control it results from E U Directive97!67/EC requirements on this control and states individualtypes of price control focusing on ineffective price controlcurrently used in the Czech postal sector (especially withindomestic services and proposes a more effective method ofprice control. The paper also discusses the principles of the proposedmethod of price control of the Czech postal sector. It describesconcrete fulfilment of the price control model resultingfrom the price-cap and tariff formula RP I-X and concentrateson its quantitative expression. The application of the proposedmodel is carried out for a hypothetical period in the past (in orderto compare it with the current control system for letteritems tariff basket.

  8. Leadership Models.

    Science.gov (United States)

    Freeman, Thomas J.

    This paper discusses six different models of organizational structure and leadership, including the scalar chain or pyramid model, the continuum model, the grid model, the linking pin model, the contingency model, and the circle or democratic model. Each model is examined in a separate section that describes the model and its development, lists…

  9. Inhalation Exposure Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rautenstrauch

    2004-09-10

    This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception.

  10. CEAI: CCM-based email authorship identification model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2013-11-01

    Full Text Available In this paper we present a model for email authorship identification (EAI by employing a Cluster-based Classification (CCM technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature set to include some more interesting and effective features for email authorship identification (e.g., the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell. We also included Info Gain feature selection based content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM-based models, as well as the models proposed by Iqbal et al. (2010, 2013 [1,2]. The proposed model attains an accuracy rate of 94% for 10 authors, 89% for 25 authors, and 81% for 50 authors, respectively on Enron dataset, while 89.5% accuracy has been achieved on authors’ constructed real email dataset. The results on Enron dataset have been achieved on quite a large number of authors as compared to the models proposed by Iqbal et al. [1,2].

  11. Formation of an Integrated Stock Price Forecast Model in Lithuania

    Directory of Open Access Journals (Sweden)

    Audrius Dzikevičius

    2016-12-01

    Full Text Available Technical and fundamental analyses are widely used to forecast stock prices due to lack of knowledge of other modern models and methods such as Residual Income Model, ANN-APGARCH, Support Vector Machine, Probabilistic Neural Network and Genetic Fuzzy Systems. Although stock price forecast models integrating both technical and fundamental analyses are currently used widely, their integration is not justified comprehensively enough. This paper discusses theoretical one-factor and multi-factor stock price forecast models already applied by investors at a global level and determines possibility to create and apply practically a stock price forecast model which integrates fundamental and technical analysis with the reference to the Lithuanian stock market. The research is aimed to determine the relationship between stock prices of the 14 Lithuanian companies listed in the Main List by the Nasdaq OMX Baltic and various fundamental variables. Based on correlation and regression analysis results and application of c-Squared Test, ANOVA method, a general stock price forecast model is generated. This paper discusses practical implications how the developed model can be used to forecast stock prices by individual investors and suggests additional check measures.

  12. [Detection of thyroid dysfunction in pregnant women: universal screening is justified].

    Science.gov (United States)

    Vila, Lluís; Velasco, Inés; González, Stella; Morales, Francisco; Sánchez, Emilia; Lailla, José Maria; Martinez-Astorquiza, Txanton; Puig-Domingo, Manel

    2012-11-03

    There is a controversy among different scientific societies in relation to the recommendations on whether universal screening for the detection of thyroid dysfunction during gestation should be performed or not. Although various studies have shown an association between subclinical hypothyroidism or hypothyroxinemia with obstetric problems and/or neurocognitive impairment in the offspring, no evidence on the possible positive effects of treatment of such conditions with thyroxin has been demonstrated so far. However, there is a general agreement about the need for treatment of clinical hypothyroidism during pregnancy and the risks of not doing so. Because it is a common, easily diagnosed and effectively treated disorder without special risk, the working Group of Iodine Deficiency Disorders and Thyroid Dysfunction of the Spanish Society of Endocrinology and Nutrition and Spanish Society of Gynaecology and Obstetrics recommends an early evaluation (before week 10) of thyroid function in all pregnant women. Given the complex physiology of thyroid function during pregnancy, hormone assessment should be performed according to reference values for each gestational trimester and generated locally in each reference laboratory. Thyrotropin determination would be sufficient for screening purposes and only if it is altered, free thyroxin or total thyroxin would be required. Adequate iodine nutrition is also highly recommended before and during pregnancy to contribute to a normal thyroid function in the pregnant women and fetus.

  13. Widespread ocular use of topical chloramphenicol: is there justifiable concern regarding idiosyncratic aplastic anaemia?

    Science.gov (United States)

    McGhee, C N; Anastas, C N

    1996-02-01

    A theoretical but as yet not conclusively proved risk of chloramphenicol induced idiosyncratic aplastic anaemia exists with topical ophthalmic therapy, with the absolute, but highly improbable, maximum risk of death (equalling that of systemic therapy) being 1 in 50,000 to 90,000. To put this in realistic perspective, one must note that the comparable risk of fatal anaphylaxis resulting from penicillin therapy, from any route, is similar at 1 in 100,000. Indeed, it has been noted recently that with more than 200 million ocular chloramphenicol products dispensed in the UK in the past 10 years, only 11 reports (all non-fatal) of suspected topical chloramphenicol induced blood dyscrasia have been reported to the Committee on the Safety of Medicines since 1966. One also has to consider that inadvertent exposure to minute quantities of chloramphenicol (ng/ml) may occur through consumption of livestock that have been treated with chloramphenicol. Broad statements condemning topical chloramphenicol need to be tempered with its proved safety, tolerance, cost, and efficacy while acknowledging an extremely remote risk of the very serious adverse effect of drug induced aplastic anaemia. Risk-benefit assessment is the duty of all prescribing physicians and a decision to prescribe or not prescribe must be made on the basis of personal judgment and an awareness of the statistics in perspective. The only known factor to be associated with vulnerability in the case of topical chloramphenicol is family history. There is no evidence to date that suggests children are any more susceptible than adults.

  14. Does the goal justify the methods? Harm and benefit in neuroscience research using animals.

    Science.gov (United States)

    Vieira de Castro, Ana Catarina; Olsson, I Anna S

    2015-01-01

    The goal of the present chapter is to open up for discussion some of the major ethical issues involved in animal-based neuroscience research. We begin by approaching the question of the moral acceptability of the use of animals in research at all, exploring the implications of three different ethical theories: contractarianism, utilitarianism, and animal rights. In the rest of this chapter, we discuss more specific issues of neuroscience research within what we argue is the mainstream framework for research animal ethics, namely one based on harm-benefit analysis. We explore issues of harms and benefits and how to balance them as well as how to reduce harm and increase benefit within neuroscience research.

  15. The global spread of Zika virus: is public and media concern justified in regions currently unaffected?

    OpenAIRE

    Gyawali, Narayan; Bradbury, Richard S.; Taylor-Robinson, Andrew W.

    2016-01-01

    Background Zika virus, an Aedes mosquito-borne flavivirus, is fast becoming a worldwide public health concern following its suspected association with over 4000 recent cases of microcephaly among newborn infants in Brazil. Discussion Prior to its emergence in Latin America in 2015–2016, Zika was known to exist at a relatively low prevalence in parts of Africa, Asia and the Pacific islands. An extension of its apparent global dispersion may be enabled by climate conditions suitable to support ...

  16. Is screening patients for antibiotic-resistant bacteria justified in the Indian context?

    Directory of Open Access Journals (Sweden)

    S Bhattacharya

    2011-01-01

    Full Text Available Infection with multi-antibiotic-resistant bacteria is a common clinical problem in India. In some countries and centres, screening patients to detect colonisation by these organisms is used to determine specific interventions such as decolonisation treatment, prophylactic antibiotics prior to surgical interventions or for selection of empirical antibiotic therapy, and to isolate patients so that transmission of these difficult to treat organisms to other patients could be prevented. In India, there is no national guideline or recommendation for screening patients for multi-drug-resistant (MDR bacteria such as MRSA (methicillin-resistant Staphylococcus aureus, VRE (vancomycin-resistant enterococcus, ESBL (extended spectrum beta-lactamase or MBL (metallo-beta-lactamase producers. The present article discusses the relevance of screening patients for multi-antibiotic-resistant bacteria in the Indian context. Literature has been reviewed about antibiotic resistance in India, screening methodology, economic debate about screening. The percentages of strains from various hospitals in India which were reported to be MRSA was between 8 and 71%, those for ESBL between 19 and 60% and carbapenem-resistant Gram-negative bacilli between 5.3 and 59%. There exists culture-based technology for the detection of these resistant organisms from patient samples. For some pathogens, such as MRSA and VRE Polymerase chain reaction-based tests are also becoming available. Screening for MDR bacteria is an option which may be used after appraisal of the resources available, and after exploring possibility of implementing the interventions that may be required after a positive screening test result.

  17. Is mandating elective single embryo transfer ethically justifiable in young women?

    Directory of Open Access Journals (Sweden)

    Kelton Tremellen

    2015-12-01

    Full Text Available Compared with natural conception, IVF is an effective form of fertility treatment associated with higher rates of obstetric complications and poorer neonatal outcomes. While some increased risk is intrinsic to the infertile population requiring treatment, the practice of multiple embryo transfer contributes to these complications and outcomes, especially concerning its role in higher order pregnancies. As a result, several jurisdictions (e.g. Sweden, Belgium, Turkey, and Quebec have legally mandated elective single-embryo transfer (eSET for young women. We accept that in very high-risk scenarios (e.g. past history of preterm delivery and poor maternal health, double-embryo transfer (DET should be prohibited due to unacceptably high risks. However, we argue that mandating eSET for all young women can be considered an unacceptable breach of patient autonomy, especially since DET offers certain women financial and social advantages. We also show that mandated eSET is inconsistent with other practices (e.g. ovulation induction and intrauterine insemination–ovulation induction that can expose women and their offspring to risks associated with multiple pregnancies. While defending the option of DET for certain women, some recommendations are offered regarding IVF practice (e.g. preimplantation genetic screening and better support of IVF and maternity leave to incentivise patients to choose eSET.

  18. Empathizing and systemizing (un)justified mediated violence: Psychophysiological indicators of emotional response

    NARCIS (Netherlands)

    Samson, L.; Potter, R.F.

    2016-01-01

    This article examines individual variability in empathizing and systemizing abilities (Baron-Cohen, 2003, 2009) on emotional responses to mediated violence. It is predicted that these abilities influence feelings of distress and enjoyment while processing violent media and that they interact with

  19. The Ethical and Legal Context of Justifying Anti-Doping Attitudes

    Directory of Open Access Journals (Sweden)

    Kosiewicz Jerzy

    2014-06-01

    Full Text Available The reflections presented in the paper are not normative (in general, it can be said, that they do not create moral values and demands. The presented reflections particularly stress the sense, essence, meaning, and identity of sport in the context of moral demands. A disquisition pointing out that sports and sport-related doping can be situated beyond the moral good and evil must be considered precisely as metaethical, and leads in a consciously controversial way to fully defining the identity of sport in general, as well as the identity of particular sports disciplines.

  20. The global spread of Zika virus: is public and media concern justified in regions currently unaffected?

    Institute of Scientific and Technical Information of China (English)

    Narayan Gyawali; Richard S.Bradbury; Andrew W.Taylor-Robinson

    2016-01-01

    Background:Zika virus,an Aedes mosquito-borne flavivirus,is fast becoming a worldwide public health concern following its suspected association with over 4000 recent cases of microcephaly among newborn infants in Brazil.Discussion:Prior to its emergence in Latin America in 2015-2016,Zika was known to exist at a relatively low prevalence in parts of Africa,Asia and the Pacific islands.An extension of its apparent global dispersion may be enabled by climate conditions suitable to support the population growth ofA.aegypti and A.albopictus mosquitoes over an expanding geographical range.In addition,increased globalisation continues to pose a risk for the spread of infection.Further,suspicions of alternative modes of virus transmission (sexual and vertical),if proven,provide a platform for outbreaks in mosquito non-endemic regions as well.Since a vaccine or anti-viral therapy is not yet available,current means of disease prevention involve protection from mosquito bites,excluding pregnant females from travelling to Zika-endemic territories,and practicing safe sex in those countries.Importantly,in countries where Zika is reported as endemic,caution is advised in planning to conceive a baby until such time as the apparent association between infection with the virus and microcephaly is either confirmed or refuted.The question arises as to what advice is appropriate to give in more economically developed countries distant to the current epidemic and in which Zika has not yet been reported.Summary:Despite understandable concern among the general public that has been fuelled by the media,in regions where Zika is not present,such as North America,Europe and Australia,at this time any outbreak (initiated by an infected traveler returning from an endemic area) would very probably be contained locally.Since Aedes spp.has very limited spatial dispersal,overlapping high population densities of mosquitoes and humans would be needed to sustain a focus of infection.However,as A.aegypti is distinctly anthropophilic,future control strategies for Zika should be considered in tandem with the continuing threat to human wellbeing that is presented by dengue,yellow fever and Japanese encephalitis,all of which are transmitted by the same vector species.

  1. Simultaneous thoracic and abdominal transplantation: can we justify two organs for one recipient?

    Science.gov (United States)

    Wolf, J H; Sulewski, M E; Cassuto, J R; Levine, M H; Naji, A; Olthoff, K M; Shaked, A; Abt, P L

    2013-07-01

    Simultaneous thoracic and abdominal (STA) transplantation is controversial because two organs are allocated to a single individual. We studied wait-list urgency, and whether transplantation led to successful outcomes. Candidates and recipients for heart-kidney (SHK), heart-liver (SHLi), lung-liver (SLuLi) and lung-kidney (SLuK) were identified through the United Network for Organ Sharing (UNOS) and outcomes were compared to single-organ transplantation. Since 1987, there were 1801 STA candidates and 836 recipients. Wait-list survival at 1- and 3 years for SHK (67.4%, 40.8%; N = 1420), SHLi (65.7%, 43.6%; N = 218) and SLuLi (65.7%, 41.0%; N = 122), was lower than controls (p organ candidates. STA outcomes were similar to thoracic transplantation; however, outcomes were similar to abdominal transplantation for SHLi only. Although select patients benefit from STA, risk-exposure variables for decreased survival should be identified, aiming to eliminate futile transplantation.

  2. Justifying atrocities: the effect of moral-disengagement strategies on socially shared retrieval-induced forgetting.

    Science.gov (United States)

    Coman, Alin; Stone, Charles B; Castano, Emanuele; Hirst, William

    2014-06-01

    A burgeoning literature has established that exposure to atrocities committed by in-group members triggers moral-disengagement strategies. There is little research, however, on how such moral disengagement affects the degree to which conversations shape people's memories of the atrocities and subsequent justifications for those atrocities. We built on the finding that a speaker's selective recounting of past events can result in retrieval-induced forgetting of related, unretrieved memories for both the speaker and the listener. In the present study, we investigated whether American participants listening to the selective remembering of atrocities committed by American soldiers (in-group condition) or Afghan soldiers (out-group condition) resulted in the retrieval-induced forgetting of unmentioned justifications. Consistent with a motivated-recall account, results showed that the way people's memories are shaped by selective discussions of atrocities depends on group-membership status.

  3. Understanding multicultural attitudes : The role of group status, identification, friendships, and justifying ideologies

    NARCIS (Netherlands)

    Verkuyten, Maykel; Martinovic, Borja

    2006-01-01

    Questions of multiculturalism and the management of cultural diversity are much debated in many countries. The present research aims to further the understanding of people’s attitude toward multiculturalism by examining ethnic majority and minority group adolescents in the Netherlands. In two studie

  4. Stem cells from residual IVF-embryos - Continuation of life justifies isolation.

    NARCIS (Netherlands)

    Bongaerts, G.P.A.; Severijnen, R.S.V.M.

    2007-01-01

    Embryonic stem cells are undifferentiated pluripotent cells that can indefinitely grow in vitro. They are derived from the inner mass of early embryos. Because of their ability to differentiate into all three embryonic germ layers, and finally into specialized somatic cell types, human embryonic ste

  5. Detection of thyroid dysfunction in pregnant women: universal screening is justified.

    Science.gov (United States)

    Vila, Lluís; Velasco, Inés; González, Stella; Morales, Francisco; Sánchez, Emilia; Lailla, José Maria; Martinez-Astorquiza, Txanton; Puig-Domingo, Manel

    2012-11-01

    There is a controversy among different scientific societies in relation to the recommendations on whether universal screening for the detection of thyroid dysfunction during gestation should be performed or not. Although various studies have shown an association between subclinical hypothyroidism or hypothyroxinemia with obstetric problems and/or neurocognitive impairment in the offspring, no evidence on the possible positive effects of treatment of such conditions with thyroxin has been demonstrated so far. However, there is a general agreement about the need for treatment of clinical hypothyroidism during pregnancy and the risks of not doing so. Because it is a common, easily diagnosed and effectively treated disorder without special risk, the working Group of Iodine Deficiency Disorders and Thyroid Dysfunction of the Spanish Society of Endocrinology and Nutrition and Spanish Society of Gynaecology and Obstetrics recommends an early evaluation (before week 10) of thyroid function in all pregnant women. Given the complex physiology of thyroid function during pregnancy, hormone assessment should be performed according to reference values for each gestational trimester and generated locally in each reference laboratory. Thyrotropin determination would be sufficient for screening purposes and only if it is altered, free thyroxin or total thyroxin would be required. Adequate iodine nutrition is also highly recommended before and during pregnancy to contribute to a normal thyroid function in the pregnant women and fetus.

  6. Final final Reginald M.J. Oduor Justifying Non-violent Civil ...

    African Journals Online (AJOL)

    Jimmy Gitonga

    dissent, resulting in political disobedience - instances in which men and women ...... Slave Law in 1850, and still more after John Brown's raid, Thoreau defended ... Gandhi's work greatly influenced the thinking of the African-American Civil.

  7. Chemistry by Mobile Phone (or how to justify more time at the bar)

    OpenAIRE

    Robinson, Jamie M.; Frey, Jeremy G; Stanford-Clark, Andy J.; Reynolds, Andrew D.; Bedi, Bharat V.

    2005-01-01

    By combining automatic environment monitoring with Java smartphones a system has been produced for the real-time monitoring of experiments whilst away from the lab. Changes in the laboratory environment are encapsulated as simple XML messages, which are published using an MQTT compliant broker. Clients subscribe to the MQTT stream, and produce a user display. An MQTT client written for the Java MIDP platform, can be run on a smartphone with a GPRS Internet connection, freeing us from the cons...

  8. Naval Air Systems Command Lakehurst Contracts Awarded Without Competition Were Properly Justified

    Science.gov (United States)

    2012-01-20

    and Arm Adapters 1/27/2010 FFP FAR 6.302-1 $126,225.00 16 N68335-09-C-0080 Product 15 Butterfly Shutoff Valves 1/22/2009 FFP FAR 6.302-1...software and the only qualifying source on the drawings. The Government does not own the data necessary to manufacture the valve J&A and Business

  9. Paying their way? Do nonprofit hospitals justify their favorable tax treatment?

    Science.gov (United States)

    Schneider, Helen

    2007-01-01

    This study addresses the effect of hospital ownership on the delivery of services to medically indigent patients and on their communities, using two alternative definitions of community benefits. Using data from hospitals in California, the study finds that in similar markets, the amount of community benefits provided by a tax-exempt private hospital is equivalent in value to that provided by an investor-owned hospital. These results are sensitive to the definition of community benefits, thus indicating need for a more explicit identification and minimum standard of the community benefits expected of nonprofit hospitals in return for their special tax treatment.

  10. Can Breast Self-Examination Continue to Be Touted Justifiably as an Optional Practice?

    Directory of Open Access Journals (Sweden)

    T. T. Fancher

    2011-01-01

    Full Text Available In 2003, the revised American Cancer Society guidelines recommended that breast self-examination (BSE be optional. Of 822 women diagnosed with breast cancer in our hospital from 1994 to 2004, sixty four (7.7% were 40 years of age or younger. Forty four (68.7% of these young women discovered their breast cancers on BSE, 17 (18% by mammography, and 3 (4.7% by clinical breast examination by medical professionals. Of 758 women over 40 years of age diagnosed with breast cancer, 382 (49% discovered their cancer by mammography, 278 (39% by BSE, and 98 (14% by a clinical breast examination. Lymph node metastases in the older women was one-half that in the younger women (21% versus 42%, and a higher percentage of younger women presented with more advanced disease. In response to increasing breast cancer in young women under 41 years of age, encouragement of proper breast self-examination is warranted and should be advocated.

  11. The global spread of Zika virus: is public and media concern justified in regions currently unaffected?

    Science.gov (United States)

    Gyawali, Narayan; Bradbury, Richard S; Taylor-Robinson, Andrew W

    2016-04-19

    Zika virus, an Aedes mosquito-borne flavivirus, is fast becoming a worldwide public health concern following its suspected association with over 4000 recent cases of microcephaly among newborn infants in Brazil. Prior to its emergence in Latin America in 2015-2016, Zika was known to exist at a relatively low prevalence in parts of Africa, Asia and the Pacific islands. An extension of its apparent global dispersion may be enabled by climate conditions suitable to support the population growth of A. aegypti and A. albopictus mosquitoes over an expanding geographical range. In addition, increased globalisation continues to pose a risk for the spread of infection. Further, suspicions of alternative modes of virus transmission (sexual and vertical), if proven, provide a platform for outbreaks in mosquito non-endemic regions as well. Since a vaccine or anti-viral therapy is not yet available, current means of disease prevention involve protection from mosquito bites, excluding pregnant females from travelling to Zika-endemic territories, and practicing safe sex in those countries. Importantly, in countries where Zika is reported as endemic, caution is advised in planning to conceive a baby until such time as the apparent association between infection with the virus and microcephaly is either confirmed or refuted. The question arises as to what advice is appropriate to give in more economically developed countries distant to the current epidemic and in which Zika has not yet been reported. Despite understandable concern among the general public that has been fuelled by the media, in regions where Zika is not present, such as North America, Europe and Australia, at this time any outbreak (initiated by an infected traveler returning from an endemic area) would very probably be contained locally. Since Aedes spp. has very limited spatial dispersal, overlapping high population densities of mosquitoes and humans would be needed to sustain a focus of infection. However, as A. aegypti is distinctly anthropophilic, future control strategies for Zika should be considered in tandem with the continuing threat to human wellbeing that is presented by dengue, yellow fever and Japanese encephalitis, all of which are transmitted by the same vector species.

  12. Is non-directive counseling for patient choice cesarean delivery ethically justified?

    Science.gov (United States)

    Kalish, Robin B; McCullough, Laurence B; Chervenak, Frank A

    2007-01-01

    The current controversy concerning patient choice cesarean delivery potentially affects all women of child-bearing age and the physicians who care for them. The purpose of this paper is to address three salient issues within the patient choice cesarean delivery controversy. First, is performing patient choice cesarean delivery consistent with good professional medical practice? Second, how should physicians respond to or counsel patients who request patient choice cesarean delivery? And, third, should patient choice cesarean delivery be routinely offered to all pregnant women?

  13. Can wood quality justify local preferences for firewood in an area of caatinga (dryland) vegetation?

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Marcelo Alves; Medeiros, Patricia Muniz de; Almeida, Alyson Luiz Santos de; Albuquerque, Ulysses Paulino de [Laboratorio de Etnobotanica Aplicada, Departamento de Biologia, Area de Botanica, Universidade Federal Rural de Pernambuco, Av. Dom Manoel de Medeiros s/n, Dois Irmaos, CEP: 52171-900, Recife, Pernambuco (Brazil); Feliciano, Ana Licia Patriota [Departamento de Ciencia Florestal, Area de Silvicultura, Universidade Federal Rural de Pernambuco, Av. Dom Manoel de Medeiros s/n, Dois Irmaos, CEP: 52171-900, Recife, Pernambuco (Brazil)

    2008-06-15

    Studies have been undertaken in many parts of the world to evaluate the qualities of fuelwood, but rarely is this information associated with an examination of the preferences of the local populations. As such, the present study sought to address the question of whether local preferences for fuelwoods can be explained by the physical characteristics of the wood itself. To that end, the residents of 102 domiciles in a rural community in NE Brazil were interviewed and a list was compiled of all the plants used and preferred for domestic use. These woods were subsequently analyzed to determine their density, water content, and Fuel Value Index (FVI). Although a total of 67 species were identified by the residents, only 14 were described as being preferred - due to their great number of desirable attributes for cooking. The density, humidity, and FVI of 38 species used and/or preferred were determined. A significant relationship (p<0.05) was noted between plants with the highest FVIs and the most preferred fuelwood plants in the region, indicating that local preference could be explained by the physical properties that were examined. (author)

  14. Quality of life in healing diabetic wounds: does the end justify the means?

    Science.gov (United States)

    Armstrong, David G; Lavery, Lawrence A; Wrobel, James S; Vileikyte, Loretta

    2008-01-01

    The objective of this investigation was to compare the health-related quality of life (QoL) among persons participating in a randomized clinical trial of pressure-offloading modalities to heal diabetic foot wounds and diabetic neuropathic foot ulcers. In this prospective clinical trial, 63 patients with superficial noninfected, non-ischemic plantar neuropathic diabetic foot ulcers were randomized to 1 of 3 offloading modalities: total contact cast (TCC), a half-shoe, or a removable cast walker (RCW). A Short-Form 36 questionnaire (SF-36) was used to measure health-related QoL of patients before and after the 12-week study period. The overall mean baseline physical and mental summary scores for the entire population studied were 65.2 +/- 6.5 and 60.7 +/- 5.3, respectively. There were statistically significant differences between the pre- and posttreatment responses in 7 of the 8 SF-36 scales, with the nonsignificant trend in all cases signifying improvement in overall QoL. Patients' overall QoL improved regardless of the pressure-offloading device employed, although this trend was erased when the groups were dichotomized based on whether or not they healed during the study period. In conclusion, the results of this study suggest the potential moderating role of closure of a foot ulcer on the effects of the offloading modality on a patient's QoL. Specifically, the results suggest that in diabetic patients with neuropathic foot ulcers, QoL may have less to do with how an index wound is treated than it does with whether or not the wound heals.

  15. Can Fontan Conversion for Patients Without Late Fontan Complications be Justified?

    Science.gov (United States)

    Higashida, Akihiko; Hoashi, Takaya; Kagisaki, Koji; Shimada, Masatoshi; Ohuchi, Hideo; Shiraishi, Isao; Ichikawa, Hajime

    2017-06-01

    Fontan conversion from a classic Fontan operation such as atriopulmonary connection to total cavopulmonary connection with antiarrhythmia surgery is currently not indicated for patients without any late Fontan complications. Thirty-two consecutive patients who underwent Fontan conversion between 1991 and 2012 were divided into 2 groups by the presence (group 1: n = 25, atrial tachyarrhythmia [AT] in 24 and protein-losing enteropathy in 4) or absence (group 2: n = 7) of late Fontan complications, and the surgical outcomes were retrospectively compared. During the study period, heart transplantation was not indicated for patients with failed Fontan circulation in Japan. The mean follow-up period was 6.2 ± 3.7 years in group 1 and 4.6 ± 3.8 years in group 2 (p = 0.29). Overall survival rate at 10 years after conversion was 71% in group 1 and 100% in group 2 (p = 0.12). Whereas preoperative AT and protein-losing enteropathy remained after conversion in 8 patients (33%) and all 4 patients (100%), respectively, in group 1, neither were observed in group 2. Cardiac catheter examinations presurgery (n = 32) at 1 year (n = 28), and at 5 years (n = 19) after the conversion showed that the cardiac index significantly and similarly improved in both groups after the conversion, and maintained for at least 5 years. Cardiac output similarly improved after Fontan conversion in patients with or without late Fontan complications by elimination of venous blood congestion on Fontan pathways. Although long-term follow-up is mandatory, newly onset AT was not observed after prophylactic Fontan conversion. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  16. Female genital mutilation of minors in Italy: is a harmless and symbolic alternative justified?

    Directory of Open Access Journals (Sweden)

    Maria Luisa Di Pietro

    2012-09-01

    Full Text Available

    In 2004, Omar Abdulcadir - a gynecologist of the Centre for the prevention and therapy of female genital mutilation (FMG at the Careggi Hospital (Florence - proposed a “harmless and symbolic” alternative to FMG, which consists in the puncture of the clitoris under local anesthesia, in order to allow the outflow of some drops of blood (1.

    The intention behind the symbolic alternative is to avoid more severe forms of FGM while respecting cultural heritage. The proposal of this alternative procedure, which was sustained by the leaders of 10 local African immigrant communities, has encountered ample criticism (1.

    However, the question is: is the puncture of the clitoris prohibited by the Italian Law n. 7/2006? If it is not, could it be considered a method of reducing health risks caused by the more invasive forms of FGM (2? Or could it culturally legitimize FGM, causing a greater difficulty in the attempts to prevent and eradicate FGM in Italy?

  17. To Vaccinate or Not to Vaccinate: How Teenagers Justified Their Decision

    Science.gov (United States)

    Lundstrom, Mats; Ekborg, Margareta; Ideland, Malin

    2012-01-01

    This article reports on a study of how teenagers made their decision on whether or not to vaccinate themselves against the new influenza. Its purpose was to identify connections between how teenagers talk about themselves and the decision they made. How do the teenagers construct their identities while talking about a specific socio-scientific…

  18. Stem cells from residual IVF-embryos - Continuation of life justifies isolation.

    NARCIS (Netherlands)

    Bongaerts, G.P.A.; Severijnen, R.S.V.M.

    2007-01-01

    Embryonic stem cells are undifferentiated pluripotent cells that can indefinitely grow in vitro. They are derived from the inner mass of early embryos. Because of their ability to differentiate into all three embryonic germ layers, and finally into specialized somatic cell types, human embryonic

  19. Stem cells from residual IVF-embryos - Continuation of life justifies isolation.

    Science.gov (United States)

    Bongaerts, Ger P A; Severijnen, René S V M

    2007-01-01

    Embryonic stem cells are undifferentiated pluripotent cells that can indefinitely grow in vitro. They are derived from the inner mass of early embryos. Because of their ability to differentiate into all three embryonic germ layers, and finally into specialized somatic cell types, human embryonic stem cells represent important material for studying developmental biology and cell replacement therapy. They are usually isolated from excess human IVF-embryos. Since many people regard isolation of human stem cells as intentional killing of the embryo, it is a very difficult ethical problem. Similar feelings concern medical or scientific use of these stem cells. Is this feeling correct, or does it arise from a sentimental view? The problem encloses two aspects: (i) use of stem cells for medical therapy and scientific research and (ii) isolation of stem cells from human IVF-embryos. Worldwide human tissues are cultured, transplanted and used for medical and scientific research. Therefore, it may be concluded that factual use of human embryonic stem cells cannot be a real ethical problem. The main key of the problem seems to be hidden in the exact definition of 'death'; in other words: is there nothing between 'death' and 'life'? Bacterial spores, lyophilised bacteria and other micro-organisms, micro-organisms stored in glycerol mixtures at -80 degrees C and tissue cultures and sperm cells stored in liquid nitrogen, they are all neither dead nor alive, but still viable. From this point it is clear that there is more than the antithesis 'dead' versus 'alive'. In addition, we think that there is still another alternative: partial death. The present view concerning isolation of stem cells implies that residual embryos and thus new human lives are killed, and that therefore these embryos must be (passively) destroyed. However, it is especially the very well planned IVF-procedure that makes that passive destruction of not-implanted embryos means intentional killing. By isolation of stem cells embryos are not fully killed: at least one embryonic cell, i.e., a stem cell, remains alive. The life of stem cells cannot be qualified as independent. Nevertheless, the embryo's life is not completely stopped and continues in a primitive way of life and consequently it is not completely dead. Against this background we feel that isolation of human embryonic stem cells is preferred instead of passive destruction.

  20. Double effect: a useful rule that alone cannot justify hastening death.

    Science.gov (United States)

    Billings, J Andrew

    2011-07-01

    The rule of double effect is regularly invoked in ethical discussions about palliative sedation, terminal extubation and other clinical acts that may be viewed as hastening death for imminently dying patients. Unfortunately, the literature tends to employ this useful principle in a fashion suggesting that it offers the final word on the moral acceptability of such medical procedures. In fact, the rule cannot be applied appropriately without invoking moral theories that are not explicit in the rule itself. Four tenets of the rule each require their own ethical justification. A variety of moral theories are relevant to making judgements in a pluralistic society. Much of the rich moral conversation germane to the rule has been reflected in arguments about physician-assisted suicide and voluntary active euthanasia, but the rule itself has limited relevance to these debates, and requires its own moral justifications when applied to other practices that might hasten death.

  1. Can Defense Spending Be Justified during a Period of Continual Peace?

    Science.gov (United States)

    1991-06-07

    Vilfredo Pareto , that economic welfare is increased if one person is made better off and no one is made worse off. Similarly, welfare is decreased when one...being made worse off. This is called the Pareto optimum. The rule does not prescribe how an increase in welfare is distributed. An increase in...welfare occurs so long as one person (does not matter who) is made better off, and everyone else is at lease not worse off. Nonetheless, Pareto optimality

  2. XM25 Schedule Delays, Cost Increases, and Performance Problems Continue, and Procurement Quantity Not Justified (REDACTED)

    Science.gov (United States)

    2016-08-29

    I N T E G R I T Y  E F F I C I E N C Y  A C C O U N TA B I L I T Y  E XC E L L E N C E Inspector General U.S. Department of Defense Report ...within 30 days of the issuance of this report . We recommend that the Commanding General , U.S. Army Maneuver Center of Excellence, perform and...DEPARTMENT OF DEFENSE 4800 MARK CENTER DRIVE ALEXANDRIA, VIRGINIA 22350-1500 August 29, 2016 MEMORANDUM FOR THE AUDITOR GENERAL , DEPARTMENT OF THE ARMY

  3. Laparoscopic appendicectomy for complicated appendicitis: is it safe and justified?: A retrospective analysis.

    Science.gov (United States)

    Khiria, Lakshman S; Ardhnari, Ramesh; Mohan, Narshimhan; Kumar, Palaniappan; Nambiar, Rajesh

    2011-06-01

    Although laparoscopic appendectomy has some advantages over open appendectomy, the literature suggests conflicting results regarding postoperative complications for complicated appendicitis. A retrospective review of patients with complicated appendicitis managed surgically at Meenakshi Mission Hospital and Research Center, Madurai, Tamilnadu, India was undertaken. A total of 497 patients were admitted with acute appendicitis and operated during the study period of 10 years from January 1999 to July 2009, out of which 119 (24%) patients had complicated appendicitis whereas 378 (76%) had uncomplicated acute appendicitis. The mean age of patients included in the study was 33.42 years (range, 4 to 80 y) with a male: female ratio of 2:1. Ninety-nine patients (83.19%) underwent laparoscopic appendicectomy and 1 patient underwent laparoscopic-assisted right hemicolectomy for suspected mass lesion of the cecum. Eleven patients (9.24%) underwent open appendicectomy because of preoperative clinical features of peritonitis in 10 patients and mass in 1 patient. Seven patients (5.88%) had conversion from laparoscopic to open procedure. The overall mean operating time was 68 minutes (25 to 180 min). For laparoscopic appendicectomy, 66 minutes (25 to 180 min), for open appendicectomy 76 minutes (50 to 110 min), for lap to open conversion 85 minutes (40 to 135 min), and for drainage procedure 67 minutes (60 to 75 min). A total of 28 patients developed complication in the form of wound infection (7), pneumonia (8), intra-abdominal abscess (11), and enterocutaneous fistula (2) after percutaneous drainage of intra-abdominal collection. All were managed conservatively and no mortality occurred. The morbidity rates, particularly for intra-abdominal abscesses and wound infection were less for laparoscopic appendectomy in complicated appendicitis than those reported in the literature for open appendectomy, whereas operating times and hospital stays were similar.

  4. Are physicians strikes ever morally justifiable? A call for a return to tradition

    Directory of Open Access Journals (Sweden)

    Munyaradzi Mawere

    2010-08-01

    Full Text Available Though physicians strike provides an opportunity to generate more knowledge about the process in which legitimacy of an organization can be restored, it meets with a great deal of resistance not only by the public but from within the medical profession. This paper critically examines the legitimacy of strike by medical doctors heretofore referred to as physicians. Though critically reflecting on strikes of physicians in general, the paper makes more emphasis on Africa where physician strikes are rampant. More importantly, the paper argues that strike implies a failure for everyone in the organization (including the strikers themselves, not only the responsible government or authority. This is because when a strike occurs, an organization/fraternity is subjected to questions, scrutiny and slander. It becomes difficult to decouple what is said, decided and done. Traditionally, all medical fraternities the world-over are committed to acting comfortably to external demands- guaranteeing the patients lives and public health. By paying attention to external reactions, the medical fraternity adapts and learns what ought and should be done so that it is never again caught in the same messy. At the same time, the fraternity prepares itself for the future strikes. When the fraternity and those outside consider it is doing up to the external expectations, its lost legitimacy is restored. When legitimacy is restored, external pressure like once disturbed water returns to normal

  5. Introduction in Indonesian Social Sciences and Humanities Research Articles: How Indonesian Writers Justify Their Research Projects

    Science.gov (United States)

    Arsyad, Safnil; Wardhana, Dian Eka Chandra

    2014-01-01

    The introductory part of a research article (RA) is very important because in this section writers must argue about the importance of their research topic and project so that they can attract their readers' attention to read the whole article. This study analyzes RA introductions written by Indonesian writers in social sciences and humanities…

  6. Is the Inclusion of Animal Source Foods in Fortified Blended Foods Justified?

    Directory of Open Access Journals (Sweden)

    Kristen E. Noriega

    2014-09-01

    Full Text Available Fortified blended foods (FBF are used for the prevention and treatment of moderate acute malnutrition (MAM in nutritionally vulnerable individuals, particularly children. A recent review of FBF recommended the addition of animal source food (ASF in the form of whey protein concentrate (WPC, especially to corn-soy blends. The justification for this recommendation includes the potential of ASF to increase length, weight, muscle mass accretion and recovery from wasting, as well as to improve protein quality and provide essential growth factors. Evidence was collected from the following four different types of studies: (1 epidemiological; (2 ASF versus no intervention or a low-calorie control; (3 ASF versus an isocaloric non-ASF; and (4 ASF versus an isocaloric, isonitrogenous non-ASF. Epidemiological studies consistently associated improved growth outcomes with ASF consumption; however, little evidence from isocaloric and isocaloric, isonitrogenous interventions was found to support the inclusion of meat or milk in FBF. Evidence suggests that whey may benefit muscle mass accretion, but not linear growth. Overall, little evidence supports the costly addition of WPC to FBFs. Further, randomized isocaloric, isonitrogenous ASF interventions with nutritionally vulnerable children are needed.

  7. Is routine histological examination of mastectomy scars justified? An analysis of 619 scars.

    Science.gov (United States)

    Momeni, Arash; Tran, Pelu; Dunlap, Jonathan; Lee, Gordon K

    2013-02-01

    The increasing incidence of breast cancer is paralleled by an increasing demand for post-mastectomy breast reconstruction. At the time of breast reconstruction routine submission of mastectomy scars has been considered appropriate clinical practice to ensure that no residual cancer exists. However, this practice has been challenged by some and has become the topic of controversy. In a retrospective analysis we wished to assess whether routine submission of mastectomy scars altered treatment. Utilizing the Stanford Translational Research Integrated Database Environment (STRIDE) all patients who underwent implant-based breast reconstruction with routine histological analysis of mastectomy scars were identified. The following parameters were retrieved and analyzed: age, cancer histology, cancer stage (according to the American Joint Committee on Cancer staging system), receptor status (estrogen receptor [ER], progesterone receptor [PR], Her2neu), time interval between mastectomy and reconstruction, and scar histology. A total of 442 patients with a mean age of 45.9 years (range, 22-73 years) were included in the study. Mastectomy with subsequent reconstruction was performed for in-situ disease and invasive cancer in 83 and 359 patients, respectively. A total of 619 clinically unremarkable mastectomy scars were sent for histological analysis, with the most common finding being unremarkable scar tissue (i.e. collagen fibers). Of note, no specimen revealed the presence of carcinoma. According to published reports routine histological examination of mastectomy scars may detect early local recurrence. However, we were not able to detect this benefit in our patient population. As such, particularly in the current health-care climate the cost-effectiveness of this practice deserves further attention. A more selective use of histological analysis of mastectomy scars in patients with tumors that display poor prognostic indicators may be a more reasonable utilization of resources. Copyright © 2012 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  8. Is Routine Histological Examination of Mastectomy Scars Justified? – An Analysis of 619 Scars

    Science.gov (United States)

    Momeni, Arash; Tran, Pelu; Dunlap, Jonathan; Lee, Gordon K.

    2012-01-01

    Background The increasing incidence of breast cancer is paralleled by an increasing demand for post-mastectomy breast reconstruction. At the time of breast reconstruction routine submission of mastectomy scars has been considered appropriate clinical practice to ensure that no residual cancer exists. However, this practice has been challenged by some and has become the topic of controversy. In a retrospective analysis we wished to assess whether routine submission of mastectomy scars altered treatment. Methods Utilizing the Stanford Translational Research Integrated Database Environment (STRIDE) all patients who underwent implant-based breast reconstruction with routine histological analysis of mastectomy scars were identified. The following parameters were retrieved and analyzed: age, cancer histology, cancer stage (according to the American Joint Committee on Cancer staging system), receptor status (estrogen receptor [ER], progesterone receptor [PR], Her2neu), time interval between mastectomy and reconstruction, and scar histology. Results A total of 442 patients with a mean age of 45.9 years (range, 22 to 73 years) were included in the study. Mastectomy with subsequent reconstruction was performed for in-situ disease and invasive cancer in 83 and 359 patients, respectively. A total of 619 clinically unremarkable mastectomy scars were sent for histological analysis, with the most common finding being unremarkable scar tissue (i.e. collagen fibers). Of note, no specimen revealed the presence of carcinoma. Conclusion According to published reports routine histological examination of mastectomy scars may detect early local recurrence. However, we were not able to detect this benefit in our patient population. As such, particularly in the current health-care climate the cost-effectiveness of this practice deserves further attention. A more selective use of histological analysis of mastectomy scars in patients with tumors that display poor prognostic indicators may be a more reasonable utilization of resources. PMID:23044349

  9. Justifying Environmental Cost Allocation in a Multiple Product Firm: A Case Study

    Directory of Open Access Journals (Sweden)

    Collins C. Ngwakwe

    2009-12-01

    Full Text Available This case study examines the effect of environmental cost allocation on production cost and the outcome for environmental management decisions. Using a revised cost allocation – referred to in this paper as environmental cost allocation – the paper contrasts overhead allocation between traditional cost allocation and environmental cost allocation. In addition, production cost derived from the traditional allocation of waste cost is compared with the revised environmental cost allocation. Findings indicate that a revised environmental cost allocation discloses more accurate overhead cost and hence production cost; and that management is motivated to make informed environmental management decisions if a product related environmental cost is made to reflect in the production cost of the polluting product. The paper highlights the practical significance of objective environmental cost allocation on corporate waste management, which thus creates a valuable awareness on the part of the management and accountants of firms in developing countries for the need to fine-tune the dominant traditional costing system. It also suggests avenues for further research to examine the impact of costing systems on environmental investments.

  10. 78 FR 40479 - Declaration That Circumstances Exist Justifying Authorization of Emergency Use of All Oral...

    Science.gov (United States)

    2013-07-05

    ..., radiological, or nuclear agent or agents--in this case, Bacillus anthracis--pursuant to section 564(b)(1)(A) of... or agents--in this case, Bacillus anthracis--although there is no current domestic emergency... an imminent threat of an attack involving Bacillus anthracis. On October 1, 2008, on the basis...

  11. Justified Cross-Site Scripting Attacks Prevention from Client-Side

    Directory of Open Access Journals (Sweden)

    A.MONIKA

    2014-07-01

    Full Text Available Web apps are fetching towards the overriding way to offer access to web services. In parallel, vulnerabilities of web application are being revealed and unveiled at an frightening rate. Web apps frequently make JavaScript code utilization that is entrenched into web pages to defend client-side behavior which is dynamic. This script code is accomplished in the circumstance of the client’s web browser. From malicious JavaScript code to shield the client’s environment, a mechanism known as sandboxing is utilized that confines a program to admittance only resources connected with its origin website. Regrettably, these protection mechanisms not succeed if a client can be attracted into malicious JavaScript code downloading from an in-between, faithful site. In this situation, the wicked script is approved complete entrée to each and every resource (for example cookies and authentication tokens that be in the right place to the trusted/faithful site. Those types of attacks are described as XSS (crosssite scripting attacks. Commonly, cross-site scripting attacks are simple to perform, but complicated to identify and stop. One cause is the far above the ground HTML encoding methods flexibility, presenting the attacker a lot of chances for circumventing input filters on the server-side that must put off malicious scripts from entering into trusted/faithful sites. Also, developing a client-side way out is not simple cause of the complicatedness of recognizing JavaScript code as formatted as malicious. This theory shows that noxes is the finest of our understanding the initial client-side resolution to moderate cross-site scripting attacks. Noxes works as a web proxy and utilizes both automatically and manual produced rules to moderate possible cross-site scripting efforts. Noxes efficiently defends against data outflow from the client’s environment while needs least client communication and customization attempt.

  12. [Is the presence of an asymptomatic inguinal hernia enough to justify repair?].

    Science.gov (United States)

    Metzger, Jürg

    2015-11-11

    The risk of strangulation in case of a inguinal hernia is low. Patients with a symptomatic inguinal hernia should undergo an operation. Morbidity and mortality in inguinal hernia surgery are very rare. There is also non-conservative treatment of inguinal hernias. Trusses should no longer be recommended. Watchful waiting is an option for men with minimally symptomatic or asymptomatic inguinal hernias. But patients must be informed that there is a high risk of becoming symptomatic.

  13. Paediatric and adolescent horse-related injuries: does the mechanism of injury justify a trauma response?

    Science.gov (United States)

    Craven, John A

    2008-08-01

    To identify the frequency, variety and disposition of horse-related injury presentations to the ED and to use this information to evaluate the existing institutional trauma team activation criteria following horse-related injuries. A retrospective case analysis was performed of all horse-related injury presentations to the ED of Women's and Children's Hospital, Adelaide, Australia, in the 5 year period between January 1999 and December 2003. A total of 186 children presented with horse-related injuries during the 5 year study period. The median age of injury was 9 years (range 1-17 years), with 81% of presentations female and 60% of patients hospitalized. The mechanism of injury was divided into four groups: 148 falls (79%), 28 kicks (15%), 7 tramples (4%) and 5 bites (3%). There was one death. Seven presentations rated an Injury Severity Score >15, with full trauma team activation occurring for two of these presentations. Although horse-related injury presentations are uncommon, severe injuries do occur. Patients presenting with severe horse-related injuries do not always activate a full trauma team response based on current trauma team activation criteria. These severe injury presentations are supported by a limited trauma team response, which activates on the mechanism of injury. The effectiveness of this as a contingency system needs to be evaluated.

  14. Does the distinction between killing and letting die justify some forms of euthanasia?

    Science.gov (United States)

    Dines, A

    1995-05-01

    Nurses will often be at the forefront in a health care team which decides that it is ethically acceptable to allow a particular patient to die but the possibility of actually killing another patient who requests his own death is met with heartfelt resistance. This paper explores the distinction between killing and letting die and the connection these arguments have with justifications for euthanasia. Two practical examples of killing and letting die are analysed. Various issues are examined including the active-passive distinction, the difference between causing and preventing a death and the probability of outcome. Justifications for euthanasia are then revisited and some concluding remarks made.

  15. Applied ELT: A Paradigm Justifying Complex Adaptive System of Language Teaching?

    Directory of Open Access Journals (Sweden)

    Masoud Mahmoodzadeh

    2013-12-01

    Full Text Available In an endeavor to reflect on the advent of Applied ELT paradigm pioneered by Pishghadam (2011 in the area of second language education, this article delves into the unexplored nature of this emerging paradigm via a contemporary complexity-driven voice. The crux of the argument addressed in this article suggests that Applied ELT is a pragmatic manifestation of complex adaptive system of language teaching. To set the grounds expressly for such enquiry, firstly it draws on both premises and axioms associated with complexity theory and its existing literature in the circle of second language research. It then tracks down the evolutionary course of the new developed paradigm of Applied ELT within the realm of second language education and also elaborates the cornerstone and manifold tenets of this paradigm sufficiently. Finally, the article attempts to critically elucidate and rationalize the recent emergence of Applied ELT paradigm through the lens of complexity theory. To broaden our thinking and understanding about the potential and multi-directional influence of ELT field, the article ends by calling for a reshaped educational direction for ELT position in second language education.

  16. Acute abscess with fistula: long-term results justify drainage and fistulotomy.

    Science.gov (United States)

    Benjelloun, E B; Jarrar, A; El Rhazi, K; Souiki, T; Ousadden, A; Ait Taleb, K

    2013-09-01

    Conventional treatment of anal abscess by a simple drainage continues to be routine in many centers despite retrospective and randomized data showing that primary fistulotomy at the time of abscess drainage is safe and efficient. The purpose of this study is to report the long-term results of fistulotomy in the treatment of anal abscesses. This is a prospective nonrandomized study of 165 consecutive patients treated for anal abscess in University Hospital Hassan II, Fez, Morocco, between January 2005 and December 2010. Altogether 102 patients were eligible to be included in the study. Among them, 52 were treated by a simple drainage and 50 by drainage with fistulotomy. The results were analyzed in terms of recurrence and incontinence after a median follow-up of 3.2 years (range 2-6 years). The groups were comparable in terms of age, gender distribution, type and size of abscess. The recurrence rate after surgery was significantly higher in the group treated by drainage alone (88 %) compared to other group treated by drainage and fistulotomy (4, 8 %) (p fistulotomy group (5 % vs 1 %), although this difference was not significant (p = 0.27). In the group treated by drainage and fistulotomy, high fistula tract patients are more prone to develop incontinence and recurrence, mainly within the first year. A long-term follow-up seems not to influence the results of fistulotomy group. These findings confirm that fistulotomy is an efficient and safe treatment of anal abscess with good long-term results. An exception is a high fistula, where fistulotomy may be associated with a risk of recurrence and incontinence.

  17. Debate forum: levocarnitine therapy is rational and justified in selected dialysis patients.

    Science.gov (United States)

    Schreiber, Brian D

    2006-01-01

    Carnitine is a metabolic cofactor which is essential for normal fatty acid metabolism. Patients with chronic kidney disease on dialysis have been shown both to suffer from disordered fatty acid metabolism and to have a significant deficiency in plasma and tissue carnitine. Aberrant fatty acid metabolism has been associated with a number of cellular abnormalities such as increased mitochondrial permeability (a promoter of apoptosis), insulin resistance, and enhanced generation of free radicals. These cellular abnormalities have, in turn, been correlated with pathological clinical conditions common in dialysis patients including cardiomyopathy with attendant hypotension and resistance to the therapeutic effect of recombinant human erythropoietin (EPO). In 1999, the Food and Drug Administration approved levocarnitine injection for the prevention and treatment of carnitine deficiency in patients on dialysis based on documentation of free plasma carnitine levels in dialysis patients similar to other serious carnitine deficiency states for which treatment was required. Data analysis performed by expert panels convened by both the American Association of Kidney Patients and, subsequently, the National Kidney Foundation recommended a trial of levocarnitine therapy for specific subsets of dialysis patients including those with EPO resistance, dialysis-related hypotension, cardiomyopathy and muscle weakness. In 2003, the Centers for Medicare and Medicaid services convened a Medical Advisory Committee which established reimbursement on a national level for carnitine-deficient dialysis patients who had either dialysis-related hypotension or EPO resistance. Recently, a correlation between reductions in hospitalization rates of dialysis patients receiving levocarnitine therapy has been demonstrated in a large retrospective study. Despite data-based recommendations and national reimbursement, only a small minority of dialysis patients have been prescribed a therapeutic trial of levocarnitine. Whereas the reasons for the reluctance of nephrologists to prescribe this therapeutic trial are unclear, possible explanations include a lack of appreciation of the pivotal role played by carnitine in cellular metabolism and the strength of evidence for a substantial deficiency of carnitine in dialysis patients, an underestimation of the prognostic import of EPO resistance and dialysis-related hypotension, inadequate dissemination of the clinical trial data supporting the use of levocarnitine in dialysis patients, and the heterogeneous clinical response of dialysis patients to levocarnitine therapy. Difficulties in documenting both initial eligibility and evidence of improvement as a result of therapy may also be a contributing factor. This paper discusses the biological role of carnitine and its particular relevance to dialysis patients. Clinical trial data concerning an effect of therapy on EPO resistance and dialysis-related hypotension are summarized along with a discussion of the logic behind the use of levocarnitine in dialysis. Finally, the difficulties posed by a reimbursement policy based on clinical as opposed to laboratory endpoints and a heterogeneous response to therapy are addressed. Copyright 2006 S. Karger AG, Basel.

  18. Are therapeutic LDL goals justified? Controversies between the European and American guidelines

    Directory of Open Access Journals (Sweden)

    Vicente Bertomeu-Martínez

    2016-12-01

    Full Text Available Resumen La dislipemia es uno de los factores de riesgo más importantes de las enfermedades cardiovasculares, por lo que su tratamiento es una de las estrategias claves de la prevención cardiovascular. Las estatinas se han consolidado como el tratamiento de referencia para la reducción de los niveles séricos de colesterol. Existen algunas divergencias entre las guías americana y europea en el tratamiento de la dislipemia. En esta revisión narrativa se discuten los puntos clave de esta controversia.

  19. FAST implementation in Bangladesh: high frequency of unsuspected tuberculosis justifies challenges of scale-up.

    Science.gov (United States)

    Nathavitharana, R R; Daru, P; Barrera, A E; Mostofa Kamal, S M; Islam, S; Ul-Alam, M; Sultana, R; Rahman, M; Hossain, Md S; Lederer, P; Hurwitz, S; Chakraborty, K; Kak, N; Tierney, D B; Nardell, E

    2017-09-01

    National Institute of Diseases of the Chest and Hospital, Dhaka; Bangladesh Institute of Research and Rehabilitation in Diabetes, Endocrine and Metabolic Disorders, Dhaka; and Chittagong Chest Disease Hospital, Chittagong, Bangladesh. To present operational data and discuss the challenges of implementing FAST (Find cases Actively, Separate safely and Treat effectively) as a tuberculosis (TB) transmission control strategy. FAST was implemented sequentially at three hospitals. Using Xpert® MTB/RIF, 733/6028 (12.2%, 95%CI 11.4-13.0) patients were diagnosed with unsuspected TB. Patients with a history of TB who were admitted with other lung diseases had more than twice the odds of being diagnosed with unsuspected TB as those with no history of TB (OR 2.6, 95%CI 2.2-3.0, P FAST implementation revealed a high frequency of unsuspected TB in hospitalized patients in Bangladesh. Patients with a previous history of TB have an increased risk of being diagnosed with unsuspected TB. Ensuring financial resources, stakeholder engagement and laboratory capacity are important for sustainability and scalability.

  20. Justifying quasiparticle self-consistent schemes via gradient optimization in Baym-Kadanoff theory

    Science.gov (United States)

    Ismail-Beigi, Sohrab

    2017-09-01

    The question of which non-interacting Green’s function ‘best’ describes an interacting many-body electronic system is both of fundamental interest as well as of practical importance in describing electronic properties of materials in a realistic manner. Here, we study this question within the framework of Baym-Kadanoff theory, an approach where one locates the stationary point of a total energy functional of the one-particle Green’s function in order to find the total ground-state energy as well as all one-particle properties such as the density matrix, chemical potential, or the quasiparticle energy spectrum and quasiparticle wave functions. For the case of the Klein functional, our basic finding is that minimizing the length of the gradient of the total energy functional over non-interacting Green’s functions yields a set of self-consistent equations for quasiparticles that is identical to those of the quasiparticle self-consistent GW (QSGW) (van Schilfgaarde et al 2006 Phys. Rev. Lett. 96 226402-4) approach, thereby providing an a priori justification for such an approach to electronic structure calculations. In fact, this result is general, applies to any self-energy operator, and is not restricted to any particular approximation, e.g., the GW approximation for the self-energy. The approach also shows that, when working in the basis of quasiparticle states, solving the diagonal part of the self-consistent Dyson equation is of primary importance while the off-diagonals are of secondary importance, a common observation in the electronic structure literature of self-energy calculations. Finally, numerical tests and analytical arguments show that when the Dyson equation produces multiple quasiparticle solutions corresponding to a single non-interacting state, minimizing the length of the gradient translates into choosing the solution with largest quasiparticle weight.