WorldWideScience

Sample records for model misspecification justifying

  1. Linearity and Misspecification Tests for Vector Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    Teräsvirta, Timo; Yang, Yukai

    The purpose of the paper is to derive Lagrange multiplier and Lagrange multiplier type specification and misspecification tests for vector smooth transition regression models. We report results from simulation studies in which the size and power properties of the proposed asymptotic tests in small...

  2. Generalized Information Matrix Tests for Detecting Model Misspecification

    Directory of Open Access Journals (Sweden)

    Richard M. Golden

    2016-11-01

    Full Text Available Generalized Information Matrix Tests (GIMTs have recently been used for detecting the presence of misspecification in regression models in both randomized controlled trials and observational studies. In this paper, a unified GIMT framework is developed for the purpose of identifying, classifying, and deriving novel model misspecification tests for finite-dimensional smooth probability models. These GIMTs include previously published as well as newly developed information matrix tests. To illustrate the application of the GIMT framework, we derived and assessed the performance of new GIMTs for binary logistic regression. Although all GIMTs exhibited good level and power performance for the larger sample sizes, GIMT statistics with fewer degrees of freedom and derived using log-likelihood third derivatives exhibited improved level and power performance.

  3. Testing constancy of unconditional variance in volatility models by misspecification and specification tests

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Terasvirta, Timo

    fitting a GARCH model to the data is discussed. The power of the ensuing test is vastly superior to that of the misspecification test and the size distortion minimal. The test has reasonable power already in very short time series. It would thus serve as a test of constant variance in conditional mean......The topic of this paper is testing the hypothesis of constant unconditional variance in GARCH models against the alternative that the unconditional variance changes deterministically over time. Tests of this hypothesis have previously been performed as misspecification tests after fitting a GARCH...... model to the original series. It is found by simulation that the positive size distortion present in these tests is a function of the kurtosis of the GARCH process. Adjusting the size by numerical methods is considered. The possibility of testing the constancy of the unconditional variance before...

  4. Misspecification in Latent Change Score Models: Consequences for Parameter Estimation, Model Evaluation, and Predicting Change.

    Science.gov (United States)

    Clark, D Angus; Nuttall, Amy K; Bowles, Ryan P

    2018-01-01

    Latent change score models (LCS) are conceptually powerful tools for analyzing longitudinal data (McArdle & Hamagami, 2001). However, applications of these models typically include constraints on key parameters over time. Although practically useful, strict invariance over time in these parameters is unlikely in real data. This study investigates the robustness of LCS when invariance over time is incorrectly imposed on key change-related parameters. Monte Carlo simulation methods were used to explore the impact of misspecification on parameter estimation, predicted trajectories of change, and model fit in the dual change score model, the foundational LCS. When constraints were incorrectly applied, several parameters, most notably the slope (i.e., constant change) factor mean and autoproportion coefficient, were severely and consistently biased, as were regression paths to the slope factor when external predictors of change were included. Standard fit indices indicated that the misspecified models fit well, partly because mean level trajectories over time were accurately captured. Loosening constraint improved the accuracy of parameter estimates, but estimates were more unstable, and models frequently failed to converge. Results suggest that potentially common sources of misspecification in LCS can produce distorted impressions of developmental processes, and that identifying and rectifying the situation is a challenge.

  5. The effect of mis-specification on mean and selection between the Weibull and lognormal models

    Science.gov (United States)

    Jia, Xiang; Nadarajah, Saralees; Guo, Bo

    2018-02-01

    The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.

  6. A Lagrange multiplier-type test for idiosyncratic unit roots in the exact factor model under misspecification

    NARCIS (Netherlands)

    Zhou, X.; Solberger, M.

    2013-01-01

    We consider an exact factor model and derive a Lagrange multiplier-type test for unit roots in the idiosyncratic components. The asymptotic distribution of the statistic is derived under the misspecification that the differenced factors are white noise. We prove that the asymptotic distribution is

  7. A Bayesian approach to identifying and compensating for model misspecification in population models.

    Science.gov (United States)

    Thorson, James T; Ono, Kotaro; Munch, Stephan B

    2014-02-01

    State-space estimation methods are increasingly used in ecology to estimate productivity and abundance of natural populations while accounting for variability in both population dynamics and measurement processes. However, functional forms for population dynamics and density dependence often will not match the true biological process, and this may degrade the performance of state-space methods. We therefore developed a Bayesian semiparametric state-space model, which uses a Gaussian process (GP) to approximate the population growth function. This offers two benefits for population modeling. First, it allows data to update a specified "prior" on the population growth function, while reverting to this prior when data are uninformative. Second, it allows variability in population dynamics to be decomposed into random errors around the population growth function ("process error") and errors due to the mismatch between the specified prior and estimated growth function ("model error"). We used simulation modeling to illustrate the utility of GP methods in state-space population dynamics models. Results confirmed that the GP model performs similarly to a conventional state-space model when either (1) the prior matches the true process or (2) data are relatively uninformative. However, GP methods improve estimates of the population growth function when the function is misspecified. Results also demonstrated that the estimated magnitude of "model error" can be used to distinguish cases of model misspecification. We conclude with a discussion of the prospects for GP methods in other state-space models, including age and length-structured, meta-analytic, and individual-movement models.

  8. Why item parcels are (almost) never appropriate: two wrongs do not make a right--camouflaging misspecification with item parcels in CFA models.

    Science.gov (United States)

    Marsh, Herbert W; Lüdtke, Oliver; Nagengast, Benjamin; Morin, Alexandre J S; Von Davier, Matthias

    2013-09-01

    The present investigation has a dual focus: to evaluate problematic practice in the use of item parcels and to suggest exploratory structural equation models (ESEMs) as a viable alternative to the traditional independent clusters confirmatory factor analysis (ICM-CFA) model (with no cross-loadings, subsidiary factors, or correlated uniquenesses). Typically, it is ill-advised to (a) use item parcels when ICM-CFA models do not fit the data, and (b) retain ICM-CFA models when items cross-load on multiple factors. However, the combined use of (a) and (b) is widespread and often provides such misleadingly good fit indexes that applied researchers might believe that misspecification problems are resolved--that 2 wrongs really do make a right. Taking a pragmatist perspective, in 4 studies we demonstrate with responses to the Rosenberg Self-Esteem Inventory (Rosenberg, 1965), Big Five personality factors, and simulated data that even small cross-loadings seriously distort relations among ICM-CFA constructs or even decisions on the number of factors; although obvious in item-level analyses, this is camouflaged by the use of parcels. ESEMs provide a viable alternative to ICM-CFAs and a test for the appropriateness of parcels. The use of parcels with an ICM-CFA model is most justifiable when the fit of both ICM-CFA and ESEM models is acceptable and equally good, and when substantively important interpretations are similar. However, if the ESEM model fits the data better than the ICM-CFA model, then the use of parcels with an ICM-CFA model typically is ill-advised--particularly in studies that are also interested in scale development, latent means, and measurement invariance.

  9. Structural Break Tests Robust to Regression Misspecification

    NARCIS (Netherlands)

    Abi Morshed, Alaa; Andreou, E.; Boldea, Otilia

    2016-01-01

    Structural break tests developed in the literature for regression models are sensitive to model misspecification. We show - analytically and through simulations - that the sup Wald test for breaks in the conditional mean and variance of a time series process exhibits severe size distortions when the

  10. The impact of exposure model misspecification on signal detection in prospective pharmacovigilance.

    Science.gov (United States)

    van Gaalen, Rolina D; Abrahamowicz, Michal; Buckeridge, David L

    2015-05-01

    Pharmacovigilance monitors the safety of drugs after their approval and marketing. Timely detection of adverse effects is important. The true relationship between time-varying drug use and the adverse event risk is typically unknown. Yet, most current pharmacovigilance studies rely on arbitrarily chosen exposure metrics such as current exposure or use in the past 3 months. The authors used simulations to assess the impact of a misspecified exposure model on the timeliness of adverse effect detection. Prospective pharmacovigilance studies were simulated assuming different true relationships between time-varying drug use and the adverse event hazard. Simulated data were analyzed by fitting conventional parametric and more complex spline-based estimation models at multiple, pre-specified testing times. The 'signal' was generated on the basis of the corrected model-specific p-value selected to ensure a 5% probability of incorrectly rejecting the null hypothesis of no association. Results indicated that use of an estimation model that diverged substantially from the true underlying association-reduced sensitivity and increased the time to detection of a clinically important association. Time to signal detection in pharmacovigilance may depend strongly on the method chosen to model the exposure. No single estimation model performed optimally across different simulated scenarios, suggesting the need for data-dependent criteria to select the model most appropriate for a given study. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Estimating the mean and standard deviation of environmental data with below detection limit observations: Considering highly skewed data and model misspecification.

    Science.gov (United States)

    Shoari, Niloofar; Dubé, Jean-Sébastien; Chenouri, Shoja'eddin

    2015-11-01

    In environmental studies, concentration measurements frequently fall below detection limits of measuring instruments, resulting in left-censored data. Some studies employ parametric methods such as the maximum likelihood estimator (MLE), robust regression on order statistic (rROS), and gamma regression on order statistic (GROS), while others suggest a non-parametric approach, the Kaplan-Meier method (KM). Using examples of real data from a soil characterization study in Montreal, we highlight the need for additional investigations that aim at unifying the existing literature. A number of studies have examined this issue; however, those considering data skewness and model misspecification are rare. These aspects are investigated in this paper through simulations. Among other findings, results show that for low skewed data, the performance of different statistical methods is comparable, regardless of the censoring percentage and sample size. For highly skewed data, the performance of the MLE method under lognormal and Weibull distributions is questionable; particularly, when the sample size is small or censoring percentage is high. In such conditions, MLE under gamma distribution, rROS, GROS, and KM are less sensitive to skewness. Related to model misspecification, MLE based on lognormal and Weibull distributions provides poor estimates when the true distribution of data is misspecified. However, the methods of rROS, GROS, and MLE under gamma distribution are generally robust to model misspecifications regardless of skewness, sample size, and censoring percentage. Since the characteristics of environmental data (e.g., type of distribution and skewness) are unknown a priori, we suggest using MLE based on gamma distribution, rROS and GROS. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Wolf in sheep's clothing: Model misspecification undermines tests of the neutral theory for life histories.

    Science.gov (United States)

    Authier, Matthieu; Aubry, Lise M; Cam, Emmanuelle

    2017-05-01

    Understanding the processes behind change in reproductive state along life-history trajectories is a salient research program in evolutionary ecology. Two processes, state dependence and heterogeneity, can drive the dynamics of change among states. Both processes can operate simultaneously, begging the difficult question of how to tease them apart in practice. The Neutral Theory for Life Histories (NTLH) holds that the bulk of variations in life-history trajectories is due to state dependence and is hence neutral: Once previous (breeding) state is taken into account, variations are mostly random. Lifetime reproductive success (LRS), the number of descendants produced over an individual's reproductive life span, has been used to infer support for NTLH in natura. Support stemmed from accurate prediction of the population-level distribution of LRS with parameters estimated from a state dependence model. We show with Monte Carlo simulations that the current reliance of NTLH on LRS prediction in a null hypothesis framework easily leads to selecting a misspecified model, biased estimates and flawed inferences. Support for the NTLH can be spurious because of a systematic positive bias in estimated state dependence when heterogeneity is present in the data but ignored in the analysis. This bias can lead to spurious positive covariance between fitness components when there is in fact an underlying trade-off. Furthermore, neutrality implied by NTLH needs a clarification because of a probable disjunction between its common understanding by evolutionary ecologists and its translation into statistical models of life-history trajectories. Irrespective of what neutrality entails, testing hypotheses about the dynamics of change among states in life histories requires a multimodel framework because state dependence and heterogeneity can easily be mistaken for each other.

  13. The Impact of Model Misspecification on Parameter Estimation and Item-Fit Assessment in Log-Linear Diagnostic Classification Models

    Science.gov (United States)

    Kunina-Habenicht, Olga; Rupp, Andre A.; Wilhelm, Oliver

    2012-01-01

    Using a complex simulation study we investigated parameter recovery, classification accuracy, and performance of two item-fit statistics for correct and misspecified diagnostic classification models within a log-linear modeling framework. The basic manipulated test design factors included the number of respondents (1,000 vs. 10,000), attributes (3…

  14. Which level of model complexity is justified by your data? A Bayesian answer

    Science.gov (United States)

    Schöniger, Anneli; Illman, Walter; Wöhling, Thomas; Nowak, Wolfgang

    2016-04-01

    When judging the plausibility and utility of a subsurface flow or transport model, the question of justifiability arises: which level of model complexity can still be justified by the available calibration data? Although it is common sense that more data are needed to reasonably constrain the parameter space of a more complex model, there is a lack of tools that can objectively quantify model justifiability as a function of the available data. We propose an approach to determine model justifiability in the context of comparing alternative conceptual models. Our approach rests on Bayesian model averaging (BMA). BMA yields posterior model probabilities that point the modeler to an optimal trade-off between model performance in reproducing a given calibration data set and model complexity. To find out which level of complexity can be justified by the available data, we disentangle the complexity component of the trade-off from its performance counterpart. Technically, we remove the performance component from the BMA analysis by replacing the actually observed data values with potential measurement values as predicted by the models. Our proposed analysis results in a "model confusion matrix". Based on this matrix, the modeler can identify the maximum level of model complexity that could possibly be justified by the available amount and type of data. As a side product, model (dis-)similarity is revealed. We have applied the model justifiability analysis to a case of aquifer characterization via hydraulic tomography. Four models of vastly different complexity have been proposed to represent the heterogeneity in hydraulic conductivity of a sandbox aquifer, ranging from a homogeneous medium to geostatistical random fields. We have used drawdown data from two to six pumping tests to condition the models and to determine model justifiability as a function of data set size. Our test case shows that a geostatistical parameterization scheme requires a substantial amount of

  15. Model Justified Search Algorithms for Scheduling Under Uncertainty

    National Research Council Canada - National Science Library

    Howe, Adele; Whitley, L. D

    2008-01-01

    .... We also identified plateaus as a significant barrier to superb performance of local search on scheduling and have studied several canonical discrete optimization problems to discover and model the nature of plateaus...

  16. Are stock prices too volatile to be justified by the dividend discount model?

    Science.gov (United States)

    Akdeniz, Levent; Salih, Aslıhan Altay; Ok, Süleyman Tuluğ

    2007-03-01

    This study investigates excess stock price volatility using the variance bound framework of LeRoy and Porter [The present-value relation: tests based on implied variance bounds, Econometrica 49 (1981) 555-574] and of Shiller [Do stock prices move too much to be justified by subsequent changes in dividends? Am. Econ. Rev. 71 (1981) 421-436.]. The conditional variance bound relationship is examined using cross-sectional data simulated from the general equilibrium asset pricing model of Brock [Asset prices in a production economy, in: J.J. McCall (Ed.), The Economics of Information and Uncertainty, University of Chicago Press, Chicago (for N.B.E.R.), 1982]. Results show that the conditional variance bounds hold, hence, our hypothesis of the validity of the dividend discount model cannot be rejected. Moreover, in our setting, markets are efficient and stock prices are neither affected by herd psychology nor by the outcome of noise trading by naive investors; thus, we are able to control for market efficiency. Consequently, we show that one cannot infer any conclusions about market efficiency from the unconditional variance bounds tests.

  17. [On the relation between encounter rate and population density: Are classical models of population dynamics justified?].

    Science.gov (United States)

    Nedorezov, L V

    2015-01-01

    A stochastic model of migrations on a lattice and with discrete time is considered. It is assumed that space is homogenous with respect to its properties and during one time step every individual (independently of local population numbers) can migrate to nearest nodes of lattice with equal probabilities. It is also assumed that population size remains constant during certain time interval of computer experiments. The following variants of estimation of encounter rate between individuals are considered: when for the fixed time moments every individual in every node of lattice interacts with all other individuals in the node; when individuals can stay in nodes independently, or can be involved in groups in two, three or four individuals. For each variant of interactions between individuals, average value (with respect to space and time) is computed for various values of population size. The samples obtained were compared with respective functions of classic models of isolated population dynamics: Verhulst model, Gompertz model, Svirezhev model, and theta-logistic model. Parameters of functions were calculated with least square method. Analyses of deviations were performed using Kolmogorov-Smirnov test, Lilliefors test, Shapiro-Wilk test, and other statistical tests. It is shown that from traditional point of view there are no correspondence between the encounter rate and functions describing effects of self-regulatory mechanisms on population dynamics. Best fitting of samples was obtained with Verhulst and theta-logistic models when using the dataset resulted from the situation when every individual in the node interacts with all other individuals.

  18. A computational shape-based model of anger and sadness justifies a configural representation of faces.

    Science.gov (United States)

    Neth, Donald; Martinez, Aleix M

    2010-08-06

    Research suggests that configural cues (second-order relations) play a major role in the representation and classification of face images; making faces a "special" class of objects, since object recognition seems to use different encoding mechanisms. It is less clear, however, how this representation emerges and whether this representation is also used in the recognition of facial expressions of emotion. In this paper, we show how configural cues emerge naturally from a classical analysis of shape in the recognition of anger and sadness. In particular our results suggest that at least two of the dimensions of the computational (cognitive) space of facial expressions of emotion correspond to pure configural changes. The first of these dimensions measures the distance between the eyebrows and the mouth, while the second is concerned with the height-width ratio of the face. Under this proposed model, becoming a face "expert" would mean to move from the generic shape representation to that based on configural cues. These results suggest that the recognition of facial expressions of emotion shares this expertise property with the other processes of face processing. Copyright 2010 Elsevier Ltd. All rights reserved.

  19. Justifying an information system.

    Science.gov (United States)

    Neal, T

    1993-03-01

    A four-step model for the hospital pharmacist to use in justifying a computerized information system is described. In the first step, costs are identified and analyzed. Both the costs and the advantages of the existing system are evaluated. A request for information and a request for proposal are prepared and sent to vendors, who return estimates of hardware, software, and support costs. Costs can then be merged and analyzed as one-time costs, recurring annual costs, and total costs annualized over five years. In step 2, benefits are identified and analyzed. Tangible economic benefits are those that directly reduce or avoid costs or directly enhance revenues and can be measured in dollars. Intangible economic benefits are realized through a reduction in overhead and reallocation of labor and are less easily measured in dollars. Noneconomic benefits, some involving quality-of-care issues, can also be used in the justification. Step 3 consists of a formal risk assessment in which the project is broken into categories for which specific questions are answered by assigning a risk factor. In step 4, both costs and benefits are subjected to a financial analysis, the object of which is to maximize the return on investment to the institution from the capital being requested. Calculations include return on investment based on the net present value of money, internal rate of return, payback period, and profitability index. A well-designed justification for an information system not only identifies the costs, risks, and benefits but also presents a plan of action for realizing the benefits.

  20. Justifying Action Research

    Science.gov (United States)

    Helskog, Guro Hansen

    2014-01-01

    In this paper I use a general philosophy of science perspective in looking at the problem of justifying action research. First I try to clarify the concept of justification, by contrasting it with the concept of validity, which seems to be used almost as a synonym in some parts of the literature. I discuss the need for taking a stand in relation…

  1. IRT Model Misspecification and Measurement of Growth in Vertical Scaling

    Science.gov (United States)

    Bolt, Daniel M.; Deng, Sien; Lee, Sora

    2014-01-01

    Functional form misfit is frequently a concern in item response theory (IRT), although the practical implications of misfit are often difficult to evaluate. In this article, we illustrate how seemingly negligible amounts of functional form misfit, when systematic, can be associated with significant distortions of the score metric in vertical…

  2. Semi-Nonparametric Estimation and Misspecification Testing of Diffusion Models

    DEFF Research Database (Denmark)

    Kristensen, Dennis

    of the estimators and tests under the null are derived, and the power properties are analyzed by considering contiguous alternatives. Test directly comparing the drift and diffusion estimators under the relevant null and alternative are also analyzed. Markov Bootstrap versions of the test statistics are proposed...... to improve on the finite-sample approximations. The finite sample properties of the estimators are examined in a simulation study....

  3. Supplier-induced demand: re-examining identification and misspecification in cross-sectional analysis.

    Science.gov (United States)

    Peacock, Stuart J; Richardson, Jeffrey R J

    2007-09-01

    This paper re-examines criticisms of cross-sectional methods used to test for supplier-induced demand (SID) and re-evaluates the empirical evidence using data from Australian medical services. Cross-sectional studies of SID have been criticised on two grounds. First, and most important, the inclusion of the doctor supply in the demand equation leads to an identification problem. This criticism is shown to be invalid, as the doctor supply variable is stochastic and depends upon a variety of other variables including the desirability of the location. Second, cross-sectional studies of SID fail diagnostic tests and produce artefactual findings due to model misspecification. Contrary to this, the re-evaluation of cross-sectional Australian data indicate that demand equations that do not include the doctor supply are misspecified. Empirical evidence from the re-evaluation of Australian medical services data supports the notion of SID. Demand and supply equations are well specified and have very good explanatory power. The demand equation is identified and the desirability of a location is an important predictor of the doctor supply. Results show an average price elasticity of demand of 0.22 and an average elasticity of demand with respect to the doctor supply of 0.46, with the impact of SID becoming stronger as the doctor supply rises. The conclusion we draw from this paper is that two of the main criticisms of the empirical evidence supporting the SID hypothesis have been inappropriately levelled at the methods used. More importantly, SID provides a satisfactory, and robust, explanation of the empirical data on the demand for medical services in Australia.

  4. Measurement Model Specification Error in LISREL Structural Equation Models.

    Science.gov (United States)

    Baldwin, Beatrice; Lomax, Richard

    This LISREL study examines the robustness of the maximum likelihood estimates under varying degrees of measurement model misspecification. A true model containing five latent variables (two endogenous and three exogenous) and two indicator variables per latent variable was used. Measurement model misspecification considered included errors of…

  5. Justifying democracy and its authority

    Directory of Open Access Journals (Sweden)

    Mladenović Ivan

    2016-01-01

    Full Text Available In this paper I will discuss a recent attempt of justifying democracy and its authority. It pertains to recently published papers by Niko Kolodny, which complement each other and taken together practically assume a form of a monograph (Kolodny 2014a, Kolodny 2014b. It could be said that Kolodny’s approach is a non-standard one given that he avoids typical ways of justifying democracy. Namely, when a justification of democracy is concerned, Kolodny maintains that it is necessary to offer a kind of an independent justification. It is not so much that he insists that the usual approaches are wrong as much as that an independent justification is necessary in order to discern what it is that gives them their significance. Kolodny’s independent justification of democracy is based on the idea of social equality. In this paper I will try to reconstruct and critically assess Kolodny’s approach by paying special attention to the question of democratic authority. [Projekat Ministarstva nauke Republike Srbije, br. 43007

  6. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...

  7. Univariate and Multivariate Specification Search Indices in Covariance Structure Modeling.

    Science.gov (United States)

    Hutchinson, Susan R.

    1993-01-01

    Simulated population data were used to compare relative performances of the modification index and C. Chou and P. M. Bentler's Lagrange multiplier test (a multivariate generalization of a modification index) for four levels of model misspecification. Both indices failed to recover the true model except at the lowest level of misspecification. (SLD)

  8. Does benefit justify research with children?

    Science.gov (United States)

    Binik, Ariella

    2018-01-01

    The inclusion of children in research gives rise to a difficult ethical question: What justifies children's research participation and exposure to research risks when they cannot provide informed consent? This question arises out of the tension between the moral requirement to obtain a subject's informed consent for research participation, on the one hand, and the limited capacity of most children to provide informed consent, on the other. Most agree that children's participation in clinical research can be justified. But the ethical justification for exposing children to research risks in the absence of consent remains unclear. One prevalent group of arguments aims to justify children's risk exposure by appealing to the concept of benefit. I call these 'benefit arguments'. Prominent versions of this argument defend the idea that broadening our understanding of the notion of benefit to include non-medical benefits (such as the benefit of a moral education) helps to justify children's research participation. I argue that existing benefit arguments are not persuasive and raise problems with the strategy of appealing to broader notions of benefit to justify children's exposure to research risk. © 2017 The Authors. Bioethics Published by John Wiley & Sons Ltd.

  9. The Self-Justifying Desire for Happiness

    DEFF Research Database (Denmark)

    Rodogno, Raffaele

    2004-01-01

    In Happiness, Tabensky equates the notion of happiness to Aristotelian eudaimonia. I shall claim that doing so amounts to equating two concepts that moderns cannot conceptually equate, namely, the good for a person and the good person or good life. In §2 I examine the way in which Tabensky deals...... with this issue and claim that his idea of happiness is as problematic for us moderns as is any translation of the notion of eudaimonia in terms of happiness. Naturally, if happiness understood as eudaimonia is ambiguous, so will be the notion of a desire for happiness, which we find at the core of Tabensky......'s whole project. In §3 I shall be concerned with another aspect of the desire for happiness; namely, its alleged self-justifying nature. I will attempt to undermine the idea that this desire is self-justifying by undermining the criterion on which Tabensky takes self-justifiability to rest, i.e. its...

  10. Misspecifications of stimulus presentation durations in experimental psychology: a systematic review of the psychophysics literature.

    Science.gov (United States)

    Elze, Tobias

    2010-09-29

    In visual psychophysics, precise display timing, particularly for brief stimulus presentations, is often required. The aim of this study was to systematically review the commonly applied methods for the computation of stimulus durations in psychophysical experiments and to contrast them with the true luminance signals of stimuli on computer displays. In a first step, we systematically scanned the citation index Web of Science for studies with experiments with stimulus presentations for brief durations. Articles which appeared between 2003 and 2009 in three different journals were taken into account if they contained experiments with stimuli presented for less than 50 milliseconds. The 79 articles that matched these criteria were reviewed for their method of calculating stimulus durations. For those 75 studies where the method was either given or could be inferred, stimulus durations were calculated by the sum of frames (SOF) method. In a second step, we describe the luminance signal properties of the two monitor technologies which were used in the reviewed studies, namely cathode ray tube (CRT) and liquid crystal display (LCD) monitors. We show that SOF is inappropriate for brief stimulus presentations on both of these technologies. In extreme cases, SOF specifications and true stimulus durations are even unrelated. Furthermore, the luminance signals of the two monitor technologies are so fundamentally different that the duration of briefly presented stimuli cannot be calculated by a single method for both technologies. Statistics over stimulus durations given in the reviewed studies are discussed with respect to different duration calculation methods. The SOF method for duration specification which was clearly dominating in the reviewed studies leads to serious misspecifications particularly for brief stimulus presentations. We strongly discourage its use for brief stimulus presentations on CRT and LCD monitors.

  11. On Three Ways to Justify Religious Beliefs

    NARCIS (Netherlands)

    Brümmer, V.

    2001-01-01

    This paper compares the ways in which revealed theology, natural theology and philosophical theology justify religious belief. Revealed theology does so with an appeal to revelation and natural theology with an appeal to reason and perception. It is argued that both are inadequate. Philosophical

  12. Justifiability of Littering: An Empirical Investigation

    OpenAIRE

    Benno Torgler; Maria A. Garcia-Valinas; Alison Macintyre

    2008-01-01

    This paper investigates the relationship between voluntary participation in environmental organisations and the justifiability of littering behaviour. Previous empirical work regarding determinants of littering and littering behaviour remains scarce, particularly in socio-economic analysis. We address these deficiencies, demonstrating a strong empirical link between environmental participation and reduced public littering in the European Values Survey (EVS) data for 30 Western and Eastern Eur...

  13. Justifying community benefit requirements in international research.

    Science.gov (United States)

    Hughes, Robert C

    2014-10-01

    It is widely agreed that foreign sponsors of research in low- and middle-income countries (LMICs) are morally required to ensure that their research benefits the broader host community. There is no agreement, however, about how much benefit or what type of benefit research sponsors must provide, nor is there agreement about what group of people is entitled to benefit. To settle these questions, it is necessary to examine why research sponsors have an obligation to benefit the broader host community, not only their subjects. Justifying this claim is not straightforward. There are three justifications for an obligation to benefit host communities that each apply to some research, but not to all. Each requires a different amount of benefit, and each requires benefit to be directed toward a different group. If research involves significant net risk to LMIC subjects, research must provide adequate benefit to people in LMICs to avoid an unjustified appeal to subjects' altruism. If research places significant burdens on public resources, research must provide fair compensation to the community whose public resources are burdened. If research is for profit, research sponsors must contribute adequately to the upkeep of public goods from which they benefit in order to avoid the wrong of free-riding, even if their use of these public goods is not burdensome. © Published 2012. This article is a U.S. Government work and is in the public domain in the USA.

  14. Is the selenium drinking water standard justified?

    Science.gov (United States)

    Lafond, M G; Calabrese, E J

    1979-08-01

    Four cases are presented which suggest that the present U.S.E.P.A. drinking water standard for selenium of 10 micrograms/L in inappropriate. The rationale upon which this standard is based is that selenium is carcinogenic, induces dental caries formation, and is highly toxic to animals. However, a critical assessment of this literature can not support these claims. Case no. 1 demonstrates that there is insufficient evidence to classify selenium as a carcinogen. Data derived from the three respective groups of researchers claiming a carcinogenic effect induced by selenium are obscure due to 1) the inability to accurately identify malignancies, 2) the apparent opposite effects of different selenium compounds, and 3) the lack of proper controls. Case no. 2 reviews recent evidence that selenium reduces the incidence of cancer in laboratory animals and in man, an effect which can probably be attributed to the antioxidant properties of selenium compounds. Case no. 3 provides evidence which does not permit the classification of selenium as a cariogenic element. Epidemiological studies supporting such a claim are inadequate since they lack properly matched control groups. Animal data do not support this link as well. Case no. 4 is a review of studies which clearly demonstrate the essentiality of selenium, an aspect of selenium metabolism that was not considered when the 10 micrograms/L standard was promulgated. In light of the four cases presented and an assessment of selenium toxicity in man, it is concluded that the 10 micrograms/L standard can not be justified. Instead, it is suggested that 50 micrograms/L selenium should provide sufficient protection from the toxic effects of this element. This is consistent with the current state of knowledge with respect to the potential adverse health effects associated with selenium.

  15. Coercion in psychiatric care: can paternalism justify coercion?

    Science.gov (United States)

    Seo, Mi Kyung; Kim, Seung Hyun; Rhee, MinKyu

    2013-05-01

    It has long been debated whether coercion can be justified as paternalism in the field of mental health and it is still a continuing issue of controversy today. This study analyses whether coercive intervention in mental health can be justified by the basic assumptions of paternalists: the assumption of incompetence, the assumption of dangerousness and the assumption of impairment. This study involved 248 patients: 158 (63.7%) were diagnosed with schizophrenia and 90 (36.3%) were diagnosed with mood disorder. In this study, experiences of coercion were divided into legal status, subjective (perceived coercion) and objective experiences (experienced coercion). The assumption of incompetence was justified in all three categories of coercion whereas the assumption of dangerousness was not justified in any. The assumption of impairment was not justified in legal status and perceived coercion, but provided a partial explanation to serve as a basis for justifying experienced coercive measures. It can be noted that mental health experts who support paternalism without question must reconsider their previous methods. Above all, the reason why the assumption of dangerousness was not justified in any of the categories of coercion was because coercive intervention used to prevent harm to oneself and others must be very carefully carried out.

  16. Does uncertainty justify intensity emission caps?

    International Nuclear Information System (INIS)

    Quirion, Philippe

    2005-01-01

    Environmental policies often set 'relative' or 'intensity' emission caps, i.e. emission limits proportional to the polluting firm's output. One of the arguments put forth in favour of relative caps is based on the uncertainty on business-as-usual output: if the firm's production level is higher than expected, so will be business-as-usual emissions, hence reaching a given level of emissions will be more costly than expected. As a consequence, it is argued, a higher emission level should be allowed if the production level is more important than expected. We assess this argument with a stochastic analytical model featuring two random variables: the business-as-usual emission level, proportional to output, and the slope of the marginal abatement cost curve. We compare the relative cap to an absolute cap and to a price instrument, in terms of welfare impact. It turns out that in most plausible cases, either a price instrument or an absolute cap yields a higher expected welfare than a relative cap. Quantitatively, the difference in expected welfare is typically very small between the absolute and the relative cap but may be significant between the relative cap and the price instrument. (author)

  17. Electrical stimulation in dysphagia treatment: a justified controversy?

    NARCIS (Netherlands)

    Bogaardt, H. C. A.

    2008-01-01

    Electrical stimulation in dysphagia treatment: a justified controversy? Neuromuscular electrostimulation (LAMES) is a method for stimulating muscles with short electrical pulses. Neuromuscular electrostimulation is frequently used in physiotherapy to strengthen healthy muscles (as in sports

  18. The Use of Imputed Sibling Genotypes in Sibship-Based Association Analysis: On Modeling Alternatives, Power and Model Misspecification

    NARCIS (Netherlands)

    Minica, C.C.; Dolan, C.V.; Willemsen, G.; Vink, J.M.; Boomsma, D.I.

    2013-01-01

    When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of

  19. Justifying the Classical-Quantum Divide of the Copenhagen Interpretation

    OpenAIRE

    Bolotin, Arkady

    2014-01-01

    Perhaps the most significant drawback, which the Copenhagen interpretation (still the most popular interpretation of quantum theory) suffers from, is the classical-quantum divide between the large classical systems that carry out measurements and the small quantum systems that they measure. So, an "ideal" alternative interpretation of quantum theory would either eliminate this divide or justify it in some reasonable way. The present paper demonstrates that it is possible to justify the classi...

  20. Screening for foot problems in children: is this practice justifiable?

    Directory of Open Access Journals (Sweden)

    Evans Angela

    2012-07-01

    Full Text Available Abstract Podiatry screening of children is a common practice, which occurs largely without adequate data to support the need for such activity. Such programs may be either formalised, or more ad hoc in nature, depending upon the use of guidelines or existing models. Although often not used, the well-established criteria for assessing the merits of screening programs can greatly increase the understanding as to whether such practices are actually worthwhile. This review examines the purpose of community health screening in the Australian context, as occurs for tuberculosis, breast, cervical and prostate cancers, and then examines podiatry screening practices for children with reference to the criteria of the World Health Organisation (WHO. Topically, the issue of paediatric foot posture forms the focus of this review, as it presents with great frequency to a range of clinicians. Comparison is made with developmental dysplasia of the hip, in which instance the WHO criteria are well met. Considering that the burden of the condition being screened for must be demonstrable, and that early identification must be found to be beneficial, in order to justify a screening program, there is no sound support for either continuing or establishing podiatry screenings for children.

  1. Calculation-experimental method justifies the life of wagons

    Directory of Open Access Journals (Sweden)

    Валерія Сергіївна Воропай

    2015-11-01

    Full Text Available The article proposed a method to evaluate the technical state of tank wagons operating in chemical industry. An algorithm for evaluation the technical state of tank wagons was developed, that makes it possible on the basis of diagnosis and analysis of current condition to justify a further period of operation. The complex of works on testing the tanks and mathematical models for calculations of the design strength and reliability were proposed. The article is devoted to solving the problem of effective exploitation of the working fleet of tank wagons. Opportunities for further exploitation of cars, the complex of works on the assessment of their technical state and the calculation of the resources have been proposed in the article. Engineering research of the chemical industries park has reduced the shortage of the rolling stock for transportation of ammonia. The analysis of the chassis numerous faults and the main elements of tank wagons supporting structure after 20 years of exploitation was made. The algorithm of determining the residual life of the specialized tank wagons operating in an industrial plant has been proposed. The procedure for resource conservation of tank wagons carrying cargo under high pressure was first proposed. The improved procedure for identifying residual life proposed in the article has both theoretical and practical importance

  2. Small-Business Computing: Is Software Piracy Justified?

    Science.gov (United States)

    Immel, A. Richard

    1983-01-01

    Presents several different perspectives on the copying of computer software (discs, tapes, etc.) in an attempt to determine whether such infringement of copyright, often called "software piracy," can ever be justified. Implications for both the hardware and software firms and the users are also discussed. (EAO)

  3. Suing One's Sense Faculties for Fraud: 'Justifiable Reliance' in the ...

    African Journals Online (AJOL)

    The law requires that plaintiffs in fraud cases be 'justified' in relying on a misrepresentation. I deploy the accumulated intuitions of the law to defend externalist accounts of epistemic justification and knowledge against Laurence BonJour's counterexamples involving clairvoyance. I suggest that the law can offer a ...

  4. Can fertility service providers justify discrimination against lesbians?

    Science.gov (United States)

    Saffron, Lisa

    2002-05-01

    In Britain, for the last 25 years, lesbians have been seeking access to fertility services to achieve pregnancy. The problem for this particular group of clients is that often they are refused access to fertility services because of their sexuality. The reasons for treating lesbians less favourably than heterosexual clients are twofold: (1) donor insemination is defined as a treatment for infertility; and (2) service providers have a legal requirement to take into account the child's need for a father, according to Section 13(5) of the Human Fertilisation and Embryology Act 1990. The question addressed in this paper is whether these explanations constitute objective and reasonable justifications for discriminating against lesbians. As long as donor insemination is defined and perceived as a medical treatment for infertility, the social grounds for offering this procedure are obscured and there is apparent justification for treating lesbians less favourably. Once donor insemination is taken out of its medical context and understood in its social context, then it is no longer possible to justify discrimination. The 'need for a father' is the most common justification for discriminating against lesbians. Yet this phrase is ambiguous to the point of meaninglessness. Clearly, every child needs a father for conception to occur. None of the assisted conception techniques currently in use tamper with that basic need. This paper argues that commonly held beliefs about what a child needs from a father do not stand up to close examination. There is no evidence that a child needs a father for any of the following reasons: normal development, another parent, male role model, development of heterosexuality or social acceptance. If discrimination against lesbians were considered acceptable in other fields of family policy, it might legitimate discrimination in the field of reproductive medicine. However, the opposite is the case. In the fields of psychology, social work, adoption

  5. Is Editing the Genome for Climate Change Adaptation Ethically Justifiable?

    Science.gov (United States)

    Lehmann, Lisa Soleymani

    2017-12-01

    As climate change progresses, we humans might have to inhabit a world for which we are increasingly maladapted. If we were able to identify genes that directly influence our ability to thrive in a changing climate, would it be ethically justifiable to edit the human genome to enhance our ability to adapt to this new environment? Should we use gene editing not only to prevent significant disease but also to enhance our ability to function in the world? Here I suggest a "4-S framework" for analyzing the justifiability of gene editing that includes these considerations: (1) safety, (2) significance of harm to be averted, (3) succeeding generations, and (4) social consequences. © 2017 American Medical Association. All Rights Reserved.

  6. Thumba X-ray plant: Are radiation fears justified

    International Nuclear Information System (INIS)

    Madhvanath, U.

    1978-01-01

    Technical facts about the X-ray unit located at Vikram Sarabhai Space Centre, Thumba (India) are set down to explain that it is not posing any radiation hazard as reported in a newspaper and thus radiation fears are not justifiable. It is stated that, after thorough checking, X-ray installations in this space centre cause negligible exposure even to workers who handle these units, and others practically do not get any exposure at all. (B.G.W.)

  7. Legislative Prohibitions on wearing a headscarf: Are they justified?

    Directory of Open Access Journals (Sweden)

    Fatima Osman

    2014-11-01

    Full Text Available A headscarf, a simple piece of cloth that covers the head, is a controversial garment that carries various connotations and meanings. While it may be accepted as just another item of clothing when worn by non-Muslim women, it is often the subject of much controversy when worn by Muslim women. In recent years the headscarf has been described as a symbol of Islam's oppression of women and simultaneously of terrorism. As the debate regarding the acceptability of the headscarf in the modern world continues, an increasing number of states have legislated to ban the wearing of the headscarf. This article critically examines the reasons underlying these bans and argues that these prohibitions are not justified. It does this by first analysing the place of the headscarf in Islam, its religious basis and its significance to Muslim women. It argues that the headscarf is more than just a mere religious symbol and that Muslim women wear the headscarf as a matter of religious obligation. The headscarf is considered to be an important religious practice protected by the right to freedom of religion. Thereafter the article examines legislative bans on the headscarf in France, Turkey and Switzerland in order to identify the most popular justifications advanced by states and courts for banning the headscarf. It critically evaluates the justifications for protecting secularism, preventing coercion, promoting equality and curbing religious extremism, and disputes that the reasons put forward by states and accepted by courts justify banning the headscarf. It thereafter explores how South African courts would respond to a headscarf ban and argues that schools and employers should accommodate the headscarf. While Muslim women may not have an absolute right to wear the headscarf, there has thus far been no justifiable reason for banning the headscarf.

  8. Cost-justifying usability an update for the internet age

    CERN Document Server

    Bias, Randolph G; Bias, Randolph G

    2005-01-01

    You just know that an improvement of the user interface will reap rewards, but how do you justify the expense and the labor and the time-guarantee a robust ROI!-ahead of time? How do you decide how much of an investment should be funded? And what is the best way to sell usability to others? In this completely revised and new edition, Randolph G. Bias (University of Texas at Austin, with 25 years' experience as a usability practitioner and manager) and Deborah J. Mayhew (internationally recognized usability consultant and author of two other seminal books including The Usability Enginee

  9. Why do women justify violence against wives more often than do men in Vietnam?

    Science.gov (United States)

    Krause, Kathleen Helen; Gordon-Roberts, Rachel; VanderEnde, Kristin; Schuler, Sidney Ruth; Yount, Kathryn Mary

    2015-01-01

    Background Intimate partner violence (IPV) harms the health of women and their children. In Vietnam, 31% of women report lifetime exposure to physical IPV, and surprisingly, women justify physical IPV against wives more often than do men. Objective We compare men’s and women’s rates of finding good reason for wife hitting and assess whether differences in childhood experiences and resources and constraints in adulthood account for observed differences. Methods Probability samples of married men (N = 522) and women (N = 533) were surveyed in Vietnam. Ordered logit models assessed the proportional odds for women versus men of finding more “good reasons” to hit a wife (never, 1–3 situations, 4–6 situations). Results In all situations, women found good reason to hit a wife more often than did men. The unadjusted odds for women versus men of reporting more good reasons to hit a wife were 6.55 (95% CI 4.82 – 8.91). This gap disappeared in adjusted models that included significant interactions of gender with age, number of children ever born, and experience of physical IPV as an adult. Discussion Having children was associated with justifying wife hitting among women but not men. Exposure to IPV in adulthood was associated with justifying wife hitting among men but was negatively associated with justification of IPV among women. Further study of the gendered effects of resources and constraints in adulthood on attitudes about IPV against women will clarify women’s more frequent reporting than men’s that IPV against women is justified. PMID:25948647

  10. Why Do Women Justify Violence Against Wives More Often Than Do Men in Vietnam?

    Science.gov (United States)

    Krause, Kathleen H; Gordon-Roberts, Rachel; VanderEnde, Kristin; Schuler, Sidney Ruth; Yount, Kathryn M

    2015-05-06

    Intimate partner violence (IPV) harms the health of women and their children. In Vietnam, 31% of women report lifetime exposure to physical IPV, and surprisingly, women justify physical IPV against wives more often than do men. We compare men's and women's rates of finding good reason for wife hitting and assess whether differences in childhood experiences and resources and constraints in adulthood account for observed differences. Probability samples of married men (n = 522) and women (n = 533) were surveyed in Vietnam. Ordered logit models assessed the proportional odds for women versus men of finding more "good reasons" to hit a wife (never, 1-3 situations, 4-6 situations). In all situations, women found good reason to hit a wife more often than did men. The unadjusted odds for women versus men of reporting more good reasons to hit a wife were 6.55 (95% confidence interval [CI] = [4.82, 8.91]). This gap disappeared in adjusted models that included significant interactions of gender with age, number of children ever born, and experience of physical IPV as an adult. Having children was associated with justifying wife hitting among women but not men. Exposure to IPV in adulthood was associated with justifying wife hitting among men, but was negatively associated with justification of IPV among women. Further study of the gendered effects of resources and constraints in adulthood on attitudes about IPV against women will clarify women's more frequent reporting than men's that IPV against women is justified. © The Author(s) 2015.

  11. Loss of Ptf1a Leads to a Widespread Cell-Fate Misspecification in the Brainstem, Affecting the Development of Somatosensory and Viscerosensory Nuclei.

    Science.gov (United States)

    Iskusnykh, Igor Y; Steshina, Ekaterina Y; Chizhikov, Victor V

    2016-03-02

    The brainstem contains diverse neuronal populations that regulate a wide range of processes vital to the organism. Proper cell-fate specification decisions are critical to achieve neuronal diversity in the CNS, but the mechanisms regulating cell-fate specification in the developing brainstem are poorly understood. Previously, it has been shown that basic helix-loop-helix transcription factor Ptf1a is required for the differentiation and survival of neurons of the inferior olivary and cochlear brainstem nuclei, which contribute to motor coordination and sound processing, respectively. In this study, we show that the loss of Ptf1a compromises the development of the nucleus of the solitary tract, which processes viscerosensory information, and the spinal and principal trigeminal nuclei, which integrate somatosensory information of the face. Combining genetic fate-mapping, birth-dating, and gene expression studies, we found that at least a subset of brainstem abnormalities in Ptf1a(-/-) mice are mediated by a dramatic cell-fate misspecification in rhombomeres 2-7, which results in the production of supernumerary viscerosensory and somatosensory neurons of the Lmx1b lineage at the expense of Pax2(+) GABAergic viscerosensory and somatosensory neurons, and inferior olivary neurons. Our data identify Ptf1a as a major regulator of cell-fate specification decisions in the developing brainstem, and as a previously unrecognized developmental regulator of both viscerosensory and somatosensory brainstem nuclei. Cell-fate specification decisions are critical for normal CNS development. Although extensively studied in the cerebellum and spinal cord, the mechanisms mediating cell-fate decisions in the brainstem, which regulates a wide range of processes vital to the organism, remain largely unknown. Here we identified mouse Ptf1a as a novel regulator of cell-fate decisions during both early and late brainstem neurogenesis, which are critical for proper development of several major

  12. The process of justifying assisted reproductive technologies in Iran.

    Science.gov (United States)

    Gooshki, Ehsan Shamsi; Allahbedashti, Neda

    2015-01-01

    Infertility is medically defined as one year of unprotected intercourse that does not result in pregnancy. Infertility is a noticeable medical problem in Iran, and about a quarter of Iranian couples experience primary infertility at some point in their lives. Since having children is a basic social value in Iran, infertility has an adverse effect on the health of the couple and affects their well-being. The various methods of assisting infertile couples raise several ethical questions and touch upon certain sensitive points. Although the present Iranian legislative system, which is based on the Shi'a school of Islam, has legalised some aspects of assisted reproductive technologies (ARTs), given the absence of a general officially ratified act (official pathway), such medical interventions are usually justified through a fatwa system (non-official pathway). Officially registered married couples can access almost all ART methods, including third-party gamete donation, if they use such pathways. The process of justifying ART interventions generally began when in vitro fertilisation was given the nod and later, Ayatollah Khamenei (the political-religious leader of the country) issued a fatwa which permitted gamete donation by third parties. This open juristic approach paved the way for the ratification of the Embryo Donation to Infertile Spouses Act in 2003.

  13. Justifying the design and selection of literacy and thinking tools

    Directory of Open Access Journals (Sweden)

    David Whitehead

    2008-10-01

    Full Text Available Criteria for the design and selection of literacy and thinking tools that allow educators to justify what they do are described within a wider framework of learning theory and research into best practice. Based on a meta-analysis of best practice, results from a three year project designed to evaluate the effectiveness of a secondary school literacy initiative in New Zealand, together with recent research from cognitive and neuro-psychologists, it is argued that the design and selection of literacy and thinking tools used in elementary schools should be consistent with (i teaching focused (ii learner focused, (iii thought linked (iv neurologically consistent, (v subject specific, (vi text linked, (vii developmentally appropriate, and (viii assessment linked criteria.

  14. Justifying British Advertising in War and Austerity, 1939-51.

    Science.gov (United States)

    Haughton, Philippa

    2017-09-01

    Drawing together institutional papers, the trade- and national-press, and Mass-Observation documents, this article examines the changing ways that the Advertising Association justified commercial advertising from 1939 to 1951. It argues that the ability to repeatedly re-conceptualize the social and economic purposes of advertising was central to the industry's survival and revival during the years of war and austerity. This matters because the survival and revival of commercial advertising helps to explain the composition of the post-war mixed economy and the emergence of a consumer culture that became the 'golden age' of capitalism. While commercial advertising's role in supporting periods of affluence is well documented, much less is known about its relationship with war and austerity. This omission is problematic. Advertising was only able to shape the 1950s and 1960s economy because its corporate structures remained intact during the 1940s, as the industry withstood the challenges of wartime and the difficulties presented under Attlee's government. Recognizing the deliberate attempts of advertising people to promote a role for commercial advertising invites us to reconsider the inevitability of post-war affluence, while offering fresh insight into the debate around consumer education, freedom of choice, and the centrality of advertising and communication in democratic society: issues central to the society Britain was, and hoped to become. © The Author [2017]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Two independent pivotal statistics that test location and misspecification and add-up to the Anderson-Rubin statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2002-01-01

    We extend the novel pivotal statistics for testing the parameters in the instrumental variables regression model. We show that these statistics result from a decomposition of the Anderson-Rubin statistic into two independent pivotal statistics. The first statistic is a score statistic that tests

  16. Posterior-Predictive Evidence on US Inflation using Phillips Curve Models with Non-Filtered Time Series

    NARCIS (Netherlands)

    N. Basturk (Nalan); C. Cakmakli (Cem); S.P. Ceyhan (Pinar); H.K. van Dijk (Herman)

    2012-01-01

    textabstractChanging time series properties of US inflation and economic activity are analyzed within a class of extended Phillips Curve (PC) models. First, the misspecification effects of mechanical removal of low frequency movements of these series on posterior inference of a basic PC model are

  17. Posterior-Predictive Evidence on US Inflation using Phillips Curve Models with Non-Filtered Time Series

    NARCIS (Netherlands)

    Basturk, N.; Cakmakli, C.; Ceyhan, P.; van Dijk, H.K.

    2013-01-01

    Changing time series properties of US inflation and economic activity are analyzed within a class of extended Phillips Curve (PC) models. First, the misspecification effects of mechanical removal of low frequency movements of these series on posterior inference of a basic PC model are analyzed using

  18. The neural correlates of justified and unjustified killing: an fMRI study

    OpenAIRE

    Molenberghs, Pascal; Ogilvie, Claudette; Louis, Winnifred R.; Decety, Jean; Bagnall, Jessica; Bain, Paul G.

    2015-01-01

    Despite moral prohibitions on hurting other humans, some social contexts allow for harmful actions such as killing of others. One example is warfare, where killing enemy soldiers is seen as morally justified. Yet, the neural underpinnings distinguishing between justified and unjustified killing are largely unknown. To improve understanding of the neural processes involved in justified and unjustified killing, participants had to imagine being the perpetrator whilst watching ‘first-person pers...

  19. Improved hidden Markov model for nosocomial infections.

    Science.gov (United States)

    Khader, Karim; Leecaster, Molly; Greene, Tom; Samore, Matthew; Thomas, Alun

    2014-12-01

    We propose a novel hidden Markov model (HMM) for parameter estimation in hospital transmission models, and show that commonly made simplifying assumptions can lead to severe model misspecification and poor parameter estimates. A standard HMM that embodies two commonly made simplifying assumptions, namely a fixed patient count and binomially distributed detections is compared with a new alternative HMM that does not require these simplifying assumptions. Using simulated data, we demonstrate how each of the simplifying assumptions used by the standard model leads to model misspecification, whereas the alternative model results in accurate parameter estimates. © The Authors 2013. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  20. Formal structures for extracting analytically justifiable decisions from ...

    African Journals Online (AJOL)

    Eventually, with these models, a practically useful information system can be developed easily, driven by the logic embedded within the tables, coupled with the well planned business decisions. The contribution of the paper is to provide models that describe the internal computation mechanisms driving the DSSs at the ...

  1. Justifying gender discrimination in the workplace: The mediating role of motherhood myths

    Science.gov (United States)

    2018-01-01

    The issue of gender equality in employment has given rise to numerous policies in advanced industrial countries, all aimed at tackling gender discrimination regarding recruitment, salary and promotion. Yet gender inequalities in the workplace persist. The purpose of this research is to document the psychosocial process involved in the persistence of gender discrimination against working women. Drawing on the literature on the justification of discrimination, we hypothesized that the myths according to which women’s work threatens children and family life mediates the relationship between sexism and opposition to a mother’s career. We tested this hypothesis using the Family and Changing Gender Roles module of the International Social Survey Programme. The dataset contained data collected in 1994 and 2012 from 51632 respondents from 18 countries. Structural equation modellings confirmed the hypothesised mediation. Overall, the findings shed light on how motherhood myths justify the gender structure in countries promoting gender equality. PMID:29315326

  2. Justifying gender discrimination in the workplace: The mediating role of motherhood myths.

    Science.gov (United States)

    Verniers, Catherine; Vala, Jorge

    2018-01-01

    The issue of gender equality in employment has given rise to numerous policies in advanced industrial countries, all aimed at tackling gender discrimination regarding recruitment, salary and promotion. Yet gender inequalities in the workplace persist. The purpose of this research is to document the psychosocial process involved in the persistence of gender discrimination against working women. Drawing on the literature on the justification of discrimination, we hypothesized that the myths according to which women's work threatens children and family life mediates the relationship between sexism and opposition to a mother's career. We tested this hypothesis using the Family and Changing Gender Roles module of the International Social Survey Programme. The dataset contained data collected in 1994 and 2012 from 51632 respondents from 18 countries. Structural equation modellings confirmed the hypothesised mediation. Overall, the findings shed light on how motherhood myths justify the gender structure in countries promoting gender equality.

  3. Justifying the WKB approximation in pure katabatic flows

    NARCIS (Netherlands)

    Grisogono, B.; Oerlemans, J.

    2002-01-01

    Pure katabatic flow is studied with a Prandtl-type model allowing eddy diffusivity / conductivity to vary with height. Recently we obtained an asymptotic solution to the katabatic flow assuming the validity of the WKB method, which solves the fourth-order governing equation coupling the momentum and

  4. Is Justified True Belief Knowledge? / ¿Una creencia verdadera justificada es conocimiento?

    Directory of Open Access Journals (Sweden)

    Edmund L. Gettier

    2013-12-01

    Full Text Available [ES] En este breve trabajo, se presenta una edición bilingüe de Is Justified True Belief Knowledge? (1963, de Edmund L. Gettier, donde se presentan contraejemplos a la definición de «conocimiento» como «creencia verdadera justificada». [ES] In this brief text, a bilingual edition of Is Justified True Belief Knowledge?, (1963 by Edmund L. Gettier, some counterexamples are presented to the definition of «knowledge» as «justified true belief».

  5. Evidence-based, ethically justified counseling for fetal bilateral renal agenesis

    Science.gov (United States)

    Thomas, Alana N.; McCullough, Laurence B.; Chervenak, Frank A.; Placencia, Frank X.

    2017-01-01

    Background Not much data are available on the natural history of bilateral renal agenesis, as the medical community does not typically offer aggressive obstetric or neonatal care asbilateral renal agenesis has been accepted as a lethal condition. Aim To provide an evidence-based, ethically justified approach to counseling pregnant women about the obstetric management of bilateral renal agenesis. Study design A systematic literature search was performed using multiple databases. We deploy an ethical analysis of the results of the literature search on the basis of the professional responsibility model of obstetric ethics. Results Eighteen articles met the inclusion criteria for review. With the exception of a single case study using serial amnioinfusion, there has been no other case of survival following dialysis and transplantation documented. Liveborn babies die during the neonatal period. Counseling pregnant women about management of pregnancies complicated by bilateral renal agenesis should be guided by beneficence-based judgment informed by evidence about outcomes. Conclusions Based on the ethical analysis of the results from this review, without experimental obstetric intervention, neonatal mortality rates will continue to be 100%. Serial amnioinfusion therefore should not be offered as treatment, but only as approved innovation or research. PMID:28222038

  6. Evidence-based, ethically justified counseling for fetal bilateral renal agenesis.

    Science.gov (United States)

    Thomas, Alana N; McCullough, Laurence B; Chervenak, Frank A; Placencia, Frank X

    2017-07-26

    Not much data are available on the natural history of bilateral renal agenesis, as the medical community does not typically offer aggressive obstetric or neonatal care asbilateral renal agenesis has been accepted as a lethal condition. To provide an evidence-based, ethically justified approach to counseling pregnant women about the obstetric management of bilateral renal agenesis. A systematic literature search was performed using multiple databases. We deploy an ethical analysis of the results of the literature search on the basis of the professional responsibility model of obstetric ethics. Eighteen articles met the inclusion criteria for review. With the exception of a single case study using serial amnioinfusion, there has been no other case of survival following dialysis and transplantation documented. Liveborn babies die during the neonatal period. Counseling pregnant women about management of pregnancies complicated by bilateral renal agenesis should be guided by beneficence-based judgment informed by evidence about outcomes. Based on the ethical analysis of the results from this review, without experimental obstetric intervention, neonatal mortality rates will continue to be 100%. Serial amnioinfusion therefore should not be offered as treatment, but only as approved innovation or research.

  7. Semiparametric mixed-effects analysis of PK/PD models using differential equations.

    Science.gov (United States)

    Wang, Yi; Eskridge, Kent M; Zhang, Shunpu

    2008-08-01

    Motivated by the use of semiparametric nonlinear mixed-effects modeling on longitudinal data, we develop a new semiparametric modeling approach to address potential structural model misspecification for population pharmacokinetic/pharmacodynamic (PK/PD) analysis. Specifically, we use a set of ordinary differential equations (ODEs) with form dx/dt = A(t)x + B(t) where B(t) is a nonparametric function that is estimated using penalized splines. The inclusion of a nonparametric function in the ODEs makes identification of structural model misspecification feasible by quantifying the model uncertainty and provides flexibility for accommodating possible structural model deficiencies. The resulting model will be implemented in a nonlinear mixed-effects modeling setup for population analysis. We illustrate the method with an application to cefamandole data and evaluate its performance through simulations.

  8. Diagnostic Measures for the Cox Regression Model with Missing Covariates.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Chen, Ming-Hui

    2015-12-01

    This paper investigates diagnostic measures for assessing the influence of observations and model misspecification in the presence of missing covariate data for the Cox regression model. Our diagnostics include case-deletion measures, conditional martingale residuals, and score residuals. The Q-distance is proposed to examine the effects of deleting individual observations on the estimates of finite-dimensional and infinite-dimensional parameters. Conditional martingale residuals are used to construct goodness of fit statistics for testing possible misspecification of the model assumptions. A resampling method is developed to approximate the p -values of the goodness of fit statistics. Simulation studies are conducted to evaluate our methods, and a real data set is analyzed to illustrate their use.

  9. The neural correlates of justified and unjustified killing: an fMRI study.

    Science.gov (United States)

    Molenberghs, Pascal; Ogilvie, Claudette; Louis, Winnifred R; Decety, Jean; Bagnall, Jessica; Bain, Paul G

    2015-10-01

    Despite moral prohibitions on hurting other humans, some social contexts allow for harmful actions such as killing of others. One example is warfare, where killing enemy soldiers is seen as morally justified. Yet, the neural underpinnings distinguishing between justified and unjustified killing are largely unknown. To improve understanding of the neural processes involved in justified and unjustified killing, participants had to imagine being the perpetrator whilst watching 'first-person perspective' animated videos where they shot enemy soldiers ('justified violence') and innocent civilians ('unjustified violence'). When participants imagined themselves shooting civilians compared with soldiers, greater activation was found in the lateral orbitofrontal cortex (OFC). Regression analysis revealed that the more guilt participants felt about shooting civilians, the greater the response in the lateral OFC. Effective connectivity analyses further revealed an increased coupling between lateral OFC and the temporoparietal junction (TPJ) when shooting civilians. The results show that the neural mechanisms typically implicated with harming others, such as the OFC, become less active when the violence against a particular group is seen as justified. This study therefore provides unique insight into how normal individuals can become aggressors in specific situations. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  10. Reward maximization justifies the transition from sensory selection at childhood to sensory integration at adulthood.

    Science.gov (United States)

    Daee, Pedram; Mirian, Maryam S; Ahmadabadi, Majid Nili

    2014-01-01

    In a multisensory task, human adults integrate information from different sensory modalities--behaviorally in an optimal Bayesian fashion--while children mostly rely on a single sensor modality for decision making. The reason behind this change of behavior over age and the process behind learning the required statistics for optimal integration are still unclear and have not been justified by the conventional Bayesian modeling. We propose an interactive multisensory learning framework without making any prior assumptions about the sensory models. In this framework, learning in every modality and in their joint space is done in parallel using a single-step reinforcement learning method. A simple statistical test on confidence intervals on the mean of reward distributions is used to select the most informative source of information among the individual modalities and the joint space. Analyses of the method and the simulation results on a multimodal localization task show that the learning system autonomously starts with sensory selection and gradually switches to sensory integration. This is because, relying more on modalities--i.e. selection--at early learning steps (childhood) is more rewarding than favoring decisions learned in the joint space since, smaller state-space in modalities results in faster learning in every individual modality. In contrast, after gaining sufficient experiences (adulthood), the quality of learning in the joint space matures while learning in modalities suffers from insufficient accuracy due to perceptual aliasing. It results in tighter confidence interval for the joint space and consequently causes a smooth shift from selection to integration. It suggests that sensory selection and integration are emergent behavior and both are outputs of a single reward maximization process; i.e. the transition is not a preprogrammed phenomenon.

  11. Compatriot partiality and cosmopolitan justice: Can we justify compatriot partiality within the cosmopolitan framework?

    Directory of Open Access Journals (Sweden)

    Rachelle Bascara

    2016-10-01

    Full Text Available This paper shows an alternative way in which compatriot partiality could be justified within the framework of global distributive justice. Philosophers who argue that compatriot partiality is similar to racial partiality capture something correct about compatriot partiality. However, the analogy should not lead us to comprehensively reject compatriot partiality. We can justify compatriot partiality on the same grounds that liberation movements and affirmative action have been justified. Hence, given cosmopolitan demands of justice, special consideration for the economic well-being of your nation as a whole is justified if and only if the country it identifies is an oppressed developing nation in an unjust global order.This justification is incomplete. We also need to say why Person A, qua national of Country A, is justified in helping her compatriots in Country A over similarly or slightly more oppressed non-compatriots in Country B. I argue that Person A’s partiality towards her compatriots admits further vindication because it is part of an oppressed group’s project of self-emancipation, which is preferable to paternalistic emancipation.Finally, I identify three benefits in my justification for compatriot partiality. First, I do not offer a blanket justification for all forms of compatriot partiality. Partiality between members of oppressed groups is only a temporary effective measure designed to level an unlevel playing field. Second, because history attests that sovereign republics could arise as a collective response to colonial oppression, justifying compatriot partiality on the grounds that I have identified is conducive to the development of sovereignty and even democracy in poor countries, thereby avoiding problems of infringement that many humanitarian poverty alleviation efforts encounter. Finally, my justification for compatriot partiality complies with the implicit cosmopolitan commitment to the realizability of global justice

  12. The Luckless and the Doomed: Contractualism on Justified Risk-Imposition

    DEFF Research Database (Denmark)

    Holm, Sune Hannibal

    2018-01-01

    Several authors have argued that contractualism faces a dilemma when it comes to justifying risks generated by socially valuable activities. At the heart of the matter is the question of whether contractualists should adopt an ex post or an ex ante perspective when assessing whether an action...... or policy is justifiable to each person. In this paper I argue for the modest conclusion that ex post contractualism is a live option notwithstanding recent criticisms raised by proponents of the ex ante perspective. I then consider how an ex post contractualist can best respond to the problem that it seems...

  13. Fit Indexes, Lagrange Multipliers, Constraint Changes and Incomplete Data in Structural Models.

    Science.gov (United States)

    Bentler, P M

    1990-04-01

    Certain aspects of model modification and evaluation are discussed, with an emphasis on some points of view that expand upon or may differ from Kaplan (1990). The usefulness of BentlerBonett indexes is reiterated. When degree of misspecification can be measured by the size of the noncentrality parameter of a x[SUP2] distribution, the comparative fit index provides a useful general index of model adequacy that does not require knowledge of sourees of misspecification. The dependence of the Lagrange Multiplier X[SUP2] statistic on both the estimated multiplier parameter and estimated constraint or parameter change is discussed. A sensitivity theorem that shows the effects of unit change in constraints on model fit is developed for model modification in structural models. Recent incomplete data methods, such as those developed by Kaplan and his collaborators, are extended to be applicable in a wider range of situations.

  14. Bayesian inference for partially identified models exploring the limits of limited data

    CERN Document Server

    Gustafson, Paul

    2015-01-01

    Introduction Identification What Is against Us? What Is for Us? Some Simple Examples of Partially Identified ModelsThe Road Ahead The Structure of Inference in Partially Identified Models Bayesian Inference The Structure of Posterior Distributions in PIMs Computational Strategies Strength of Bayesian Updating, Revisited Posterior MomentsCredible Intervals Evaluating the Worth of Inference Partial Identification versus Model Misspecification The Siren Call of Identification Comp

  15. The Luckless and the Doomed: Contractualism on Justified Risk-Imposition

    DEFF Research Database (Denmark)

    Holm, Sune Hannibal

    2018-01-01

    Several authors have argued that contractualism faces a dilemma when it comes to justifying risks generated by socially valuable activities. At the heart of the matter is the question of whether contractualists should adopt an ex post or an ex ante perspective when assessing whether an action or ...... to prohibit a range of intuitively permissible and socially valuable activities....

  16. Context Based Inferences in Research Methodology: The Role of Culture in Justifying Knowledge Claims

    Science.gov (United States)

    Evers, Colin W.; Mason, Mark

    2011-01-01

    Drawing on work in epistemology and the philosophy of science, this paper seeks to provide very general reasons for why a comparative perspective needs to be applied to the inferential procedures of research methodologies where these concern the issue of justifying knowledge claims. In particular, the paper explores the role of culture on a number…

  17. Conjecturing, Generalizing and Justifying: Building Theory around Teacher Knowledge of Proving

    Science.gov (United States)

    Lesseig, Kristin

    2016-01-01

    The purpose of this study was to detail teachers' proving activity and contribute to a framework of Mathematical Knowledge for Teaching Proof (MKT for Proof). While working to justify claims about sums of consecutive numbers, teachers searched for key ideas and productively used examples to make, test and refine conjectures. Analysis of teachers'…

  18. Intervention in Countries with Unsustainable Energy Policies: Is it Ever Justifiable?

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, Bruce Edward [ORNL

    2010-08-01

    This paper explores whether it is ever justifiable for the international community to forcibly intervene in countries that have unsustainable energy policies. The literature on obligations to future generations suggests, philosophically, that intervention might be justified under certain circumstances. Additionally, the world community has intervened in the affairs of other countries for humanitarian reasons, such as in Kosovo, Somalia, and Haiti. However, intervention to deal with serious energy problems is a qualitatively different and more difficult problem. A simple risk analysis framework is used to organize the discussion about possible conditions for justifiable intervention. If the probability of deaths resulting from unsustainable energy policies is very large, if the energy problem can be attributed to a relatively small number of countries, and if the risk of intervention is acceptable (i.e., the number of deaths due to intervention is relatively small), then intervention may be justifiable. Without further analysis and successful solution of several vexing theoretical questions, it cannot be stated whether unsustainable energy policies being pursued by countries at the beginning of the 21st century meet the criteria for forcible intervention by the international community.

  19. Mandatory Personal Therapy: Does the Evidence Justify the Practice? In Debate

    Science.gov (United States)

    Chaturvedi, Surabhi

    2013-01-01

    The article addresses the question of whether the practice of mandatory personal therapy, followed by several training organisations, is justified by existing research and evidence. In doing so, it discusses some implications of this training requirement from an ethical and ideological standpoint, raising questions of import for training…

  20. Does an appeal to the common good justify individual sacrifices for genomic research?

    NARCIS (Netherlands)

    Hoedemaekers, R.H.M.V.; Gordijn, B.; Pijnenburg, M.A.M.

    2006-01-01

    In genomic research the ideal standard of free, informed, prior, and explicit consent is believed to restrict important research studies. For certain types of genomic research other forms of consent are therefore proposed which are ethically justified by an appeal to the common good. This notion is

  1. "Teach Your Children Well": Arguing in Favor of Pedagogically Justifiable Hospitality Education

    Science.gov (United States)

    Potgieter, Ferdinand J.

    2016-01-01

    This paper is a sequel to the paper which I delivered at last year's BCES conference in Sofia. Making use of hermeneutic phenomenology and constructive interpretivism as methodological apparatus, I challenge the pedagogic justifiability of the fashionable notion of religious tolerance. I suggest that we need, instead, to reflect "de…

  2. Is radiography justified for the evaluation of patients presenting with cervical spine trauma?

    Energy Technology Data Exchange (ETDEWEB)

    Theocharopoulos, Nicholas; Chatzakis, Georgios; Damilakis, John [Department of Medical Physics, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece) and Department of Natural Sciences, Technological Education Institute of Crete, P.O. Box 140, Iraklion 71004 Crete (Greece); Department of Radiology, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece); Department of Medical Physics, Faculty of Medicine, University of Crete, P.O. Box 2208, Iraklion, 71003 Crete (Greece)

    2009-10-15

    radiogenic lethal cancer incidents. According to the decision model calculations, the use of CT is more favorable over the use of radiography alone or radiography with CT by a factor of 13, for low risk 20 yr old patients, to a factor of 23, for high risk patients younger than 80 yr old. The radiography/CT imaging strategy slightly outperforms plain radiography for high and moderate risk patients. Regardless of the patient age, sex, and fracture risk, the higher diagnostic accuracy obtained by the CT examination counterbalances the increase in dose compared to plain radiography or radiography followed by CT only for positive radiographs and renders CT utilization justified and the radiographic screening redundant.

  3. Heteroscedasticity as a Basis of Direction Dependence in Reversible Linear Regression Models.

    Science.gov (United States)

    Wiedermann, Wolfgang; Artner, Richard; von Eye, Alexander

    2017-01-01

    Heteroscedasticity is a well-known issue in linear regression modeling. When heteroscedasticity is observed, researchers are advised to remedy possible model misspecification of the explanatory part of the model (e.g., considering alternative functional forms and/or omitted variables). The present contribution discusses another source of heteroscedasticity in observational data: Directional model misspecifications in the case of nonnormal variables. Directional misspecification refers to situations where alternative models are equally likely to explain the data-generating process (e.g., x → y versus y → x). It is shown that the homoscedasticity assumption is likely to be violated in models that erroneously treat true nonnormal predictors as response variables. Recently, Direction Dependence Analysis (DDA) has been proposed as a framework to empirically evaluate the direction of effects in linear models. The present study links the phenomenon of heteroscedasticity with DDA and describes visual diagnostics and nine homoscedasticity tests that can be used to make decisions concerning the direction of effects in linear models. Results of a Monte Carlo simulation that demonstrate the adequacy of the approach are presented. An empirical example is provided, and applicability of the methodology in cases of violated assumptions is discussed.

  4. Should Live Patient Licensing Examinations in Dentistry Be Discontinued? Two Viewpoints: Viewpoint 1: Alternative Assessment Models Are Not Yet Viable Replacements for Live Patients in Clinical Licensure Exams and Viewpoint 2: Ethical and Patient Care Concerns About Live Patient Exams Require Full Acceptance of Justifiable Alternatives.

    Science.gov (United States)

    Chu, Tien-Min Gabriel; Makhoul, Nicholas M; Silva, Daniela Rodrigues; Gonzales, Theresa S; Letra, Ariadne; Mays, Keith A

    2018-03-01

    This Point/Counterpoint article addresses a long-standing but still-unresolved debate on the advantages and disadvantages of using live patients in dental licensure exams. Two contrasting viewpoints are presented. Viewpoint 1 supports the traditional use of live patients, arguing that other assessment models have not yet been demonstrated to be viable alternatives to the actual treatment of patients in the clinical licensure process. This viewpoint also contends that the use of live patients and inherent variances in live patient treatment represent the realities of daily private practice. Viewpoint 2 argues that the use of live patients in licensure exams needs to be discontinued considering those exams' ethical dilemmas of exposing patients to potential harm, as well as their lack of reliability and validity and limited scope. According to this viewpoint, the current presence of viable alternatives means that the risk of harm inherent in live patient exams can finally be eliminated and those exams replaced with other means to confirm that candidates are qualified for licensure to practice.

  5. Routine X-ray of the chest is not justified in staging of cutaneous melanoma patients

    DEFF Research Database (Denmark)

    Gjorup, Caroline Asirvatham; Hendel, Helle Westergren; Pilegaard, Rita Kaae

    2016-01-01

    INTRODUCTION: The incidence of cutaneous melanoma is increasing in Denmark and worldwide. However, the prevalence of distant metastases at the time of diagnosis has decreased to 1%. We therefore questioned the value of routine preoperative chest X-ray (CXR) for staging asymptomatic melanoma...... patients and hypothesised that routine CXR is not justified. METHODS: A retrospective study was conducted on patients undergoing wide local excision and sentinel lymph node biopsy for cutaneous melanoma in the period from 2010 to 2014. RESULTS: A total of 603 patients were included. The mean time of follow...... value was 8%, and the negative predictive value was 100%. CONCLUSION: Our results suggest that CXR cannot be justified in the initial staging of cutaneous melanoma patients. The guideline for the treatment of melanoma in Denmark is under revision: The use of CXR has been omitted. FUNDING: This study...

  6. Justifying decisions in social dilemmas: justification pressures and tacit coordination under environmental uncertainty.

    Science.gov (United States)

    de Kwaadsteniet, Erik W; van Dijk, Eric; Wit, Arjaan; De Cremer, David; de Rooij, Mark

    2007-12-01

    This article investigates how justification pressures influence harvesting decisions in common resource dilemmas. The authors argue that when a division rule prescribes a specific harvest level, such as under environmental certainty, people adhere more strongly to this division rule when they have to justify their decisions to fellow group members. When a division rule does not prescribe a specific harvest level, such as under environmental uncertainty, people restrict their harvests when they have to justify their decisions to fellow group members. The results of two experimental studies corroborate this line of reasoning. The findings are discussed in terms of tacit coordination. The authors specify conditions under which justification pressures may or may not facilitate efficient coordination.

  7. Estimation of increased regional income that emanates from economically justified road construction projects

    Directory of Open Access Journals (Sweden)

    W. J. Pienaar

    2005-09-01

    Full Text Available This article identifies the possible development benefits than can emanate from economically justified road construction projects. It shows how the once-off increase in regional income resulting from investment in road construction projects, and the recurring additional regional income resulting from the use of new or improved roads can be estimated. The difference is shown that exists between a cost-benefit analysis (to determine how economically justified a project is and a regional economic income analysis (to estimate the general economic benefits that will be developed by investment in and usage of a road. Procedures are proposed through which the once-off and recurring increases in regional income can be estimated by using multiplier and accelerator analyses respectively. Finally guidelines are supplied on the appropriate usage of input variables in the calculation of the regional income multiplier.

  8. Temporal artery biopsy in the diagnosis of giant cell arteritis: Does the end justify the means?

    Directory of Open Access Journals (Sweden)

    K. Bowling

    2017-08-01

    Conclusion: Overall 13.2% of our biopsies were positive for GCA and 87.3% of biopsy negative patients continued prednisolone therapy on clinical grounds. In the face of new diagnostic tests (high resolution MRI (Magnetic Resonance Imaging, colour duplex USS (Ultra Sound Scan and PET (Positive Emission Topography can we justify invasive surgery to all patients on histological grounds when the results may not alter management? Further investigation is needed directly comparing newer imaging modalities to histology.

  9. Private Motive, Humanitarian Intent: A Theory of Ethically Justified Private Intervention

    Science.gov (United States)

    2013-06-01

    229 J. Carl Ficarrotta, “Just War Theory : Triumphant… and Doing More Harm than Good,” in his Kantian Thinking about Military Ethics ...169–195. Ficarrotta, J. Carl. “Just War Theory : Triumphant… and Doing More Harm than Good.” In his Kantian Thinking about Military Ethics , 107–118...HUMANITARIAN INTENT: A THEORY OF ETHICALLY JUSTIFIED PRIVATE INTERVENTION by Edwin D. Morton III June 2013 Thesis Advisor: Bradley J

  10. How arguments are justified in the media debate on climate change in the USA and France

    OpenAIRE

    Ylä-Anttila, Tuomas; Kukkonen, Anna

    2014-01-01

    This paper examines the differences in the values that are evoked to justify arguments in the media debate on climate change in USA and France from 1997 to 2011. We find that climate change is more often discussed in terms of justice, democracy, and legal regulation in France, while monetary value plays a more important role as a justification for climate policy arguments in the USA. Technological and scientific arguments are more often made in France, and ecological arguments equally in both...

  11. Influenza vaccination in Dutch nursing homes: is tacit consent morally justified?

    Science.gov (United States)

    Verweij, M F; van den Hoven, M A

    2005-01-01

    Efficient procedures for obtaining informed (proxy) consent may contribute to high influenza vaccination rates in nursing homes. Yet are such procedures justified? This study's objective was to gain insight in informed consent policies in Dutch nursing homes; to assess how these may affect influenza vaccination rates and to answer the question whether deviating from standard informed consent procedures could be morally justified. A survey among nursing home physicians. We sent a questionnaire to all (356) nursing homes in the Netherlands, to be completed by one of the physicians. We received 245 completed questionnaires. As 21 institutions appeared to be closed or merged into other institutions, the response was 73.1% (245/335). Of all respondents 81.9% reported a vaccination rate above 80%. Almost 50% reported a vaccination rate above 90%. Most respondents considered herd immunity to be an important consideration for institutional policy. Freedom of choice for residents was considered important by almost all. Nevertheless, 106 out of 245 respondents follow a tacit consent procedure, according to which vaccination will be administered unless the resident or her proxy refuses. These institutions show significantly higher vaccination rates (p tacit consent procedures can be morally justifiable. Such procedures assume that vaccination is good for residents either as individuals or as a group. Even though this assumption may be true for most residents, there are good reasons for preferring express consent procedures.

  12. Justifiability and Animal Research in Health: Can Democratisation Help Resolve Difficulties?

    Science.gov (United States)

    2018-01-01

    Simple Summary Scientists justify animal use in medical research because the benefits to human health outweigh the costs or harms to animals. However, whether it is justifiable is controversial for many people. Even public interests are divided because an increasing proportion of people do not support animal research, while demand for healthcare that is based on animal research is also rising. The wider public should be given more influence in these difficult decisions. This could be through requiring explicit disclosure about the role of animals in drug labelling to inform the public out of respect for people with strong objections. It could also be done through periodic public consultations that use public opinion and expert advice to decide which diseases justify the use of animals in medical research. More public input will help ensure that animal research projects meet public expectations and may help to promote changes to facilitate medical advances that need fewer animals. Abstract Current animal research ethics frameworks emphasise consequentialist ethics through cost-benefit or harm-benefit analysis. However, these ethical frameworks along with institutional animal ethics approval processes cannot satisfactorily decide when a given potential benefit is outweighed by costs to animals. The consequentialist calculus should, theoretically, provide for situations where research into a disease or disorder is no longer ethical, but this is difficult to determine objectively. Public support for animal research is also falling as demand for healthcare is rising. Democratisation of animal research could help resolve these tensions through facilitating ethical health consumerism or giving the public greater input into deciding the diseases and disorders where animal research is justified. Labelling drugs to disclose animal use and providing a plain-language summary of the role of animals may help promote public understanding and would respect the ethical beliefs of

  13. Modelling severe Staphylococcus aureus sepsis in conscious pigs: are implications for animal welfare justified?

    DEFF Research Database (Denmark)

    Olsen, Helle G; Kjelgaard-Hansen, Mads; Tveden-Nyborg, Pernille

    2016-01-01

    by the severity of induced disease, which in some cases necessitated humane euthanasia. A pilot study was therefore performed in order to establish the sufficient inoculum concentration and application protocol needed to produce signs of liver dysfunction within limits of our pre-defined humane endpoints. Four....... Prior to euthanasia, a galactose elimination capacity test was performed to assess liver function. Pigs were euthanised 48 h post inoculation for necropsy and histopathological evaluation. While infusion times of 6.66 min, and higher, did not induce liver dysfunction (n = 3), the infusion time of 3...

  14. Bivariate Random Effects Meta-analysis of Diagnostic Studies Using Generalized Linear Mixed Models

    Science.gov (United States)

    GUO, HONGFEI; ZHOU, YIJIE

    2011-01-01

    Bivariate random effect models are currently one of the main methods recommended to synthesize diagnostic test accuracy studies. However, only the logit-transformation on sensitivity and specificity has been previously considered in the literature. In this paper, we consider a bivariate generalized linear mixed model to jointly model the sensitivities and specificities, and discuss the estimation of the summary receiver operating characteristic curve (ROC) and the area under the ROC curve (AUC). As the special cases of this model, we discuss the commonly used logit, probit and complementary log-log transformations. To evaluate the impact of misspecification of the link functions on the estimation, we present two case studies and a set of simulation studies. Our study suggests that point estimation of the median sensitivity and specificity, and AUC is relatively robust to the misspecification of the link functions. However, the misspecification of link functions has a noticeable impact on the standard error estimation and the 95% confidence interval coverage, which emphasizes the importance of choosing an appropriate link function to make statistical inference. PMID:19959794

  15. Is Date Rape Justifiable? The Effects of Dating Activity, Who Initiated, Who Paid, and Men's Attitudes toward Women.

    Science.gov (United States)

    Muehlenhard, Charlene L.; And Others

    1985-01-01

    Examined justifiability of date rape under various circumstances among male undergraduates. Rape was rated as significantly more justifiable if the couple went to the man's apartment rather than to a religious function, if the woman asked the man out, and if the man paid all the dating expenses. Differences between traditional and nontraditional…

  16. Disability, discrimination and death: is it justified to ration life saving treatment for disabled newborn infants?

    Science.gov (United States)

    Wilkinson, Dominic; Savulescu, Julian

    2014-01-01

    Disability might be relevant to decisions about life support in intensive care in several ways. It might affect the chance of treatment being successful, or a patient's life expectancy with treatment. It may affect whether treatment is in a patient's best interests. However, even if treatment would be of overall benefit it may be unaffordable and consequently unable to be provided. In this paper we will draw on the example of neonatal intensive care, and ask whether or when it is justified to ration life-saving treatment on the basis of disability. We argue that predicted disability is relevant both indirectly and directly to rationing decisions.

  17. Justifiability and Animal Research in Health: Can Democratisation Help Resolve Difficulties?

    Directory of Open Access Journals (Sweden)

    Shaun Yon-Seng Khoo

    2018-02-01

    Full Text Available Current animal research ethics frameworks emphasise consequentialist ethics through cost-benefit or harm-benefit analysis. However, these ethical frameworks along with institutional animal ethics approval processes cannot satisfactorily decide when a given potential benefit is outweighed by costs to animals. The consequentialist calculus should, theoretically, provide for situations where research into a disease or disorder is no longer ethical, but this is difficult to determine objectively. Public support for animal research is also falling as demand for healthcare is rising. Democratisation of animal research could help resolve these tensions through facilitating ethical health consumerism or giving the public greater input into deciding the diseases and disorders where animal research is justified. Labelling drugs to disclose animal use and providing a plain-language summary of the role of animals may help promote public understanding and would respect the ethical beliefs of objectors to animal research. National animal ethics committees could weigh the competing ethical, scientific, and public interests to provide a transparent mandate for animal research to occur when it is justifiable and acceptable. Democratic processes can impose ethical limits and provide mandates for acceptable research while facilitating a regulatory and scientific transition towards medical advances that require fewer animals.

  18. When is deliberate killing of young children justified? Indigenous interpretations of infanticide in Bolivia.

    Science.gov (United States)

    de Hilari, Caroline; Condori, Irma; Dearden, Kirk A

    2009-01-01

    In the Andes, as elsewhere, infanticide is a difficult challenge that remains largely undocumented and misunderstood. From January to March 2004 we used community-based vital event surveillance systems, discussions with health staff, ethnographic interviews, and focus group discussions among Aymara men and women from two geographically distinct sites in the Andes of Bolivia to provide insights into the practice of infanticide. We noted elevated mortality at both sites. In one location, suspected causes of infanticide were especially high for girls. We also observed that community members maintain beliefs that justify infanticide under certain circumstances. Among the Aymara, justification for infanticide was both biological (deformities and twinship) and social (illegitimate birth, family size and poverty). Communities generally did not condemn killing when reasons for doing so were biological, but the taking of life for social reasons was rarely justified. In this cultural context, strategies to address the challenge of infanticide should include education of community members about alternatives to infanticide. At a program level, planners and implementers should target ethnic groups with high levels of infanticide and train health care workers to detect and address multiple warning signs for infanticide (for example, domestic violence and child maltreatment) as well as proxies for infant neglect and abuse such as mother/infant separation and bottle use.

  19. Justifying molecular images in cell biology textbooks: From constructions to primary data.

    Science.gov (United States)

    Serpente, Norberto

    2016-02-01

    For scientific claims to be reliable and productive they have to be justified. However, on the one hand little is known on what justification precisely means to scientists, and on the other the position held by philosophers of science on what it entails is rather limited; for justifications customarily refer to the written form (textual expressions) of scientific claims, leaving aside images, which, as many cases from the history of science show are relevant to this process. The fact that images can visually express scientific claims independently from text, plus their vast variety and origins, requires an assessment of the way they are currently justified and in turn used as sources to justify scientific claims in the case of particular scientific fields. Similarly, in view of the different nature of images, analysis is required to determine on what side of the philosophical distinction between data and phenomena these different kinds of images fall. This paper historicizes and documents a particular aspect of contemporary life sciences research: the use of the molecular image as vehicle of knowledge production in cell studies, a field that has undergone a significant shift in visual expressions from the early 1980s onwards. Focussing on textbooks as sources that have been overlooked in the historiography of contemporary biomedicine, the aim is to explore (1) whether the shift of cell studies, entailing a superseding of the optical image traditionally conceptualised as primary data, by the molecular image, corresponds with a shift of justificatory practices, and (2) to assess the role of the molecular image as primary data. This paper also explores the dual role of images as teaching resources and as resources for the construction of knowledge in cell studies especially in its relation to discovery and justification. Finally, this paper seeks to stimulate reflection on what kind of archival resources could benefit the work of present and future epistemic

  20. Efficient experimental designs for sigmoidal growth models

    OpenAIRE

    Dette, Holger; Pepelyshev, Andrey

    2005-01-01

    For the Weibull- and Richards-regression model robust designs are determined by maximizing a minimum of D- or D1-efficiencies, taken over a certain range of the non-linear parameters. It is demonstrated that the derived designs yield a satisfactory solution of the optimal design problem for this type of model in the sense that these designs are efficient and robust with respect to misspecification of the unknown parameters. Moreover, the designs can also be used for testing the postulated for...

  1. How to justify enforcing a Ulysses contract when Ulysses is competent to refuse.

    Science.gov (United States)

    Davis, John K

    2008-03-01

    Sometimes the mentally ill have sufficient mental capacity to refuse treatment competently, and others have a moral duty to respect their refusal. However, those with episodic mental disorders may wish to precommit themselves to treatment, using Ulysses contracts known as "mental health advance directives." How can health care providers justify enforcing such contracts over an agent's current, competent refusal? I argue that providers respect an agent's autonomy not retrospectively--by reference to his or her past wishes-and not merely synchronically--so that the agent gets what he or she wants right now-but diachronically and prospectively, acting so that the agent can shape his or her circumstances as the agent wishes over time, for the agent will experience the consequences of providers' actions over time. Mental health directives accomplish this, so they are a way of respecting the agent's autonomy even when providers override the agent's current competent refusal.

  2. Justifying continuous sedation until death: a focus group study in nursing homes in Flanders, Belgium.

    Science.gov (United States)

    Rys, Sam; Deschepper, Reginald; Deliens, Luc; Mortier, Freddy; Bilsen, Johan

    2013-01-01

    Continuous Sedation until Death (CSD), the act of reducing or removing the consciousness of an incurably ill patient until death, has become a common practice in nursing homes in Flanders (Belgium). Quantitative research has suggested that CSD is not always properly applied. This qualitative study aims to explore and describe the circumstances under which nursing home clinicians consider CSD to be justified. Six focus groups were conducted including 10 physicians, 24 nurses, and 14 care assistants working in either Catholic or non-Catholic nursing homes of varying size. Refractory suffering, limited life expectancy and respecting patient autonomy are considered essential elements in deciding for CSD. However, multiple factors complicate the care of nursing home residents at the end of life, and often hinder clinicians from putting these elements into practice. Nursing home clinicians may benefit from more information and instruction about managing CSD in the complex care situations which typically occur in nursing homes. Copyright © 2013 Mosby, Inc. All rights reserved.

  3. Do the ends justify the means? Nursing and the dilemma of whistleblowing.

    Science.gov (United States)

    Firtko, Angela; Jackson, Debra

    2005-01-01

    Patient advocacy and a desire to rectify misconduct in the clinical setting are frequently cited reasons for whistleblowing in nursing and healthcare. This paper explores current knowledge about whistleblowing in nursing and critiques current definitions of whistleblowing. The authors draw on published perspectives of whistleblowing including the media, to reflect on the role of the media in health related whistleblowing. Whistleblowing represents a dilemma for nurses. It strikes at the heart of professional values and raises questions about the responsibilities nurses have to communities and clients, the profession, and themselves. In its most damaging forms, whistleblowing necessarily involves a breach of ethical standards, particularly confidentiality. Despite the pain that can be associated with whistleblowing, if the ends are improved professional standards, enhanced outcomes, rectification of wrongdoings, and, increased safety for patients and staff in our health services, then the ends definitely justify the means.

  4. What justifies the United States ban on federal funding for nonreproductive cloning?

    Science.gov (United States)

    Cunningham, Thomas V

    2013-11-01

    This paper explores how current United States policies for funding nonreproductive cloning are justified and argues against that justification. I show that a common conceptual framework underlies the national prohibition on the use of public funds for cloning research, which I call the simple argument. This argument rests on two premises: that research harming human embryos is unethical and that embryos produced via fertilization are identical to those produced via cloning. In response to the simple argument, I challenge the latter premise. I demonstrate there are important ontological differences between human embryos (produced via fertilization) and clone embryos (produced via cloning). After considering the implications my argument has for the morality of publicly funding cloning for potential therapeutic purposes and potential responses to my position, I conclude that such funding is not only ethically permissible, but also humane national policy.

  5. Is the term "fasciculus opticus cerebralis" more justifiable than the term "optic nerve"?

    Science.gov (United States)

    Vojniković, Bojo; Bajek, Snjezana; Bajek, Goran; Strenja-Linić, Ines; Grubesić, Aron

    2013-04-01

    The terminology of the optic nerve had already been changed three times, since 1895 until 1955 when the term "nervus opticus" was introduced in the "Terminologia Anatomica". Following our study we claim that, from the aspect of phylogenetic evolution of binocular vision development as well as optical embryogenesis where opticus is evidently presented as a product of diencephalic structures, the addition of the term "nervus" to opticus is not adequate and justified. From the clinical aspect the term "nervus opticus" is also inadequate, both as a "nerve" that has no functional regenerative properties, unlike other cranial nerves, as well as from a pedagogical and didactical aspect of educating future physicians. We suggest that the term "Fasciculus Opticus Cerebralis" should be used as it much better explains the origin as well as its affiliation to the central nervous system.

  6. Is development of geothermal energy resource in Macedonia justified or not?

    International Nuclear Information System (INIS)

    Popovski, Kiril; Popovska Vasilevska, Sanja

    2007-01-01

    During the 80-ies of last century, Macedonia has been one of the world leaders in development of direct application of geothermal energy. During a period of only 6-7 years a participation of 0,7% in the State energy balance has been reached. However, situation has been changed during the last 20 years and the development of this energy resource has been not only stopped but some of the existing projects have been abandoned leading to regression. This situation is illogical, due the fact that it practically proved of being technically feasible and absolutely economically justified. A summary of the present situation with geothermal projects in Macedonia is made in the paper, and possibilities for their improvement and possibilities and justifications for development of new resources foreseen. Final conclusion is that the development of direct application of geothermal energy in Macedonia offer (in comparison with other renewable energy resources) the best energy and economic effects. (Author)

  7. Longstanding hydrocele in adult Black Africans: Is preoperative scrotal ultrasound justified?

    Science.gov (United States)

    Okorie, Chukwudi O.; Pisters, Louis L.; Liu, Ping

    2011-01-01

    Background: Longstanding hydrocele is very common among adult Black Africans. Preoperative scrotal ultrasound is widely used for adult patients presenting with hydrocele, with the main aim to rule out more serious underlying pathologies like malignancy or testicular torsion. This paper analyzes the findings and the necessity of automatic ordering of scrotal ultrasound in cases of longstanding hydrocele in adult Black Africans. Materials and Methods: 102 consecutive patients with longstanding scrotal hydrocele were investigated clinically and all patients also had routine preoperative scrotal ultrasound. Results: Overall, none of our patients had any serious underlying pathology associated with their hydrocele. 97% of the patients had simple hydrocele on ultrasound. Hydrocele is more common on the right (P=0.04) and is more bilateral in elderly patients (P=0.0002). Conclusions: Routine preoperative scrotal ultrasound does not seem to be justified in longstanding hydroceles. This is especially important considering the fact that most hydroceles are benign in origin and nature. PMID:22083049

  8. Is routine antenatal venereal disease research laboratory test still justified? Nigerian experience.

    Science.gov (United States)

    Nwosu, Betrand O; Eleje, George U; Obi-Nwosu, Amaka L; Ahiarakwem, Ita F; Akujobi, Comfort N; Egwuatu, Chukwudi C; Onyiuke, Chukwudumebi O C

    2015-01-01

    To determine the seroreactivity of pregnant women to syphilis in order to justify the need for routine antenatal syphilis screening. A multicenter retrospective analysis of routine antenatal venereal disease research laboratory (VDRL) test results between 1 September 2010 and 31 August 2012 at three specialist care hospitals in south-east Nigeria was done. A reactive VDRL result is subjected for confirmation using Treponema pallidum hemagglutination assay test. Analysis was by Epi Info 2008 version 3.5.1 and Stata/IC version 10. Adequate records were available regarding 2,156 patients and were thus reviewed. The mean age of the women was 27.4 years (±3.34), and mean gestational age was 26.4 weeks (±6.36). Only 15 cases (0.70%) were seropositive to VDRL. Confirmatory T. pallidum hemagglutination assay was positive in 4 of the 15 cases, giving an overall prevalence of 0.19% and a false-positive rate of 73.3%. There was no significant difference in the prevalence of syphilis in relation to maternal age and parity (P>0.05). While the prevalence of syphilis is extremely low in the antenatal care population at the three specialist care hospitals in south-east Nigeria, false-positive rate is high and prevalence did not significantly vary with maternal age or parity. Because syphilis is still a serious but preventable and curable disease, screening with VDRL alone, without confirmatory tests may not be justified. Because of the increase in the demand for evidence-based medicine and litigation encountered in medical practice, we may advocate that confirmatory test for syphilis is introduced in routine antenatal testing to reduce the problem of false positives. The government should increase the health budget that will include free routine antenatal testing including the T. pallidum hemagglutination assay.

  9. Using outcomes data to justify instituting new technology: a single institution's experience.

    Science.gov (United States)

    Starker, P M; Chinn, B

    2018-03-01

    The PILLAR II trial demonstrated PINPOINT is safe, feasible to use with no reported adverse events and resulted in no anastomotic leaks in patients who had a change in surgical plan based on PINPOINT's intraoperative assessment of tissue perfusion during colorectal resection. Whether the cost savings associated with this reduction in anastomotic complications can offset the cost of investing in PINPOINT is unknown. We performed a retrospective analysis of all patients (N = 347) undergoing colectomy with primary anastomosis from January 2015 to April 2016. These patients were stratified based on whether fluorescence imaging was used intraoperatively. The clinical outcomes of these patients were then evaluated based on their development of an anastomotic leak or stricture. The direct hospital costs per case were then calculated, and the economic impact of using fluorescence imaging was examined to assess whether decreased direct costs would justify the initial expenditures to purchase new technology (PINPOINT System, NOVADAQ, Canada). Fluorescence imaging in colorectal surgery using PINPOINT reduced the anastomotic failure rate in patients who underwent colon resection. The PINPOINT group (n = 238) had two (0.84%) anastomotic failures, while the non-PINPOINT group (n = 109) had six (5.5%) anastomotic failures. In the PINPOINT group, 11 (4.6%) patients had a change in the resection margin based on the results of the fluorescence imaging, and none of these patients experienced an anastomotic failure. Cost per case was less in the PINPOINT group secondary to fewer direct costs associated with complications. These results validate the findings of the PILLAR II trial and confirm the decrease in direct costs due to reduction in anastomotic failures as a result of using PINPOINT justified the expense of the new technology after just 143 cases.

  10. Additive Intensity Regression Models in Corporate Default Analysis

    DEFF Research Database (Denmark)

    Lando, David; Medhat, Mamdouh; Nielsen, Mads Stenbo

    2013-01-01

    We consider additive intensity (Aalen) models as an alternative to the multiplicative intensity (Cox) models for analyzing the default risk of a sample of rated, nonfinancial U.S. firms. The setting allows for estimating and testing the significance of time-varying effects. We use a variety...... of model checking techniques to identify misspecifications. In our final model, we find evidence of time-variation in the effects of distance-to-default and short-to-long term debt. Also we identify interactions between distance-to-default and other covariates, and the quick ratio covariate is significant....... None of our macroeconomic covariates are significant....

  11. Selection Criteria in Regime Switching Conditional Volatility Models

    Directory of Open Access Journals (Sweden)

    Thomas Chuffart

    2015-05-01

    Full Text Available A large number of nonlinear conditional heteroskedastic models have been proposed in the literature. Model selection is crucial to any statistical data analysis. In this article, we investigate whether the most commonly used selection criteria lead to choice of the right specification in a regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH and the Markov-Switching GARCH models. Simulation experiments reveal that information criteria and loss functions can lead to misspecification ; BIC sometimes indicates the wrong regime switching framework. Depending on the Data Generating Process used in the experiments, great care is needed when choosing a criterion.

  12. Is routine antenatal venereal disease research laboratory test still justified? Nigerian experience

    Directory of Open Access Journals (Sweden)

    Nwosu BO

    2015-01-01

    Full Text Available Betrand O Nwosu,1 George U Eleje,1 Amaka L Obi-Nwosu,2 Ita F Ahiarakwem,3 Comfort N Akujobi,4 Chukwudi C Egwuatu,4 Chukwudumebi O Onyiuke5 1Department of Obstetrics and Gynecology, Nnamdi Azikiwe University, Nnewi Campus, Nnewi, Anambra State, Nigeria; 2Department of Family Medicine, Nnamdi Azikiwe University Teaching Hospital, Nnewi, Nigeria; 3Department of Medical Microbiology, Imo State University Teaching Hospital, Orlu, Imo State, Nigeria; 4Department of Medical Microbiology, Nnamdi Azikiwe University, Nnewi Campus, Nnewi, Anambra State, Nigeria; 5Department of Medical Microbiology, Nnamdi Azikiwe University Teaching Hospital, Nnewi, Anambra State, NigeriaObjective: To determine the seroreactivity of pregnant women to syphilis in order to justify the need for routine antenatal syphilis screening.Methods: A multicenter retrospective analysis of routine antenatal venereal disease research laboratory (VDRL test results between 1 September 2010 and 31 August 2012 at three specialist care hospitals in south-east Nigeria was done. A reactive VDRL result is subjected for confirmation using Treponema pallidum hemagglutination assay test. Analysis was by Epi Info 2008 version 3.5.1 and Stata/IC version 10.Results: Adequate records were available regarding 2,156 patients and were thus reviewed. The mean age of the women was 27.4 years (±3.34, and mean gestational age was 26.4 weeks (±6.36. Only 15 cases (0.70% were seropositive to VDRL. Confirmatory T. pallidum hemagglutination assay was positive in 4 of the 15 cases, giving an overall prevalence of 0.19% and a false-positive rate of 73.3%. There was no significant difference in the prevalence of syphilis in relation to maternal age and parity (P>0.05.Conclusion: While the prevalence of syphilis is extremely low in the antenatal care population at the three specialist care hospitals in south-east Nigeria, false-positive rate is high and prevalence did not significantly vary with maternal age or

  13. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  14. How to Justify Purchase of an iPad: Users of the Latest Launch

    Directory of Open Access Journals (Sweden)

    Emílio José Montero Arruda Filho

    2014-09-01

    Full Text Available Contemporary technology innovation is increasingly based on convergence and the multiple uses of products. This change is detailed in the literature about new product development, as well as that on systems integration. This article focuses on the factors that determine the justification for using advanced technology products in which the perceived value of the product is not based on its functionality, as much as on its hedonistic or social value as an “all-in-one” product. In this study, consumer behaviors toward the Apple iPad are analyzed using netnographic evidence taken from internet postings by the consumers themselves. Since Apple initially marketed the iPad as a revolutionary product, with integrated services and features, our analysis concentrates on how consumers perceived these new, innovative features, in an effort to justify their purchase of the product. Our findings indicate that consumers’ justifications are based not only on the iPad’s functionality, but also its hedonic traits, and its similarity to the previously released innovative product, the iPhone.

  15. Quadrilatero ferrifero, MG, Brazil. Regional characteristics justify application for global geoparks network

    International Nuclear Information System (INIS)

    Mantesso-Neto, V.; Azevedo, U.; Guimarães, R.; Nascimento, M.; Beato, D.; Castro, P.; Liccardo, A.

    2010-01-01

    Geopark, a concept created in 2000, is neither strictly geological nor a park in the usual sense. Geopark is a holistic concept, aimed at promoting sustainable economic development based on unique geological features (represented by “geosites”, outcrops with special value, under some point of view), but also having a social objective. The Global Geoparks Network (GGN), working in synergy with UNESCO, has 64 members in 19 countries. This paper presents a brief history and some characteristics of a few European Geoparks, followed by some aspects of the Quadrilátero Ferrífero. As shall be seen, this area is rich in geosites, and in historical, social and cultural attractions. On the other hand, foreseeing a decline in mineral exploitation in mid-century, it urgently seeks a good plan for regional development. As a conclusion, it will be seen that its characteristics fit the Geopark concept, and justify the support of the geoscientific community, and that of society in general, to its application, recently submitted to UNESCO, for admission to the GGN

  16. [Are in-utero interventions justified?--perspective of neonatologists. Part I. Congenital diaphragmatic hernia (CDH)].

    Science.gov (United States)

    Dabrowska, Katarzyna; Gadzinowski, Janusz

    2011-05-01

    In-utero interventions are often perceived by parents as the only hope for their unborn child. Because it is neonatologists who have to deal with a sick newborn and sometimes unrealistic optimism of the parents after delivery we have taken on the task of reviewing the current knowledge concerning fetal surgeries from the neonatologist's perspective. In the first of three parts we have analyzed the data for in-utero interventions for CDH. Our main objective was to evaluate available data and to ascertain whether performing fetal surgeries for CDH is justified. Review of available literature on the subject of in-utero interventions in the fetuses with CDH was performed. Pubmed and Cochrane library were searched for relevant publications, in particular for randomized controlled trials. In randomized controlled trial (RCT), the in-utero intervention did not improve the outcome. The results of uncontrolled clinical trials suggest that it may be beneficial in cases with severe lung hypoplasia. The RCT testing the efficacy of the procedure performed later in pregnancy in moderately severe cases in currently under way In-utero interventions might improve survival in a carefully selected group of patients with CDH. However the evidence to support this claim is not strong, and until more data is available, in-utero interventions for CDH should only be performed in specialized centers as part of controlled clinical trial.

  17. Dear Critics: Addressing Concerns and Justifying the Benefits of Photography as a Research Method

    Directory of Open Access Journals (Sweden)

    Kyle Elizabeth Miller

    2015-08-01

    Full Text Available Photography serves as an important tool for researchers to learn about the contextualized lives of individuals. This article explores the process of integrating photo elicitation interviews (PEI into research involving children and families. Much literature is dedicated to the general debate surrounding the ethics of visual methods in research, with little attention directed at the actual process of gaining study approval and publishing one's findings. There are two main critiques that researchers must face in order to conduct and disseminate studies involving visual images—ethics committees and peer reviewers. In this article, I identify and discuss some of the challenges that emerged across gaining protocol approval from an ethics committee in the United States. Ethical concerns and restrictions related to the use of photography can delay data collection and create barriers to research designs. Similarly, I describe the process of responding to reviewers' concerns as part of the publication process. Peer reviewers' lack of familiarity with the use of photography as a research tool may lead to misunderstandings and inappropriate requests for manuscript changes. While many concerns are sound, the range of benefits stemming from the use of visual data help to justify the time and energy required to defend this type of research. Implications are discussed for researchers using visual methods in their work. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1503274

  18. Is sex-selective abortion morally justified and should it be prohibited?

    Science.gov (United States)

    Rogers, Wendy; Ballantyne, Angela; Draper, Heather

    2007-11-01

    In this paper we argue that sex-selective abortion (SSA) cannot be morally justified and that it should be prohibited. We present two main arguments against SSA. First, we present reasons why the decision for a woman to seek SSA in cultures with strong son-preference cannot be regarded as autonomous on either a narrow or a broad account of autonomy. Second, we identify serious harms associated with SSA including perpetuation of discrimination against women, disruption to social and familial networks, and increased violence against women. For these reasons, SSA should be prohibited by law, and such laws should be enforced. Finally, we describe additional strategies for decreasing son-preference. Some of these strategies rely upon highlighting the disadvantages of women becoming scarce, such as lack of brides and daughters-in-law to care for elderly parents. We should, however, be cautious not to perpetuate the view that the purpose of women is to be the consorts for, and carers of, men, and the providers of children. Arguments against SSA should be located within a concerted effort to ensure greater, deeper social and cultural equality between the sexes.

  19. Ongoing Transmission of HCV: Should Cesarean Section be Justified? Data Mining Discovery.

    Science.gov (United States)

    Elrazek, Abd; Saab, Samy; Foad, Mahmoud; Elgohary, Elsayed A; Sallam, Mohammad M; Nawara, Abdallah; Ismael, Ali; Morsi, Samar S; Salah, Altaher; Alboraie, Mohamed; Bhagavathula, Akshaya Srikanth; Zayed, Marwa; Elmasry, Hossam; Salem, Tamer Z

    2017-03-01

    Over the past few decades, cesarean section (CS) rates are steadily increasing in most of the middle- and high-income countries. However, most of the pregnant women (particularly undergoing CS) are not screened for hepatitis C virus (HCV); hence, neonates born to HCV-positive mother could be a source of future HCV infection. In this study, the role of the CS and other surgical interventions in HCV transmission in Egypt, the highest endemic country of HCV-4, was investigated. From January to June 2016, a prospective cohort study was conducted among 3,836 pregnant women in both urban and rural areas across Egypt for HCV screening in both mothers and neonates born to HCV-positive mother. All pregnant women were screened during third trimester or just before delivery, neonates born to HCV-positive mothers were evaluated within 24-h postdelivery to record vertical transmission cases. Data mining (DM)-driven computational analysis was used to quantify the findings. Among 3,836 randomized pregnant women, HCV genotype 4 was identified in 80 women (2.08%). Out of 80 HCV-infected women, 18 have experienced surgical intervention (22.5%) and 62 CS (77.5%). HCV vertical transmission was identified in 10 neonates, 10/80 (12.5%). Screening women who had experienced surgical intervention or CS during child bearing period and before pregnancy might prevent HCV mother-to-child transmission (MTCT). CS should be ethically justified to decrease global HCV transmission.

  20. What the eye doesn’t see: An analysis of strategies for justifying acts by an appeal for conealing them

    NARCIS (Netherlands)

    Tellings, A.E.J.M.

    2006-01-01

    This article analyzes the moral reasoning implied in a very commonly used expression, namely, “What the eye doesn't see, the heart doesn't grieve over”, or “What you don't know won't hurt you.” It especially deals with situations in which it is used for trying to justify acts that are, in

  1. Portfolios: Justify Your Job as a Library Media Specialist and the Media Budget during Times of Budget Cuts

    Science.gov (United States)

    Allen, Melissa; Bradley, Amy

    2009-01-01

    During this time of economic crisis and the projected budget cuts in education, it is more important than ever to justify one's position as a critical element in the school. One excellent way to document the need for a full-time, certified media specialist in the school is through a portfolio. Maintaining a portfolio is cheap and fairly easy,…

  2. Modelling Conditional and Unconditional Heteroskedasticity with Smoothly Time-Varying Structure

    DEFF Research Database (Denmark)

    Amado, Christina; Teräsvirta, Timo

    in the conditional and unconditional variances where the transition between regimes over time is smooth. A modelling strategy for these new time-varying parameter GARCH models is developed. It relies on a sequence of Lagrange multiplier tests, and the adequacy of the estimated models is investigated by Lagrange...... multiplier type misspecification tests. Finite-sample properties of these procedures and tests are examined by simulation. An empirical application to daily stock returns and another one to daily exchange rate returns illustrate the functioning and properties of our modelling strategy in practice...

  3. Can "presumed consent" justify the duty to treat infectious diseases? An analysis

    Directory of Open Access Journals (Sweden)

    Arda Berna

    2008-03-01

    -fifth of the participants in this study either lacked adequate knowledge of the occupational risks when they chose the medical profession or were not sufficiently informed of these risks during their faculty education and training. Furthermore, in terms of the moral duty to provide care, it seems that most HCWs are more concerned about the availability of protective measures than about whether they had been informed of a particular risk beforehand. For all these reasons, the presumed consent argument is not persuasive enough, and cannot be used to justify the duty to provide care. It is therefore more useful to emphasize justifications other than presumed consent when defining the duty of HCWs to provide care, such as the social contract between society and the medical profession and the fact that HCWs have a greater ability to provide medical aid.

  4. Model-checking techniques based on cumulative residuals.

    Science.gov (United States)

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  5. When is the use of pacifiers justifiable in the baby-friendly hospital initiative context? A clinician?s guide

    OpenAIRE

    Lubbe, Welma; ten Ham-Baloyi, Wilma

    2017-01-01

    Background The use of pacifiers is an ancient practice, but often becomes a point of debate when parents and professionals aim to protect and promote breastfeeding as most appropriately for nurturing infants. We discuss the current literature available on pacifier use to enable critical decision-making regarding justifiable use of pacifiers, especially in the Baby-Friendly Hospital Initiative context, and we provide practical guidelines for clinicians. Discussion Suck-swallow-breathe coordina...

  6. Scientific and technical conference Thermophysical experimental and calculating and theoretical studies to justify characteristics and safety of fast reactors. Thermophysics-2012. Book of abstracts

    International Nuclear Information System (INIS)

    Kalyakin, S.G.; Kukharchuk, O.F.; Sorokin, A.P.

    2012-01-01

    The collection includes abstracts of reports of scientific and technical conference Thermophysics-2012 which has taken place on October 24-26, 2012 in Obninsk. In abstracts the following questions are considered: experimental and calculating and theoretical studies of thermal hydraulics of liquid-metal cooled fast reactors to justify their characteristics and safety; physico-chemical processes in the systems with liquid-metal coolants (LMC); physico-chemical characteristics and thermophysical properties of LMC; development of models, computational methods and calculational codes for simulating processes of of hydrodynamics, heat and mass transfer, including impurities mass transfer in the systems with LMC; methods and means for control of composition and condition of LMC in fast reactor circuits on impurities and purification from them; apparatuses, equipment and technological processes at the work with LMC taking into account the ecology, including fast reactors decommissioning; measuring techniques, sensors and devices for experimental studies of heat and mass transfer in the systems with LMC [ru

  7. Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis

    International Nuclear Information System (INIS)

    2011-01-01

    Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnup Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in

  8. Technical Data to Justify Full Burnup Credit in Criticality Safety Licensing Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Enercon Services, Inc.

    2011-03-14

    Enercon Services, Inc. (ENERCON) was requested under Task Order No.2 to identify scientific and technical data needed to benchmark and justify Full Burnup Credit, which adds 16 fission products and 4 minor actinides1 to Actinide-Only burnup credit. The historical perspective for Full Burnup Credit is discussed, and interviews of organizations participating in burnup credit activities are summarized as a basis for identifying additional data needs and making recommendation. Input from burnup credit participants representing two segments of the commercial nuclear industry is provided. First, the Electric Power Research Institute (EPRI) has been very active in the development of Full Burnup Credit, representing the interests of nuclear utilities in achieving capacity gains for storage and transport casks. EPRI and its utility customers are interested in a swift resolution of the validation issues that are delaying the implementation of Full Burnup Credit [EPRI 2010b]. Second, used nuclear fuel storage and transportation Cask Vendors favor improving burnup credit beyond Actinide-Only burnup credit, although their discussion of specific burnup credit achievements and data needs was limited citing business sensitive and technical proprietary concerns. While Cask Vendor proprietary items are not specifically identified in this report, the needs of all nuclear industry participants are reflected in the conclusions and recommendations of this report. In addition, Oak Ridge National Laboratory (ORNL) and Sandia National Laboratory (SNL) were interviewed for their input into additional data needs to achieve Full Burnup Credit. ORNL was very open to discussions of Full Burnup Credit, with several telecoms and a visit by ENERCON to ORNL. For many years, ORNL has provided extensive support to the NRC regarding burnup credit in all of its forms. Discussions with ORNL focused on potential resolutions to the validation issues for the use of fission products. SNL was helpful in

  9. Estimation and model selection of semiparametric multivariate survival functions under general censorship.

    Science.gov (United States)

    Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

    2010-07-01

    We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root- n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.

  10. Analysis of hypoglycemic events using negative binomial models.

    Science.gov (United States)

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Justifying Reasons for Giving Employment Priorities to Isargaran and Veterans in Iranian and American Law

    Directory of Open Access Journals (Sweden)

    Ali Akbar Gorji Azandaryani

    2012-11-01

    Full Text Available Equality is one of the principles and fundamental rights of human being. There has been lots of talk about equality and justice, but the legal aspect of this principle is still under dispute. Human beings are born equal, so their life has an equal moral value. This principle, along with prohibiting discrimination and bias rejection, has a great impact in the legislative and administrative decisions and is accepted in the Constitution and international norms. But here the important point in this matter is a formation of a paradox in the concept of the principle of equality in today's law. There is a kind of discrimination in the legal and social relationship, within the quest for equality. Privileges that granted to soldiers returning from war and their descendants is an issue that arises during or immediately after every war and because of its discriminatory nature becomes a controversial matter at first glance, and there are widespread opinions regarding this issue. In this article, we try to examine justifying reasons for giving employment priorities to veterans based on the theory of permissible discrimination and equality and to allude to isargaran and veterans' employment priority in Iran and the United States law. Therefore, at first, we examine the theoretical discussions and preference of veterans in America's law. In the next part, in the light of the findings of the first part, veterans and isargaran employment preference will be debated in the United States and Iran's judicial system. Discussing this privilege, we conclude that this privilege is granted to veterans and isargaran according to the theory of permissible discrimination and equality and none of these theories is completely accepted by the legislature of Iran and America and various theories have been used according to time and place. برابری یکی از اصول و حقوق بنیادین بشر به شمار می‌رود این اصل در کنار منع ت

  12. Are multi-paddock grazing systems economically justifiable? | M.T. ...

    African Journals Online (AJOL)

    The financial implications of few- and multi-paddock systems were modelled by a discounted cash flow analysis with the (discounted) present value as the dependent variable, and number of paddocks, farm run-down time, time horizon and discount rate as the independent variables. Present values were higher for few- ...

  13. When is the use of pacifiers justifiable in the baby-friendly hospital initiative context? A clinician's guide.

    Science.gov (United States)

    Lubbe, Welma; Ten Ham-Baloyi, Wilma

    2017-04-27

    The use of pacifiers is an ancient practice, but often becomes a point of debate when parents and professionals aim to protect and promote breastfeeding as most appropriately for nurturing infants. We discuss the current literature available on pacifier use to enable critical decision-making regarding justifiable use of pacifiers, especially in the Baby-Friendly Hospital Initiative context, and we provide practical guidelines for clinicians. Suck-swallow-breathe coordination is an important skill that every newborn must acquire for feeding success. In most cases the development and maintenance of the sucking reflex is not a problem, but sometimes the skill may be compromised due to factors such as mother-infant separation or medical conditions. In such situations the use of pacifiers can be considered therapeutic and even provide medical benefits to infants, including reducing the risk of sudden infant death syndrome. The argument opposing pacifier use, however, is based on potential risks such as nipple confusion and early cessation of breastfeeding. The Ten Steps to Successful Breastfeeding as embedded in the Baby-Friendly Hospital Initiative initially prohibited the use of pacifiers in a breastfeeding friendly environment to prevent potential associated risks. This article provides a summary of the evidence on the benefits of non-nutritive sucking, risks associated with pacifier use, an identification of the implications regarded as 'justifiable' in the clinical use of pacifiers and a comprehensive discussion to support the recommendations for safe pacifier use in healthy, full-term, and ill and preterm infants. The use of pacifiers is justifiable in certain situations and will support breastfeeding rather than interfere with it. Justifiable conditions have been identified as: low-birth weight and premature infants; infants at risk for hypoglyceamia; infants in need of oral stimulation to develop, maintain and mature the sucking reflex in preterm infants; and

  14. PET/CT in cancer: moderate sample sizes may suffice to justify replacement of a regional gold standard

    DEFF Research Database (Denmark)

    Gerke, Oke; Poulsen, Mads Hvid; Bouchelouche, Kirsten

    2009-01-01

    /CT also performs well in adjacent areas, then sample sizes in accuracy studies can be reduced. PROCEDURES: Traditional standard power calculations for demonstrating sensitivities of both 80% and 90% are shown. The argument is then described in general terms and demonstrated by an ongoing study...... of metastasized prostate cancer. RESULTS: An added value in accuracy of PET/CT in adjacent areas can outweigh a downsized target level of accuracy in the gold standard region, justifying smaller sample sizes. CONCLUSIONS: If PET/CT provides an accuracy benefit in adjacent regions, then sample sizes can be reduced...

  15. Does the Occasion Justify the Denunciation?: a Multilevel Approach for Brazilian Accountants

    Directory of Open Access Journals (Sweden)

    Bernardo de Abreu Guelber Fajardo

    2014-01-01

    Full Text Available Frauds represent large losses to the global economy, and one of the main means for their containment is by means of denunciations within organizations: whistle blowing. This research aims to analyze whistle blowing within the Brazilian context, considering the influence of costs and intrinsic benefits as well as aspects of the individual's interaction with his/her organization, profession and society at large. By means of a questionnaire answered by 124 accountants, a multilevel model was applied to analyze these aspects. The results demonstrate the importance of situational aspects as a positive influence in favor of denunciations. These results are useful for organizations and regulatory institutions in developing institutional mechanisms to encourage denunciation. Moreover, the results are also useful for teachers of professional ethics and members of the Federal and Regional Accounting Councils, which are dedicated to the assessment of alleged deviations from the professional code of ethics.

  16. La guerre en Irak peut-elle être justifiée comme un cas d’intervention humanitaire?

    Directory of Open Access Journals (Sweden)

    Stéphane Courtois

    2006-05-01

    Full Text Available Most current criticisms against the intervention in Iraq have tackled the two justifications articulated by the members of the coalition:(1 that the United States had to neutralize the threats that Iraq generated for their own security and to the political stability in the Middle Eastand (2 that the war in Iraq can be justified as a necessary stage in the war against international terrorism. The principal objection against justification (1 is that it was, and remains, unfounded. Against justification (2, many have replied that the intervention in Iraq had no connection,or at best had merely an indirect connection, with the fight against terrorism. In a recent text,Fernando Tesón claims that the American intervention in Iraq can nevertheless be morally justified as a case of humanitarian intervention. By “humanitarian intervention”, one must understand a coercive action taken by a state or a group of states inside the sphere of jurisdiction of an independent political community, without the permission of the latter, in order to preventor to end a massive violation of individual rights perpetrated against innocent persons which are not co-nationals inside this political community. I argue in this article that the American intervention in Iraq does not satisfy the conditions of a legitimate humanitarian intervention, as opposed to what Fernando Tesón claims.

  17. Topics in modelling of clustered data

    CERN Document Server

    Aerts, Marc; Ryan, Louise M; Geys, Helena

    2002-01-01

    Many methods for analyzing clustered data exist, all with advantages and limitations in particular applications. Compiled from the contributions of leading specialists in the field, Topics in Modelling of Clustered Data describes the tools and techniques for modelling the clustered data often encountered in medical, biological, environmental, and social science studies. It focuses on providing a comprehensive treatment of marginal, conditional, and random effects models using, among others, likelihood, pseudo-likelihood, and generalized estimating equations methods. The authors motivate and illustrate all aspects of these models in a variety of real applications. They discuss several variations and extensions, including individual-level covariates and combined continuous and discrete outcomes. Flexible modelling with fractional and local polynomials, omnibus lack-of-fit tests, robustification against misspecification, exact, and bootstrap inferential procedures all receive extensive treatment. The application...

  18. Hydroaerothermal investigations conducted in the USSR to justify the construction of large cooling towers

    International Nuclear Information System (INIS)

    Goncharov, V.V.

    1989-01-01

    The multi-purpose task of improving water cooling systems of thermal and nuclear power plants is aimed at the development of efficient designs of cooling towers and other types of industrial coolers which call for comprehensive scientific justification. Cooling towers of 60-70 thou m 3 /h capacity with a chimney height of 130 m and those of 80-100 thou m 3 /h capacity with a chimney height of 150 m were developed. For circulating water systems of large power plants the design of a counterflow chimney cooling tower of 180 thou m 3 /h capacity has been recently developed. At present the work is being conducted on developing a new three-cell cooling tower featuring high reliability, operational flexibility and cost-effectiveness of the design. This cooling tower, besides having higher operating reliability than the conventional one of circular shape, can ensure the commissioning, current repairs and overhauls of water cooling arrangements in a cell-wise sequence, i.e. without shutting down the power generating units. Laboratory and field investigations of the spray-type cooling towers having no packing (fill), studies on heat and mass exchanges processes, aerodynamics of droplet flows and new designs of sprayers made it possible to come to a conclusion that their cooling capacity can be substantially increased and brought up to the level of the cooling towers with film packings. The pilot cooling towers were designed according to the counterflow, crossflow and cross-counterflow schemes. The basic investigation method remains to be the experimental one. On the test rigs and aerodynamic models the heat and mass transfer and aerodynamic resistance coefficients are determined. These studies and subsequent calculations are based on the heat balance equation

  19. Is the use of wildlife group-specific concentration ratios justified?

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Michael D. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Beresford, Nicholas A. [School of Environment and Life Sciences, University of Salford, Manchester, M4 4WT (United Kingdom); Centre for Ecology and Hydrology, Bailrigg, Lancaster, LA1 4AP (United Kingdom); Copplestone, David [School of Natural Sciences, University of Stirling, Stirling, FK9 4LA (United Kingdom); Howard, Brenda J. [Centre for Ecology and Hydrology, Bailrigg, Lancaster, LA1 4AP (United Kingdom); Yankovich, Tamara L. [International Atomic Energy Agency, Vienna International Centre, 1400 Vienna (Austria)

    2014-07-01

    The international Wildlife Transfer Database (WTD; www.wildlifetransferdatabase.org/?) provides the most comprehensive international compilation of radionuclide transfer parameters (concentration ratios) for wildlife. The concentration ratio (CR{sub wo-media}) is a constant that describes the ratio between the activity concentration of a radionuclide in the whole- organism and the activity concentration of that radionuclide in a reference environmental medium (e.g. soil or filtered water). Developed to support activities of the International Atomic Energy Agency (IAEA) and the International Commission on Radiological Protection (ICRP), the WTD now contains over 100,000 CR{sub wo-media} values. The WTD has been used to generate summary statistics for broad wildlife groups (e.g. amphibian, arthropod, mammal, reptile, shrub, tree etc). The group-specific summary statistics include mean and standard deviation (both arithmetic and geometric) and range. These summarised CR{sub wo-media} values (generally arithmetic or geometric mean) are used in most of the modelling approaches currently implemented for wildlife dose assessment. Beyond the broad organism group summary statistics presented within the WTD, it is possible to generate CR{sub wo-media} summary statistics for some organism sub-categories (e.g. carnivorous, herbivorous and omnivorous birds). However, using a statistical analysis we developed recently for the analysis of summarised datasets, we have shown that there is currently little statistical justification for the use of organism sub-category CR{sub wo-media} values. Large variability is a characteristic of many of the organism-radionuclide datasets within the WTD, even within individual input data sets. Therefore, the statistical validity of defining different CR{sub wo-media} values for these broad wildlife groups may also be questioned. However, no analysis has been undertaken to date to determine the statistical significance of any differences between

  20. A comparison of non-homogeneous Markov regression models with application to Alzheimer’s disease progression

    Science.gov (United States)

    Hubbard, R. A.; Zhou, X.H.

    2011-01-01

    Markov regression models are useful tools for estimating the impact of risk factors on rates of transition between multiple disease states. Alzheimer’s disease (AD) is an example of a multi-state disease process in which great interest lies in identifying risk factors for transition. In this context, non-homogeneous models are required because transition rates change as subjects age. In this report we propose a non-homogeneous Markov regression model that allows for reversible and recurrent disease states, transitions among multiple states between observations, and unequally spaced observation times. We conducted simulation studies to demonstrate performance of estimators for covariate effects from this model and compare performance with alternative models when the underlying non-homogeneous process was correctly specified and under model misspecification. In simulation studies, we found that covariate effects were biased if non-homogeneity of the disease process was not accounted for. However, estimates from non-homogeneous models were robust to misspecification of the form of the non-homogeneity. We used our model to estimate risk factors for transition to mild cognitive impairment (MCI) and AD in a longitudinal study of subjects included in the National Alzheimer’s Coordinating Center’s Uniform Data Set. Using our model, we found that subjects with MCI affecting multiple cognitive domains were significantly less likely to revert to normal cognition. PMID:22419833

  1. Assessing the DICE model: uncertainty associated with the emission and retention of greenhouse gases

    International Nuclear Information System (INIS)

    Kaufmann, R.K.

    1997-01-01

    Analysis of the DICE model indicates that it contains unsupported assumptions, simple extrapolations, and mis-specifications that cause it to understate the rate at which economic activity emits greenhouse gases and the rate at which the atmosphere retains greenhouse gases. The model assumes a world population that is 2 billion people lower than the 'base case' projected by demographers. The model extrapolates a decline in the quantity of greenhouse gases emitted per unit of economic activity that is possible only if there is a structural break in the economic and engineering factors have determined this ratio over the last century. The model uses a single equation to simulate the rate at which greenhouse gases accumulate in the atmosphere. The forecast for the airborne fraction generated by this equation contradicts forecasts generated by models that represent the physical and chemical processes which determine the movement of carbon from the atmosphere to the ocean. When these unsupported assumptions, simple extrapolations, and misspecifications are remedied with simple fixes, the economic impact of global climate change increases several fold. Similarly, these remedies increase the impact of uncertainty on estimates for the economic impact of global climate change. Together, these results indicate that considerable scientific and economic research is needed before the threat of climate change can be dismissed with any degree of certainty. 23 refs., 3 figs

  2. The protection of fundamental rights in the Netherlands and South Africa compared: can the many differences be justified?

    Directory of Open Access Journals (Sweden)

    G van der Schyff

    2008-08-01

    Full Text Available This contribution considers the protection of fundamental rights in the Netherlands and South Africa. Both countries strive to be constitutional democracies that respect basic rights. But both countries go about this aim in very different ways. These different paths to constitutionalism are compared, as well as the reasons for these differences and whether it can be said that these differences are justifiable. This is done by comparing the character of the rights guaranteed in the Dutch and South African legal orders, the sources of these rights and the locus or centre of protection in both systems. The conclusion is reached that no single or perfect route to attaining the desired protection of fundamental rights exists, but that one should always enquire as to the state of individual freedom and the right to make free political choices in measuring the worth of a system's protection of rights.

  3. A Lacanian Reading of the Two Novels The Scarlet Letter And Private Memoirs And Confessions of A Justified Sinner

    Directory of Open Access Journals (Sweden)

    Marjan Yazdanpanahi

    2016-07-01

    Full Text Available This paper discusses two novels The Private Memoirs and Confessions of a Justified Sinner and The Scarlet Letter written by James Hogg and Nathaniel Hawthorn from the perspective of Jacques Lacan theories: the mirror stage, the-name-of-the-father and desire. The mirror stage refers to historical value and an essential libidinal relationship with the body-image. The-name-of-the-father is defined as the prohibitive role of the father as the one who lays down the incest taboo in the Oedipus complex. Meanwhile, desire is neither the appetite for satisfaction, nor the demand for love, but the difference that results from the subtraction of the first from the second.

  4. Are chest radiographs justified in pre-employment examinations. Presentation of legal position and medical evidence based on 1760 cases

    International Nuclear Information System (INIS)

    Ladd, S.C.; Krause, U.; Ladd, M.E.

    2006-01-01

    The legal and medical basis for chest radiographs as part of pre-employment examinations (PEE) at a University Hospital is evaluated. The radiographs are primarily performed to exclude infectious lung disease. A total of 1760 consecutive chest radiographs performed as a routine part of PEEs were reviewed retrospectively. Pathologic findings were categorized as ''nonrelevant'' or ''relevant.'' No positive finding with respect to tuberculosis or any other infectious disease was found; 94.8% of the chest radiographs were completely normal. Only five findings were regarded as ''relevant'' for the individual. No employment-relevant diagnosis occurred. The performance of chest radiography as part of a PEE is most often not justified. The practice is expensive, can violate national and European law, and lacks medical justification. (orig.) [de

  5. Is a Clean Development Mechanism project economically justified? Case study of an International Carbon Sequestration Project in Iran.

    Science.gov (United States)

    Katircioglu, Salih; Dalir, Sara; Olya, Hossein G

    2016-01-01

    The present study evaluates a carbon sequestration project for the three plant species in arid and semiarid regions of Iran. Results show that Haloxylon performed appropriately in the carbon sequestration process during the 6 years of the International Carbon Sequestration Project (ICSP). In addition to a high degree of carbon dioxide sequestration, Haloxylon shows high compatibility with severe environmental conditions and low maintenance costs. Financial and economic analysis demonstrated that the ICSP was justified from an economic perspective. The financial assessment showed that net present value (NPV) (US$1,098,022.70), internal rate of return (IRR) (21.53%), and payback period (6 years) were in an acceptable range. The results of the economic analysis suggested an NPV of US$4,407,805.15 and an IRR of 50.63%. Therefore, results of this study suggest that there are sufficient incentives for investors to participate in such kind of Clean Development Mechanism (CDM) projects.

  6. Is Sport Nationalism Justifiable?

    Directory of Open Access Journals (Sweden)

    José Luis Pérez Triviño

    2012-01-01

    Full Text Available The article aims to clarify the deep relationships established between sport and nationalism by considering, among other factors, the instrumentalisation of sport by political elites, political apathy of citizens, economic resources for sport, the question of violence or identitarian matters. In order to define if the combination of sport and nationalism is admissible, the paper defines sport nationalism and distinguishes the political use of sport for purposes of domestic and foreign policy. In the first section the analysis focuses on whether a causal link with respect to the contribution to violence can be established and with respect to its use in the internal politics of a state, the paper differentiates between normal political circumstances and political crises in order to properly address the question of whether there are grounds to assert that sport can distract citizens from asserting their genuine interests.

  7. Justified Self-Esteem

    Science.gov (United States)

    Kristjansson, Kristjan

    2007-01-01

    This paper develops a thread of argument from previous contributions to this journal by Richard Smith and Ruth Cigman about the educational salience of self-esteem. It is argued--contra Smith and Cigman--that the social science conception of self-esteem does serve a useful educational function, most importantly in undermining the inflated…

  8. Rethinking Recruitment in Policing in Australia: Can the Continued Use of Masculinised Recruitment Tests and Pass Standards that Limit the Number of Women be Justified?

    Directory of Open Access Journals (Sweden)

    Susan Robinson

    2015-06-01

    Full Text Available Over the past couple of decades, Australian police organisations have sought to increase the numbers of women in sworn policing roles by strictly adhering to equal treatment of men and women in the recruitment process. Unfortunately this blind adherence to equal treatment in the recruitment processes may inadvertently disadvantage and limit women. In particular, the emphasis on masculine attributes in recruitment, as opposed to the ‘soft’ attributes of communication and conflict resolution skills, and the setting of the minimum pass standards according to average male performance, disproportionately disadvantages women and serves to unnecessarily limit the number of women in policing. This paper reviews studies undertaken by physiotherapists and a range of occupational experts to discuss the relevance of physical fitness and agility tests and the pass standards that are applied to these in policing. It is suggested that masculinised recruitment tests that pose an unnecessary barrier to women cannot be justified unless directly linked to the job that is to be undertaken. Utilising a policy development and review model, an analysis of the problem posed by physical testing that is unadjusted for gender, is applied. As a result, it is recommended that police organisations objectively review recruitment processes and requirements to identify and eliminate unnecessary barriers to women’s entry to policing. It is also recommended that where fitness and agility tests are deemed essential to the job, the pass level is adjusted for gender.

  9. Preventing the ends from justifying the means: withholding results to address publication bias in peer-review.

    Science.gov (United States)

    Button, Katherine S; Bal, Liz; Clark, Anna; Shipley, Tim

    2016-12-01

    The evidence that many of the findings in the published literature may be unreliable is compelling. There is an excess of positive results, often from studies with small sample sizes, or other methodological limitations, and the conspicuous absence of null findings from studies of a similar quality. This distorts the evidence base, leading to false conclusions and undermining scientific progress. Central to this problem is a peer-review system where the decisions of authors, reviewers, and editors are more influenced by impressive results than they are by the validity of the study design. To address this, BMC Psychology is launching a pilot to trial a new 'results-free' peer-review process, whereby editors and reviewers are blinded to the study's results, initially assessing manuscripts on the scientific merits of the rationale and methods alone. The aim is to improve the reliability and quality of published research, by focusing editorial decisions on the rigour of the methods, and preventing impressive ends justifying poor means.

  10. To what extent do English language RCT meta-analysis justify induction of low-risk pregnancy for postdates?

    Science.gov (United States)

    Cohain, J S

    2015-05-01

    Induction for postdates in low-risk pregnancy was adopted with the intent to prevent post-term antepartum stillbirth, the most common cause of perinatal death, based on evidence derived in English language RCT meta-analysis. Systematic English language meta-analysis of RCT studies of induction for postdates in low-risk pregnancy report perinatal mortality rates (PMRs) for low-risk pregnancy ranging from 2.6 to 7.6/1000, based on 2-5 stillbirths among 13-16 perinatal deaths, including diabetic pregnancies as well as other high-risk pregnancies irrelevant to the study question. Baseline PMR≥41 weeks in large international databases for high and low risk pregnancies before routine induction 1998-2003 range from 0.9 to 2.4/1000 or about 300% lower than the reported PMR rates for postdate pregnancies in the expectant management arm in English language RCT meta-analysis. Deaths in the first week far exceed stillbirths in the RCT meta-analysis, the opposite of what is expected. These 2 implausible results bring into question the evidence used to justify induction for postdates≥41 weeks. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  11. Are Concerns About Irremediableness, Vulnerability, or Competence Sufficient to Justify Excluding All Psychiatric Patients from Medical Aid in Dying?

    Science.gov (United States)

    Rooney, William; Schuklenk, Udo; van de Vathorst, Suzanne

    2017-06-17

    Some jurisdictions that have decriminalized assisted dying (like Canada) exclude psychiatric patients on the grounds that their condition cannot be determined to be irremediable, that they are vulnerable and in need of protection, or that they cannot be determined to be competent. We review each of these claims and find that none have been sufficiently well-supported to justify the differential treatment psychiatric patients experience with respect to assisted dying. We find bans on psychiatric patients' access to this service amount to arbitrary discrimination. Proponents of banning the practice ignore or overlook alternatives to their proposal, like an assisted dying regime with additional safeguards. Some authors have further criticized assisted dying for psychiatric patients by highlighting allegedly problematic practices in those countries which allow it. We address recent evidence from the Netherlands, showing that these problems are either misrepresented or have straightforward solutions. Even if one finds such evidence troubling despite our analysis, other jurisdictions need not adopt every feature of the Dutch system.

  12. Catastrophic Decline of World's Largest Primate: 80% Loss of Grauer's Gorilla (Gorilla beringei graueri) Population Justifies Critically Endangered Status

    Science.gov (United States)

    Nixon, Stuart; Kujirakwinja, Deo K.; Vieilledent, Ghislain; Critchlow, Rob; Williamson, Elizabeth A.; Nishuli, Radar; Kirkby, Andrew E.; Hall, Jefferson S.

    2016-01-01

    Grauer’s gorilla (Gorilla beringei graueri), the World’s largest primate, is confined to eastern Democratic Republic of Congo (DRC) and is threatened by civil war and insecurity. During the war, armed groups in mining camps relied on hunting bushmeat, including gorillas. Insecurity and the presence of several militia groups across Grauer’s gorilla’s range made it very difficult to assess their population size. Here we use a novel method that enables rigorous assessment of local community and ranger-collected data on gorilla occupancy to evaluate the impacts of civil war on Grauer’s gorilla, which prior to the war was estimated to number 16,900 individuals. We show that gorilla numbers in their stronghold of Kahuzi-Biega National Park have declined by 87%. Encounter rate data of gorilla nests at 10 sites across its range indicate declines of 82–100% at six of these sites. Spatial occupancy analysis identifies three key areas as the most critical sites for the remaining populations of this ape and that the range of this taxon is around 19,700 km2. We estimate that only 3,800 Grauer’s gorillas remain in the wild, a 77% decline in one generation, justifying its elevation to Critically Endangered status on the IUCN Red List of Threatened Species. PMID:27760201

  13. Need humanities be so useless? Justifying the place and role of humanities as a critical resource for performance and practice.

    Science.gov (United States)

    Edgar, A; Pattison, S

    2006-12-01

    Justifying the existence, position, and relevance of academic humanities scholarship may be difficult in the face of chronic practical needs in health care. Such scholarship may seem parasitic on human activity and performance that directly contributes to human wellbeing and health care. Here, a possible and partial justification for the importance of scholarship in the humanities as a critical resource for practice and performance is undertaken by two humanities scholars. Human identity and emotion are reflected and defined by performances, both in the traditional disciplines of the humanities, such as art and literature, and in the sciences and medicine. The critical attitude that such performances might inadvertently undermine is sustained by the humanities. The humanities disciplines ask the question: "What is it to be human?" Uncritical emotion and expression, arising, for example, from understanding developments in medicine and science, which might exclude or corrupt much that is of value in the healthcare sector and other areas of practical performance, can be constrained by this.

  14. Sentinel lymph node biopsy in patients with a needle core biopsy diagnosis of ductal carcinoma in situ: is it justified?

    LENUS (Irish Health Repository)

    Doyle, B

    2012-02-01

    BACKGROUND: The incidence of ductal carcinoma in situ (DCIS) has increased markedly with the introduction of population-based mammographic screening. DCIS is usually diagnosed non-operatively. Although sentinel lymph node biopsy (SNB) has become the standard of care for patients with invasive breast carcinoma, its use in patients with DCIS is controversial. AIM: To examine the justification for offering SNB at the time of primary surgery to patients with a needle core biopsy (NCB) diagnosis of DCIS. METHODS: A retrospective analysis was performed of 145 patients with an NCB diagnosis of DCIS who had SNB performed at the time of primary surgery. The study focused on rates of SNB positivity and underestimation of invasive carcinoma by NCB, and sought to identify factors that might predict the presence of invasive carcinoma in the excision specimen. RESULTS: 7\\/145 patients (4.8%) had a positive sentinel lymph node, four macrometastases and three micrometastases. 6\\/7 patients had invasive carcinoma in the final excision specimen. 55\\/145 patients (37.9%) with an NCB diagnosis of DCIS had invasive carcinoma in the excision specimen. The median invasive tumour size was 6 mm. A radiological mass and areas of invasion <1 mm, amounting to "at least microinvasion" on NCB were predictive of invasive carcinoma in the excision specimen. CONCLUSIONS: SNB positivity in pure DCIS is rare. In view of the high rate of underestimation of invasive carcinoma in patients with an NCB diagnosis of DCIS in this study, SNB appears justified in this group of patients.

  15. Cardiac retransplantation: is it justified in times of critical donor organ shortage? Long-term single-center experience.

    Science.gov (United States)

    Goerler, Heidi; Simon, Andre; Gohrbandt, Bernhard; Hagl, Christian; Oppelt, Petra; Haverich, Axel; Strueber, Martin

    2008-12-01

    Survival after heart transplantation has improved significantly over the last decades. There are a growing number of patients that require cardiac retransplantation because of chronic allograft dysfunction. With regard to the critical shortage of cardiac allograft donors the decision to offer repeat heart transplantation must be carefully considered. Since 1983 a total of 807 heart transplantations have been performed at our institution. Among them 41 patients received cardiac retransplantation, 18 patients because of acute graft failure and 23 because of chronic graft failure. Data were analyzed for demographics, morbidity and risk factors for mortality. The acute and chronic retransplant group was compared to those patients undergoing primary transplantation. The mean interval between primary transplantation and retransplantation was 1.9 days in the acute and 6.7 years in the chronic retransplant group. Mean follow-up was 6.9 years. Baseline characteristics were similar in the primary and retransplant group. Actuarial survival rates at 1, 3, 5 and 7 years after primary cardiac transplantation compared to retransplantation were 83, 78, 72 and 64% vs 53, 50, 47 and 36%, respectively (p<0.001). Early mortality after acute retransplantation was significantly higher compared to late retransplantation (10/18, 55.6% vs 4/23, 17.4%, p=0.011). Major causes of death were acute and chronic rejection, infection and sepsis. Cardiac retransplantation is associated with lower survival rates compared to primary transplantation. However, results after retransplantation in chronic graft failure are significantly better compared to acute graft failure. Therefore, we consider cardiac retransplantation in chronic graft failure a justified therapeutic option. In contrast, patients with acute graft failure seem to be inappropriate candidates for cardiac retransplantation.

  16. Is referring patients with a positive history of allergic drug reactions or atopy for allergy testing to local anesthetics justified?

    Science.gov (United States)

    Erdeljic, Viktorija; Francetic, Igor; Likic, Robert; Bakran, Ivan; Makar-Ausperger, Ksenija; Simic, Petra

    2009-04-01

    Although no more than 1% of adverse reactions to local anesthetics (LA) are thought to be immunologically mediated, many patients continue to be referred to allergy clinics for allergy workup. We evaluated the impact of a history of drug hypersensitivity or atopy on results of allergy testing to LA, with the aim of determining the appropriateness of allergy testing to LA in such patients. We retrospectively analyzed medical records of 112 consecutive patients referred for allergy testing to LA in a 9-year period (1996-2005). Intradermal tests with diluted (1:10) LA were performed to identify patients at risk for immunoglobulin E (IgE)-mediated hypersensitivity reaction. The odds for being testpositive were calculated with regard to the defined risk factors (atopy, history of adverse reactions to LA or other drugs, underlying autoimmune disease). Eleven of 112 patients (9.8%) tested positive for allergy to LA. Atopy, history of adverse reactions to LA or other drugs and underlying autoimmune disease did not increase the odds for being test-positive. The prevalence of multiple drug hypersensitivity, IgE values and eosinophil count were not significantly higher among the patients who tested positive as compared to the patients who tested negative. According to our data, allergy testing to LA is not justified in patients with atopy or histories of adverse drug reactions other than to LA. Further studies using validated methods of allergy testing to LA coupled with analysis of defined risk factors are needed to definitively establish the indications for referral of patients for allergy testing to LA. Copyright 2009 Prous Science, S.A.U. or its licensors. All rights reserved.

  17. Validation of image quality in full-field digital mammography: is the replacement of wet by dry laser printers justified?

    Science.gov (United States)

    Schueller, Gerd; Kaindl, Elisabeth; Langenberger, Herbert; Stadler, Alfred; Schueller-Weidekamm, Claudia; Semturs, Friedrich; Helbich, Thomas H

    2007-05-01

    Dry laser printers have replaced wet laser printers to produce hard copies of high-resolution digital images, primarily because of environmental concerns. However, no scientific research data have been published that compare the image quality of dry and wet laser printers in full-field digital mammography (FFDM). This study questions the image quality of these printers. Objective image quality parameters of both printers were evaluated using a standardized printer test image, i.e., optical density and detectability of specific image elements (lines, curves, and shapes). Furthermore, mammograms of 129 patients with different breast tissue composition patterns were imaged with both printers. A total of 1806 subjective image quality parameters (brightness, contrast, and detail detection of anatomic structures), the detectability of breast lesions, as well as diagnostic performance according to the BI-RADS classification were evaluated. In addition, the presence of film artifacts was investigated. Optical density values were equal for the dry and the wet laser printer. Detection of specific image elements on the printer test image was not different. Ratings of subjective image quality parameters were equal, as were the detectability of breast lesions and the diagnostic performance. Dry laser printer images showed more artifacts (164 versus 27). However, these artifacts did not influence image quality. Based on the evidence of objective and subjective parameters, a dry laser printer equals the image quality of a wet laser printer in FFDM. Therefore, not only for reasons of environmental preference, the replacement of wet laser printers by dry laser printers in FFDM is justified.

  18. Validation of image quality in full-field digital mammography: Is the replacement of wet by dry laser printers justified?

    International Nuclear Information System (INIS)

    Schueller, Gerd; Kaindl, Elisabeth; Langenberger, Herbert; Stadler, Alfred; Schueller-Weidekamm, Claudia; Semturs, Friedrich; Helbich, Thomas H.

    2007-01-01

    Objective: Dry laser printers have replaced wet laser printers to produce hard copies of high-resolution digital images, primarily because of environmental concerns. However, no scientific research data have been published that compare the image quality of dry and wet laser printers in full-field digital mammography (FFDM). This study questions the image quality of these printers. Materials and methods: Objective image quality parameters of both printers were evaluated using a standardized printer test image, i.e., optical density and detectability of specific image elements (lines, curves, and shapes). Furthermore, mammograms of 129 patients with different breast tissue composition patterns were imaged with both printers. A total of 1806 subjective image quality parameters (brightness, contrast, and detail detection of anatomic structures), the detectability of breast lesions, as well as diagnostic performance according to the BI-RADS classification were evaluated. In addition, the presence of film artifacts was investigated. Results: Optical density values were equal for the dry and the wet laser printer. Detection of specific image elements on the printer test image was not different. Ratings of subjective image quality parameters were equal, as were the detectability of breast lesions and the diagnostic performance. Dry laser printer images showed more artifacts (164 versus 27). However, these artifacts did not influence image quality. Conclusion: Based on the evidence of objective and subjective parameters, a dry laser printer equals the image quality of a wet laser printer in FFDM. Therefore, not only for reasons of environmental preference, the replacement of wet laser printers by dry laser printers in FFDM is justified

  19. The frequency of Tay-Sachs disease causing mutations in the Brazilian Jewish population justifies a carrier screening program

    Directory of Open Access Journals (Sweden)

    Roberto Rozenberg

    Full Text Available CONTEXT: Tay-Sachs disease is an autosomal recessive disease characterized by progressive neurologic degeneration, fatal in early childhood. In the Ashkenazi Jewish population the disease incidence is about 1 in every 3,500 newborns and the carrier frequency is 1 in every 29 individuals. Carrier screening programs for Tay-Sachs disease have reduced disease incidence by 90% in high-risk populations in several countries. The Brazilian Jewish population is estimated at 90,000 individuals. Currently, there is no screening program for Tay-Sachs disease in this population. OBJECTIVE: To evaluate the importance of a Tay-Sachs disease carrier screening program in the Brazilian Jewish population by determining the frequency of heterozygotes and the acceptance of the program by the community. SETTING: Laboratory of Molecular Genetics - Institute of Biosciences - Universidade de São Paulo. PARTICIPANTS: 581 senior students from selected Jewish high schools. PROCEDURE: Molecular analysis of Tay-Sachs disease causing mutations by PCR amplification of genomic DNA, followed by restriction enzyme digestion. RESULTS: Among 581 students that attended educational classes, 404 (70% elected to be tested for Tay-Sachs disease mutations. Of these, approximately 65% were of Ashkenazi Jewish origin. Eight carriers were detected corresponding to a carrier frequency of 1 in every 33 individuals in the Ashkenazi Jewish fraction of the sample. CONCLUSION: The frequency of Tay-Sachs disease carriers among the Ashkenazi Jewish population of Brazil is similar to that of other countries where carrier screening programs have led to a significant decrease in disease incidence. Therefore, it is justifiable to implement a Tay-Sachs disease carrier screening program for the Brazilian Jewish population.

  20. Economic modelling of energy services: Rectifying misspecified energy demand functions

    International Nuclear Information System (INIS)

    Hunt, Lester C.; Ryan, David L.

    2015-01-01

    Although it is well known that energy demand is derived, since energy is required not for its own sake but for the energy services it produces – such as heating, lighting, and motive power – energy demand models, both theoretical and empirical, often fail to take account of this feature. In this paper, we highlight the misspecification that results from ignoring this aspect, and its empirical implications – biased estimates of price elasticities and other measures – and provide a relatively simple and empirically practicable way to rectify it, which has a strong theoretical grounding. To do so, we develop an explicit model of consumer behaviour in which utility derives from consumption of energy services rather than from the energy sources that are used to produce them. As we discuss, this approach opens up the possibility of examining many aspects of energy demand in a theoretically sound way that have not previously been considered on a widespread basis, although some existing empirical work could be interpreted as being consistent with this type of specification. While this formulation yields demand equations for energy services rather than for energy or particular energy sources, these are shown to be readily converted, without added complexity, into the standard type of energy demand equation(s) that is (are) typically estimated. The additional terms that the resulting energy demand equations include, compared to those that are typically estimated, highlight the misspecification that is implicit when typical energy demand equations are estimated. A simple solution for dealing with an apparent drawback of this formulation for empirical purposes, namely that information is required on typically unobserved energy efficiency, indicates how energy efficiency can be captured in the model, such as by including exogenous trends and/or including its possible dependence on past energy prices. The approach is illustrated using an empirical example that involves

  1. An individual-based probabilistic model for simulating fisheries population dynamics

    Directory of Open Access Journals (Sweden)

    Jie Cao

    2016-12-01

    Full Text Available The purpose of stock assessment is to support managers to provide intelligent decisions regarding removal from fish populations. Errors in assessment models may have devastating impacts on the population fitness and negative impacts on the economy of the resource users. Thus, accuracte estimations of population size, growth rates are critical for success. Evaluating and testing the behavior and performance of stock assessment models and assessing the consequences of model mis-specification and the impact of management strategies requires an operating model that accurately describe the dynamics of the target species, and can resolve spatial and seasonal changes. In addition, the most thorough evaluations of assessment models use an operating model that takes a different form than the assessment model. This paper presents an individual-based probabilistic model used to simulate the complex dynamics of populations and their associated fisheries. Various components of population dynamics are expressed as random Bernoulli trials in the model and detailed life and fishery histories of each individual are tracked over their life span. The simulation model is designed to be flexible so it can be used for different species and fisheries. It can simulate mixing among multiple stocks and link stock-recruit relationships to environmental factors. Furthermore, the model allows for flexibility in sub-models (e.g., growth and recruitment and model assumptions (e.g., age- or size-dependent selectivity. This model enables the user to conduct various simulation studies, including testing the performance of assessment models under different assumptions, assessing the impacts of model mis-specification and evaluating management strategies.

  2. Information matrix estimation procedures for cognitive diagnostic models.

    Science.gov (United States)

    Liu, Yanlou; Xin, Tao; Andersson, Björn; Tian, Wei

    2018-03-06

    Two new methods to estimate the asymptotic covariance matrix for marginal maximum likelihood estimation of cognitive diagnosis models (CDMs), the inverse of the observed information matrix and the sandwich-type estimator, are introduced. Unlike several previous covariance matrix estimators, the new methods take into account both the item and structural parameters. The relationships between the observed information matrix, the empirical cross-product information matrix, the sandwich-type covariance matrix and the two approaches proposed by de la Torre (2009, J. Educ. Behav. Stat., 34, 115) are discussed. Simulation results show that, for a correctly specified CDM and Q-matrix or with a slightly misspecified probability model, the observed information matrix and the sandwich-type covariance matrix exhibit good performance with respect to providing consistent standard errors of item parameter estimates. However, with substantial model misspecification only the sandwich-type covariance matrix exhibits robust performance. © 2018 The British Psychological Society.

  3. Brand Cigarillos: Low Price but High Particulate Matter Levels-Is Their Favorable Taxation in the European Union Justified?

    Science.gov (United States)

    Wasel, Julia; Boll, Michael; Schulze, Michaela; Mueller, Daniel; Bundschuh, Matthias; Groneberg, David A; Gerber, Alexander

    2015-08-06

    taxation of cigarillos is not justifiable.

  4. Simple, efficient estimators of treatment effects in randomized trials using generalized linear models to leverage baseline variables.

    Science.gov (United States)

    Rosenblum, Michael; van der Laan, Mark J

    2010-04-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation.

  5. Fitting additive hazards models for case-cohort studies: a multiple imputation approach.

    Science.gov (United States)

    Jung, Jinhyouk; Harel, Ofer; Kang, Sangwook

    2016-07-30

    In this paper, we consider fitting semiparametric additive hazards models for case-cohort studies using a multiple imputation approach. In a case-cohort study, main exposure variables are measured only on some selected subjects, but other covariates are often available for the whole cohort. We consider this as a special case of a missing covariate by design. We propose to employ a popular incomplete data method, multiple imputation, for estimation of the regression parameters in additive hazards models. For imputation models, an imputation modeling procedure based on a rejection sampling is developed. A simple imputation modeling that can naturally be applied to a general missing-at-random situation is also considered and compared with the rejection sampling method via extensive simulation studies. In addition, a misspecification aspect in imputation modeling is investigated. The proposed procedures are illustrated using a cancer data example. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Reviem Of Legal Relevance Program School No Political Party Based On The Proportionality And Evidence And Justifiability Controls Applied By The Brazilian Supreme Court

    OpenAIRE

    Baggenstoss, Grazielly Alessandra

    2016-01-01

    This research examines the legal context of bringing the School No Political Party Program, which aims to include legal provisions in the Law of Guidelines and Bases of National Education. Therefore, the problem of research is if the mentioned project has legal relevance to the Brazilian legal system , as well as the current pedagogical context. Thus, with deductive method, the question is examined from the proportionality test of Robert Alexy and Evidence and Justifiability Controls, applied...

  7. How to define and build an effective cyber threat intelligence capability how to understand, justify and implement a new approach to security

    CERN Document Server

    Dalziel, Henry; Carnall, James

    2014-01-01

    Intelligence-Led Security: How to Understand, Justify and Implement a New Approach to Security is a concise review of the concept of Intelligence-Led Security. Protecting a business, including its information and intellectual property, physical infrastructure, employees, and reputation, has become increasingly difficult. Online threats come from all sides: internal leaks and external adversaries; domestic hacktivists and overseas cybercrime syndicates; targeted threats and mass attacks. And these threats run the gamut from targeted to indiscriminate to entirely accidental. Amo

  8. How can health care organisations make and justify decisions about risk reduction? Lessons from a cross-industry review and a health care stakeholder consensus development process

    International Nuclear Information System (INIS)

    Sujan, Mark A.; Habli, Ibrahim; Kelly, Tim P.; Gühnemann, Astrid; Pozzi, Simone; Johnson, Christopher W.

    2017-01-01

    Interventions to reduce risk often have an associated cost. In UK industries decisions about risk reduction are made and justified within a shared regulatory framework that requires that risk be reduced as low as reasonably practicable. In health care no such regulatory framework exists, and the practice of making decisions about risk reduction is varied and lacks transparency. Can health care organisations learn from relevant industry experiences about making and justifying risk reduction decisions? This paper presents lessons from a qualitative study undertaken with 21 participants from five industries about how such decisions are made and justified in UK industry. Recommendations were developed based on a consensus development exercise undertaken with 20 health care stakeholders. The paper argues that there is a need in health care to develop a regulatory framework and an agreed process for managing explicitly the trade-off between risk reduction and cost. The framework should include guidance about a health care specific notion of acceptable levels of risk, guidance about standardised risk reduction interventions, it should include regulatory incentives for health care organisations to reduce risk, and it should encourage the adoption of an approach for documenting explicitly an organisation's risk position. - Highlights: • Empirical description of industry perceptions on making risk reduction decisions. • Health care consensus development identified five recommendations. • Risk concept should be better integrated into safety management. • Education and awareness about risk concept are required. • Health systems need to start a dialogue about acceptable levels of risk.

  9. How can interventions for inhabitants be justified after a nuclear accident? An approach based on the radiological protection system of the international commission on radiological protection

    International Nuclear Information System (INIS)

    Takahara, Shogo; Homma, Toshimitsu; Yoneda, Minoru; Shimada, Yoko

    2016-01-01

    Management of radiation-induced risks in areas contaminated by a nuclear accident is characterized by three ethical issues: (1) risk trade-off, (2) paternalistic intervention and (3) individualization of responsibilities. To deal with these issues and to clarify requirements of justification of interventions for the purpose of reduction in radiation-induced risks, we explored the ethical basis of the radiological protection system of the International Commission on Radiological Protection (ICRP). The ICRP's radiological protection system is established based on three normative ethics, i.e. utilitarianism, deontology and virtue ethics. The three ethical issues can be resolved based on the decision-making framework which is constructed in combination with these ethical theories. In addition, the interventions for inhabitants have the possibility to be justified in accordance with two ways. Firstly, when the dangers are severe and far-reaching, interventions could be justified with a sufficient explanation about the nature of harmful effects (or beneficial consequences). Secondly, if autonomy of intervened-individuals can be promoted, those interventions could be justified. (author)

  10. Specification and testing of Multiplicative Time-Varying GARCH models with applications

    DEFF Research Database (Denmark)

    Amado, Cristina; Teräsvirta, Timo

    2017-01-01

    In this article, we develop a specification technique for building multiplicative time-varying GARCH models of Amado and Teräsvirta (2008, 2013). The variance is decomposed into an unconditional and a conditional component such that the unconditional variance component is allowed to evolve smoothly...... over time. This nonstationary component is defined as a linear combination of logistic transition functions with time as the transition variable. The appropriate number of transition functions is determined by a sequence of specification tests. For that purpose, a coherent modelling strategy based...... on statistical inference is presented. It is heavily dependent on Lagrange multiplier type misspecification tests. The tests are easily implemented as they are entirely based on auxiliary regressions. Finite-sample properties of the strategy and tests are examined by simulation. The modelling strategy...

  11. Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.

    Science.gov (United States)

    Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H

    2016-01-01

    Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.

  12. The use of a numerical method to justify the criteria for the maximum settlement of the tank foundation

    Science.gov (United States)

    Tarasenko, Alexander; Chepur, Petr; Gruchenkova, Alesya

    2017-11-01

    The article examines the problem of assessing the permissible values of uneven settlement for a vertical steel tank base and foundation. A numerical experiment was performed using a finite element model of the tank. The model took into account the geometric shape of the structure and its additional stiffening elements that affect the stress-strain state of the tank. An equation was obtained that allowed determining the maximum possible deformation of the bottom outer contour during uneven settlement. Depending on the length of the uneven settlement zone, the values of the permissible settlement of the tank base were determined. The article proposes new values of the maximum permissible tank settlement with additional stiffening elements.

  13. [Hemolytic disease of the newborn has not vanished from Finland--routine protection of RhD negative mothers during pregnancy is justifiable].

    Science.gov (United States)

    Sainio, Susanna; Kuosmanen, Malla

    2012-01-01

    Prophylaxis of RhD negative mothers with anti-D immunoglobulin after childbirth is the most important procedure reducing the immunization of the mother and the risk of severe hemolytic disease of the newborn. In spite of this, anti-D antibodies having relevance to pregnancy are later detected in 1.8% of RhD negative mothers. Half of these cases could be prevented by routine anti-D prophylaxis given to the mothers during weeks 28 to 34 of pregnancy. Convincing evidence of the effectiveness of this measure has accumulated in the last few years, and application of the treatment is justified also in Finland.

  14. On adjustment for auxiliary covariates in additive hazard models for the analysis of randomized experiments

    DEFF Research Database (Denmark)

    Vansteelandt, S.; Martinussen, Torben; Tchetgen, E. J Tchetgen

    2014-01-01

    We consider additive hazard models (Aalen, 1989) for the effect of a randomized treatment on a survival outcome, adjusting for auxiliary baseline covariates. We demonstrate that the Aalen least-squares estimator of the treatment effect parameter is asymptotically unbiased, even when the hazard...... that, in view of its robustness against model misspecification, Aalen least-squares estimation is attractive for evaluating treatment effects on a survival outcome in randomized experiments, and the primary reasons to consider baseline covariate adjustment in such settings could be interest in subgroup......'s dependence on time or on the auxiliary covariates is misspecified, and even away from the null hypothesis of no treatment effect. We furthermore show that adjustment for auxiliary baseline covariates does not change the asymptotic variance of the estimator of the effect of a randomized treatment. We conclude...

  15. Are more restrictive food cadmium standards justifiable health safety measures or opportunistic barriers to trade? An answer from economics and public health

    International Nuclear Information System (INIS)

    Figueroa B, Eugenio

    2008-01-01

    In the past, Cd regulations have imposed trade restrictions on foodstuffs from some developing countries seeking to access markets in the developed world and in recent years, there has been a trend towards imposing more rigorous standards. This trend seems to respond more to public and private sectors strategies in some developed countries to create disguised barriers to trade and to improve market competitiveness for their industries, than to scientifically justified health precautions (sanitary and phytosanitary measures) and/or technical barriers to trade acceptable under the Uruguay Round Agreement of the WTO. Applying more rigorous Cd standards in some developed countries will not only increase production costs in developing countries but it will also have a large impact on their economies highly dependent on international agricultural markets. In the current literature there are large uncertainties in the cause-effect relationship between current levels of Cd intakes and eventual health effects in human beings; even the risk of Cd to kidney function is under considerable debate. Recent works on the importance of zinc:Cd ratio rather than Cd levels alone to determine Cd risk factors, on the one hand, and on the declining trends of Cd level in foods and soils, on the other, also indicate a lack of scientific evidence justifying more restrictive cadmium standards. This shows that developing countries should fight for changing and making more transparent the current international structures and procedures for setting sanitary and phytosanitary measures and technical barriers to trade

  16. Attitudes Justifying Domestic Violence Predict Endorsement of Corporal Punishment and Physical and Psychological Aggression towards Children: A Study in 25 Low- and Middle-Income Countries

    Science.gov (United States)

    Lansford, Jennifer E.; Deater-Deckard, Kirby; Bornstein, Marc H.; Putnick, Diane L.; Bradley, Robert H.

    2014-01-01

    Objective The Convention on the Rights of the Child has prompted countries to protect children from abuse and exploitation. Exposure to domestic violence and corporal punishment are risk factors in children’s development. This study investigated how women’s attitudes about domestic violence are related to attitudes about corporal punishment, and harsh behaviors toward children, and whether country-wide norms regarding domestic violence and corporal punishment are related to psychological aggression and physical violence toward children. Study design Data were drawn from the Multiple Indicator Cluster Survey, a nationally representative and internationally comparable household survey developed by UNICEF. Measures of domestic violence and discipline were completed by 85,999 female caregivers of children between the ages of 2 and 14 years from families in 25 low- and middle-income countries. Results Mothers who believed that husbands were justified in hitting their wives were more likely to believe that corporal punishment is necessary to rear children, and, in turn, were justified in hitting their wives and that corporal punishment is necessary to rear children were more likely to report that their child had experienced psychological aggression and physical violence. Countrywide norms regarding the acceptability of husbands hitting wives and advisability of corporal punishment moderated the links between mothers’ attitudes and their behaviors toward children. Conclusions Pediatricians can address parents’ psychological aggression and physical violence toward children by discussing parents’ attitudes and behaviors within a framework that incorporates social norms regarding the acceptability of domestic violence and corporal punishment. PMID:24412139

  17. Developing critical consciousness or justifying the system? A qualitative analysis of attributions for poverty and wealth among low-income racial/ethnic minority and immigrant women.

    Science.gov (United States)

    Godfrey, Erin B; Wolf, Sharon

    2016-01-01

    Economic inequality is a growing concern in the United States and globally. The current study uses qualitative techniques to (a) explore the attributions low-income racial/ethnic minority and immigrant women make for poverty and wealth in the U.S., and (b) clarify important links between attributions, critical consciousness development, and system justification theory. In-depth interview transcripts from 19 low-income immigrant Dominican and Mexican and native African American mothers in a large Northeastern city were analyzed using open coding techniques. Interview topics included perceptions of current economic inequality and mobility and experiences of daily economic hardships. Almost all respondents attributed economic inequality to individual factors (character flaws, lack of hard work). Structural explanations for poverty and wealth were expressed by fewer than half the sample and almost always paired with individual explanations. Moreover, individual attributions included system-justifying beliefs such as the belief in meritocracy and equality of opportunity and structural attributions represented varying levels of critical consciousness. Our analysis sheds new light on how and why individuals simultaneously hold individual and structural attributions and highlights key links between system justification and critical consciousness. It shows that critical consciousness and system justification do not represent opposite stances along a single underlying continuum, but are distinct belief systems and motivations. It also suggests that the motive to justify the system is a key psychological process impeding the development of critical consciousness. Implications for scholarship and intervention are discussed. (c) 2016 APA, all rights reserved).

  18. Are more restrictive food cadmium standards justifiable health safety measures or opportunistic barriers to trade? An answer from economics and public health.

    Science.gov (United States)

    Figueroa B, Eugenio

    2008-01-15

    In the past, Cd regulations have imposed trade restrictions on foodstuffs from some developing countries seeking to access markets in the developed world and in recent years, there has been a trend towards imposing more rigorous standards. This trend seems to respond more to public and private sectors strategies in some developed countries to create disguised barriers to trade and to improve market competitiveness for their industries, than to scientifically justified health precautions (sanitary and phytosanitary measures) and/or technical barriers to trade acceptable under the Uruguay Round Agreement of the WTO. Applying more rigorous Cd standards in some developed countries will not only increase production costs in developing countries but it will also have a large impact on their economies highly dependent on international agricultural markets. In the current literature there are large uncertainties in the cause-effect relationship between current levels of Cd intakes and eventual health effects in human beings; even the risk of Cd to kidney function is under considerable debate. Recent works on the importance of zinc:Cd ratio rather than Cd levels alone to determine Cd risk factors, on the one hand, and on the declining trends of Cd level in foods and soils, on the other, also indicate a lack of scientific evidence justifying more restrictive cadmium standards. This shows that developing countries should fight for changing and making more transparent the current international structures and procedures for setting sanitary and phytosanitary measures and technical barriers to trade.

  19. Developing critical consciousness or justifying the system? A qualitative analysis of attributions for poverty and wealth among low-income racial/ethnic minority and immigrant women

    Science.gov (United States)

    Godfrey, Erin B.; Wolf, Sharon

    2015-01-01

    Objectives Economic inequality is a growing concern in the United States and globally. The current study uses qualitative techniques to (1) explore the attributions low-income racial/ethnic minority and immigrant women make for poverty and wealth in the U.S., and (2) clarify important links between attributions, critical consciousness development and system justification theory. Methods In-depth interview transcripts from 19 low-income immigrant Dominican and Mexican and native African-American mothers in a large Northeastern city were analyzed using open coding techniques. Interview topics included perceptions of current economic inequality and mobility and experiences of daily economic hardships. Results Almost all respondents attributed economic inequality to individual factors (character flaws, lack of hard work). Structural explanations for poverty and wealth were expressed by less than half the sample and almost always paired with individual explanations. Moreover, individual attributions included system-justifying beliefs such as the belief in meritocracy and equality of opportunity and structural attributions represented varying levels of critical consciousness. Conclusions Our analysis sheds new light on how and why individuals simultaneously hold individual and structural attributions and highlights key links between system justification and critical consciousness. It shows that critical consciousness and system justification do not represent opposite stances along a single underlying continuum, but are distinct belief systems and motivations. It also suggests that the motive to justify the system is a key psychological process impeding the development of critical consciousness. Implications for scholarship and intervention are discussed. PMID:25915116

  20. Current Evidence to Justify, and the Methodological Considerations for a Randomised Controlled Trial Testing the Hypothesis that Statins Prevent the Malignant Progression of Barrett's Oesophagus

    Directory of Open Access Journals (Sweden)

    David Thurtle

    2014-12-01

    Full Text Available Barrett’s oesophagus is the predominant risk factor for oesophageal adenocarcinoma, a cancer whose incidence is increasing and which has a poor prognosis. This article reviews the latest experimental and epidemiological evidence justifying the development of a randomised controlled trial investigating the hypothesis that statins prevent the malignant progression of Barrett’s oesophagus, and explores the methodological considerations for such a trial. The experimental evidence suggests anti-carcinogenic properties of statins on oesophageal cancer cell lines, based on the inhibition of the mevalonate pathway and the production of pro-apoptotic proteins. The epidemiological evidence reports inverse associations between statin use and the incidence of oesophageal carcinoma in both general population and Barrett’s oesophagus cohorts. Such a randomised controlled trial would be a large multi-centre trial, probably investigating simvastatin, given the wide clinical experience with this drug, relatively low side-effect profile and low financial cost. As with any clinical trial, high adherence is important, which could be increased with therapy, patient, doctor and system-focussed interventions. We would suggest there is now sufficient evidence to justify a full clinical trial that attempts to prevent this aggressive cancer in a high-risk population.

  1. Justifiability of amniocentesis on the basis of positive findings of triple test, ultrasound scan and advanced maternal age

    Directory of Open Access Journals (Sweden)

    Dragoslav Bukvic

    2011-05-01

    Full Text Available Objective. To assess the effectiveness of antenatal screening for chromosomal abnormalities based on maternal age (≥35 years, positive ultrasound findings or a positive triple test. Materials and methods. Retrospective six-year study. The pregnant women routinely underwent established clinical and laboratory practice at the Department of Medical Genetics between 1997 and 2003. The women’s case notes were examined to identify indications for karyotyping, gestation period and the outcome of karyotyping and pregnancy. Results. Invasive antenatal tests were performed on 1440 cases, 1168 (81.11% age 35(a, 72 (5.00% positive triple test (b, 24 (1.67% positive ultrasound scanning (c and 176 (12.2% other (psychological, personal reasons, etc (d. The overall positive predictive value was 1.67% (1.6%(a, 1.4% (b, 12.5% (c, 0.0% (d. The constructed model of logistic regression gave an odds-ratio of 8.647 for the “positive ultrasound result vs. maternal age ≥35” indication, while the odds-ratio for the triple test vs. maternal age ≥35 was 0.854. Conclusions. Amniocentesis and cytogenetic analysis of foetal karyotype should be presented as a diagnostic possibility to all women over 35 years. The application of biochemical markers was far from the expected results. If we compare results for indication positive ultrasound scanning vs. maternal age, an oddsratio of ~9 was obtained. These results demonstrate that the likelihood of obtaining positive results (i.e. the presence of chromosome alterations from an amniocentesis having this indication is almost 9 times higher than from having an amniocentesis performed solely for advanced maternal age.

  2. Relative efficiency of joint-model and full-conditional-specification multiple imputation when conditional models are compatible: The general location model.

    Science.gov (United States)

    Seaman, Shaun R; Hughes, Rachael A

    2016-09-05

    Estimating the parameters of a regression model of interest is complicated by missing data on the variables in that model. Multiple imputation is commonly used to handle these missing data. Joint model multiple imputation and full-conditional specification multiple imputation are known to yield imputed data with the same asymptotic distribution when the conditional models of full-conditional specification are compatible with that joint model. We show that this asymptotic equivalence of imputation distributions does not imply that joint model multiple imputation and full-conditional specification multiple imputation will also yield asymptotically equally efficient inference about the parameters of the model of interest, nor that they will be equally robust to misspecification of the joint model. When the conditional models used by full-conditional specification multiple imputation are linear, logistic and multinomial regressions, these are compatible with a restricted general location joint model. We show that multiple imputation using the restricted general location joint model can be substantially more asymptotically efficient than full-conditional specification multiple imputation, but this typically requires very strong associations between variables. When associations are weaker, the efficiency gain is small. Moreover, full-conditional specification multiple imputation is shown to be potentially much more robust than joint model multiple imputation using the restricted general location model to mispecification of that model when there is substantial missingness in the outcome variable. © The Author(s) 2016.

  3. Simulation for Teaching Orthopaedic Residents in a Competency-based Curriculum: Do the Benefits Justify the Increased Costs?

    Science.gov (United States)

    Nousiainen, Markku T; McQueen, Sydney A; Ferguson, Peter; Alman, Benjamin; Kraemer, William; Safir, Oleg; Reznick, Richard; Sonnadara, Ranil

    2016-04-01

    Although simulation-based training is becoming widespread in surgical education and research supports its use, one major limitation is cost. Until now, little has been published on the costs of simulation in residency training. At the University of Toronto, a novel competency-based curriculum in orthopaedic surgery has been implemented for training selected residents, which makes extensive use of simulation. Despite the benefits of this intensive approach to simulation, there is a need to consider its financial implications and demands on faculty time. This study presents a cost and faculty work-hours analysis of implementing simulation as a teaching and evaluation tool in the University of Toronto's novel competency-based curriculum program compared with the historic costs of using simulation in the residency training program. All invoices for simulation training were reviewed to determine the financial costs before and after implementation of the competency-based curriculum. Invoice items included costs for cadavers, artificial models, skills laboratory labor, associated materials, and standardized patients. Costs related to the surgical skills laboratory rental fees and orthopaedic implants were waived as a result of special arrangements with the skills laboratory and implant vendors. Although faculty time was not reimbursed, faculty hours dedicated to simulation were also evaluated. The academic year of 2008 to 2009 was chosen to represent an academic year that preceded the introduction of the competency-based curriculum. During this year, 12 residents used simulation for teaching. The academic year of 2010 to 2011 was chosen to represent an academic year when the competency-based curriculum training program was functioning parallel but separate from the regular stream of training. In this year, six residents used simulation for teaching and assessment. The academic year of 2012 to 2013 was chosen to represent an academic year when simulation was used equally

  4. Is the systematic use of automatic exposure monitoring justified in pediatrics abdomen computerized tomography?; L'utilisation systematique du controle automatique d'exposition est-elle justifiee en TDM abdominale pediatique?

    Energy Technology Data Exchange (ETDEWEB)

    Brisse, H.; Robilliard, M.; Pierrat, N.; Gaboriaud, G.; Neuenschwander, S.; Rosenwald, J.C.; Aubert, B

    2007-10-15

    The use of automatic exposure in pediatrics abdomen computerized tomography induces an increase and possibly useless of the dose to pelvic organs, and then must be justified for what diagnosis is expected from this medical examination. (N.C.)

  5. For Better or Worse? System-Justifying Beliefs in Sixth-Grade Predict Trajectories of Self-Esteem and Behavior Across Early Adolescence.

    Science.gov (United States)

    Godfrey, Erin B; Santos, Carlos E; Burson, Esther

    2017-06-19

    Scholars call for more attention to how marginalization influences the development of low-income and racial/ethnic minority youth and emphasize the importance of youth's subjective perceptions of contexts. This study examines how beliefs about the fairness of the American system (system justification) in sixth grade influence trajectories of self-esteem and behavior among 257 early adolescents (average age 11.4) from a diverse, low-income, middle school in an urban southwestern city. System justification was associated with higher self-esteem, less delinquent behavior, and better classroom behavior in sixth grade but worse trajectories of these outcomes from sixth to eighth grade. These findings provide novel evidence that system-justifying beliefs undermine the well-being of marginalized youth and that early adolescence is a critical developmental period for this process. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  6. The EU Seal Products Ban – Why Ineffective Animal Welfare Protection Cannot Justify Trade Restrictions under European and International Trade Law

    Directory of Open Access Journals (Sweden)

    Martin Hennig

    2015-03-01

    Full Text Available In this article, the author questions the legitimacy of the general ban on trade in seal products adopted by the European Union. It is submitted that the EU Seal Regime, which permits the marketing of Greenlandic seal products derived from Inuit hunts, but excludes Canadian and Norwegian seal products from the European market, does not ensure a satisfactory degree of animal welfare protection in order to justify the comprehensive trade restriction in place. It is argued that the current ineffective EU ban on seal products, which according to the WTO Appellate Body cannot be reconciled with the objective of protecting animal welfare, has no legal basis in EU Treaties and should be annulled.

  7. Attitudes justifying domestic violence predict endorsement of corporal punishment and physical and psychological aggression towards children: a study in 25 low- and middle-income countries.

    Science.gov (United States)

    Lansford, Jennifer E; Deater-Deckard, Kirby; Bornstein, Marc H; Putnick, Diane L; Bradley, Robert H

    2014-05-01

    The Convention on the Rights of the Child has prompted countries to protect children from abuse and exploitation. Exposure to domestic violence and corporal punishment are risk factors in children's development. This study investigated how women's attitudes about domestic violence are related to attitudes about corporal punishment and harsh behaviors toward children, and whether country-wide norms regarding domestic violence and corporal punishment are related to psychological aggression and physical violence toward children. Data were drawn from the Multiple Indicator Cluster Survey, a nationally representative and internationally comparable household survey developed by the United Nations Children's Fund. Measures of domestic violence and discipline were completed by 85 999 female caregivers of children between the ages of 2 and 14 years from families in 25 low- and middle-income countries. Mothers who believed that husbands were justified in hitting their wives were more likely to believe that corporal punishment is necessary to rear children. Mothers who believed that husbands were justified in hitting their wives and that corporal punishment is necessary to rear children were more likely to report that their child had experienced psychological aggression and physical violence. Countrywide norms regarding the acceptability of husbands hitting wives and advisability of corporal punishment moderated the links between mothers' attitudes and their behaviors toward children. Pediatricians can address parents' psychological aggression and physical violence toward children by discussing parents' attitudes and behaviors within a framework that incorporates social norms regarding the acceptability of domestic violence and corporal punishment. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. The Ends Justify the Memes

    OpenAIRE

    Miller, Ian D.; Cupchik, Gerald C.

    2016-01-01

    This talk presents an update on my research into memes.  It begins with an introduction to memes that is suitable for any audience.  It concludes with a detailed description of human research and simulation results that converge with one another.  I also present a short online study on email forwarding chains.

  9. Can Intimacy Justify Home Education?

    Science.gov (United States)

    Merry, Michael S.; Howell, Charles

    2009-01-01

    Many parents cite intimacy as one of their reasons for deciding to educate at home. It seems intuitively obvious that home education is conducive to intimacy because of the increased time families spend together. Yet what is not clear is whether intimacy can provide justification for one's decision to home educate. To see whether this is so, we…

  10. Can intimacy justify home education?

    NARCIS (Netherlands)

    Merry, M.S.; Howell, C.

    2009-01-01

    Many parents cite intimacy as one of their reasons for deciding to educate at home. It seems intuitively obvious that home education is conducive to intimacy because of the increased time families spend together. Yet what is not clear is whether intimacy can provide justification for one’s decision

  11. Are Vulnerability Disclosure Deadlines Justified?

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Jason L. Wright; Lawrence Wellman

    2011-09-01

    Vulnerability research organizations Rapid7, Google Security team, and Zero Day Initiative recently imposed grace periods for public disclosure of vulnerabilities. The grace periods ranged from 45 to 182 days, after which disclosure might occur with or without an effective mitigation from the affected software vendor. At this time there is indirect evidence that the shorter grace periods of 45 and 60 days may not be practical. However, there is strong evidence that the recently announced Zero Day Initiative grace period of 182 days yields benefit in speeding up the patch creation process, and may be practical for many software products. Unfortunately, there is also evidence that the 182 day grace period results in more vulnerability announcements without an available patch.

  12. Wound healing in cell studies and animal model experiments by low level laser therapy; Were clinical studies justified? A systematic review

    NARCIS (Netherlands)

    Lucas, C.; Criens-Poublon, L. J.; Cockrell, C. T.; de Haan, R. J.

    2002-01-01

    Based on results of cell studies and animal experiments, clinical trials with Low Level Laser Therapy (LLLT) were performed, which finally did not demonstrate a beneficial effect on outcome of wound healing. The aim of this study was to investigate whether the evidence from cell studies and animal

  13. Cholera transmission dynamic models for public health practitioners.

    Science.gov (United States)

    Fung, Isaac Chun-Hai

    2014-02-12

    Great progress has been made in mathematical models of cholera transmission dynamics in recent years. However, little impact, if any, has been made by models upon public health decision-making and day-to-day routine of epidemiologists. This paper provides a brief introduction to the basics of ordinary differential equation models of cholera transmission dynamics. We discuss a basic model adapted from Codeço (2001), and how it can be modified to incorporate different hypotheses, including the importance of asymptomatic or inapparent infections, and hyperinfectious V. cholerae and human-to-human transmission. We highlight three important challenges of cholera models: (1) model misspecification and parameter uncertainty, (2) modeling the impact of water, sanitation and hygiene interventions and (3) model structure. We use published models, especially those related to the 2010 Haitian outbreak as examples. We emphasize that the choice of models should be dictated by the research questions in mind. More collaboration is needed between policy-makers, epidemiologists and modelers in public health.

  14. Do cost savings from reductions in nosocomial infections justify additional costs of single-bed rooms in intensive care units? A simulation case study.

    Science.gov (United States)

    Sadatsafavi, Hessam; Niknejad, Bahar; Zadeh, Rana; Sadatsafavi, Mohsen

    2016-02-01

    Evidence shows that single-patient rooms can play an important role in preventing cross-transmission and reducing nosocomial infections in intensive care units (ICUs). This case study investigated whether cost savings from reductions in nosocomial infections justify the additional construction and operation costs of single-bed rooms in ICUs. We conducted deterministic and probabilistic return-on-investment analyses of converting the space occupied by open-bay rooms to single-bed rooms in an exemplary ICU. We used the findings of a study of an actual ICU in which the association between the locations of patients in single-bed vs open-bay rooms with infection risk was evaluated. Despite uncertainty in the estimates of costs, infection risks, and length of stay, the cost savings from the reduction of nosocomial infections in single-bed rooms in this case substantially outweighed additional construction and operation expenses. The mean value of internal rate of return over a 5-year analysis period was 56.18% (95% credible interval, 55.34%-57.02%). This case study shows that although single-patient rooms are more costly to build and operate, they can result in substantial savings compared with open-bay rooms by avoiding costs associated with nosocomial infections. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  16. Is screening for abnormal ECG patterns justified in long-term follow-up of childhood cancer survivors treated with anthracyclines?

    Science.gov (United States)

    Pourier, Milanthy S; Mavinkurve-Groothuis, Annelies M C; Loonen, Jacqueline; Bökkerink, Jos P M; Roeleveld, Nel; Beer, Gil; Bellersen, Louise; Kapusta, Livia

    2017-03-01

    ECG and echocardiography are noninvasive screening tools to detect subclinical cardiotoxicity in childhood cancer survivors (CCSs). Our aims were as follows: (1) assess the prevalence of abnormal ECG patterns, (2) determine the agreement between abnormal ECG patterns and echocardiographic abnormalities; and (3) determine whether ECG screening for subclinical cardiotoxicity in CCSs is justified. We retrospectively studied ECG and echocardiography in asymptomatic CCSs more than 5 years after anthracycline treatment. Exclusion criteria were abnormal ECG and/or echocardiogram at the start of therapy, incomplete follow-up data, clinical heart failure, cardiac medication, and congenital heart disease. ECG abnormalities were classified using the Minnesota Code. Level of agreement between ECG and echocardiography was calculated with Cohen kappa. We included 340 survivors with a mean follow-up of 14.5 years (range 5-32). ECG was abnormal in 73 survivors (21.5%), with ventricular conduction disorders, sinus bradycardia, and high-amplitude R waves being most common. Prolonged QTc (>0.45 msec) was found in two survivors, both with a cumulative anthracycline dose of 300 mg/m 2 or higher. Echocardiography showed abnormalities in 44 survivors (12.9%), mostly mild valvular abnormalities. The level of agreement between ECG and echocardiography was low (kappa 0.09). Male survivors more often had an abnormal ECG (corrected odds ratio: 3.00, 95% confidence interval: 1.68-5.37). Abnormal ECG patterns were present in 21% of asymptomatic long-term CCSs. Lack of agreement between abnormal ECG patterns and echocardiographic abnormalities may suggest that ECG is valuable in long-term follow-up of CCSs. However, it is not clear whether these abnormal ECG patterns will be clinically relevant. © 2016 Wiley Periodicals, Inc.

  17. “This is not a burning issue for me”: How citizens justify their use of wood heaters in a city with a severe air pollution problem

    International Nuclear Information System (INIS)

    Reeve, Ian; Scott, John; Hine, Donald W.; Bhullar, Navjot

    2013-01-01

    Although wood smoke pollution has been linked to health problems, wood burning remains a popular form of domestic heating in many countries across the world. In this paper, we describe the rhetoric of resistance to wood heater regulation amongst citizens in the regional Australian town of Armidale, where wood smoke levels regularly exceed national health advisory limits. We discuss how this is related to particular sources of resistance, such as affective attachment to wood heating and socio-cultural norms. The research draws on six focus groups with participants from households with and without wood heating. With reference to practice theory, we argue that citizen discourses favouring wood burning draw upon a rich suite of justifications and present this activity as a natural and traditional activity promoting comfort and cohesion. Such discourses also emphasise the identity of the town as a rural community and the supposed gemeinschaft qualities of such places. We show that, in this domain of energy policy, it is not enough to present ‘facts’ which have little emotional association or meaning for the populace. Rather, we need understand how social scripts, often localised, inform identity and practice. - Highlights: ► The negative health effects of wood smoke from wood heaters are known by citizens. ► Continued use of wood heating is justified with a rich suite of rhetorical strategies. ► Some strategies try to negate or diminish the case for phasing out wood heaters. ► Other strategies present wood heating as a natural, traditional and social activity

  18. Surgery on unfavourable persistent N2/N3 non-small-cell lung cancer after trimodal therapy: do the results justify the risk?

    Science.gov (United States)

    Steger, Volker; Walker, Tobias; Mustafi, Migdat; Lehrach, Karoline; Kyriss, Thomas; Veit, Stefanie; Friedel, Godehard; Walles, Thorsten

    2012-01-01

    OBJECTIVES Persistent mediastinal lymph node metastasis after neoadjuvant therapy is a significant negative indicator for survival. Even though there is still no consensus on the matter, some authors advocate a thorough restaging prior to surgery and deny surgery in cases of persistent N2 because of the poor outcome. We analysed our results after trimodal therapy in pN2/N3 stage III non-small-cell lung cancer (NSCLC) and persistent mediastinal lymph node metastasis after neoadjuvant chemoradiotherapy. METHODS We conducted a retrospective cohort analysis of 167 patients who received trimodal therapy for stage III NSCLC. Progression-free interval and survival were calculated. T-stage, N-stage, ypT-stage, ypN2/3-stage and surgical procedure were tested as risk factors. RESULTS Eighty-three patients with potentially resectable initial pN2/3 underwent 44 pneumonectomies and 76% extended resections. Thirty-five patients showed persistent mediastinal lymph node metastasis after trimodal therapy. Treatment-related comorbidity after an operative therapy was 58%. Hospital mortality was 2.4%. The ypT- and ypN2/N3 stages were significant risk factors and, in the case of persistent mediastinal lymph node metastasis, median progression-free period was 17 months and median survival time was 21 months. CONCLUSIONS Persistent but resectable N2/N3 after chemoradiotherapy in stage III NSCLC is the least favourable subgroup of patients in neoadjuvant approaches. If surgery can be carried out with curative intent and low morbidity, completing trimodal therapy is justified, with an acceptable outcome. PMID:22997251

  19. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    Science.gov (United States)

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  20. A Smooth Transition Logit Model of the Effects of Deregulation in the Electricity Market

    DEFF Research Database (Denmark)

    Hurn, A.S.; Silvennoinen, Annastiina; Teräsvirta, Timo

    We consider a nonlinear vector model called the logistic vector smooth transition autoregressive model. The bivariate single-transition vector smooth transition regression model of Camacho (2004) is generalised to a multivariate and multitransition one. A modelling strategy consisting of specific......We consider a nonlinear vector model called the logistic vector smooth transition autoregressive model. The bivariate single-transition vector smooth transition regression model of Camacho (2004) is generalised to a multivariate and multitransition one. A modelling strategy consisting...... of specification, including testing linearity, estimation and evaluation of these models is constructed. Nonlinear least squares estimation of the parameters of the model is discussed. Evaluation by misspecification tests is carried out using tests derived in a companion paper. The use of the modelling strategy...... is illustrated by two applications. In the first one, the dynamic relationship between the US gasoline price and consumption is studied and possible asymmetries in it considered. The second application consists of modelling two well known Icelandic riverflow series, previously considered by many hydrologists...

  1. Would it be legally justified to impose vaccination in Israel? Examining the issue in light of the 2013 detection of polio in Israeli sewage.

    Science.gov (United States)

    Kamin-Friedman, Shelly

    2017-10-30

    specify a variety of sanctions to accompany the enforcement of mandatory vaccinations which would be formulated from least to most restrictive according to the "intervention ladder" concept. The law should also describe the circumstances which would justify the implementation of each and every sanction as well as the procedural safeguards designed for established decisions and fairness toward the individual(s) whose rights are infringed by the application of these sanctions.

  2. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  3. The estimation of time-varying risks in asset pricing modelling using B-Spline method

    Science.gov (United States)

    Nurjannah; Solimun; Rinaldo, Adji

    2017-12-01

    Asset pricing modelling has been extensively studied in the past few decades to explore the risk-return relationship. The asset pricing literature typically assumed a static risk-return relationship. However, several studies found few anomalies in the asset pricing modelling which captured the presence of the risk instability. The dynamic model is proposed to offer a better model. The main problem highlighted in the dynamic model literature is that the set of conditioning information is unobservable and therefore some assumptions have to be made. Hence, the estimation requires additional assumptions about the dynamics of risk. To overcome this problem, the nonparametric estimators can also be used as an alternative for estimating risk. The flexibility of the nonparametric setting avoids the problem of misspecification derived from selecting a functional form. This paper investigates the estimation of time-varying asset pricing model using B-Spline, as one of nonparametric approach. The advantages of spline method is its computational speed and simplicity, as well as the clarity of controlling curvature directly. The three popular asset pricing models will be investigated namely CAPM (Capital Asset Pricing Model), Fama-French 3-factors model and Carhart 4-factors model. The results suggest that the estimated risks are time-varying and not stable overtime which confirms the risk instability anomaly. The results is more pronounced in Carhart’s 4-factors model.

  4. Analysing model fit of psychometric process models: An overview, a new test and an application to the diffusion model.

    Science.gov (United States)

    Ranger, Jochen; Kuhn, Jörg-Tobias; Szardenings, Carsten

    2017-05-01

    Cognitive psychometric models embed cognitive process models into a latent trait framework in order to allow for individual differences. Due to their close relationship to the response process the models allow for profound conclusions about the test takers. However, before such a model can be used its fit has to be checked carefully. In this manuscript we give an overview over existing tests of model fit and show their relation to the generalized moment test of Newey (Econometrica, 53, 1985, 1047) and Tauchen (J. Econometrics, 30, 1985, 415). We also present a new test, the Hausman test of misspecification (Hausman, Econometrica, 46, 1978, 1251). The Hausman test consists of a comparison of two estimates of the same item parameters which should be similar if the model holds. The performance of the Hausman test is evaluated in a simulation study. In this study we illustrate its application to two popular models in cognitive psychometrics, the Q-diffusion model and the D-diffusion model (van der Maas, Molenaar, Maris, Kievit, & Boorsboom, Psychol Rev., 118, 2011, 339; Molenaar, Tuerlinckx, & van der Maas, J. Stat. Softw., 66, 2015, 1). We also compare the performance of the test to four alternative tests of model fit, namely the M 2 test (Molenaar et al., J. Stat. Softw., 66, 2015, 1), the moment test (Ranger et al., Br. J. Math. Stat. Psychol., 2016) and the test for binned time (Ranger & Kuhn, Psychol. Test. Asess. , 56, 2014b, 370). The simulation study indicates that the Hausman test is superior to the latter tests. The test closely adheres to the nominal Type I error rate and has higher power in most simulation conditions. © 2017 The British Psychological Society.

  5. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  6. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  7. The Application of Formative Model Procedures to Assess The Quality of Measurements: The Case of Job Embeddedness’ Dimensions

    Directory of Open Access Journals (Sweden)

    Gugup Kismono

    2016-01-01

    Full Text Available “Kelekatan pekerjaan” (job embeddedness sebagai sebuah model kontemporer perputaran karyawan menarik perhatian para ilmuwan dan peneliti. Model ini telah dipandang lebih unggul dibanding model tradisional dalam menjelaskan perputaran sukarela karyawan. Berdasarkan pada taksonomi konstruk multidimensional, kelekatan pekerjaan harus diperlakukan sebagai model agregat. Model agregat mensyaratkan bahwa sebuah konstruk dibentuk atau disebabkan (bukan menyebabkan oleh indikator-indikator. Sebagai konsekuensinya, konstruk kelekatan pekerjaan harus diklasifikasikan ke dalam model alat ukur formatif. Namun demikian, sejauh ini evaluasi atas alat ukur kelekatan pekerjaan masih menggunakan prosedur model reflektif. Karena kelekatan pekerjaan merupakan konstruk formatif, penggunaan prosedur model reflektif untuk mengevaluasi alat ukurnya berpotensi menghasilkan kesalahan construct misspesification. Riset ini menawarkan metoda alternatif yang lebih sesuai untuk mengevaluasi kualitas alat ukur kelekatan pekerjaan, khususnya dimensi-dimensinya. Hasil riset menunjukkan bahwa penggunaan model reflektif tidak cocok. Hal ini ditunjukkan dengan adanya perbedaan hasil antara penggunaan model reflektif dan model formatif.Kata kunci: kelekatan pekerjaan, perputaran sukarela karyawan, model formatif, model reflektif, misspecification.

  8. Comment les enfants justifient-ils ce qu’ils savent faire ? Le concept de milieu géométrique dans l’approche piagétienne de la formation des raisons

    Directory of Open Access Journals (Sweden)

    IOANNA BERTHOUD-PAPANDROPOULOU

    2008-01-01

    Full Text Available The relation between know-how and know to justify his/her action is explored within the Piagetian constructivist theoretical frame, particularly within the issue of Reasons. Reasons are considered as a reconstitution of the activity, contributing to its understanding by the subject. Thirty four children aged three to nine have been faced with a double task: determine the middle of geometric figures and then justify the chosen location. Results show that while determination is correctly performed at all ages by efficient perceptive evaluation, the justification undergoes a development leading from illustrative, to argumentative and finally to properly founding reasons from the age of eight years on. The relationship between action and reason is discussed on a cognitive, a social and an educational level.

  9. Evaluation of fecal mRNA reproducibility via a marginal transformed mixture modeling approach

    Directory of Open Access Journals (Sweden)

    Davidson Laurie A

    2010-01-01

    Full Text Available Abstract Background Developing and evaluating new technology that enables researchers to recover gene-expression levels of colonic cells from fecal samples could be key to a non-invasive screening tool for early detection of colon cancer. The current study, to the best of our knowledge, is the first to investigate and report the reproducibility of fecal microarray data. Using the intraclass correlation coefficient (ICC as a measure of reproducibility and the preliminary analysis of fecal and mucosal data, we assessed the reliability of mixture density estimation and the reproducibility of fecal microarray data. Using Monte Carlo-based methods, we explored whether ICC values should be modeled as a beta-mixture or transformed first and fitted with a normal-mixture. We used outcomes from bootstrapped goodness-of-fit tests to determine which approach is less sensitive toward potential violation of distributional assumptions. Results The graphical examination of both the distributions of ICC and probit-transformed ICC (PT-ICC clearly shows that there are two components in the distributions. For ICC measurements, which are between 0 and 1, the practice in literature has been to assume that the data points are from a beta-mixture distribution. Nevertheless, in our study we show that the use of a normal-mixture modeling approach on PT-ICC could provide superior performance. Conclusions When modeling ICC values of gene expression levels, using mixture of normals in the probit-transformed (PT scale is less sensitive toward model mis-specification than using mixture of betas. We show that a biased conclusion could be made if we follow the traditional approach and model the two sets of ICC values using the mixture of betas directly. The problematic estimation arises from the sensitivity of beta-mixtures toward model mis-specification, particularly when there are observations in the neighborhood of the the boundary points, 0 or 1. Since beta-mixture modeling

  10. Phylogenetic tree reconstruction accuracy and model fit when proportions of variable sites change across the tree.

    Science.gov (United States)

    Shavit Grievink, Liat; Penny, David; Hendy, Michael D; Holland, Barbara R

    2010-05-01

    Commonly used phylogenetic models assume a homogeneous process through time in all parts of the tree. However, it is known that these models can be too simplistic as they do not account for nonhomogeneous lineage-specific properties. In particular, it is now widely recognized that as constraints on sequences evolve, the proportion and positions of variable sites can vary between lineages causing heterotachy. The extent to which this model misspecification affects tree reconstruction is still unknown. Here, we evaluate the effect of changes in the proportions and positions of variable sites on model fit and tree estimation. We consider 5 current models of nucleotide sequence evolution in a Bayesian Markov chain Monte Carlo framework as well as maximum parsimony (MP). We show that for a tree with 4 lineages where 2 nonsister taxa undergo a change in the proportion of variable sites tree reconstruction under the best-fitting model, which is chosen using a relative test, often results in the wrong tree. In this case, we found that an absolute test of model fit is a better predictor of tree estimation accuracy. We also found further evidence that MP is not immune to heterotachy. In addition, we show that increased sampling of taxa that have undergone a change in proportion and positions of variable sites is critical for accurate tree reconstruction.

  11. Testing Departure from Additivity in Tukey’s Model using Shrinkage: Application to a Longitudinal Setting

    Science.gov (United States)

    Ko, Yi-An; Mukherjee, Bhramar; Smith, Jennifer A.; Park, Sung Kyun; Kardia, Sharon L.R.; Allison, Matthew A.; Vokonas, Pantel S.; Chen, Jinbo; Diez-Roux, Ana V.

    2014-01-01

    While there has been extensive research developing gene-environment interaction (GEI) methods in case-control studies, little attention has been given to sparse and efficient modeling of GEI in longitudinal studies. In a two-way table for GEI with rows and columns as categorical variables, a conventional saturated interaction model involves estimation of a specific parameter for each cell, with constraints ensuring identifiability. The estimates are unbiased but are potentially inefficient because the number of parameters to be estimated can grow quickly with increasing categories of row/column factors. On the other hand, Tukey’s one degree of freedom (df) model for non-additivity treats the interaction term as a scaled product of row and column main effects. Due to the parsimonious form of interaction, the interaction estimate leads to enhanced efficiency and the corresponding test could lead to increased power. Unfortunately, Tukey’s model gives biased estimates and low power if the model is misspecified. When screening multiple GEIs where each genetic and environmental marker may exhibit a distinct interaction pattern, a robust estimator for interaction is important for GEI detection. We propose a shrinkage estimator for interaction effects that combines estimates from both Tukey’s and saturated interaction models and use the corresponding Wald test for testing interaction in a longitudinal setting. The proposed estimator is robust to misspecification of interaction structure. We illustrate the proposed methods using two longitudinal studies — the Normative Aging Study and the Multi-Ethnic Study of Atherosclerosis. PMID:25112650

  12. A modified version of the Molly rumen model to quantify methane emissions from sheep.

    Science.gov (United States)

    Vetharaniam, I; Vibart, R E; Hanigan, M D; Janssen, P H; Tavendale, M H; Pacheco, D

    2015-07-01

    We modified the rumen submodel of the Molly dairy cow model to simulate the rumen of a sheep and predict its methane emissions. We introduced a rumen hydrogen (H2) pool as a dynamic variable, which (together with the microbial pool in Molly) was used to predict methane production, to facilitate future consideration of thermodynamic control of methanogenesis. The new model corrected a misspecification of the equation of microbial H2 utilization in Molly95, which could potentially give rise to unrealistic predictions under conditions of low intake rates. The new model included a function to correct biases in the estimation of net H2 production based on the default stoichiometric relationships in Molly95, with this function specified in terms of level of intake. Model parameters for H2 and methane production were fitted to experimental data that included fresh temperate forages offered to sheep at a wide range of intake levels and then tested against independent data. The new model provided reasonable estimates relative to the calibration data set, but a different parameterization was needed to improve its predicted ability relative to the validation data set. Our results indicate that, although feedback inhibition on H2 production and methanogen activity increased with feeding level, other feedback effects that vary with diet composition need to be considered in future work on modeling rumen digestion in Molly.

  13. Semi-supervised anomaly detection - towards model-independent searches of new physics

    International Nuclear Information System (INIS)

    Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Aaltonen, Timo; Raiko, Tapani; Nagai, Yoshikazu

    2012-01-01

    Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.

  14. A matlab framework for estimation of NLME models using stochastic differential equations: applications for estimation of insulin secretion rates.

    Science.gov (United States)

    Mortensen, Stig B; Klim, Søren; Dammann, Bernd; Kristensen, Niels R; Madsen, Henrik; Overgaard, Rune V

    2007-10-01

    The non-linear mixed-effects model based on stochastic differential equations (SDEs) provides an attractive residual error model, that is able to handle serially correlated residuals typically arising from structural mis-specification of the true underlying model. The use of SDEs also opens up for new tools for model development and easily allows for tracking of unknown inputs and parameters over time. An algorithm for maximum likelihood estimation of the model has earlier been proposed, and the present paper presents the first general implementation of this algorithm. The implementation is done in Matlab and also demonstrates the use of parallel computing for improved estimation times. The use of the implementation is illustrated by two examples of application which focus on the ability of the model to estimate unknown inputs facilitated by the extension to SDEs. The first application is a deconvolution-type estimation of the insulin secretion rate based on a linear two-compartment model for C-peptide measurements. In the second application the model is extended to also give an estimate of the time varying liver extraction based on both C-peptide and insulin measurements.

  15. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  16. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    Science.gov (United States)

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  17. Cost Modeling for Space Telescope

    Science.gov (United States)

    Stahl, H. Philip

    2011-01-01

    Parametric cost models are an important tool for planning missions, compare concepts and justify technology investments. This paper presents on-going efforts to develop single variable and multi-variable cost models for space telescope optical telescope assembly (OTA). These models are based on data collected from historical space telescope missions. Standard statistical methods are used to derive CERs for OTA cost versus aperture diameter and mass. The results are compared with previously published models.

  18. Evaluation of (241)Am deposited in different parts of the leg bones and skeleton to justify in vivo measurements of the knee for estimating total skeletal activity.

    Science.gov (United States)

    Khalaf, Majid; Brey, Richard R; Derryberry, DeWayne

    2013-01-01

    The percentage of Am deposited in different parts of leg bones relative to the total leg activity was calculated from radiochemical analysis results from six whole body donors participating in the U.S. Transuranium and Uranium Registries (USTUR). In five of these six USTUR cases, the percentage of Am deposited in the knee region as well as in the entire leg was separately calculated relative to total skeletal activity. The purpose of this study is to find a region in the leg that is both suitable for in vivo measurement of Am deposited in the bones and has a good correlation with the total skeletal Am burden. In all analyzed cases, the femur was the bone with the highest percentage of Am deposited in the leg (48.8%). In the five cases that have complete whole skeletal analysis, the percentage of Am activity in the knee relative to entire skeletal activity was 4.8%, and the average value of its coefficient of variation was 10.6%. The percentage of Am in the leg relative to total skeletal activity was 20% with an average coefficient of variation of 13.63%. The Am activity in the knee as well as in the leg was strongly correlated (R = 99.5% and R = 99.1%, respectively) with the amount of Am activity in the entire skeleton using a simple linear relationship. The highest correlation was found between the amount of Am deposited in the knee and the amount of Am deposited in the entire skeleton. This correlation is important because it might enable an accurate assessment of the total skeletal Am burden to be performed from in vivo monitoring of the knee region. In all analyzed cases, an excellent correlation (R = 99.9%) was found between the amount of Am activity in the knee and the amount of Am activity in the entire leg. The results of this study suggest three simple models: two models to predict the total skeletal activity based on either leg or knee activity, and the third model to predict the total leg activity based on knee activity. The results also suggest that the

  19. Relationship between Remittance and Economic Growth in Bangladesh: an Autoregressive Distributed Lag Model (ARDL

    Directory of Open Access Journals (Sweden)

    Shapan Chandra Majumder

    2016-03-01

    Full Text Available This study examines the long-run impact of remittances on economic growth in Bangladesh. Bangladesh, being one of the top remittance-recipient countries in the world, has drawn attention to the remittance-output relationship in recent years. In 2014, remittances contributed to 8.2% of GDP of Bangladesh while the contribution was 6.7% in 2006. The main objective of this study is to investigate the impact of the remittance on economic growth (GDP. We adopted Autoregressive Distributed Lag (ARDL models or dynamic linear regressions are widely used to examine the relationship between remittances and economic growth in the country. In testing for the unit root properties of the time series data, all variables are found stationary at first differencing level under the ADF and PP stationary tests. The study made use of diagnostic tests such as the residual normality test, heteroskedacity and serial autocorrelation tests for misspecification in order to validate the parameter estimation outcomes achieved by the estimated model. The stability test of the model is also checked by CUSUM test. The ARDL model presents that there exist a statistically significant long run positive relationship between remittance and economic growth of gross domestic product in Bangladesh.

  20. Identification and estimation of nonlinear models using two samples with nonclassical measurement errors

    KAUST Repository

    Carroll, Raymond J.

    2010-05-01

    This paper considers identification and estimation of a general nonlinear Errors-in-Variables (EIV) model using two samples. Both samples consist of a dependent variable, some error-free covariates, and an error-prone covariate, for which the measurement error has unknown distribution and could be arbitrarily correlated with the latent true values; and neither sample contains an accurate measurement of the corresponding true variable. We assume that the regression model of interest - the conditional distribution of the dependent variable given the latent true covariate and the error-free covariates - is the same in both samples, but the distributions of the latent true covariates vary with observed error-free discrete covariates. We first show that the general latent nonlinear model is nonparametrically identified using the two samples when both could have nonclassical errors, without either instrumental variables or independence between the two samples. When the two samples are independent and the nonlinear regression model is parameterized, we propose sieve Quasi Maximum Likelihood Estimation (Q-MLE) for the parameter of interest, and establish its root-n consistency and asymptotic normality under possible misspecification, and its semiparametric efficiency under correct specification, with easily estimated standard errors. A Monte Carlo simulation and a data application are presented to show the power of the approach.

  1. Quantile regression for censored mixed-effects models with applications to HIV studies.

    Science.gov (United States)

    Lachos, Victor H; Chen, Ming-Hui; Abanto-Valle, Carlos A; Azevedo, Caio L N

    HIV RNA viral load measures are often subjected to some upper and lower detection limits depending on the quantification assays. Hence, the responses are either left or right censored. Linear/nonlinear mixed-effects models, with slight modifications to accommodate censoring, are routinely used to analyze this type of data. Usually, the inference procedures are based on normality (or elliptical distribution) assumptions for the random terms. However, those analyses might not provide robust inference when the distribution assumptions are questionable. In this paper, we discuss a fully Bayesian quantile regression inference using Markov Chain Monte Carlo (MCMC) methods for longitudinal data models with random effects and censored responses. Compared to the conventional mean regression approach, quantile regression can characterize the entire conditional distribution of the outcome variable, and is more robust to outliers and misspecification of the error distribution. Under the assumption that the error term follows an asymmetric Laplace distribution, we develop a hierarchical Bayesian model and obtain the posterior distribution of unknown parameters at the p th level, with the median regression ( p = 0.5) as a special case. The proposed procedures are illustrated with two HIV AIDS studies on viral loads that were initially analyzed using the typical normal (censored) mean regression mixed-effects models, as well as a simulation study.

  2. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  3. Modeling units of study from a pedagogical perspective: the pedagogical meta-model behind EML

    NARCIS (Netherlands)

    Koper, Rob

    2003-01-01

    This text is a short summary of the work on pedagogical analysis carried out when EML (Educational Modelling Language) was being developed. Because we address pedagogical meta-models the consequence is that I must justify the underlying pedagogical models it describes. I have included a (far from

  4. Ecologically justified regulatory provisions for riverine hydroelectric power plants and minimum instream flow requirements in diverted streams; Oekologisch begruendete, dynamische Mindestwasserregelungen bei Ausleitungskraftwerken

    Energy Technology Data Exchange (ETDEWEB)

    Jorde, K.

    1997-12-31

    The study was intended to develop a model versatile enough to permit quantification of various water demand scenarios in connection with operation of riverine hydroelectric power plants. Specific emphasis was to be placed on defining the minimum instream flow to be maintained in river segments because of the elementary significance to flowing water biocinoses. Based on fictitious minimum water requirements, various scenarious were simulated for flow regimes depending on power plant operation, so as to establish a system for comparative analysis and evaluation of resulting economic effects on power plant efficiency on the one hand, and the ecologic effects on the aquatic habitat. The information derived was to serve as a basis for decision-making for regulatory purposes. For this study, the temporal and spatial variability of the flow regime at the river bed in a river segment was examined for the first time. Based on this information, complemented by information obtained from habitat simulations, a method was derived for determination of ecologic requirements and their incorporation into regulatory water management provisions. The field measurements were carried out with the FST hemisphere as a proven and most efficient and reliable method of assessing flow regimes at river beds. Evaluation of the measured instream flow data characterising three morphologically different segments of diverted rivers was done with the CASIMIR computer code. The ASS models derived were used for comparative assessment of existing regulatory provisions and recommended amendments determining required minimum instream flow in diverted rivers. The requirements were defined taking as a basis data obtained for three different years. (orig./CB) [Deutsch] Ziel der Arbeit war die Entwicklung eines Modellverfahrens, das flexibel die Quantifizierung unterschiedlicher Nutzansprueche an Laufwasserkraftanlagen ermoeglicht. Insbesondere der Erhalt einer gewissen Dynamik, die fuer

  5. Justifying Nonstandard Exception Requests for Pediatric Liver Transplant Candidates: An Analysis of Narratives Submitted to the United Network for Organ Sharing, 2009-2014.

    Science.gov (United States)

    Perito, E R; Braun, H J; Dodge, J L; Rhee, S; Roberts, J P

    2017-08-01

    Nonstandard exception requests (NSERs), for which transplant centers provide patient-specific narratives to support a higher Model for End-stage Liver Disease/Pediatric End-stage Liver Disease score, are made for >30% of pediatric liver transplant candidates. We describe the justifications used in pediatric NSER narratives 2009-2014 and identify justifications associated with NSER denial, waitlist mortality, and transplant. Using United Network for Organ Sharing data, 1272 NSER narratives from 1138 children with NSERs were coded for analysis. The most common NSER justifications were failure-to-thrive (48%) and risk of death (40%); both associated with approval. Varices, involvement of another organ, impaired quality of life, and encephalopathy were justifications used more often in denied NSERs. Of the 25 most prevalent justifications, 60% were not associated with approval or denial. Waitlist mortality risk was increased when fluid overload or "posttransplant complication outside standard criteria" were cited and decreased when liver-related infection was noted. Transplant probability was increased when the narrative mentioned liver-related infections, and fluid overload for children pediatric candidates. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  6. Type I error and the power of the s-test: old lessons from a new, analytically justified statistical test for phylogenies.

    Science.gov (United States)

    Antezana, M A; Hudson, R R

    1999-06-01

    We present a new procedure for assessing the statistical significance of the most likely unrooted dichotomous topology inferrable from four DNA sequences. The procedure calculates directly a P-value for the support given to this topology by the informative sites congruent with it, assuming the most likely star topology as the null hypothesis. Informative sites are crucial in the determination of the maximum likelihood dichotomous topology and are therefore an obvious target for a statistical test of phylogenies. Our P-value is the probability of producing through parallel substitutions on the branches of the star topology at least as much support as that given to the maximum likelihood dichotomous topology by the aforementioned informative sites, for any of the three possible dichotomous topologies. The degree of statistical significance is simply the complement of this P-value. Ours is therefore an a posteriori testing approach, in which no dichotomous topology is specified in advance. We implement the test for the case in which all sites behave identically and the substitution model has a single parameter. Under these conditions, the P-value can be easily calculated on the basis of the probabilities of change on the branches of the most likely star topology, because under these assumptions, each site can become informative independently from every other site; accordingly, the total number of informative sites of each kind is binomially distributed. We explore the test's type I error by applying it to data produced in star topologies having all branches equally long, or having two short and two long branches, and various degrees of homoplasy. The test is conservative but we demonstrate, by means of a discreteness correction and progressively assumption-free calculations of the P-values, that (1) the conservativeness is mostly due to the discrete nature of informative sites and (2) the P-values calculated empirically are moreover mostly quite accurate in absolute

  7. A place-based model of local activity spaces: individual place exposure and characteristics

    Science.gov (United States)

    Hasanzadeh, Kamyar; Laatikainen, Tiina; Kyttä, Marketta

    2018-01-01

    Researchers for long have hypothesized relationships between mobility, urban context, and health. Despite the ample amount of discussions, the empirical findings corroborating such associations remain to be marginal in the literature. It is growingly believed that the weakness of the observed associations can be largely explained by the common misspecification of the geographical context. Researchers coming from different fields have developed a wide range of methods for estimating the extents of these geographical contexts. In this article, we argue that no single approach yet has sufficiently been capable of capturing the complexity of human mobility patterns. Subsequently, we discuss that reaching a better understanding of individual activity spaces can be possible through a spatially sensitive estimation of place exposure. Following this discussion, we take an integrative person and place-based approach to create an individualized residential exposure model (IREM) to estimate the local activity spaces (LAS) of the individuals. This model is created using data collected through public participation GIS. Following a brief comparison of IREM with other commonly used LAS models, the article continues by presenting an empirical study of aging citizens in Helsinki area to demonstrate the usability of the proposed framework. In this study, we identify the main dimensions of LASs and seek their associations with socio-demographic characteristics of individuals and their location in the region. The promising results from comparisons and the interesting findings from the empirical part suggest both a methodological and conceptual improvement in capturing the complexity of local activity spaces.

  8. Parametric modeling and optimal experimental designs for estimating isobolograms for drug interactions in toxicology.

    Science.gov (United States)

    Holland-Letz, Tim; Gunkel, Nikolas; Amtmann, Eberhard; Kopp-Schneider, Annette

    2017-11-27

    In toxicology and related areas, interaction effects between two substances are commonly expressed through a combination index [Formula: see text] evaluated separately at different effect levels and mixture ratios. Often, these indices are combined into a graphical representation, the isobologram. Instead of estimating the combination indices at the experimental mixture ratios only, we propose a simple parametric model for estimating the underlying interaction function. We integrate this approach into a joint model where both the parameters of the dose-response functions of the singular substances and the interaction parameters can be estimated simultaneously. As an additional benefit, this concept allows to determine optimal statistical designs for combination studies optimizing the estimation of the interaction function as a whole. From an optimal design perspective, finding the interaction parameters generally corresponds to a [Formula: see text]-optimality resp. [Formula: see text]-optimality design problem, while estimation of all underlying dose response parameters corresponds to a [Formula: see text]-optimality design problem. We show how optimal designs can be obtained in either case as well as how combination designs providing reasonable performance in regard to both criteria can be determined by putting a constraint on the efficiency in regard to one of the criteria and optimizing for the other. As all designs require prior information about model parameter values, which may be unreliable in practice, the effect of misspecifications is investigated as well.

  9. Variable Selection with Prior Information for Generalized Linear Models via the Prior LASSO Method

    Science.gov (United States)

    Jiang, Yuan; He, Yunxiao

    2015-01-01

    LASSO is a popular statistical tool often used in conjunction with generalized linear models that can simultaneously select variables and estimate parameters. When there are many variables of interest, as in current biological and biomedical studies, the power of LASSO can be limited. Fortunately, so much biological and biomedical data have been collected and they may contain useful information about the importance of certain variables. This paper proposes an extension of LASSO, namely, prior LASSO (pLASSO), to incorporate that prior information into penalized generalized linear models. The goal is achieved by adding in the LASSO criterion function an additional measure of the discrepancy between the prior information and the model. For linear regression, the whole solution path of the pLASSO estimator can be found with a procedure similar to the Least Angle Regression (LARS). Asymptotic theories and simulation results show that pLASSO provides significant improvement over LASSO when the prior information is relatively accurate. When the prior information is less reliable, pLASSO shows great robustness to the misspecification. We illustrate the application of pLASSO using a real data set from a genome-wide association study. PMID:27217599

  10. More Precise Estimation of Lower-Level Interaction Effects in Multilevel Models.

    Science.gov (United States)

    Loeys, Tom; Josephy, Haeike; Dewitte, Marieke

    2018-03-20

    In hierarchical data, the effect of a lower-level predictor on a lower-level outcome may often be confounded by an (un)measured upper-level factor. When such confounding is left unaddressed, the effect of the lower-level predictor is estimated with bias. Separating this effect into a within- and between-component removes such bias in a linear random intercept model under a specific set of assumptions for the confounder. When the effect of the lower-level predictor is additionally moderated by another lower-level predictor, an interaction between both lower-level predictors is included into the model. To address unmeasured upper-level confounding, this interaction term ought to be decomposed into a within- and between-component as well. This can be achieved by first multiplying both predictors and centering that product term next, or vice versa. We show that while both approaches, on average, yield the same estimates of the interaction effect in linear models, the former decomposition is much more precise and robust against misspecification of the effects of cross-level and upper-level terms, compared to the latter.

  11. Toward a Theory of Sexual Aggression: A Quadripartite Model.

    Science.gov (United States)

    Hall, Gordon C. Nagayama; Hirschman, Richard

    1991-01-01

    Addresses need for unified theoretical model of sexually aggressive behavior against adult females by integrating elements of existing models into quadripartite model which accounts for heterogeneity of sexual aggressors by prominence of potential etiological factors. Model components (physiological sexual arousal, cognitions justifying sexual…

  12. Proceedings of the Workshop on Justifying the Suitability of Nuclear Licensee Organisational Structure, Resources and Competencies - Methods, Approaches and Good Practices

    International Nuclear Information System (INIS)

    2009-01-01

    The nuclear industry is currently facing a range of organisational challenges. The nuclear renaissance is resulting in renewed interest in new reactor build programmes; existing plants are being modernised; ageing plants and an ageing workforce are being replaced. The industry is developing new models of working in a competitive, and increasingly global market which has seen increased use of contractors and organisational change taking place at an unparalleled rate. It is clear that the way in which nuclear licensees' organisations are structured and resourced has a potential impact on nuclear safety. For example, nuclear safety may be challenged if organisational structures create uncertainty concerning authority and responsibilities or if nuclear safety functions are not adequately resourced. Inasmuch as this is so, then it is reasonable to expect both licensees and regulatory bodies to seek assurance that licensee organisations are suitable to manage nuclear safety and discharge the responsibilities associated with operating as a nuclear licensee. Although licensees should have the authority to organise their plant activities in different ways, they should also be able to demonstrate that they understand the potential impact that these activities may have on plant safety. They should be able to show how their organisations are designed to carry out these activities safely and effectively, and to verify that the nuclear safety functions are being delivered as expected. There is a growing interest from some nuclear regulatory bodies, as well as licensees, in methods and approaches that can be used to ensure that the licensee organisations are well structured and have sufficient resources and competencies to manage safety. To address these and other nuclear plant organisational safety-related issues a NEA/CSNI workshop was held in Uppsala (Sweden) hosted by the Swedish Radiation Safety Authority with support from the European Union's Joint Research Centre

  13. A variational data assimilation system for soil–atmosphere flux estimates for the Community Land Model (CLM3.5

    Directory of Open Access Journals (Sweden)

    C. M. Hoppe

    2014-05-01

    Full Text Available This paper presents the development and implementation of a spatio-temporal variational data assimilation system (4D-var for the soil–vegetation–atmosphere transfer model "Community Land Model" (CLM3.5, along with the development of the adjoint code for the core soil–atmosphere transfer scheme of energy and soil moisture. The purpose of this work is to obtain an improved estimation technique for the energy fluxes (sensible and latent heat fluxes between the soil and the atmosphere. Optimal assessments of these fluxes are neither available from model simulations nor measurements alone, while a 4D-var data assimilation has the potential to combine both information sources by a Best Linear Unbiased Estimate (BLUE. The 4D-var method requires the development of the adjoint model of the CLM which is established in this work. The new data assimilation algorithm is able to assimilate soil temperature and soil moisture measurements for one-dimensional columns of the model grid. Numerical experiments were first used to test the algorithm under idealised conditions. It was found that the analysis delivers improved results whenever there is a dependence between the initial values and the assimilated quantity. Furthermore, soil temperature and soil moisture from in situ field measurements were assimilated. These calculations demonstrate the improved performance of flux estimates, whenever soil property parameters are available of sufficient quality. Misspecifications could also be identified by the performance of the variational scheme.

  14. Modeling of hydrogen interactions with beryllium

    Energy Technology Data Exchange (ETDEWEB)

    Longhurst, G.R. [Lockheed Martin Idaho Technologies Co., Idaho Falls, ID (United States)

    1998-01-01

    In this paper, improved mathematical models are developed for hydrogen interactions with beryllium. This includes the saturation effect observed for high-flux implantation of ions from plasmas and retention of tritium produced from neutronic transmutations in beryllium. Use of the models developed is justified by showing how they can replicated experimental data using the TMAP4 tritium transport code. (author)

  15. Justified requirements in private transportation and a recommendation for improving the efficiency of household energy utilisation through the use of small ecologically-friendly or 'ultralight' vehicles for mass private transportation in the 21st century

    International Nuclear Information System (INIS)

    Juravic, T.

    1999-01-01

    Needs and ownership are sociobiologically manifested in the alter-ego of a Homo sapiens where the natural progression of events (a household being the fundamental microlevel) and the social order, i.e. globalisation, are based on ownership and needs as sacred rights, and for this reason universal values like energy conservation end up as the waste of the mindless worship of consumption. Justified needs are phenomena of a consumerist (egocentric, pragmatic, voluntary) social conscience and instinctive behaviour - an unpredictable cause resulting from freedom being the foundation of the quality of life, socio-economic and political changes but are mutually exclusive to understanding (expressing and gaining deeper and richer knowledge). Inbuilt limits and/or control of consumption, which are already used in household appliances with aforeset processes (goals) for unknown consumers, to achieve large energy savings in 'routine' functions are more effective than attempts to prevent mistakes (lack of user knowledge through repression). A private vehicle, as a symbol of the freedom and quality of life, is a mechanism for achieving 'justified' needs and presents another means of household energy utilisation. The consumer's desires regarding private transportation are not sufficiently reconciled with intelligent microprocessors (expert systems), which achieve (the most) optimal behaviour in the process of transportation. This detailed consideration (as part of investigating the technical system) cannot be examined on a strictly logical or scientific basis, as it only proposes a method of co-agreement (not co-reponsability) of manufacturers and consumers and an alternative logical way of thinking, or organisation of the interaction between vehicles and traffic in order to form a judgement of really justifiable needs, and to achieve a robotic private vehicle, transportation and traffic. The goal of this consideration is to establish the DIVISION of energy with the help of

  16. Multilevel Autoregressive Mediation Models: Specification, Estimation, and Applications.

    Science.gov (United States)

    Zhang, Qian; Wang, Lijuan; Bergeman, C S

    2017-11-27

    In the current study, extending from the cross-lagged panel models (CLPMs) in Cole and Maxwell (2003), we proposed the multilevel autoregressive mediation models (MAMMs) by allowing the coefficients to differ across individuals. In addition, Level-2 covariates can be included to explain the interindividual differences of mediation effects. Given the complexity of the proposed models, Bayesian estimation was used. Both a CLPM and an unconditional MAMM were fitted to daily diary data. The 2 models yielded different statistical conclusions regarding the average mediation effect. A simulation study was conducted to examine the estimation accuracy of Bayesian estimation for MAMMs and consequences of model mis-specifications. Factors considered included the sample size (N), number of time points (T), fixed indirect and direct effect sizes, and Level-2 variances and covariances. Results indicated that the fixed effect estimates for the indirect effect components (a and b) and the fixed effects of Level-2 covariates were accurate when N ≥ 50 and T ≥ 5. For estimating Level-2 variances and covariances, they were accurate provided a sufficiently large N and T (e.g., N ≥ 500 and T ≥ 50). Estimates of the average mediation effect were generally accurate when N ≥ 100 and T ≥ 10, or N ≥ 50 and T ≥ 20. Furthermore, we found that when Level-2 variances were zero, MAMMs yielded valid inferences about the fixed effects, whereas when random effects existed, CLPMs had low coverage rates for fixed effects. DIC can be used for model selection. Limitations and future directions were discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Stratified sampling design and loss to follow-up in survival models: evaluation of efficiency and bias

    Directory of Open Access Journals (Sweden)

    Carvalho Marilia S

    2011-06-01

    Full Text Available Abstract Background Longitudinal studies often employ complex sample designs to optimize sample size, over-representing population groups of interest. The effect of sample design on parameter estimates is quite often ignored, particularly when fitting survival models. Another major problem in long-term cohort studies is the potential bias due to loss to follow-up. Methods In this paper we simulated a dataset with approximately 50,000 individuals as the target population and 15,000 participants to be followed up for 40 years, both based on real cohort studies of cardiovascular diseases. Two sample strategies - simple random (our golden standard and Stratified by professional group, with non-proportional allocation - and two loss to follow-up scenarios - non-informative censoring and losses related to the professional group - were analyzed. Results Two modeling approaches were evaluated: weighted and non-weighted fit. Our results indicate that under the correctly specified model, ignoring the sample weights does not affect the results. However, the model ignoring the interaction of sample strata with the variable of interest and the crude estimates were highly biased. Conclusions In epidemiological studies misspecification should always be considered, as different sources of variability, related to the individuals and not captured by the covariates, are always present. Therefore, allowance must be made for the possibility of unknown confounders and interactions with the main variable of interest in our data. It is strongly recommended always to correct by sample weights.

  18. Evaluating remedial alternatives for an acid mine drainage stream: A model post audit

    Science.gov (United States)

    Runkel, Robert L.; Kimball, Briant A.; Walton-Day, Katherine; Verplanck, Philip L.; Broshears, Robert E.

    2012-01-01

    A post audit for a reactive transport model used to evaluate acid mine drainage treatment systems is presented herein. The post audit is based on a paired synoptic approach in which hydrogeochemical data are collected at low (existing conditions) and elevated (following treatment) pH. Data obtained under existing, low-pH conditions are used for calibration, and the resultant model is used to predict metal concentrations observed following treatment. Predictions for Al, As, Fe, H+, and Pb accurately reproduce the observed reduction in dissolved concentrations afforded by the treatment system, and the information provided in regard to standard attainment is also accurate (predictions correctly indicate attainment or nonattainment of water quality standards for 19 of 25 cases). Errors associated with Cd, Cu, and Zn are attributed to misspecification of sorbent mass (precipitated Fe). In addition to these specific results, the post audit provides insight in regard to calibration and sensitivity analysis that is contrary to conventional wisdom. Steps taken during the calibration process to improve simulations of As sorption were ultimately detrimental to the predictive results, for example, and the sensitivity analysis failed to bracket observed metal concentrations.

  19. Evaluating remedial alternatives for an acid mine drainage stream: a model post audit.

    Science.gov (United States)

    Runkel, Robert L; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L; Broshears, Robert E

    2012-01-03

    A post audit for a reactive transport model used to evaluate acid mine drainage treatment systems is presented herein. The post audit is based on a paired synoptic approach in which hydrogeochemical data are collected at low (existing conditions) and elevated (following treatment) pH. Data obtained under existing, low-pH conditions are used for calibration, and the resultant model is used to predict metal concentrations observed following treatment. Predictions for Al, As, Fe, H(+), and Pb accurately reproduce the observed reduction in dissolved concentrations afforded by the treatment system, and the information provided in regard to standard attainment is also accurate (predictions correctly indicate attainment or nonattainment of water quality standards for 19 of 25 cases). Errors associated with Cd, Cu, and Zn are attributed to misspecification of sorbent mass (precipitated Fe). In addition to these specific results, the post audit provides insight in regard to calibration and sensitivity analysis that is contrary to conventional wisdom. Steps taken during the calibration process to improve simulations of As sorption were ultimately detrimental to the predictive results, for example, and the sensitivity analysis failed to bracket observed metal concentrations.

  20. Three Requirements for Justifying an Educational Neuroscience

    Science.gov (United States)

    Hruby, George G.

    2012-01-01

    Background: Over the past quarter century, efforts to bridge between research in the neurosciences and research, theory, and practice in education have grown from a mere hope to noteworthy scholarly sophistication. Many dedicated educational researchers have developed the secondary expertise in the necessary neurosciences and related fields to…

  1. Self-Esteem: Justifying Its Existence.

    Science.gov (United States)

    Street, Sue; Isaacs, Madelyn

    1998-01-01

    The role of self-esteem as a professional and personality construct has been obscured by its panacea role. Definitions of self-esteem and related terms are distinguished. Self-esteem is discussed as a developmental construct, a personality construct, and as a therapeutic goal. Therapeutic, educational, and counseling implications are discussed.…

  2. Justifying Physical Education Based on Neuroscience Evidence

    Science.gov (United States)

    Berg, Kris

    2010-01-01

    Research has shown that exercise improves cognitive function and psychological traits that influence behavior (e.g., mood, level of motivation). The evidence in the literature also shows that physical education may enhance learning or that academic performance is at least maintained despite a reduction in classroom time in order to increase time…

  3. Are ionic CAT contrast media still justifiable

    International Nuclear Information System (INIS)

    Witt, H.; Trempenau, B.; Dietz, G.

    1984-01-01

    The authors' clinical results revealed no statistically significant differences of tolerance between the two X-ray contrast media 'Ioxitalamat' and 'Ioglicinat'. Side-effects were found in 4.3% of the cases for both contrast media, a rate which is slightly below the one for urography. However, it must not be overlooked that patients exposed to certain risk faktors such as e.g. relative contraindications were as far as possible excluded from the study. (orig./WU) [de

  4. Are segregated sports classes scientifically justified?

    OpenAIRE

    Lawson, Sian; Hall, Edward

    2014-01-01

    School sports classes are a key part of physical and mental development, yet in many countries these classes are gender segregated. Before institutionalised segregation can be condoned it is important to tackle assumptions and check for an evidence-based rationale. This presentation aims to analyse the key arguments for segregation given in comment-form response to a recent media article discussing mixed school sports (Lawson, 2013).\\ud \\ud The primary argument given was division for strength...

  5. Three requirements for justifying an educational neuroscience.

    Science.gov (United States)

    Hruby, George G

    2012-03-01

    Over the past quarter century, efforts to bridge between research in the neurosciences and research, theory, and practice in education have grown from a mere hope to noteworthy scholarly sophistication. Many dedicated educational researchers have developed the secondary expertise in the necessary neurosciences and related fields to generate both empirical research and theoretical syntheses of noteworthy promise. Nonetheless, thoughtful and critical scholars in education have expressed concern about both the intellectual coherence and ethical dangers of this new area. It is still an open question whether educational neuroscience is for some time yet to remain only a formative study area for adventurous scholars or is already a fully fledged field of educational scholarship. In this paper, I suggest that to be a worthy field of educational research, educational neuroscience will need to address three issues: intellectual coherence, mutually informing and respected scholarly expertise, and an ethical commitment to the moral implications and obligations shared within educational research generally. I shall set forth some examples of lapses in this regard, focusing primarily on work on reading development, as that is my area of expertise, and make recommendations for due diligence. Arguments. First, intellectual coherence requires both precision in definition of technical terms (so that diverse scholars and professionals may communicate findings and insights consistently across fields), and precision in the logical warrants by which educational implications are drawn from empirical data from the neurosciences. Both needs are facilitated by careful attention to categorical boundary and avoidance of category error. Second, educational neuroscientists require focused and broad expertise in both the neurosciences and educational scholarship on teaching and learning in classrooms (and/or ancillary fields). If history is our guide, neuroscience implications for practice will prove unlikely in practice without expertise on practice. Additionally, respect for the expertise of others in this hybrid and necessarily collaborative enterprise is required. Third, educational neuroscience must take seriously the heightened moral and ethical concerns and commitments of educational professionals generally and educational researchers particularly. This means keeping a vigilant eye towards preserving the integrity of empirical and theoretical findings against rhetorical misuse by educational marketers, policy makers, and polemicists targeting the general public. I conclude that educational neuroscience is more than a hybrid patchwork of individual interests constituting a study area, and is perhaps ready to stand as a legitimate field of educational inquiry. It will not be accepted as such, however, nor should it be, unless the need to demonstrate a capacity for consistent intellectual coherence, scholarly expertise, and ethical commitment is met. ©2012 The British Psychological Society.

  6. A heteroscedastic generalized linear model with a non-normal speed factor for responses and response times

    NARCIS (Netherlands)

    Molenaar, D.; Bolsinova, M.

    In generalized linear modelling of responses and response times, the observed response time variables are commonly transformed to make their distribution approximately normal. A normal distribution for the transformed response times is desirable as it justifies the linearity and homoscedasticity

  7. Evidence for Misspecification of a Nationally Used Quality Measure for Substance Use Treatment.

    Science.gov (United States)

    Mattke, Soeren; Predmore, Zachary; Sloss, Elizabeth; Wilks, Asa; Watkins, Katherine E

    2017-09-04

    The National Committee for Quality Assurance's (NCQA) measure "Initiation and Engagement of Alcohol and Other Drug Dependence Treatment" captures the proportion of substance use patients with (1) treatment initiation within 14 days and (2) treatment engagement within 30 days thereafter. The definition of treatment considers only counseling but not medication-assisted treatment (MAT), although MAT is supported by current guidelines. Our research question is whether this omission results in meaningful measurement error. We analyze claims data for members of commercial health plans to investigate whether including MAT would meaningfully change the measure rate and health plan rankings. Including MAT increased both the initiation and engagement rates. The initiation and engagement rates increased 2.4% (38.9-39.8%) and 9.9% (12.9-14%), respectively. These differences imply that 19% of health plans would change their ranking by at least one quintile for the initiation measure and 27% for the engagement measure. The current specifications result in erroneous conclusions about the quality of care provided by different health plans. Our results suggest that aligning the measure specifications with guideline recommendations, as recently proposed by NCQA, would result in more accurate information.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  8. Education and gender bias in the sex ratio at birth: evidence from India.

    Science.gov (United States)

    Echávarri, Rebeca A; Ezcurra, Roberto

    2010-02-01

    This article investigates the possible existence of a nonlinear link between female disadvantage in natality and education. To this end, we devise a theoretical model based on the key role of social interaction in explaining people's acquisition of preferences, which justifies the existence of a nonmonotonic relationship between female disadvantage in natality and education. The empirical validity of the proposed model is examined for the case of India, using district-level data. In this context, our econometric analysis pays particular attention to the role of spatial dependence to avoid any potential problems of misspecification. The results confirm that the relationship between the sex ratio at birth and education in India follows an inverted U-shape. This finding is robust to the inclusion of additional explanatory variables in the analysis, and to the choice of the spatial weight matrix used to quantify the spatial interdependence between the sample districts.

  9. Integrated population modeling of black bears in Minnesota: implications for monitoring and management.

    Directory of Open Access Journals (Sweden)

    John R Fieberg

    Full Text Available BACKGROUND: Wildlife populations are difficult to monitor directly because of costs and logistical challenges associated with collecting informative abundance data from live animals. By contrast, data on harvested individuals (e.g., age and sex are often readily available. Increasingly, integrated population models are used for natural resource management because they synthesize various relevant data into a single analysis. METHODOLOGY/PRINCIPAL FINDINGS: We investigated the performance of integrated population models applied to black bears (Ursus americanus in Minnesota, USA. Models were constructed using sex-specific age-at-harvest matrices (1980-2008, data on hunting effort and natural food supplies (which affects hunting success, and statewide mark-recapture estimates of abundance (1991, 1997, 2002. We compared this approach to Downing reconstruction, a commonly used population monitoring method that utilizes only age-at-harvest data. We first conducted a large-scale simulation study, in which our integrated models provided more accurate estimates of population trends than did Downing reconstruction. Estimates of trends were robust to various forms of model misspecification, including incorrectly specified cub and yearling survival parameters, age-related reporting biases in harvest data, and unmodeled temporal variability in survival and harvest rates. When applied to actual data on Minnesota black bears, the model predicted that harvest rates were negatively correlated with food availability and positively correlated with hunting effort, consistent with independent telemetry data. With no direct data on fertility, the model also correctly predicted 2-point cycles in cub production. Model-derived estimates of abundance for the most recent years provided a reasonable match to an empirical population estimate obtained after modeling efforts were completed. CONCLUSIONS/SIGNIFICANCE: Integrated population modeling provided a reasonable

  10. Integrated population modeling of black bears in Minnesota: implications for monitoring and management.

    Science.gov (United States)

    Fieberg, John R; Shertzer, Kyle W; Conn, Paul B; Noyce, Karen V; Garshelis, David L

    2010-08-12

    Wildlife populations are difficult to monitor directly because of costs and logistical challenges associated with collecting informative abundance data from live animals. By contrast, data on harvested individuals (e.g., age and sex) are often readily available. Increasingly, integrated population models are used for natural resource management because they synthesize various relevant data into a single analysis. We investigated the performance of integrated population models applied to black bears (Ursus americanus) in Minnesota, USA. Models were constructed using sex-specific age-at-harvest matrices (1980-2008), data on hunting effort and natural food supplies (which affects hunting success), and statewide mark-recapture estimates of abundance (1991, 1997, 2002). We compared this approach to Downing reconstruction, a commonly used population monitoring method that utilizes only age-at-harvest data. We first conducted a large-scale simulation study, in which our integrated models provided more accurate estimates of population trends than did Downing reconstruction. Estimates of trends were robust to various forms of model misspecification, including incorrectly specified cub and yearling survival parameters, age-related reporting biases in harvest data, and unmodeled temporal variability in survival and harvest rates. When applied to actual data on Minnesota black bears, the model predicted that harvest rates were negatively correlated with food availability and positively correlated with hunting effort, consistent with independent telemetry data. With no direct data on fertility, the model also correctly predicted 2-point cycles in cub production. Model-derived estimates of abundance for the most recent years provided a reasonable match to an empirical population estimate obtained after modeling efforts were completed. Integrated population modeling provided a reasonable framework for synthesizing age-at-harvest data, periodic large-scale abundance estimates, and

  11. Numerical Modeling of Rotary Kiln Productivity Increase

    NARCIS (Netherlands)

    Romero-Valle, M.A.; Pisaroni, M.; Van Puyvelde, D.; Lahaye, D.J.P.; Sadi, R.

    2013-01-01

    Rotary kilns are used in many industrial processes ranging from cement manufacturing to waste incineration. The operating conditions vary widely depending on the process. While there are many models available within the literature and industry, the wide range of operating conditions justifies

  12. Modeling Site Heterogeneity with Posterior Mean Site Frequency Profiles Accelerates Accurate Phylogenomic Estimation.

    Science.gov (United States)

    Wang, Huai-Chun; Minh, Bui Quang; Susko, Edward; Roger, Andrew J

    2018-03-01

    Proteins have distinct structural and functional constraints at different sites that lead to site-specific preferences for particular amino acid residues as the sequences evolve. Heterogeneity in the amino acid substitution process between sites is not modeled by commonly used empirical amino acid exchange matrices. Such model misspecification can lead to artefacts in phylogenetic estimation such as long-branch attraction. Although sophisticated site-heterogeneous mixture models have been developed to address this problem in both Bayesian and maximum likelihood (ML) frameworks, their formidable computational time and memory usage severely limits their use in large phylogenomic analyses. Here we propose a posterior mean site frequency (PMSF) method as a rapid and efficient approximation to full empirical profile mixture models for ML analysis. The PMSF approach assigns a conditional mean amino acid frequency profile to each site calculated based on a mixture model fitted to the data using a preliminary guide tree. These PMSF profiles can then be used for in-depth tree-searching in place of the full mixture model. Compared with widely used empirical mixture models with $k$ classes, our implementation of PMSF in IQ-TREE (http://www.iqtree.org) speeds up the computation by approximately $k$/1.5-fold and requires a small fraction of the RAM. Furthermore, this speedup allows, for the first time, full nonparametric bootstrap analyses to be conducted under complex site-heterogeneous models on large concatenated data matrices. Our simulations and empirical data analyses demonstrate that PMSF can effectively ameliorate long-branch attraction artefacts. In some empirical and simulation settings PMSF provided more accurate estimates of phylogenies than the mixture models from which they derive.

  13. “It’s an Issue that Must Be Addressed, Once Infertility Is Declared a Disease” Study of The Discursive Mechanisms Used by Chilean Deputies to Justify their Positions Regarding Assisted Reproduction

    Directory of Open Access Journals (Sweden)

    Yanko Pavicevic Cifuentes

    2015-10-01

    Full Text Available In Chile, the use and development of assisted reproductive technologies (ART have been on a constant rise, with a total of 1932 cycles registered in 2009 (SOCMER, 2009. However, the legal frame that regulates these technologies is weak, so in practice, the clinics that provide them are the ones that supervise their use. This is why, we wanted to understand the  position of those who have the faculty to legislate on the use of ART in Chile, namely, the deputies of the Republic, focusing on how they justify their standpoints. Our investigation was of qualitative nature, because it gives more space for reflexivity and flexibility in the investigation process (Mason, 2002. We collected information using semi-structured interviews, conducted to 16 deputies of the two main political coalitions present in Chile. In the deputies’ discourse, different positions concerning the use of ART’s are manifested: there are those who demand the respect of human dignity and nature and those who foment scientific development, but in general, there is consensus about the necessity to amplify ART’s regulation and incentivize its development, safeguarding ethics and equity, always.

  14. Statistical models for brain signals with properties that evolve across trials.

    Science.gov (United States)

    Ombao, Hernando; Fiecas, Mark; Ting, Chee-Ming; Low, Yin Fen

    2017-12-07

    Most neuroscience cognitive experiments involve repeated presentations of various stimuli across several minutes or a few hours. It has been observed that brain responses, even to the same stimulus, evolve over the course of the experiment. These changes in brain activation and connectivity are believed to be associated with learning and/or habituation. In this paper, we present two general approaches to modeling dynamic brain connectivity using electroencephalograms (EEGs) recorded across replicated trials in an experiment. The first approach is the Markovian regime-switching vector autoregressive model (MS-VAR) which treats EEGs as realizations of an underlying brain process that switches between different states both within a trial and across trials in the entire experiment. The second is the slowly evolutionary locally stationary process (SEv-LSP) which characterizes the observed EEGs as a mixture of oscillatory activities at various frequency bands. The SEv-LSP model captures the dynamic nature of the amplitudes of the band-oscillations and cross-correlations between them. The MS-VAR model is able to capture abrupt changes in the dynamics while the SEv-LSP directly gives interpretable results. Moreover, it is nonparametric and hence does not suffer from model misspecification. For both of these models, time-evolving connectivity metrics in the frequency domain are derived from the model parameters for both functional and effective connectivity. We illustrate these two models for estimating cross-trial connectivity in selective attention using EEG data from an oddball paradigm auditory experiment where the goal is to characterize the evolution of brain responses to target stimuli and to standard tones presented randomly throughout the entire experiment. The results suggest dynamic changes in connectivity patterns over trials with inter-subject variability. Copyright © 2017. Published by Elsevier Inc.

  15. Statistical models for brain signals with properties that evolve across trials

    KAUST Repository

    Ombao, Hernando

    2017-12-07

    Most neuroscience cognitive experiments involve repeated presentations of various stimuli across several minutes or a few hours. It has been observed that brain responses, even to the same stimulus, evolve over the course of the experiment. These changes in brain activation and connectivity are believed to be associated with learning and/or habituation. In this paper, we present two general approaches to modeling dynamic brain connectivity using electroencephalograms (EEGs) recorded across replicated trials in an experiment. The first approach is the Markovian regime-switching vector autoregressive model (MS-VAR) which treats EEGs as realizations of an underlying brain process that switches between different states both within a trial and across trials in the entire experiment. The second is the slowly evolutionary locally stationary process (SEv-LSP) which characterizes the observed EEGs as a mixture of oscillatory activities at various frequency bands. The SEv-LSP model captures the dynamic nature of the amplitudes of the band-oscillations and cross-correlations between them. The MS-VAR model is able to capture abrupt changes in the dynamics while the SEv-LSP directly gives interpretable results. Moreover, it is nonparametric and hence does not suffer from model misspecification. For both of these models, time-evolving connectivity metrics in the frequency domain are derived from the model parameters for both functional and effective connectivity. We illustrate these two models for estimating cross-trial connectivity in selective attention using EEG data from an oddball paradigm auditory experiment where the goal is to characterize the evolution of brain responses to target stimuli and to standard tones presented randomly throughout the entire experiment. The results suggest dynamic changes in connectivity patterns over trials with inter-subject variability.

  16. Some Problems in Using Diffusion Models for New Products.

    Science.gov (United States)

    Bernhardt, Irwin; Mackenzie, Kenneth D.

    This paper analyzes some of the problems of using diffusion models to formulate marketing strategies for new products. Though future work in this area appears justified, many unresolved problems limit its application. There is no theory for adoption and diffusion processes; such a theory is outlined in this paper. The present models are too…

  17. ACEt: An R Package for Estimating Dynamic Heritability and Comparing Twin Models.

    Science.gov (United States)

    He, Liang; Pitkäniemi, Janne; Silventoinen, Karri; Sillanpää, Mikko J

    2017-11-01

    Estimating dynamic effects of age on the genetic and environmental variance components in twin studies may contribute to the investigation of gene-environment interactions, and may provide more insights into more accurate and powerful estimation of heritability. Existing parametric models for estimating dynamic variance components suffer from various drawbacks such as limitation of predefined functions. We present ACEt, an R package for fast estimating dynamic variance components and heritability that may change with respect to age or other moderators. Building on the twin models using penalized splines, ACEt provides a unified framework to incorporate a class of ACE models, in which each component can be modeled independently and is not limited by a linear or quadratic function. We demonstrate that ACEt is robust against misspecification of the number of spline knots, and offers a refined resolution of dynamic behavior of the genetic and environmental components and thus a detailed estimation of age-specific heritability. Moreover, we develop resampling methods for testing twin models with different variance functions including splines, log-linearity and constancy, which can be easily employed to verify various model assumptions. We evaluated the type I error rate and statistical power of the proposed hypothesis testing procedures under various scenarios using simulated datasets. Potential numerical issues and computational cost were also assessed through simulations. We applied the ACEt package to a Finnish twin cohort to investigate age-specific heritability of body mass index and height. Our results show that the age-specific variance components of these two traits exhibited substantially different patterns despite of comparable estimates of heritability. In summary, the ACEt R package offers a useful tool for the exploration of age-dependent heritability and model comparison in twin studies.

  18. HEDR modeling approach

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1992-07-01

    This report details the conceptual approaches to be used in calculating radiation doses to individuals throughout the various periods of operations at the Hanford Site. The report considers the major environmental transport pathways--atmospheric, surface water, and ground water--and projects and appropriate modeling technique for each. The modeling sequence chosen for each pathway depends on the available data on doses, the degree of confidence justified by such existing data, and the level of sophistication deemed appropriate for the particular pathway and time period being considered

  19. Testing for Stock Market Contagion: A Quantile Regression Approach

    NARCIS (Netherlands)

    S.Y. Park (Sung); W. Wang (Wendun); N. Huang (Naijing)

    2015-01-01

    markdownabstract__Abstract__ Regarding the asymmetric and leptokurtic behavior of financial data, we propose a new contagion test in the quantile regression framework that is robust to model misspecification. Unlike conventional correlation-based tests, the proposed quantile contagion test

  20. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Science.gov (United States)

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  1. Justifying knowledge, justifying method, taking action: epistemologies, methodologies, and methods in qualitative research.

    Science.gov (United States)

    Carter, Stacy M; Little, Miles

    2007-12-01

    In this article, the authors clarify a framework for qualitative research, in particular for evaluating its quality, founded on epistemology, methodology, and method. They define these elements and discuss their respective contributions and interrelationships. Epistemology determines and is made visible through method, particularly in the participant- researcher relationship, measures of research quality, and form, voice, and representation in analysis and writing. Epistemology guides methodological choices and is axiological. Methodology shapes and is shaped by research objectives, questions, and study design. Methodologies can prescribe choices of method, resonate with particular academic disciplines, and encourage or discourage the use and/or development of theory. Method is constrained by and makes visible methodological and epistemic choices. If we define good quality qualitative research as research that attends to all three elements and demonstrates internal consistency between them, standardized checklists can be transcended and innovation and diversity in qualitative research practice facilitated.

  2. Robust inference in the negative binomial regression model with an application to falls data.

    Science.gov (United States)

    Aeberhard, William H; Cantoni, Eva; Heritier, Stephane

    2014-12-01

    A popular way to model overdispersed count data, such as the number of falls reported during intervention studies, is by means of the negative binomial (NB) distribution. Classical estimating methods are well-known to be sensitive to model misspecifications, taking the form of patients falling much more than expected in such intervention studies where the NB regression model is used. We extend in this article two approaches for building robust M-estimators of the regression parameters in the class of generalized linear models to the NB distribution. The first approach achieves robustness in the response by applying a bounded function on the Pearson residuals arising in the maximum likelihood estimating equations, while the second approach achieves robustness by bounding the unscaled deviance components. For both approaches, we explore different choices for the bounding functions. Through a unified notation, we show how close these approaches may actually be as long as the bounding functions are chosen and tuned appropriately, and provide the asymptotic distributions of the resulting estimators. Moreover, we introduce a robust weighted maximum likelihood estimator for the overdispersion parameter, specific to the NB distribution. Simulations under various settings show that redescending bounding functions yield estimates with smaller biases under contamination while keeping high efficiency at the assumed model, and this for both approaches. We present an application to a recent randomized controlled trial measuring the effectiveness of an exercise program at reducing the number of falls among people suffering from Parkinsons disease to illustrate the diagnostic use of such robust procedures and their need for reliable inference. © 2014, The International Biometric Society.

  3. Validity of tests under covariate-adaptive biased coin randomization and generalized linear models.

    Science.gov (United States)

    Shao, Jun; Yu, Xinxin

    2013-12-01

    Some covariate-adaptive randomization methods have been used in clinical trials for a long time, but little theoretical work has been done about testing hypotheses under covariate-adaptive randomization until Shao et al. (2010) who provided a theory with detailed discussion for responses under linear models. In this article, we establish some asymptotic results for covariate-adaptive biased coin randomization under generalized linear models with possibly unknown link functions. We show that the simple t-test without using any covariate is conservative under covariate-adaptive biased coin randomization in terms of its Type I error rate, and that a valid test using the bootstrap can be constructed. This bootstrap test, utilizing covariates in the randomization scheme, is shown to be asymptotically as efficient as Wald's test correctly using covariates in the analysis. Thus, the efficiency loss due to not using covariates in the analysis can be recovered by utilizing covariates in covariate-adaptive biased coin randomization. Our theory is illustrated with two most popular types of discrete outcomes, binary responses and event counts under the Poisson model, and exponentially distributed continuous responses. We also show that an alternative simple test without using any covariate under the Poisson model has an inflated Type I error rate under simple randomization, but is valid under covariate-adaptive biased coin randomization. Effects on the validity of tests due to model misspecification is also discussed. Simulation studies about the Type I errors and powers of several tests are presented for both discrete and continuous responses. © 2013, The International Biometric Society.

  4. The effect of using a robust optimality criterion in model based adaptive optimization.

    Science.gov (United States)

    Strömberg, Eric A; Hooker, Andrew C

    2017-08-01

    Optimizing designs using robust (global) optimality criteria has been shown to be a more flexible approach compared to using local optimality criteria. Additionally, model based adaptive optimal design (MBAOD) may be less sensitive to misspecification in the prior information available at the design stage. In this work, we investigate the influence of using a local (lnD) or a robust (ELD) optimality criterion for a MBAOD of a simulated dose optimization study, for rich and sparse sampling schedules. A stopping criterion for accurate effect prediction is constructed to determine the endpoint of the MBAOD by minimizing the expected uncertainty in the effect response of the typical individual. 50 iterations of the MBAODs were run using the MBAOD R-package, with the concentration from a one-compartment first-order absorption pharmacokinetic model driving the population effect response in a sigmoidal EMAX pharmacodynamics model. The initial cohort consisted of eight individuals in two groups and each additional cohort added two individuals receiving a dose optimized as a discrete covariate. The MBAOD designs using lnD and ELD optimality with misspecified initial model parameters were compared by evaluating the efficiency relative to an lnD-optimal design based on the true parameter values. For the explored example model, the MBAOD using ELD-optimal designs converged quicker to the theoretically optimal lnD-optimal design based on the true parameters for both sampling schedules. Thus, using a robust optimality criterion in MBAODs could reduce the number of adaptations required and improve the practicality of adaptive trials using optimal design.

  5. Modelling

    CERN Document Server

    Spädtke, P

    2013-01-01

    Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.

  6. Stochastic Restricted Biased Estimators in misspecified regression model with incomplete prior information

    OpenAIRE

    Kayanan, Manickavasagar; Wijekoon, Pushpakanthie

    2017-01-01

    In this article, the analysis of misspecification was extended to the recently introduced stochastic restricted biased estimators when multicollinearity exists among the explanatory variables. The Stochastic Restricted Ridge Estimator (SRRE), Stochastic Restricted Almost Unbiased Ridge Estimator (SRAURE), Stochastic Restricted Liu Estimator (SRLE), Stochastic Restricted Almost Unbiased Liu Estimator (SRAULE), Stochastic Restricted Principal Component Regression Estimator (SRPCR), Stochastic R...

  7. Uninformative priors prefer simpler models

    Science.gov (United States)

    Mattingly, Henry; Abbott, Michael; Machta, Benjamin

    The Bayesian framework for model selection requires a prior for the probability of candidate models that is uninformative-it minimally biases predictions with preconceptions. For parameterized models, Jeffreys' uninformative prior, pJ, weights parameter space according to the local density of distinguishable model predictions. While pJ is rigorously justifiable in the limit that there is infinite data, it is ill-suited to effective theories and sloppy models. In these models, parameters are very poorly constrained by available data, and even the number of parameters is often arbitrary. We use a principled definition of `uninformative' as the mutual information between parameters and their expected data and study the properties of the prior p* which maximizes it. When data is abundant, p* approaches Jeffreys' prior. With finite data, however, p* is discrete, putting weight on a finite number of atoms in parameter space. In addition, when data is scarce, the prior lies on model boundaries, which in many cases correspond to interpretable models but with fewer parameters. As more data becomes available, the prior puts weight on models with more parameters. Thus, p* quantifies the intuition that better data can justify the use of more complex models.

  8. Preliminary Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Prince, F. Andrew; Smart, Christian; Stephens, Kyle; Henrichs, Todd

    2009-01-01

    Parametric cost models are routinely used to plan missions, compare concepts and justify technology investments. However, great care is required. Some space telescope cost models, such as those based only on mass, lack sufficient detail to support such analysis and may lead to inaccurate conclusions. Similarly, using ground based telescope models which include the dome cost will also lead to inaccurate conclusions. This paper reviews current and historical models. Then, based on data from 22 different NASA space telescopes, this paper tests those models and presents preliminary analysis of single and multi-variable space telescope cost models.

  9. THE FEATURES OF INNOVATIVE ACTIVITY UNDER THE OPEN INNOVATION MODEL

    Directory of Open Access Journals (Sweden)

    Julia P. Kulikova

    2014-01-01

    Full Text Available The article discusses the distinctive characteristics of open and closed models of functioning of the innovation sphere. Justified the use of interaction marketing approach to relationship management of innovation sphere. Two sets of marketing functions - network and process for the effective functioning of innovation networks. Given matrix scorecard marketing functions in the innovation network.

  10. Islamic vs. conventional banks : Business models, efficiency and stability

    NARCIS (Netherlands)

    Beck, T.H.L.; Demirgüc-Kunt, A.; Merrouche, O.

    2013-01-01

    How different are Islamic banks from conventional banks? Does the recent crisis justify a closer look at the Sharia-compliant business model for banking? When comparing conventional and Islamic banks, controlling for time-variant country-fixed effects, we find few significant differences in business

  11. Inhomogeneous Markov Models for Describing Driving Patterns

    DEFF Research Database (Denmark)

    Iversen, Emil Banning; Møller, Jan K.; Morales, Juan Miguel

    2017-01-01

    . Specifically, an inhomogeneous Markov model that captures the diurnal variation in the use of a vehicle is presented. The model is defined by the time-varying probabilities of starting and ending a trip, and is justified due to the uncertainty associated with the use of the vehicle. The model is fitted to data...... collected from the actual utilization of a vehicle. Inhomogeneous Markov models imply a large number of parameters. The number of parameters in the proposed model is reduced using B-splines....

  12. The use of logistic regression in modelling the distributions of bird ...

    African Journals Online (AJOL)

    The method of logistic regression was used to model the observed geographical distribution patterns of bird species in Swaziland in relation to a set of environmental variables. Reporting rates derived from bird atlas data are used as an index of population densities. This is justified in part by the success of the modelling ...

  13. The use of logistic regression in modelling the distributions of bird ...

    African Journals Online (AJOL)

    The method of logistic regression was used to model the observed geographical distribution patterns of bird species in Swaziland in relation to a set of environmental variables. Reporting rates derived from brrd atlas data are used as an index of population densities. This is justified in part by the success of the modelling ...

  14. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  15. Fitting Multilevel Models with Ordinal Outcomes: Performance of Alternative Specifications and Methods of Estimation

    Science.gov (United States)

    Bauer, Daniel J.; Sterba, Sonya K.

    2011-01-01

    Previous research has compared methods of estimation for fitting multilevel models to binary data, but there are reasons to believe that the results will not always generalize to the ordinal case. This article thus evaluates (a) whether and when fitting multilevel linear models to ordinal outcome data is justified and (b) which estimator to employ…

  16. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  17. A thermal model of the economy

    Science.gov (United States)

    Arroyo Colon, Luis Balbino

    The motivation for this work came from an interest in Economics (particularly since the 2008 economic downturn) and a desire to use the tools of physics in a field that has not been the subject of great exploration. We propose a model of economics in analogy to thermodynamics and introduce the concept of the Value Multiplier as a fundamental addition to any such model. Firstly, we attempt to make analogies between some economic concepts and fundamental concepts of thermal physics. Then we introduce the value multiplier and justify its existence in our system; the value multiplier allows us to account for some intangible, psychological elements of the value of goods and services. We finally bring all the elements together in a qualitative system. In particular, we attempt to make an analogy with the Keynesian Multiplier that justifies the usefulness of fiscal stimulus in severe economic downturns. ii

  18. Discussing the Strehler-Mildvan model of mortality

    Directory of Open Access Journals (Sweden)

    Maxim Finkelstein

    2012-03-01

    Full Text Available BACKGROUND Half a century ago Strehler and Mildvan (1960 have published the seminal paper that, based on some assumptions (postulates, theoretically 'justified' the Gompertz law of mortality. OBJECTIVE We wish to discuss assumptions and limitations of the original Strehler-Mildvan model (as well as of the Strehler-Mildvan correlation and consider some modifications and departures from this model. METHODS We use the framework of stochastic point processes for analyzing the original Strehler-Mildvan model. We also suggest the 'lifesaving approach' for describing the departure from rectangularization to shifts in survival curves for human mortality that has been observed in the second half of the previous century. RESULTS We show that the Strehler-Mildvan model can be justified only under the additional assumption that the process of shocks (demands for energy follows the Poisson pattern. We also suggest a modification that accounts for the oldest-old mortality plateau.

  19. A mathematical model of star formation in the Galaxy

    Directory of Open Access Journals (Sweden)

    M.A. Sharaf

    2012-06-01

    Full Text Available This paper is generally concerned with star formation in the Galaxy, especially blue stars. Blue stars are the most luminous, massive and the largest in radius. A simple mathematical model of the formation of the stars is established and put in computational algorithm. This algorithm enables us to know more about the formation of the star. Some real and artificial examples had been used to justify this model.

  20. Multi-Tasking vs. Screening: A Model of Academic Tenure

    OpenAIRE

    Kou, Zonglai; Zhou, Min

    2009-01-01

    The paper develops a model of academic tenure based on multi-tasking and screening. A professor has two tasks, researching and teaching. We assume that researching performance is easy to measure but teaching performance is immeasurable. Then Holmtrom and Milgrom's (1991) classical muli-task principal-agent model implies that the only way for the the university to "incentivize" teaching activity is decreasing the incentive power to researching activity. This justifies the low-powered contract ...

  1. Method of modeling the cognitive radio using Opnet Modeler

    OpenAIRE

    Yakovenko, I. V.; Poshtarenko, V. M.; Kostenko, R. V.

    2012-01-01

    This article is a review of the first wireless standard based on cognitive radio networks. The necessity of wireless networks based on the technology of cognitive radio. An example of the use of standard IEEE 802.22 in Wimax network through which was implemented in the simulation software environment Opnet Modeler. Schedules to check the performance of HTTP and FTP protocols CR network. Simulation results justify the use of standard IEEE 802.22 in wireless networks. Ця стаття являє собою о...

  2. Spectroscopic factors within an algebraic model and an application to 12C+12C

    International Nuclear Information System (INIS)

    Hess, P. O.; Aguilera, E. F.; Martinez-Quiroz, E.; Algora, A.; Cseh, J.; Draayer, J. P.; Belyaeva, T. L.

    2007-01-01

    A parameterization of the spectroscopic factor is presented which almost matches the ones obtained via the microscopic SU(3) model of the nucleus. A short introduction of the algebraic model (the Semimicroscopic Algebraic Cluster Model, SACM) is given. The parameterization of the spectroscopic factor is explained, justified and compared, for light nuclei, to the microscopic SU(3) model. The applications concern the calculation of the total fusion cross section of 12C+12C

  3. A random effects meta-analysis model with Box-Cox transformation.

    Science.gov (United States)

    Yamaguchi, Yusuke; Maruo, Kazushi; Partlett, Christopher; Riley, Richard D

    2017-07-19

    In a random effects meta-analysis model, true treatment effects for each study are routinely assumed to follow a normal distribution. However, normality is a restrictive assumption and the misspecification of the random effects distribution may result in a misleading estimate of overall mean for the treatment effect, an inappropriate quantification of heterogeneity across studies and a wrongly symmetric prediction interval. We focus on problems caused by an inappropriate normality assumption of the random effects distribution, and propose a novel random effects meta-analysis model where a Box-Cox transformation is applied to the observed treatment effect estimates. The proposed model aims to normalise an overall distribution of observed treatment effect estimates, which is sum of the within-study sampling distributions and the random effects distribution. When sampling distributions are approximately normal, non-normality in the overall distribution will be mainly due to the random effects distribution, especially when the between-study variation is large relative to the within-study variation. The Box-Cox transformation addresses this flexibly according to the observed departure from normality. We use a Bayesian approach for estimating parameters in the proposed model, and suggest summarising the meta-analysis results by an overall median, an interquartile range and a prediction interval. The model can be applied for any kind of variables once the treatment effect estimate is defined from the variable. A simulation study suggested that when the overall distribution of treatment effect estimates are skewed, the overall mean and conventional I 2 from the normal random effects model could be inappropriate summaries, and the proposed model helped reduce this issue. We illustrated the proposed model using two examples, which revealed some important differences on summary results, heterogeneity measures and prediction intervals from the normal random effects model. The

  4. A catchment-scale irrigation systems model for sugarcane Part 1 ...

    African Journals Online (AJOL)

    In South Africa, the demand for water exceeds available supplies in many catchments. In order to justify existing water requirements and to budget and plan in the context of growing uncertainty regarding water availability, a model to assist in the assessment and management of catchment water supply and demand ...

  5. A singular perturbation theorem for evolution equations and time-scale arguments for structured population models

    NARCIS (Netherlands)

    Greiner, G.; Heesterbeek, J.A.P.; Metz, J.A.J.

    1994-01-01

    In this paper we present a generalization of a finite dimensional singular perturbation theorem to Banach spaces. From this we obtain sufficient conditions under which a faithful simplification by a time-scale argument is justified for age-structured models of slowly growing populations. An explicit

  6. Optimising the management of complex dynamic ecosystems. An ecological-economic modelling approach

    NARCIS (Netherlands)

    Hein, L.G.

    2005-01-01

    Keywords: ecological-economic modelling; ecosystem services; resource use; efficient; sustainability; wetlands, rangelands.

  7. A computerized model for integrating the physical environmental factors into metropolitan landscape planning

    Science.gov (United States)

    Julius Gy Fabos; Kimball H. Ferris

    1977-01-01

    This paper justifies and illustrates (in simplified form) a landscape planning approach to the environmental management of the metropolitan landscape. The model utilizes a computerized assessment and mapping system, which exhibits a recent advancement in computer technology that allows for greater accuracy and the weighting of different values when mapping at the...

  8. Development of the human aortic arch system captured in an interactive three-dimensional reference model

    NARCIS (Netherlands)

    Rana, M. Sameer; Sizarov, Aleksander; Christoffels, Vincent M.; Moorman, Antoon F. M.

    2014-01-01

    Variations and mutations in the human genome, such as 22q11.2 microdeletion, can increase the risk for congenital defects, including aortic arch malformations. Animal models are increasingly expanding our molecular and genetic insights into aortic arch development. However, in order to justify

  9. A simple model for atomic layer doped field-effect transistor (ALD-FET) electronic states

    International Nuclear Information System (INIS)

    Mora R, M.E.; Gaggero S, L.M.

    1998-01-01

    We propose a simple potential model based on the Thomas-Fermi approximation to reproduce the main properties of the electronic structure of an atomic layer doped field effect transistor. Preliminary numerical results for a Si-based ALD-FET justify why bound electronic states are not observed in the experiment. (Author)

  10. Development, Implementation, and Evaluation of the Apollo Model of Pediatric Rehabilitation Service Delivery

    Science.gov (United States)

    Camden, Chantal; Swaine, Bonnie; Tetreault, Sylvie; Bergeron, Sophie; Lambert, Carole

    2013-01-01

    This article presents the experience of a rehabilitation program that undertook the challenge to reorganize its services to address accessibility issues and improve service quality. The context in which the reorganization process occurred, along with the relevant literature justifying the need for a new service delivery model, and an historical…

  11. CRITICAL ANALYSIS OF EVALUATION MODEL LOMCE

    Directory of Open Access Journals (Sweden)

    José Luis Bernal Agudo

    2015-06-01

    Full Text Available The evaluation model that the LOMCE projects sinks its roots into the neoliberal beliefs, reflecting a specific way of understanding the world. What matters is not the process but the results, being the evaluation the center of the education-learning processes. It presents an evil planning, since the theory that justifies the model doesn’t specify upon coherent proposals, where there is an excessive worry for excellence and diversity is left out. A comprehensive way of understanding education should be recovered.

  12. Mathematical Models of Human Hematopoiesis Following Acute Radiation Exposure

    Science.gov (United States)

    2014-05-01

    mediator to the action of known hematopoietic stimulators such as thrombopoietin ( TPO ) and granulocyte colony-stimulating factor (G-CSF). Thus, in our...significant stimulatory role of TPO in thrombopoiesis (Kaushansky 2005), it is assumed to represent a large portion of the generic mediator. Thus, known...biological mechanisms involving TPO are used to justify the effects of the generic mediator in our model. TPO concentration is regulated by platelets

  13. Justifying the design and selection of literacy and thinking tools

    Directory of Open Access Journals (Sweden)

    David Whitehead

    2008-10-01

    Full Text Available Criteria for the design and selection of literacy and thinking tools that allow educators to justifywhat they do are described within a wider framework of learning theory and research into bestpractice. Based on a meta-analysis of best practice, results from a three year project designedto evaluate the effectiveness of a secondary school literacy initiative in New Zealand, togetherwith recent research from cognitive and neuro-psychologists, it is argued that the design andselection of literacy and thinking tools used in elementary schools should be consistent with (iteaching focused (ii learner focused, (iii thought linked (iv neurologically consistent, (vsubject specific, (vi text linked, (vii developmentally appropriate, and (viii assessment linkedcriteria.

  14. Justifying genetics as a possible legal defence to criminal ...

    African Journals Online (AJOL)

    However, jurisprudence of many criminal cases tends to question whether a person's inherited genes predispose him to violence and further determine his criminal responsibility in law. Under the Nigerian criminal law, the legal test of criminal responsibility is mainly whether the accused person intends the consequence of ...

  15. Justified and Ancient: Pop Music in EFL Classrooms.

    Science.gov (United States)

    Domoney, Liz; Harris, Simon

    1993-01-01

    A teacher training workshop uses linked tasks through which teachers explore the integration of pop music into Mexican secondary school English classes. Rather than being discrete, marginal items, pop music activities are worth linking, elaborating, and treating as more central in a secondary school program. (Contains 10 references.) (Author)

  16. Justifying Non-Violent Civil Disobedience within the Kenyan Context ...

    African Journals Online (AJOL)

    This paper employs the critical and analytical techniques of philosophical reflection to present a moral justification for the use of non-violent civil disobedience by Kenyan citizens in pursuit of their aspirations. It sets out with a brief review of political disobedience in Kenya from the advent of the British invasion and ...

  17. Perinatal health in the Danube region - new birth cohort justified.

    Czech Academy of Sciences Publication Activity Database

    Knudsen, L. E.; Andersen, Z.J.; Šrám, Radim; Braun Kohlová, M.; Gurzau, E.S.; Fucic, A.; Gribaldo, L.; Rössner ml., Pavel; Rössnerová, Andrea; Máca, V.; Zvěřinová, I.; Gajdošová, D.; Moshammer, H.; Rudnai, P.; Ščasný, M.

    2017-01-01

    Roč. 32, 1-2 (2017), s. 9-14 ISSN 2191-0308 Institutional support: RVO:68378041 Keywords : birth cohort * child health * Danube region * environmental exposures Subject RIV: DN - Health Impact of the Environment Quality OBOR OECD: Public and environmental health

  18. A simple framework to justify linear response theory

    International Nuclear Information System (INIS)

    Hairer, Martin; Majda, Andrew J

    2010-01-01

    The use of linear response theory for forced dissipative stochastic dynamical systems through the fluctuation dissipation theorem is an attractive way to study climate change systematically among other applications. Here, a mathematically rigorous justification of linear response theory for forced dissipative stochastic dynamical systems is developed. The main results are formulated in an abstract setting and apply to suitable systems, in finite and infinite dimensions, that are of interest in climate change science and other applications

  19. Beyond Baby Doe: Does Infant Transplantation Justify Euthanasia?

    Science.gov (United States)

    Coulter, David L.

    1988-01-01

    The paper examines ethical issues in the transplantation of organs from infants with anencephaly into infants with severe heart and kidney disease. It argues that active euthanasia of infants with anencephaly should be prohibited to safeguard the rights of all persons with severe neurological disabilities. (Author/DB)

  20. Is antenatal screening for rubella and cytomegalovirus justified ...

    African Journals Online (AJOL)

    No congenital rubella infections were detected, while the transplacental transmission rate for CMV was 6,4%. None of the infants followed up was clinically affected at birth or at 6 months. No racial differences in seroprevalences for CMV or rubella immunoglobulin were observed, but immunoglobulin antibody prevalence to ...

  1. 4D ultrasound imaging - ethically justifiable in India?

    Science.gov (United States)

    Indiran, Venkatraman

    2017-01-01

    Four-dimensional (4D) ultrasound (real-time volume sonography), which has been used in the West since the last decade for the determination of gender as well as for bonding and entertainment of the parents, has become widely available in India in this decade. Here, I would like to discuss the ethical issues associated with 4D ultrasonography in India. These are self-referral, the use of the technology for non-medical indications, a higher possibility of the disclosure of the foetus' gender and safety concerns.

  2. Is total glossectomy for advanced caricinoma of the tongue justified ...

    African Journals Online (AJOL)

    Advanced SCC of the tongue is a devastating disease causing severe pain and disorders of speech and swallowing. Total glossectomy (with or without total laryngectomy) and postoperative radiotherapy is a reasonable treatment option, particularly in the developing world setting. It has cure rates superior to primary ...

  3. Is conservative treatment of displaced tibial shaft fractures justified?

    Science.gov (United States)

    Haines, J F; Williams, E A; Hargadon, E J; Davies, D R

    1984-01-01

    All tibial shaft fractures treated at one hospital during a five-year period were studied in a prospective trial. Ninety-one displaced fractures in adults were treated using a conservative policy that included early bone grafting when indicated. Sound bony union was obtained in all cases. Those that healed primarily took on average 16.3 weeks whereas the 24 per cent that required bone grafts took 35.1 weeks. The number of complications, most of which were minor, was considered acceptable. It is concluded that provided early bone grafting is performed when necessary, a basically conservative policy of treatment is satisfactory; bony union of all displaced tibial fractures is achieved in a reasonable period of time.

  4. Justified ethicality: observing desired counterfactuals modifies ethical perceptions and behavior

    NARCIS (Netherlands)

    Shalvi, S.; Dana, J.; Handgraaf, M.J.J.; de Dreu, C.K.W.

    2011-01-01

    Employing a die-under-cup paradigm, we study the extent to which people lie when it is transparently clear they cannot be caught. We asked participants to report the outcome of a private die roll and gain money according to their reports. Results suggest that the degree of lying depends on the

  5. Is selenium supplementation in autoimmune thyroid diseases justified?

    DEFF Research Database (Denmark)

    Winther, Kristian H.; Bonnema, Steen; Hegedüs, Laszlo

    2017-01-01

    PURPOSE OF REVIEW: This review provides an appraisal of recent evidence for or against selenium supplementation in patients with autoimmune thyroid diseases, and discusses possible effect mechanisms. RECENT FINDINGS: Epidemiological data suggest an increased prevalence of autoimmune thyroid...... diseases under conditions of low dietary selenium intake. Two systematic reviews have evaluated controlled trials among patients with autoimmune thyroiditis and report that selenium supplementation decreases circulating thyroid autoantibodies. The immunomodulatory effects of selenium might involve reducing...... supplementation in the standard treatment of patients with autoimmune thyroiditis or Graves’ disease. However, correction of moderate to severe selenium deficiency may offer benefits in preventing, as well as treating, these disorders. Molecular mechanisms have been proposed, but further studies are needed....

  6. Justifying the Ivory Tower: Higher Education and State Economic Growth

    Science.gov (United States)

    Baldwin, J. Norman; McCracken, William A., III

    2013-01-01

    As the U.S. continues to embrace a comprehensive plan for economic recovery, this article investigates the validity of the claim that investing in higher education will help restore state economic growth and prosperity. It presents the findings from a study that indicates that the most consistent predictors of state economic growth related to…

  7. Translating malaria as sumaya: Justified convention or inappropriateness?

    Science.gov (United States)

    Dugas, Marylène; Dubé, Eric; Bibeau, Gilles

    2009-12-01

    In exchanges between health professionals and consultants in the West African context, the word malaria is often replaced by its equivalent in the local dialect. In the Nouna health district of Burkina Faso the term malaria is regularly translated as sumaya. Acknowledging that there may be important epistemological differences between malaria, a term issued from the biomedical epistemology, and sumaya, which is borrowed from traditional medicine epistemology, the possible mismatches between these two terms have been assessed to anticipate problems that may result from their translation by different health stakeholders. By consulting various traditional healers and other members of the communities about the local meaning of the term sumaya, it has been possible to compare the conceptualisation of sumaya to the biomedical conceptualisation of malaria and assess the gap between them. An investigation based on a sample of 13 traditional healers and over 450 individuals from Nouna's health district was conducted to document the meaning of the term sumaya. This paper demonstrates that the generally accepted translation of the word malaria as sumaya is a mistake when one looks at the different systems of belief and representations given to each of these two terms.

  8. Parental Education and Public Reason: Why Comprehensive Enrolment Is Justified

    Science.gov (United States)

    Giesinger, Johannes

    2013-01-01

    Matthew Clayton claims that "comprehensive enrolment"--raising one's children in accordance with one's own conception of the good--is illegitimate. In his argument against comprehensive enrolment, Clayton refers to Rawls's idea of public reason. In a recent response to Clayton, Christina Cameron not only rejects…

  9. Justifying Music Instruction in American Public Schools: A Historical Perspective.

    Science.gov (United States)

    Jorgensen, Estelle R.

    1995-01-01

    Charts the development of music education from early utilitarianism up to its current emphasis on aesthetic value. Recent attempts to pursue music education as an interdisciplinary subject have been limited due to budget cuts. Briefly discusses this financial crisis and suggests some sources of alternative funding. (MJP)

  10. Is post-operative radiation for renal cell carcinoma justified?

    International Nuclear Information System (INIS)

    Aref, Ibrahim; Bociek, R. Gregory; Salhani, Douglas

    1997-01-01

    Purpose: To identify the pattern of failure in patients with resected renal cell carcinoma (RCC). Materials and methods: The records of 116 patients with unilateral, non-hematogenous metastatic RCC who were treated with definitive surgery and referred to the Ottawa Regional Cancer Centre between 1977 and 1988 were reviewed. Distribution by stage included T1 (3 patients), T2 (42 patients) and T3 (71 patients). The median follow-up was 44 months, with a range of 4-267 months. Results: Local regional failure (LRF) developed in 8 patients. Nine patients developed local or regional recurrence, plus distant failure. Fifty-eight patients had distant metastases (DM) only. The 7-year actuarial rate for LRF and DM were 12%, and 67%, respectively. The overall 7-year actuarial survival rate was 35%, and cause-specific survival was 42%. Conclusions: LRF alone is rare following nephrectomy. DM is the main pattern of failure. This data does not support the role of adjuvant radiation therapy in this disease

  11. Justifying the Arts: The Value of Illuminating Failures

    Science.gov (United States)

    Forrest, Michelle

    2011-01-01

    This paper revisits how late 20th-century attempts to account for conceptual and other difficult art-work by defining the concept "art" have failed to offer a useful strategy for educators seeking a non-instrumental justification for teaching the arts. It is suggested that this theoretical ground is nonetheless instructive and provides useful…

  12. Super-Obesity in the Elderly: Is Bariatric Surgery Justified?

    Science.gov (United States)

    McGlone, Emma Rose; Bond, Amanda; Reddy, Marcus; Khan, Omar A; Wan, Andrew C

    2015-09-01

    Although the prevalence of obese elderly patients is increasing, the outcomes of bariatric surgery in this potentially high-risk cohort remain poorly understood, especially those relating to quality of life. Furthermore, there is no data on the efficacy of bariatric surgery in the super-obese elderly. We identified 50 consecutive patients undergoing bariatric surgery aged 60 years or over, and compared the outcomes of the super-obese (BMI ≥ 50; n = 26) with those of BMI Bariatric Analysis and Reporting Outcome System (BAROS) 3.5 vs. 3.1; p = 0.64).

  13. The Self-Justifying Desire for Happiness | Rodogno | South African ...

    African Journals Online (AJOL)

    In Happiness, Tabensky equates the notion of happiness to Aristotelian eudaimonia. I shall claim that doing so amounts to equating two concepts that moderns cannot conceptually equate, namely, the good for a person and the good person or good life. In §2 I examine the way in which Tabensky deals with this issue and ...

  14. Vitamins for Cardiovascular Diseases: Is the Expense Justified?

    Science.gov (United States)

    Sultan, Sulaiman; Murarka, Shishir; Jahangir, Ahad; Mookadam, Farouk; Tajik, A Jamil; Jahangir, Arshad

    Despite the knowledge that a well-balanced diet provides most of the nutritional requirements, the use of supplemental vitamins is widespread among adults in the United States. Evidence from large randomized controlled trials over the last 2 decades does not support vitamin supplementation for the reduction of cardiovascular risk factors or clinical outcomes. Many of the vitamins used in common practice likely are safe when consumed in small doses, but long-term consumption of megadoses is not only expensive but has the potential to cause adverse effects. Therefore, a need exists to revisit this issue, reminding the public and healthcare providers about the data supporting the use of vitamins for cardiovascular disease, and the potential for harm and the expense associated with their unnecessary use. In this review, we highlight the scientific evidence from randomized controlled studies regarding the efficacy and safety of vitamin supplementation for primary and secondary prevention of cardiovascular diseases and outcomes. We also draw attention to issues related to widespread and indiscriminate use of vitamin supplements and the need to educate the public to curtail unnecessary consumption and expense by limiting their use based on strong scientific evidence.

  15. Legislative prohibitions on wearing a headscarf: are they justified ...

    African Journals Online (AJOL)

    In recent years the headscarf has been described as a symbol of Islam's oppression of women and simultaneously of terrorism. As the debate regarding the acceptability of the headscarf in the modern world continues, an increasing number of states have legislated to ban the wearing of the headscarf. This article critically ...

  16. Corporate governance and banks : How justified is the match?

    NARCIS (Netherlands)

    van der Elst, C.F.

    2015-01-01

    Banks and bank governance are different. We critically assess the arguments used to pervade these divergences in operational activities. We also question if and how, in light of the specificity of banking activities, bank governance translates the operational peculiarities in different governance

  17. Prolonged Mechanical Ventilation (PMV): When is it Justified in ICU?

    Science.gov (United States)

    Trivedi, Trupti H

    2015-10-01

    Over years, the number of patients requiring prolonged mechanical ventilation (PMV) in ICU has increased. Trends in the numbers of patients requiring PMV are of interest to health service planners because they consume a disproportionate amount of healthcare resources, and have high illness costs.1 PMV is defined as need of invasive mechanical ventilation for consecutive 21 days for at least 6 hours per day. With improvement in ICU care more patients survive acute respiratory failure and with that number of patients requiring PMV is likely to increase further. In a large multi centric study in United Kingdom the incidence PMV was 4.4 per 100 ICU admissions, and 6.3 per 100 ventilated ICU admissions. Also these patients used 29.1% of all general ICU bed days, had longer hospital stay after ICU discharge than non-PMV patients and had higher hospital mortality (40.3% vs 33.8%, P = 0.02).2. © Journal of the Association of Physicians of India 2011.

  18. Pre-discharge defibrillation testing: Is it still justified?

    Science.gov (United States)

    Kempa, Maciej; Królak, Tomasz; Drelich, Łukasz; Budrejko, Szymon; Daniłowicz-Szymanowicz, Ludmiła; Lewicka, Ewa; Kozłowski, Dariusz; Raczak, Grzegorz

    2016-01-01

    An implantable cardioverter-defibrillator (ICD) is routinely used to prevent sudden cardiac death. Since the introduction of that device into clinical practice, a defibrillation test (the so-called pre-discharge test [PDT]) has been an inseparable part of the ICD implantation procedure. Recently, the usefulness of PDT has been called into question. The aim of this research was to analyze ICD tests performed within two time periods: in years 1995-2001 (period I) and 2007-2010 (period II), in order to compare the results of tests and solutions to all the problems with ICD systems revealed by means of PDT. During period I, 193 tests were performed, among which the ICD system malfunction was observed in 16 cases. Those included: sensing issues, specifically R-wave undersensing during ventricular fibrillation (VF) (7 patients) and T-wave oversensing (4 patients), as well as high defibrillation threshold (DFT) (2 patients) and ICD-pacemaker interaction (3 patients). During period II, among 561 tests, system malfunction was observed in 15 cases. In 1 patient it was VF undersensing, and in the remaining 14 it was high DFT. All the above problems were solved by means of appropriate ICD reprogramming, repositioning of the endocardial defibrillation lead or implantation of an additional subcutaneous defibrillation lead. Contemporary ICD technical solutions, compared to older systems, in most cases allow to avoid sensing problems. The key rationale behind ICD testing is the ability to confirm the efficacy of high-voltage therapy. Despite the increasing maximal defibrillation out-put of devices, and all possible adjustments to the characteristics of the impulse, there is still a group of patients that require additional procedures to ensure the appropriate defibrillation efficacy.

  19. British media attacks on homeopathy: are they justified?

    Science.gov (United States)

    Vithoulkas, George

    2008-04-01

    Homeopathy is being attacked by the British media. These attacks draw support from irresponsible and unjustified claims by certain teachers of homeopathy. Such claims include the use of 'dream' and 'imaginative' methods for provings. For prescribing some such teachers attempt to replace the laborious process of matching symptom picture and remedy with spurious theories based on 'signatures', sensations and other methods. Other irresponsible claims have also been made. These "new ideas" risk destroying the principles, theory, and practice of homeopathy.

  20. Is extended biopsy protocol justified in all patients with suspected ...

    African Journals Online (AJOL)

    Objective: To determine the significance of an extended 10-core transrectal biopsy protocol in different categories of patients with suspected prostate cancer using digital guidance. Materials and Methods: We studied 125 men who were being evaluated for prostate cancer. They all had an extended 10-core digitally guided ...

  1. Is the use of nuclear energy ethically justifiable?

    International Nuclear Information System (INIS)

    Feldhaus, S.

    1992-01-01

    If one wants to try and attain, in a responsible manner, the objective of future energy supply, which consists in meeting, to a satisfactory degree, the humanly adequate demands of a growing population for energy by means of a good and evil estimation guided by the criteria of social and environmental tolerability, then, based on current results of reserve and risk comparisons, and taking into account at the same time the economical practicability aspects of the different ways of energy supply, the most urgent requirement turns out to be the immediate reduction of worldwide CO 2 immission. This requirement which must be translated into immediate action is of priority. However, at present a significant CO 2 reduction in connection with the securing of a sufficient worldwide energy supply can only be achieved by means of various consistent acions which are presented in this paper in the form of a set. (orig./HSCH) [de

  2. Philosophical Clarity and Justifying the Scope of Advanced Practice Nursing.

    Science.gov (United States)

    Reed, Pamela G

    2017-01-01

    The United States (US) Department of Veterans Affairs proposed a policy change for nursing practice that would grant full practice authority to advanced practice registered nurses (APRNs) nationwide. In this article, the author briefly explains this proposed policy and explores the relevance and implications of bringing philosophy into policy debates and discussions about the nature and scope of practice.

  3. Justifying Innovative Language Programs in an Environment of ...

    African Journals Online (AJOL)

    In the analysis of the literature that has been written on project management and language issues in development, it attempts to show how the Communication Skills programme could benefit from this knowledge on project management and educational change management in the third millennium. The paper concludes that ...

  4. Polyurethane foam-covered breast implants: a justified choice?

    Science.gov (United States)

    Scarpa, C; Borso, G F; Vindigni, V; Bassetto, F

    2015-01-01

    Even if the safety of the polyurethane prosthesis has been the subject of many studies and professional and public controversies. Nowadays, polyurethane covered implants are very popular in plastic surgery for the treatment of capsular contracture. We have identified 41 papers (1 is a communication of the FDA) by using search browsers such as Pubmed, Medline, and eMedicine. Eleven manuscripts have been used for an introduction, and the remaining thirty have been subdivided into three tables whose results have been summarized in three main chapters: (1) capsular formation and contracture, (2) complications, (3) biodegradation and cancer risk. (1) The polyurethanic capsule is a well defined foreign body reaction characterized by synovial metaplasia, a thin layer of disarranged collagen fibers and a high vascularization. These features make possible a "young" capsule and a low occurrence of capsular contracture even over a long period (10 years); (2) the polyurethane implants may be difficult to remove but there is no evidence that they cause an increase in the other complications; (3) there is no evidence of polyurethane related cancer in long-term studies (after 5 years). Polyurethane foam covered breast implants remain a valid choice for the treatment of capsular contracture even if it would be very useful to verify the ease of removal of the prosthesis and to continue investigations on biodegradation products.

  5. Is extended biopsy protocol justified in all patients with suspected ...

    African Journals Online (AJOL)

    2012-01-03

    Jan 3, 2012 ... Objective: To determine the significance of an extended 10-core transrectal biopsy protocol in different categories of patients with suspected prostate cancer using digital guidance. Materials and Methods: We studied 125 men who were being evaluated for prostate cancer. They all had an extended.

  6. Is extended biopsy protocol justified in all patients with suspected ...

    African Journals Online (AJOL)

    2012-01-03

    Jan 3, 2012 ... of both sextant and extended 10-core biopsy protocols at different PSA levels and digital rectal examination (DRE) ... Key words: Biopsy, detection rate, digital rectal examination, extended, prostate cancer, prostate-specific antigen, sextant .... Indications for prostate biopsy were elevated PSA alone in.

  7. Justifying Torture: Explaining Democratic States' Noncompliance with International Humanitarian Law

    Science.gov (United States)

    Kanstroom, Emily

    2007-01-01

    On June 28, 1951, France ratified the 1949 Geneva Conventions, which prohibited the torture of prisoners of war. On August 2, 1955, the United States of America ratified the same document. Between 1954 and 1962, France fought a war against Algeria, which sought its independence from colonial rule. From September 11, 2001 until the present, the…

  8. Justified Ilegality?: Controlled clientelism by the Chilean administration

    Directory of Open Access Journals (Sweden)

    Marcelo Moriconi Bezerra

    2011-07-01

    Full Text Available The Chilean civil service is considered one of the most efficient in Latin America. However, different studies describe the informal institutions that operate between the Legislative Power and the bureaucracy to fill positions in the public administration. Although some of these clientelistic practices are against the law, they have been accepted and defended in both the political and scientific spheres. Legality is not considered an important value if certain indexes have a positive development. In this context, it is important to study how corruption and clientelism have been ignored, or hidden, through political discourses and technical reports about the situation of bureaucracy. All of this allows a better understanding of why after 20 years of administrative reforms there are damaging practices which negatively affect democracy that have not been eradicated.

  9. Is antenatal screening for rubella and cytomegalovirus justified?

    African Journals Online (AJOL)

    Abstract Altogether 2 250 asymptomatic pregnant women attending an antenatal clinic were investigated for serological evidence of past exposure to rubella and cytomegalovirus (CMV) as well as for active primary infection or reinfectionJreactivation. Only. 7 (0,3%) active rubella infections were diagnosed, none of them ...

  10. Octogenarian liver grafts: Is their use for transplant currently justified?

    Science.gov (United States)

    Jiménez-Romero, Carlos; Cambra, Felix; Caso, Oscar; Manrique, Alejandro; Calvo, Jorge; Marcacuzco, Alejandro; Rioja, Paula; Lora, David; Justo, Iago

    2017-05-07

    To analyse the impact of octogenarian donors in liver transplantation. We present a retrospective single-center study, performed between November 1996 and March 2015, that comprises a sample of 153 liver transplants. Recipients were divided into two groups according to liver donor age: recipients of donors ≤ 65 years (group A; n = 102), and recipients of donors ≥ 80 years (group B; n = 51). A comparative analysis between the groups was performed. Quantitative variables were expressed as mean values and SD, and qualitative variables as percentages. Differences in properties between qualitative variables were assessed by χ 2 test. Comparison of quantitative variables was made by t -test. Graft and patient survivals were estimated using the Kaplan-Meier method. One, 3 and 5-year overall patient survival was 87.3%, 84% and 75.2%, respectively, in recipients of younger grafts vs 88.2%, 84.1% and 66.4%, respectively, in recipients of octogenarian grafts ( P = 0.748). One, 3 and 5-year overall graft survival was 84.3%, 83.1% and 74.2%, respectively, in recipients of younger grafts vs 84.3%, 79.4% and 64.2%, respectively, in recipients of octogenarian grafts ( P = 0.524). After excluding the patients with hepatitis C virus cirrhosis (16 in group A and 10 in group B), the 1, 3 and 5-year patient ( P = 0.657) and graft ( P = 0.419) survivals were practically the same in both groups. Multivariate Cox regression analysis demonstrated that overall patient survival was adversely affected by cerebrovascular donor death, hepatocarcinoma, and recipient preoperative bilirubin, and overall graft survival was adversely influenced by cerebrovascular donor death, and recipient preoperative bilirubin. The standard criteria for utilization of octogenarian liver grafts are: normal gross appearance and consistency, normal or almost normal liver tests, hemodynamic stability with use of 30%.

  11. Justifying scale type for a latent variable: Formative or reflective?

    Science.gov (United States)

    Liu, Hao; Bahron, Arsiah; Bagul, Awangku Hassanal Bahar Pengiran

    2015-12-01

    The study attempted to explore the possibilities to create a procedure at the experimental level to double confirm whether manifest variables scale type is formative or reflective. Now, the criteria of making such a decision are heavily depended on researchers' judgment at the conceptual and operational level. The study created an experimental procedure that seems could double confirm the decisions from the conceptual and operational level judgments. The experimental procedure includes the following tests, Variance Inflation Factor (VIF), Tolerance (TOL), Ridge Regression, Cronbach's alpha, Dillon-Goldstein's rho, and first and second eigenvalue. The procedure considers manifest variables' both multicollinearity and consistency. As the result, the procedure received the same judgment with the carefully established decision making at the concept and operational level.

  12. Creating Fido's twin: can pet cloning be ethically justified?

    Science.gov (United States)

    Fiester, Autumn

    2005-01-01

    Taken at face value, pet cloning may seem at best a frivolous practice, costly both to the cloned pet's health and its owner's pocket. At worst, its critics say, it is misguided and unhealthy--a way of exploiting grief to the detriment of the animal, its owner, and perhaps even animal welfare in general. But if the great pains we are willing to take to clone Fido raise the status of companion animals in the public eye, then the practice might be defensible.

  13. How three Narratives of Modernity justify Economic Inequality

    DEFF Research Database (Denmark)

    Larsen, Christian Albrekt

    2016-01-01

    of getting ahead; and the “middle-class effect”, related to perceptions of the social structure of society. The importance of the suggested narratives is tested by means of the ISSP 2009 module, which includes 38 countries. The finding is that belief in the three narratives can account for a considerable......The acceptance of income differences varies across countries. This article suggests belief in three narratives of modernity to account for this: the “tunnel effect”, related to perceptions of generational mobility; the “procedural justice effect”, related to the perceived fairness in the process...... part of the cross-national variation. Beliefs in procedural justice and the existence of a middle class society clearly go together with high acceptance of current income differences. The “tunnel effect” is more complex. In general, belief in generational mobility goes together with acceptance...

  14. Is abandoning routine peritoneal cultures during appendectomy justified?

    International Nuclear Information System (INIS)

    Al-Saadi, A.; Al-Wadan, Ali H.; Hamarnah, Samir A.; Amin, H.

    2007-01-01

    Objective was to identify if there are any advantages of taking swab form the peritoneal fluid during appendectomy and if it has any clinical implication on the progress of diseases. Record of 160 patients who underwent appendectomy in Saqr Hospital, Rak, United Arab Emirates, from 2003-2005 and had culture and sensitivity from the peritoneal cavity were reviewed retrospectively. The macroscopic picture of the appendix, microorganism in peritoneal cultures, antibiotic and the extent using the result of the culture and sensitivity were evaluated. Patients with normal appendix who underwent laparoscopic appendectomy were excluded. Patients age ranged from 4-55 years with male to female ratio of 4:1, all had prophylactic antibiotics and standard surgical procedures; 60% had perforated appendix, 13% were gangrenous. The most common organisms cultured were, Escherichia coli and bacteroids, rate of wound infection was 5%. None of the patients had their course of antibiotics adjusted in response to the result of the swab. Swabs from the peritoneal cavity during appendectomy do not have any clinical advantage especially with the empiric use of antibiotics and the short hospital stay. (author)

  15. Do PISA data justify PISA-based education policy?

    OpenAIRE

    DE SOUSA LOBO BORGES DE ARAUJO LUISA; SALTELLI ANDREA; SCHNEPF SYLKE

    2015-01-01

    Purpose – Since the publication of its first results in 2000, the Programme for International Student Assessment (PISA) implemented by the OECD has repeatedly been the subject of heated debate. In late 2014 controversy flared up anew, with the most severe critics going so far as to call for a halt to the programme. The purpose of this paper is to discuss the methodological design of PISA and the ideological basis of scientific and policy arguments invoked for and against it. De...

  16. Is Routine Ordering of Both Hemoglobin and Hematocrit Justifiable?

    Science.gov (United States)

    Addison, David J.

    1966-01-01

    In order to assess the value of routine simultaneous hemoglobin and hematocrit determinations, paired determinations in the following groups were studied: (1) 360 consecutive pairs from the hematology laboratory, (2) 95 pairs on general medical patients, (3) 43 pairs on 10 patients with upper gastrointestinal hemorrhage, and (4) 62 pairs on 10 patients with burns. These values were plotted on scatter diagrams. In the 560 pairs only three disparate determinations were found. It is concluded that, in most clinical situations, determination of the hemoglobin or the hematocrit as a screening procedure provides as much useful information as the simultaneous determination of both. PMID:5296947

  17. The Blake-Zisserman model for digital surface models segmentation

    Directory of Open Access Journals (Sweden)

    M. Zanetti

    2013-10-01

    Full Text Available The Blake-Zisserman functional is a second-order variational model for data segmentation. The model is build up of several terms, the nature and the interaction of them allow to obtain a smooth approximation of the data that preserves the constant-gradient areas morphology, which are explicitly detected by partitioning the data with the graph of two special functions: the edge-detector function, which detects discontinuities of the datum, and the edge/crease-detector function, which also detects discontinuities of the gradient. First, the main features of the model are presented to justify the sense of the application of the model to DSMs. It is stressed the fact that the model can yield an almost piecewise-linear approximation of the data. This result is certainly of some interest for the specific application of the model to urban DSMs. Then, an example of its application is presented and the results are discussed to highlight how the features of the model affect the model outputs. The smooth approximation of the data produced by the model is thought to be a better candidate for further processing. In this sense, the application of the Blake-Zisserman model can be seen as a useful preprocessing step in the chain of DSMs processing. Eventually, some perspectives are presented to show some promising applications and developments of the Blake-Zisserman model.

  18. A mathematical model of localized corrosion of steel radioactive waste canisters

    International Nuclear Information System (INIS)

    Sharland, S.M.

    1986-01-01

    A mathematical model of crevice and pitting corrosion which is entirely predictive and self-consistent is described. The model predicts the steady state solution chemistry and electrode kinetics (and hence metal wastage rates) within corrosion cavities as functions of the many parameters on which they depend. By estimating physically realistic input parameters, short-term experimental results are reproduced reasonably accurately. Future work will include a more mechanistically justifiable approach to the selection of these parameters. (author)

  19. A probit- log- skew-normal mixture model for repeated measures data with excess zeros, with application to a cohort study of paediatric respiratory symptoms

    Directory of Open Access Journals (Sweden)

    Johnston Neil W

    2010-06-01

    smaller for asthmatics relative to non-asthmatics throughout the year, whereas there was no difference in the severity of the symptoms between the two groups. Conclusions A positive association was observed between viral infection status and both the probability of experiencing any respiratory symptoms, and their severity during the year. For DAVIS data the random effects probit -log skew normal model fits significantly better than the random effects probit -log normal model, endorsing our parametric choice for the model. The simulation study indicates that our proposed model seems to be robust to misspecification of the distribution of the positive skewed response.

  20. Cost effectiveness of recycling: A systems model

    Energy Technology Data Exchange (ETDEWEB)

    Tonjes, David J., E-mail: david.tonjes@stonybrook.edu [Department of Technology and Society, College of Engineering and Applied Sciences, Stony Brook University, Stony Brook, NY 11794-3560 (United States); Waste Reduction and Management Institute, School of Marine and Atmospheric Sciences, Stony Brook University, Stony Brook, NY 11794-5000 (United States); Center for Bioenergy Research and Development, Advanced Energy Research and Technology Center, Stony Brook University, 1000 Innovation Rd., Stony Brook, NY 11794-6044 (United States); Mallikarjun, Sreekanth, E-mail: sreekanth.mallikarjun@stonybrook.edu [Department of Technology and Society, College of Engineering and Applied Sciences, Stony Brook University, Stony Brook, NY 11794-3560 (United States)

    2013-11-15

    Highlights: • Curbside collection of recyclables reduces overall system costs over a range of conditions. • When avoided costs for recyclables are large, even high collection costs are supported. • When avoided costs for recyclables are not great, there are reduced opportunities for savings. • For common waste compositions, maximizing curbside recyclables collection always saves money. - Abstract: Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets.

  1. Cost effectiveness of recycling: A systems model

    International Nuclear Information System (INIS)

    Tonjes, David J.; Mallikarjun, Sreekanth

    2013-01-01

    Highlights: • Curbside collection of recyclables reduces overall system costs over a range of conditions. • When avoided costs for recyclables are large, even high collection costs are supported. • When avoided costs for recyclables are not great, there are reduced opportunities for savings. • For common waste compositions, maximizing curbside recyclables collection always saves money. - Abstract: Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets

  2. Cost effectiveness of recycling: a systems model.

    Science.gov (United States)

    Tonjes, David J; Mallikarjun, Sreekanth

    2013-11-01

    Financial analytical models of waste management systems have often found that recycling costs exceed direct benefits, and in order to economically justify recycling activities, externalities such as household expenses or environmental impacts must be invoked. Certain more empirically based studies have also found that recycling is more expensive than disposal. Other work, both through models and surveys, have found differently. Here we present an empirical systems model, largely drawn from a suburban Long Island municipality. The model accounts for changes in distribution of effort as recycling tonnages displace disposal tonnages, and the seven different cases examined all show that curbside collection programs that manage up to between 31% and 37% of the waste stream should result in overall system savings. These savings accrue partially because of assumed cost differences in tip fees for recyclables and disposed wastes, and also because recycling can result in a more efficient, cost-effective collection program. These results imply that increases in recycling are justifiable due to cost-savings alone, not on more difficult to measure factors that may not impact program budgets. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Improving practical atmospheric dispersion models

    International Nuclear Information System (INIS)

    Hunt, J.C.R.; Hudson, B.; Thomson, D.J.

    1992-01-01

    The new generation of practical atmospheric dispersion model (for short range ≤ 30 km) are based on dispersion science and boundary layer meteorology which have widespread international acceptance. In addition, recent improvements in computer skills and the widespread availability of small powerful computers make it possible to have new regulatory models which are more complex than the previous generation which were based on charts and simple formulae. This paper describes the basis of these models and how they have developed. Such models are needed to satisfy the urgent public demand for sound, justifiable and consistent environmental decisions. For example, it is preferable that the same models are used to simulate dispersion in different industries; in many countries at present different models are used for emissions from nuclear and fossil fuel power stations. The models should not be so simple as to be suspect but neither should they be too complex for widespread use; for example, at public inquiries in Germany, where simple models are mandatory, it is becoming usual to cite the results from highly complex computational models because the simple models are not credible. This paper is written in a schematic style with an emphasis on tables and diagrams. (au) (22 refs.)

  4. Overdeepening development in a glacial landscape evolution model with quarrying

    DEFF Research Database (Denmark)

    Ugelvig, Sofie Vej; Egholm, D.L.; Iverson, Neal R.

    In glacial landscape evolution models, subglacial erosion rates are often related to basal sliding or ice discharge by a power-law. This relation can be justified when considering bed abrasion, where rock debris transported in the basal ice drives erosion. However, the relation is not well......) introduced a new model for subglacial erosion by quarrying that operates from the theory of adhesive wear. The model is based on the fact that cavities, with a high level of bedrock differential stress, form in the lee of bed obstacles when the sliding velocity is too high to allow for the ice to creep...

  5. Parity-specific and two-sex utility models of reproductive intentions.

    Science.gov (United States)

    Fried, E S; Hofferth, S L; Udry, J R

    1980-02-01

    This paper uses married couples' anticipated consequences of having a (another) child to predict their reproductive intentions. Parity-specific models identify different variables as predictors of reproductive behavior at different parities but do not yield interpretable patterns of difference by parity. Parity-specific models are not significantly stronger predictors of reproductive behavior. Generally, wife-only models are distinctly superior to husband-only models. Two-sex models are usually better predictors than one-sex models but not enough better to justify the additional cost.

  6. Mesoscopic and continuum modelling of angiogenesis

    KAUST Repository

    Spill, F.

    2014-03-11

    Angiogenesis is the formation of new blood vessels from pre-existing ones in response to chemical signals secreted by, for example, a wound or a tumour. In this paper, we propose a mesoscopic lattice-based model of angiogenesis, in which processes that include proliferation and cell movement are considered as stochastic events. By studying the dependence of the model on the lattice spacing and the number of cells involved, we are able to derive the deterministic continuum limit of our equations and compare it to similar existing models of angiogenesis. We further identify conditions under which the use of continuum models is justified, and others for which stochastic or discrete effects dominate. We also compare different stochastic models for the movement of endothelial tip cells which have the same macroscopic, deterministic behaviour, but lead to markedly different behaviour in terms of production of new vessel cells. © 2014 Springer-Verlag Berlin Heidelberg.

  7. Uncertainty, sensitivity analysis and the role of data based mechanistic modeling in hydrology

    Science.gov (United States)

    Ratto, M.; Young, P. C.; Romanowicz, R.; Pappenberger, F.; Saltelli, A.; Pagano, A.

    2007-05-01

    calibration of mechanistic hydrological models, making their properties more transparent. It also helps to highlight possible mis-specification problems, if these are identified. The results of the exercise show that the two modelling methodologies have good synergy; combining well to produce a complete joint modelling approach that has the kinds of checks-and-balances required in practical data-based modelling of rainfall-flow systems. Such a combined approach also produces models that are suitable for different kinds of application. As such, the DBM model considered in the paper is developed specifically as a vehicle for flow and flood forecasting (although the generality of DBM modelling means that a simulation version of the model could be developed if required); while TOPMODEL, suitably calibrated (and perhaps modified) in the light of the DBM and GSA results, immediately provides a simulation model with a variety of potential applications, in areas such as catchment management and planning.

  8. A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives.

    Science.gov (United States)

    Martinez, Aleix; Du, Shichuan

    2012-05-01

    In cognitive science and neuroscience, there have been two leading models describing how humans perceive and classify facial expressions of emotion-the continuous and the categorical model. The continuous model defines each facial expression of emotion as a feature vector in a face space. This model explains, for example, how expressions of emotion can be seen at different intensities. In contrast, the categorical model consists of C classifiers, each tuned to a specific emotion category. This model explains, among other findings, why the images in a morphing sequence between a happy and a surprise face are perceived as either happy or surprise but not something in between. While the continuous model has a more difficult time justifying this latter finding, the categorical model is not as good when it comes to explaining how expressions are recognized at different intensities or modes. Most importantly, both models have problems explaining how one can recognize combinations of emotion categories such as happily surprised versus angrily surprised versus surprise. To resolve these issues, in the past several years, we have worked on a revised model that justifies the results reported in the cognitive science and neuroscience literature. This model consists of C distinct continuous spaces. Multiple (compound) emotion categories can be recognized by linearly combining these C face spaces. The dimensions of these spaces are shown to be mostly configural. According to this model, the major task for the classification of facial expressions of emotion is precise, detailed detection of facial landmarks rather than recognition. We provide an overview of the literature justifying the model, show how the resulting model can be employed to build algorithms for the recognition of facial expression of emotion, and propose research directions in machine learning and computer vision researchers to keep pushing the state of the art in these areas. We also discuss how the model can

  9. Modeling axisymmetric flow and transport

    Science.gov (United States)

    Langevin, C.D.

    2008-01-01

    Unmodified versions of common computer programs such as MODFLOW, MT3DMS, and SEAWAT that use Cartesian geometry can accurately simulate axially symmetric ground water flow and solute transport. Axisymmetric flow and transport are simulated by adjusting several input parameters to account for the increase in flow area with radial distance from the injection or extraction well. Logarithmic weighting of interblock transmissivity, a standard option in MODFLOW, can be used for axisymmetric models to represent the linear change in hydraulic conductance within a single finite-difference cell. Results from three test problems (ground water extraction, an aquifer push-pull test, and upconing of saline water into an extraction well) show good agreement with analytical solutions or with results from other numerical models designed specifically to simulate the axisymmetric geometry. Axisymmetric models are not commonly used but can offer an efficient alternative to full three-dimensional models, provided the assumption of axial symmetry can be justified. For the upconing problem, the axisymmetric model was more than 1000 times faster than an equivalent three-dimensional model. Computational gains with the axisymmetric models may be useful for quickly determining appropriate levels of grid resolution for three-dimensional models and for estimating aquifer parameters from field tests.

  10. Modelling the degradation of condition indices

    Energy Technology Data Exchange (ETDEWEB)

    Hoskins, R.P.; Strbac, G. [UMIST, Manchester (United Kingdom). Dept. of Electrical Engineering and Electronics; Brint, A.T. [University of Salford (United Kingdom). Dept. of Computer and Mathematical Sciences

    1999-07-01

    To ensure that the electricity distribution network achieves satisfactory levels of reliability and safety it is necessary to have well-founded and justifiable asset management policies in place. For many items of electricity distribution equipment, failures have been rare and inferences about future lifetimes are difficult to make. The paper argues that in such situations, importance should be given to obtaining condition information to aid asset management. A key requirement in using such data is the capability to model changes in an item's condition and the Markov model is proposed as being particularly suitable. The application of the Markov model to a collection of oil condition data, recorded during maintenance activity, is explained. The impact of such a model in making asset management decisions is then illustrated. (author)

  11. Innovation Production Models

    Directory of Open Access Journals (Sweden)

    Tamam N. Guseinova

    2016-01-01

    Full Text Available The article is dedicated to the study of the models of production of innovations at enterprise and state levels. The shift towards a new technology wave induces a change in systems of division of labour as well as establishment of new forms of cooperation that are reflected both in theory and practice of innovation policy and management. Within the scope of the research question we have studied different generation of innovation process, starting with simple linear models - "technology push" and "market pull" - and ending with a complex integrated model of open innovations. There are two organizational models of innovation production at the enterprise level: start-ups in the early stages of their development and ambidextrous organizations. The former are prone to linear models of innovation process, while the latter create innovation within more sophisticated inclusive processes. Companies that effectuate reciprocal ambidexterity stand out from all the rest, since together with start-ups, research and development centres, elements of innovation infrastructure and other economic agents operating in the same value chain they constitute the core of most advanced forms of national innovation systems, namely Triple Helix and Quadruple Helix systems. National innovation systems - models of innovation production at the state level - evolve into systems with a more profound division of labour that enable "line production" of innovations. These tendencies are closely related to the advent and development of the concept of serial entrepreneurship that transforms entrepreneurship into a new type of profession. International experience proves this concept to be efficient in various parts of the world. Nevertheless, the use of above mentioned models and concepts in national innovation system should be justified by socioeconomic conditions of economic regions, since they determine the efficiency of implementation of certain innovation processes and

  12. Mathematical models of gas-dynamic and thermophysical processes in underground coal mining at different stages of mine development

    OpenAIRE

    М. В. Грязев; Н. М. Качурин; С. А. Воробьев

    2017-01-01

    New trends have been traced and the existing ones refined regarding filtration and diffusive motion of gases in coal beds and surrounding rock, spontaneous heating of coal and transport of gas traces by ventilation currents in operating coal mines. Mathematical models of gas-dynamic and thermophysical processes inside underworked territories after mine abandonment have been justified. Mathematical models are given for feasible air feeding of production and development areas, as well as for th...

  13. Modeling of the water gap in BWR fuel elements using SCALE/TRITON; Modellierung des Wasserspalts bei SWR-BE mit SCALE/TRITON

    Energy Technology Data Exchange (ETDEWEB)

    Tittelbach, S.; Chernykh, M. [WTI Wissenschaftlich-Technische Ingenieurberatung GmbH, Juelich (Germany)

    2012-11-01

    The authors show that an adequate modeling of the water gap in BWR fuel element models using the code TRITON requires an explicit consideration of the Dancoff factors. The analysis of three modeling options reveals that considering the moderating effects of the water gap coolant for the peripheral fuel elements the resulting deviations of the U-235 and Pu-239 concentrations are significantly reduced. The increased temporal calculation efforts are justified with respect to the burnup credits for criticality safety analyses.

  14. Modelling of nonlinear shoaling based on stochastic evolution equations

    DEFF Research Database (Denmark)

    Kofoed-Hansen, Henrik; Rasmussen, Jørgen Hvenekær

    1998-01-01

    are recast into evolution equations for the complex amplitudes, and serve as the underlying deterministic model. Next, a set of evolution equations for the cumulants is derived. By formally introducing the well-known Gaussian closure hypothesis, nonlinear evolution equations for the power spectrum...... with experimental data in four different cases as well as with the underlying deterministic model. In general, the agreement is found to be acceptable, even far beyond the region where Gaussianity (Gaussian sea state) may be justified. (C) 1998 Elsevier Science B.V....

  15. ANALYSIS MODEL FOR INVENTORY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    CAMELIA BURJA

    2010-01-01

    Full Text Available The inventory represents an essential component for the assets of the enterprise and the economic analysis gives them special importance because their accurate management determines the achievement of the activity object and the financial results. The efficient management of inventory requires ensuring an optimum level for them, which will guarantee the normal functioning of the activity with minimum inventory expenses and funds which are immobilised. The paper presents an analysis model for inventory management based on their rotation speed and the correlation with the sales volume illustrated in an adequate study. The highlighting of the influence factors on the efficient inventory management ensures the useful information needed to justify managerial decisions, which will lead to a balancedfinancial position and to increased company performance.

  16. Modelling Investment Attractiveness of a Public Joint Stock Company as the Basis of Managerial Decision Making

    OpenAIRE

    Brukhovetskaya Natalia E.; Khasanova Olena V.

    2014-01-01

    The article analyses factors of influence upon investment attractiveness of a public joint stock company, which are factors by the sphere of origin. The article identifies the degree and direction of influence of the factors by the sphere of origin upon investment attractiveness of a public joint stock company; factors are divided into two groups, which could be regulated directly by society and which cannot be regulated. It justifies the necessity of modelling investment attractiveness of a ...

  17. Efficient Work Team Scheduling: Using Psychological Models of Knowledge Retention to Improve Code Writing Efficiency

    Directory of Open Access Journals (Sweden)

    Michael J. Pelosi

    2014-12-01

    Full Text Available Development teams and programmers must retain critical information about their work during work intervals and gaps in order to improve future performance when work resumes. Despite time lapses, project managers want to maximize coding efficiency and effectiveness. By developing a mathematically justified, practically useful, and computationally tractable quantitative and cognitive model of learning and memory retention, this study establishes calculations designed to maximize scheduling payoff and optimize developer efficiency and effectiveness.

  18. Statistical tests for equal predictive ability across multiple forecasting methods

    DEFF Research Database (Denmark)

    Borup, Daniel; Thyrsgaard, Martin

    We develop a multivariate generalization of the Giacomini-White tests for equal conditional predictive ability. The tests are applicable to a mixture of nested and non-nested models, incorporate estimation uncertainty explicitly, and allow for misspecification of the forecasting model as well as ...

  19. On a Quantum Model of Brain Activities

    Science.gov (United States)

    Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.

    2010-01-01

    One of the main activities of the brain is the recognition of signals. A first attempt to explain the process of recognition in terms of quantum statistics was given in [6]. Subsequently, details of the mathematical model were presented in a (still incomplete) series of papers (cf. [7, 2, 5, 10]). In the present note we want to give a general view of the principal ideas of this approach. We will introduce the basic spaces and justify the choice of spaces and operations. Further, we bring the model face to face with basic postulates any statistical model of the recognition process should fulfill. These postulates are in accordance with the opinion widely accepted in psychology and neurology.

  20. justify">Proposta de um Modelo Conceitual de Valor de Marca na Nova Lógica de Serviços

    The Proposal of a Conceptual Model of Brand Equity in the New Logic of Services

    Propuesta de un Modelo Conceptual de Valor de la Marca en La Nueva Lógica de Servicios

    OpenAIRE

    GONÇALVES, Livia Castro D'Avila; GARRIDO, Ivan Lapuente; DAMACENA, Cláudio

    2010-01-01

    RESUMOO estudo sobre valor de marca é considerado um dos pontos centrais da gestão estratégica de marketing (WEBSTER JUNIOR, 2005). Muita ênfase tem sido dada aos estudos deste tema em relação aos serviços, por apresentarem características que os diferenciam dos produtos (BERRY, 2000). Além disso, Vargo e Lusch (2004a) propõem uma nova forma de estudar os serviços, inspirado no deslocamento de uma lógica em que os bens tangíveis são o ponto central para uma lógica na qual os aspectos intangív...

  1. Semi-analytical model for a slab one-dimensional photonic crystal

    Science.gov (United States)

    Libman, M.; Kondratyev, N. M.; Gorodetsky, M. L.

    2018-02-01

    In our work we justify the applicability of a dielectric mirror model to the description of a real photonic crystal. We demonstrate that a simple one-dimensional model of a multilayer mirror can be employed for modeling of a slab waveguide with periodically changing width. It is shown that this width change can be recalculated to the effective refraction index modulation. The applicability of transfer matrix method of reflection properties calculation was demonstrated. Finally, our 1-D model was employed to analyze reflection properties of a 2-D structure - a slab photonic crystal with a number of elliptic holes.

  2. Modeling density-driven flow in porous media principles, numerics, software

    CERN Document Server

    Holzbecher, Ekkehard O

    1998-01-01

    Modeling of flow and transport in groundwater has become an important focus of scientific research in recent years. Most contributions to this subject deal with flow situations, where density and viscosity changes in the fluid are neglected. This restriction may not always be justified. The models presented in the book demonstrate immpressingly that the flow pattern may be completely different when density changes are taken into account. The main applications of the models are: thermal and saline convection, geothermal flow, saltwater intrusion, flow through salt formations etc. This book not only presents basic theory, but the reader can also test his knowledge by applying the included software and can set up own models.

  3. A Numerical-Analytical Approach to Modeling the Axial Rotation of the Earth

    Science.gov (United States)

    Markov, Yu. G.; Perepelkin, V. V.; Rykhlova, L. V.; Filippova, A. S.

    2018-04-01

    A model for the non-uniform axial rotation of the Earth is studied using a celestial-mechanical approach and numerical simulations. The application of an approximate model containing a small number of parameters to predict variations of the axial rotation velocity of the Earth over short time intervals is justified. This approximate model is obtained by averaging variable parameters that are subject to small variations due to non-stationarity of the perturbing factors. The model is verified and compared with predictions over a long time interval published by the International Earth Rotation and Reference Systems Service (IERS).

  4. Basic concepts of kinematic-wave models

    Science.gov (United States)

    Miller, J.E.

    1984-01-01

    The kinematic-wave model is one of a number of approximations of the dynamic-wave model. The dynamic-wave model describes onedimensional shallow-water waves (unsteady, gradually varied, openchannel flow). This report provides a basic reference on the theory and applications of the kinematic-wave model and describes the limitations of the model in relation to the other approximations of the dynamic-wave model. In the kinematic-wave approximation, a number of the terms in the equation of motion are assumed to be insignificant. The equation of motion is replaced by an equation describing uniform flow. Thus, the kinematic-wave model is described by the continuity equation and a uniform-flow equation such as the wellknown Chezy or Manning formulas. Kinematic-wave models are applicable to overland flow where lateral inflow is continuously added and is a large part of the total flow. For channel-routing applications, the kinematic-wave model always predicts a steeper wave with less dispersion and attenuation than actually occurs. The effect of the accumulation of errors in the kinematic-wave model shows that the approximations made in the development of the kinematic-wave equations are not generally justified for most channel-routing applications. Modified flow-routing models can be used which help to stop the accumulation of errors that occur when the kinematic-wave model is applied.

  5. The Separate Spheres Model of Gendered Inequality.

    Directory of Open Access Journals (Sweden)

    Andrea L Miller

    Full Text Available Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals' endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology.

  6. The Separate Spheres Model of Gendered Inequality

    Science.gov (United States)

    Miller, Andrea L.; Borgida, Eugene

    2016-01-01

    Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI) has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals’ endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology. PMID:26800454

  7. The Separate Spheres Model of Gendered Inequality.

    Science.gov (United States)

    Miller, Andrea L; Borgida, Eugene

    2016-01-01

    Research on role congruity theory and descriptive and prescriptive stereotypes has established that when men and women violate gender stereotypes by crossing spheres, with women pursuing career success and men contributing to domestic labor, they face backlash and economic penalties. Less is known, however, about the types of individuals who are most likely to engage in these forms of discrimination and the types of situations in which this is most likely to occur. We propose that psychological research will benefit from supplementing existing research approaches with an individual differences model of support for separate spheres for men and women. This model allows psychologists to examine individual differences in support for separate spheres as they interact with situational and contextual forces. The separate spheres ideology (SSI) has existed as a cultural idea for many years but has not been operationalized or modeled in social psychology. The Separate Spheres Model presents the SSI as a new psychological construct characterized by individual differences and a motivated system-justifying function, operationalizes the ideology with a new scale measure, and models the ideology as a predictor of some important gendered outcomes in society. As a first step toward developing the Separate Spheres Model, we develop a new measure of individuals' endorsement of the SSI and demonstrate its reliability, convergent validity, and incremental predictive validity. We provide support for the novel hypotheses that the SSI predicts attitudes regarding workplace flexibility accommodations, income distribution within families between male and female partners, distribution of labor between work and family spheres, and discriminatory workplace behaviors. Finally, we provide experimental support for the hypothesis that the SSI is a motivated, system-justifying ideology.

  8. Deep Space Network Measurement Model Development for Interplanetary Mission

    Directory of Open Access Journals (Sweden)

    Hae-Yeon Kim

    2004-12-01

    Full Text Available The DSN(Deep Space Network measurement model for interplanetary navigations which is essential for precise orbit determination has been developed. The DSN measurement model produces fictitious DSN observables such as range, doppler and angular data, containing the potential observational errors in geometric data obtained from orbit propagator. So the important part of this research is to model observational errors in DSN observation and to characterize the errors. The modeled observational errors include the range delay effect caused by troposphere, ionosphere, antenna offset, and angular refraction effect caused by troposphere. Non-modeled errors are justified %%as the solved-for parameters. as the parameters. All of these results from developed models show about 10% errors compared to the JPL's reference results, that are within acceptable error range.

  9. Modeling Pharmacokinetics and Pharmacodynamics of Glucagon for Simulation of the Glucoregulatory System in Patients with Type 1 Diabetes

    DEFF Research Database (Denmark)

    Wendt, Sabrina Lyngbye

    analogue in healthy dogs. The extended glucoregulatory model translated to the human species and described glucose-insulin-glucagon dynamics in healthy subjects and patients with type 1 diabetes (T1D). The extended glucoregulatory model was successfully validated by leave-one-out cross-validation in seven...... T1D patients which justified its use for simulations. The final model parameters were estimated from three to four datasets from each patient. The validated extended glucoregulatory model was used for in silico studies. The model replicated a clinical study of the effect of glucagon at varying...... dosing regimen for treatment of insulin-induced mild hypoglycemia....

  10. Pengembangan Soal Penalaran Model TIMSS Matematika SMP

    Directory of Open Access Journals (Sweden)

    A. Rizta

    2013-06-01

    Full Text Available Penelitian ini bertujuan mengembangkan soal penalaran model TIMSS pada mata pelajaran matematika SMP. Subjek penelitian adalah siswa kelas VIII.7 SMP Negeri 1 Palembang yang berjumlah 27 orang. Metode penelitian yang digunakan development research atau pengembangan. Hasil penelitian ini menunjukkan bahwa sebanyak 22,22% siswa mendapat skor  penalaran di atas 65%, dan 77,78% siswa memperoleh skor penalaran di bawah 65%. Lebih rinci pencapaian hasil tes penalaran pada domain penalaran generalize 11,11%,  domain penalaran justify 3,7%, domain penalaran integrate 29,63%, domain penalaran analyze 44,45%, dan domain penalaran non-routine problem 51,85%. Berdasarkan hasil tes tersebut, jika acuan batas pencapaian 65% maka  penalaran siswa masih berada di bawah batas pencapaian minimal dengan kata lain kemampuan penalaran siswa masih rendah.   The aim of this research was developing TIMSS reasoning problem on mathematics SMP. Subject of this research was 27 students on VIII.7 SMPN 1 Palembang. This research used development research. The result show that 22,22% students reach above 65% of reasoning problem, and vice versa. More detail result show that 11,11% reached generalize reasoning level, 3,7% reached justify level, 29,63% reached integrate level, 44,45% reached analyze level, and 51,85% reached non-routine problem. Based on the result, if 65% was determined as minimum limit of success, it means the student reasoning ability still low.  

  11. Multiscale modelling and analysis of collective decision making in swarm robotics.

    Directory of Open Access Journals (Sweden)

    Matthias Vigelius

    Full Text Available We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.

  12. Multiscale Modelling and Analysis of Collective Decision Making in Swarm Robotics

    Science.gov (United States)

    Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey

    2014-01-01

    We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable. PMID:25369026

  13. Multiscale modelling and analysis of collective decision making in swarm robotics.

    Science.gov (United States)

    Vigelius, Matthias; Meyer, Bernd; Pascoe, Geoffrey

    2014-01-01

    We present a unified approach to describing certain types of collective decision making in swarm robotics that bridges from a microscopic individual-based description to aggregate properties. Our approach encompasses robot swarm experiments, microscopic and probabilistic macroscopic-discrete simulations as well as an analytic mathematical model. Following up on previous work, we identify the symmetry parameter, a measure of the progress of the swarm towards a decision, as a fundamental integrated swarm property and formulate its time evolution as a continuous-time Markov process. Contrary to previous work, which justified this approach only empirically and a posteriori, we justify it from first principles and derive hard limits on the parameter regime in which it is applicable.

  14. Comparing fixed effects and covariance structure estimators for panel data

    DEFF Research Database (Denmark)

    Ejrnæs, Mette; Holm, Anders

    2006-01-01

    In this article, the authors compare the traditional econometric fixed effect estimator with the maximum likelihood estimator implied by covariance structure models for panel data. Their findings are that the maximum like lipoid estimator is remarkably robust to certain types of misspecifications...

  15. Animal models of pediatric chronic kidney disease. Is adenine intake an appropriate model?

    Directory of Open Access Journals (Sweden)

    Débora Claramunt

    2015-11-01

    Full Text Available Pediatric chronic kidney disease (CKD has peculiar features. In particular, growth impairment is a major clinical manifestation of CKD that debuts in pediatric age because it presents in a large proportion of infants and children with CKD and has a profound impact on the self-esteem and social integration of the stunted patients. Several factors associated with CKD may lead to growth retardation by interfering with the normal physiology of growth plate, the organ where longitudinal growth rate takes place. The study of growth plate is hardly possible in humans and justifies the use of animal models. Young rats made uremic by 5/6 nephrectomy have been widely used as a model to investigate growth retardation in CKD. This article examines the characteristics of this model and analyzes the utilization of CKD induced by high adenine diet as an alternative research protocol.

  16. Kinetic models in spin chemistry. 1. The hyperfine interaction

    DEFF Research Database (Denmark)

    Mojaza, M.; Pedersen, J. B.

    2012-01-01

    Kinetic models for quantum systems are quite popular due to their simplicity, although they are difficult to justify. We show that the transformation from quantum to kinetic description can be done exactly for the hyperfine interaction of one nuclei with arbitrary spin; more spins are described...... with a very good approximation. The crucial points are: to represents the quantum coherent oscillations by first order rate constants, and to determine the number of kinetic channels corresponding to a given interaction. We consider a radical pair system with spin selective reactions and calculate the spin...

  17. A comparative study of spherical and flat-Earth geopotential modeling at satellite elevations

    Science.gov (United States)

    Parrott, M. H.; Hinze, W. J.; Braile, L. W.

    1985-01-01

    Flat-Earth and spherical-Earth geopotential modeling of crustal anomaly sources at satellite elevations are compared by computing gravity and scalar magnetic anomalies perpendicular to the strike of variably dimensioned rectangular prisms at altitudes of 150, 300, and 450 km. Results indicate that the error caused by the flat-Earth approximation is less than 10% in most geometric conditions. Generally, error increase with larger and wider anomaly sources at higher altitudes. For most crustal source modeling applications at conventional satellite altitudes, flat-Earth modeling can be justified and is numerically efficient.

  18. On the Formal Modeling of Games of Language and Adversarial Argumentation : A Logic-Based Artificial Intelligence Approach

    OpenAIRE

    Eriksson Lundström, Jenny S. Z.

    2009-01-01

    Argumentation is a highly dynamical and dialectical process drawing on human cognition. Successful argumentation is ubiquitous to human interaction. Comprehensive formal modeling and analysis of argumentation presupposes a dynamical approach to the following phenomena: the deductive logic notion, the dialectical notion and the cognitive notion of justified belief. For each step of an argumentation these phenomena form networks of rules which determine the propositions to be allowed to make se...

  19. Chain flexibility and nonlinear optical properties in polyenes within a two-state (VB-CT) model

    Science.gov (United States)

    Sugliani, S.; Del Zoppo, M.; Zerbi, G.; Shu, C.-F.

    2001-09-01

    We present a simple two-state model that justifies the dependence of first-order hyperpolarizabilities ( β) of push-pull polyenes on conformational disorder. Particular relevance is given to the calculation of the vibrational properties (i.e. force constants, infrared and Raman intensities) which are used for the evaluation of the vibrational contribution to static molecular hyperpolarizabilities. The theoretical predictions are compared with experimental measurements of the quantities of interest on suitable molecules purposely synthesized.

  20. Mechanical Impedance Modeling of Human Arm: A survey

    Science.gov (United States)

    Puzi, A. Ahmad; Sidek, S. N.; Sado, F.

    2017-03-01

    Human arm mechanical impedance plays a vital role in describing motion ability of the upper limb. One of the impedance parameters is stiffness which is defined as the ratio of an applied force to the measured deformation of the muscle. The arm mechanical impedance modeling is useful in order to develop a better controller for system that interacts with human as such an automated robot-assisted platform for automated rehabilitation training. The aim of the survey is to summarize the existing mechanical impedance models of human upper limb so to justify the need to have an improved version of the arm model in order to facilitate the development of better controller of such systems with ever increase in complexity. In particular, the paper will address the following issue: Human motor control and motor learning, constant and variable impedance models, methods for measuring mechanical impedance and mechanical impedance modeling techniques.

  1. Modeling in biopharmaceutics, pharmacokinetics, and pharmacodynamics homogeneous and heterogeneous approaches

    CERN Document Server

    Macheras, Panos

    2006-01-01

    The state of the art in Biopharmaceutics, Pharmacokinetics, and Pharmacodynamics Modeling is presented in this book. It shows how advanced physical and mathematical methods can expand classical models in order to cover heterogeneous drug-biological processes and therapeutic effects in the body. The book is divided into four parts; the first deals with the fundamental principles of fractals, diffusion and nonlinear dynamics; the second with drug dissolution, release, and absorption; the third with empirical, compartmental, and stochastic pharmacokinetic models, and the fourth mainly with nonclassical aspects of pharmacodynamics. The classical models that have relevance and application to these sciences are also considered throughout. Many examples are used to illustrate the intrinsic complexity of drug administration related phenomena in the human, justifying the use of advanced modeling methods. This timely and useful book will appeal to graduate students and researchers in pharmacology, pharmaceutical scienc...

  2. Multi-agent Architecture Model for Driving Mobile Manipulator Robots

    Directory of Open Access Journals (Sweden)

    A. Hentout

    2008-09-01

    Full Text Available In this article, we present generic hierarchical behavior-based architecture model for driving mobile manipulator robots. Two behaviors are of high-level. They constitute the Supervisory agent, which manages the global system. Two others are of intermediate-level and finally one behavior is of low-level. These last ones constitute the Mobile Robot agent and the Manipulator Robot agent controlling, respectively, the mobile base and the manipulator arm. The choice of the suggested model is justified by the generic character of the proposed agent model and by the possibility of integrating the whole in a distributed robotic system. The model is formalized in Agent UML from the conceptual level to the implementation level. The interaction between the various agents is modeled by the use of the interaction diagrams of Agent UML (states and protocol diagrams.

  3. Multi-agent Architecture Model for Driving Mobile Manipulator Robots

    Directory of Open Access Journals (Sweden)

    A. Hentout

    2008-11-01

    Full Text Available In this article, we present generic hierarchical behavior-based architecture model for driving mobile manipulator robots. Two behaviors are of high-level. They constitute the Supervisory agent, which manages the global system. Two others are of intermediate-level and finally one behavior is of low-level. These last ones constitute the Mobile Robot agent and the Manipulator Robot agent controlling, respectively, the mobile base and the manipulator arm. The choice of the suggested model is justified by the generic character of the proposed agent model and by the possibility of integrating the whole in a distributed robotic system. The model is formalized in Agent UML from the conceptual level to the implementation level. The interaction between the various agents is modeled by the use of the interaction diagrams of Agent UML (states and protocol diagrams.

  4. Atmospheric disturbance modelling requirements for flying qualities applications

    Science.gov (United States)

    Moorhouse, D. J.

    1978-01-01

    Flying qualities are defined as those airplane characteristics which govern the ease or precision with which the pilot can accomplish the mission. Some atmospheric disturbance modelling requirements for aircraft flying qualities applications are reviewed. It is concluded that some simplifications are justified in identifying the primary influence on aircraft response and pilot control. It is recommended that a universal environmental model be developed, which could form the reference for different applications. This model should include the latest information on winds, turbulence, gusts, visibility, icing and precipitation. A chosen model would be kept by a national agency and updated regularly by feedback from users. A user manual is believed to be an essential part of such a model.

  5. Szekeres models: a covariant approach

    Science.gov (United States)

    Apostolopoulos, Pantelis S.

    2017-05-01

    We exploit the 1  +  1  +  2 formalism to covariantly describe the inhomogeneous and anisotropic Szekeres models. It is shown that an average scale length can be defined covariantly which satisfies a 2d equation of motion driven from the effective gravitational mass (EGM) contained in the dust cloud. The contributions to the EGM are encoded to the energy density of the dust fluid and the free gravitational field E ab . We show that the quasi-symmetric property of the Szekeres models is justified through the existence of 3 independent intrinsic Killing vector fields (IKVFs). In addition the notions of the apparent and absolute apparent horizons are briefly discussed and we give an alternative gauge-invariant form to define them in terms of the kinematical variables of the spacelike congruences. We argue that the proposed program can be used in order to express Sachs’ optical equations in a covariant form and analyze the confrontation of a spatially inhomogeneous irrotational overdense fluid model with the observational data.

  6. The nuisance of nuisance regression: spectral misspecification in a common approach to resting-state fMRI preprocessing reintroduces noise and obscures functional connectivity.

    Science.gov (United States)

    Hallquist, Michael N; Hwang, Kai; Luna, Beatriz

    2013-11-15

    Recent resting-state functional connectivity fMRI (RS-fcMRI) research has demonstrated that head motion during fMRI acquisition systematically influences connectivity estimates despite bandpass filtering and nuisance regression, which are intended to reduce such nuisance variability. We provide evidence that the effects of head motion and other nuisance signals are poorly controlled when the fMRI time series are bandpass-filtered but the regressors are unfiltered, resulting in the inadvertent reintroduction of nuisance-related variation into frequencies previously suppressed by the bandpass filter, as well as suboptimal correction for noise signals in the frequencies of interest. This is important because many RS-fcMRI studies, including some focusing on motion-related artifacts, have applied this approach. In two cohorts of individuals (n=117 and 22) who completed resting-state fMRI scans, we found that the bandpass-regress approach consistently overestimated functional connectivity across the brain, typically on the order of r=.10-.35, relative to a simultaneous bandpass filtering and nuisance regression approach. Inflated correlations under the bandpass-regress approach were associated with head motion and cardiac artifacts. Furthermore, distance-related differences in the association of head motion and connectivity estimates were much weaker for the simultaneous filtering approach. We recommend that future RS-fcMRI studies ensure that the frequencies of nuisance regressors and fMRI data match prior to nuisance regression, and we advocate a simultaneous bandpass filtering and nuisance regression strategy that better controls nuisance-related variability. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. The Nuisance of Nuisance Regression: Spectral Misspecification in a Common Approach to Resting-State fMRI Preprocessing Reintroduces Noise and Obscures Functional Connectivity

    OpenAIRE

    Hallquist, Michael N.; Hwang, Kai; Luna, Beatriz

    2013-01-01

    Recent resting-state functional connectivity fMRI (RS-fcMRI) research has demonstrated that head motion during fMRI acquisition systematically influences connectivity estimates despite bandpass filtering and nuisance regression, which are intended to reduce such nuisance variability. We provide evidence that the effects of head motion and other nuisance signals are poorly controlled when the fMRI time series are bandpass-filtered but the regressors are unfiltered, resulting in the inadvertent...

  8. Thruster Modelling for Underwater Vehicle Using System Identification Method

    Directory of Open Access Journals (Sweden)

    Mohd Shahrieel Mohd Aras

    2013-05-01

    Full Text Available Abstract This paper describes a study of thruster modelling for a remotely operated underwater vehicle (ROV by system identification using Microbox 2000/2000C. Microbox 2000/2000C is an XPC target machine device to interface between an ROV thruster with the MATLAB 2009 software. In this project, a model of the thruster will be developed first so that the system identification toolbox in MATLAB can be used. This project also presents a comparison of mathematical and empirical modelling. The experiments were carried out by using a mini compressor as a dummy depth pressure applied to a pressure sensor. The thruster model will thrust and submerge until it reaches a set point and maintain the set point depth. The depth was based on pressure sensor measurement. A conventional proportional controller was used in this project and the results gathered justified its selection.

  9. Konsep Model Pengembangan Idea Management World Vision Indonesia

    Directory of Open Access Journals (Sweden)

    Hendra Hendra

    2013-12-01

    Full Text Available The purpose of this article is to compose a development model that can be used to manage the process of idea conversion, structure idea from tacit to be explicit, determine priority scale of an idea, accomodate idea discussion in a creative thinking, justify idea alignment with organization choices, and measure the impact of an idea management. The development model is shaped based on supported concept and tool of knowledge conversion, knowledge management cycle and its procedure, QCDSM objectives, organization alignment model, importance-urgency priority scale and six hats creative thinking. The result of this research is an integrated model with some designs of idea identification form and main functions of idea management application that are ready to be implemented.

  10. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  11. Animal models of chronic obstructive pulmonary disease.

    Science.gov (United States)

    Pérez-Rial, Sandra; Girón-Martínez, Álvaro; Peces-Barba, Germán

    2015-03-01

    Animal models of disease have always been welcomed by the scientific community because they provide an approach to the investigation of certain aspects of the disease in question. Animal models of COPD cannot reproduce the heterogeneity of the disease and usually only manage to represent the disease in its milder stages. Moreover, airflow obstruction, the variable that determines patient diagnosis, not always taken into account in the models. For this reason, models have focused on the development of emphysema, easily detectable by lung morphometry, and have disregarded other components of the disease, such as airway injury or associated vascular changes. Continuous, long-term exposure to cigarette smoke is considered the main risk factor for this disease, justifying the fact that the cigarette smoke exposure model is the most widely used. Some variations on this basic model, related to exposure time, the association of other inducers or inhibitors, exacerbations or the use of transgenic animals to facilitate the identification of pathogenic pathways have been developed. Some variations or heterogeneity of this disease, then, can be reproduced and models can be designed for resolving researchers' questions on disease identification or treatment responses. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.

  12. Modeling cerebral blood flow during posture change from sitting to standing

    DEFF Research Database (Denmark)

    Ottesen, Johnny T.; Olufsen, M.; Tran, H.T.

    2004-01-01

    Abstract Hypertension, decreased cerebral blood flow, and diminished cerebral blood flow velocity regulation, are among the first signs indicating the presence of cerebral vascular disease. In this paper, we will present a mathematical model that can predict blood flow and pressure during posture...... extremities, the brain, and the heart. We use physiologically based control mechanisms to describe the regulation of cerebral blood flow velocity and arterial pressure in response to orthostatic hypotension resulting from postural change. To justify the fidelity of our mathematical model and control...

  13. Identification of Super Phenix steam generator by a simple polynomial model

    International Nuclear Information System (INIS)

    Rousseau, I.

    1981-01-01

    This note suggests a method of identification for the steam generator of the Super-Phenix fast neutron power plant for simple polynomial models. This approach is justified in the selection of the adaptive control. The identification algorithms presented will be applied to multivariable input-output behaviours. The results obtained with the representation in self-regressive form and by simple polynomial models will be compared and the effect of perturbations on the output signal will be tested, in order to select a good identification algorithm for multivariable adaptive regulation [fr

  14. Seamless Method- and Model-based Software and Systems Engineering

    Science.gov (United States)

    Broy, Manfred

    Today engineering software intensive systems is still more or less handicraft or at most at the level of manufacturing. Many steps are done ad-hoc and not in a fully systematic way. Applied methods, if any, are not scientifically justified, not justified by empirical data and as a result carrying out large software projects still is an adventure. However, there is no reason why the development of software intensive systems cannot be done in the future with the same precision and scientific rigor as in established engineering disciplines. To do that, however, a number of scientific and engineering challenges have to be mastered. The first one aims at a deep understanding of the essentials of carrying out such projects, which includes appropriate models and effective management methods. What is needed is a portfolio of models and methods coming together with a comprehensive support by tools as well as deep insights into the obstacles of developing software intensive systems and a portfolio of established and proven techniques and methods with clear profiles and rules that indicate when which method is ready for application. In the following we argue that there is scientific evidence and enough research results so far to be confident that solid engineering of software intensive systems can be achieved in the future. However, yet quite a number of scientific research problems have to be solved.

  15. A mathematical model to determine incorporated quantities of radioactivity from the measured photometric values of tritium-autoradiographs in neuroanatomy

    International Nuclear Information System (INIS)

    Jennissen, J.J.

    1981-01-01

    The mathematical/empirical model developed in this paper helps to determine the incorporated radioactivity from the measured photometric values and the exposure time T. Possible errors of autoradiography due to the exposure time or the preparation are taken into consideration by the empirical model. It is shown that the error of appr. 400% appearing in the sole comparison of the measured photometric values can be corrected. The model is valid for neuroanatomy as optical nerves, i.e. neuroanatomical material, were used to develop it. Its application also to the other sections of the central nervous system seems to be justified due to the reduction of errors thus achieved. (orig.) [de

  16. On an elastic dissipation model as continuous approximation for discrete media

    Directory of Open Access Journals (Sweden)

    I. V. Andrianov

    2006-01-01

    Full Text Available Construction of an accurate continuous model for discrete media is an important topic in various fields of science. We deal with a 1D differential-difference equation governing the behavior of an n-mass oscillator with linear relaxation. It is known that a string-type approximation is justified for low part of frequency spectra of a continuous model, but for free and forced vibrations a solution of discrete and continuous models can be quite different. A difference operator makes analysis difficult due to its nonlocal form. Approximate equations can be obtained by replacing the difference operators via a local derivative operator. Although application of a model with derivative of more than second order improves the continuous model, a higher order of approximated differential equation seriously complicates a solution of continuous problem. It is known that accuracy of the approximation can dramatically increase using Padé approximations. In this paper, one- and two-point Padé approximations suitable for justify choice of structural damping models are used.

  17. Hybrid discrete choice models: Gained insights versus increasing effort

    International Nuclear Information System (INIS)

    Mariel, Petr; Meyerhoff, Jürgen

    2016-01-01

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  18. Hybrid discrete choice models: Gained insights versus increasing effort

    Energy Technology Data Exchange (ETDEWEB)

    Mariel, Petr, E-mail: petr.mariel@ehu.es [UPV/EHU, Economía Aplicada III, Avda. Lehendakari Aguire, 83, 48015 Bilbao (Spain); Meyerhoff, Jürgen [Institute for Landscape Architecture and Environmental Planning, Technical University of Berlin, D-10623 Berlin, Germany and The Kiel Institute for the World Economy, Duesternbrooker Weg 120, 24105 Kiel (Germany)

    2016-10-15

    Hybrid choice models expand the standard models in discrete choice modelling by incorporating psychological factors as latent variables. They could therefore provide further insights into choice processes and underlying taste heterogeneity but the costs of estimating these models often significantly increase. This paper aims at comparing the results from a hybrid choice model and a classical random parameter logit. Point of departure for this analysis is whether researchers and practitioners should add hybrid choice models to their suite of models routinely estimated. Our comparison reveals, in line with the few prior studies, that hybrid models gain in efficiency by the inclusion of additional information. The use of one of the two proposed approaches, however, depends on the objective of the analysis. If disentangling preference heterogeneity is most important, hybrid model seems to be preferable. If the focus is on predictive power, a standard random parameter logit model might be the better choice. Finally, we give recommendations for an adequate use of hybrid choice models based on known principles of elementary scientific inference. - Highlights: • The paper compares performance of a Hybrid Choice Model (HCM) and a classical Random Parameter Logit (RPL) model. • The HCM indeed provides insights regarding preference heterogeneity not gained from the RPL. • The RPL has similar predictive power as the HCM in our data. • The costs of estimating HCM seem to be justified when learning more on taste heterogeneity is a major study objective.

  19. A hybrid mammalian cell cycle model

    Directory of Open Access Journals (Sweden)

    Vincent Noël

    2013-08-01

    Full Text Available Hybrid modeling provides an effective solution to cope with multiple time scales dynamics in systems biology. Among the applications of this method, one of the most important is the cell cycle regulation. The machinery of the cell cycle, leading to cell division and proliferation, combines slow growth, spatio-temporal re-organisation of the cell, and rapid changes of regulatory proteins concentrations induced by post-translational modifications. The advancement through the cell cycle comprises a well defined sequence of stages, separated by checkpoint transitions. The combination of continuous and discrete changes justifies hybrid modelling approaches to cell cycle dynamics. We present a piecewise-smooth version of a mammalian cell cycle model, obtained by hybridization from a smooth biochemical model. The approximate hybridization scheme, leading to simplified reaction rates and binary event location functions, is based on learning from a training set of trajectories of the smooth model. We discuss several learning strategies for the parameters of the hybrid model.

  20. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    Energy Technology Data Exchange (ETDEWEB)

    St. John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program.

  1. Fractured rock modeling in the National Waste Terminal Storage Program: a review of requirements and status

    International Nuclear Information System (INIS)

    St John, C.; Krug, A.; Key, S.; Monsees, J.

    1983-05-01

    Generalized computer codes capable of forming the basis for numerical models of fractured rock masses are being used within the NWTS program. Little additional development of these codes is considered justifiable, except in the area of representation of discrete fractures. On the other hand, model preparation requires definition of medium-specific constitutive descriptions and site characteristics and is therefore legitimately conducted by each of the media-oriented projects within the National Waste Terminal Storage program. However, it is essential that a uniform approach to the role of numerical modeling be adopted, including agreement upon the contribution of modeling to the design and licensing process and the need for, and means of, model qualification for particular purposes. This report discusses the role of numerical modeling, reviews the capabilities of several computer codes that are being used to support design or performance assessment, and proposes a framework for future numerical modeling activities within the NWTS program

  2. Numerical modelling and analysis of a room temperature magnetic refrigeration system

    DEFF Research Database (Denmark)

    Petersen, Thomas Frank

    This thesis presents a two-dimensional mathematical model of an Active Magnetic Regenerator (AMR) system which is used for magnetic refrigeration at room temperature. The purpose of the model is to simulate a laboratory-scale AMR constructed at Risø National Laboratory. The AMR model geometry...... comprises a regenerator made of parallel plates, which are separated by channels of a heat transfer fluid. The time-dependent model solves the momentum and continuity equations of the flow of the heat transfer fluid and the coupled energy equations of the heat transfer in the regenerator and the fluid...... of the chosen grid and time step. Initial results from the model showed significant temperature differences in both the regenerator and the fluid channel during the AMR cycle. This justifies the use of two-dimensional methods when an AMR with a parallel-plate regenerator is modelled. The model is flexible...

  3. Real Time Updating in Distributed Urban Rainfall Runoff Modelling

    DEFF Research Database (Denmark)

    Borup, Morten; Madsen, Henrik

    are equipped with basins and automated structures that allow for a large degree of control of the systems, but in order to do this optimally it is required to know what is happening throughout the system. For this task models are needed, due to the large scale and complex nature of the systems. The physically...... that are being updated from system measurements was studied. The results showed that the fact alone that it takes time for rainfall data to travel the distance between gauges and catchments has such a big negative effect on the forecast skill of updated models, that it can justify the choice of even very...... when it was used to update the water level in multiple upstream basins. This method is, however, not capable of utilising the spatial correlations in the errors to correct larger parts of the models. To accommodate this a method was developed for correcting the slow changing inflows to urban drainage...

  4. Model instruments of effective segmentation of the fast food market

    Directory of Open Access Journals (Sweden)

    Mityaeva Tetyana L.

    2013-03-01

    Full Text Available The article presents results of optimisation step-type calculations of economic effectiveness of promotion of fast food with consideration of key parameters of assessment of efficiency of the marketing strategy of segmentation. The article justifies development of a mathematical model on the bases of 3D-presentations and three-dimensional system of management variables. The modern applied mathematical packages allow formation not only of one-dimensional and two-dimensional arrays and analyse links of variables, but also of three-dimensional, besides, the more links and parameters are taken into account, the more adequate and adaptive are results of modelling and, as a result, more informative and strategically valuable. The article shows modelling possibilities that allow taking into account strategies and reactions on formation of the marketing strategy under conditions of entering the fast food market segments.

  5. A Model Based on Cocitation for Web Information Retrieval

    Directory of Open Access Journals (Sweden)

    Yue Xie

    2014-01-01

    Full Text Available According to the relationship between authority and cocitation in HITS, we propose a new hyperlink weighting scheme to describe the strength of the relevancy between any two webpages. Then we combine hyperlink weight normalization and random surfing schemes as used in PageRank to justify the new model. In the new model based on cocitation (MBCC, the pages with stronger relevancy are assigned higher values, not just depending on the outlinks. This model combines both features of HITS and PageRank. Finally, we present the results of some numerical experiments, showing that the MBCC ranking agrees with the HITS ranking, especially in top 10. Meanwhile, MBCC keeps the superiority of PageRank, that is, existence and uniqueness of ranking vectors.

  6. A MANAGEMENT MODEL FOR SUSTAINABLE DEVELOPMENT OF THE TOURIST DESTINATION

    Directory of Open Access Journals (Sweden)

    Krasimir ALEKSANDROV

    2013-01-01

    Full Text Available In recent years, Bulgaria is about to market successfully one of the few competitive advantages that the country has as a tourist destination – the diverse and authentic nature. It is an indisputable fact that tourism in its diversity is closely linked to the choice of destination. Sustainable destination management is critical for tourism development, particularly by having effective spatial planning and land use control and through investment decisions on infrastructure and services. The aim of this paper is to propose a management model of a tourist destination in the context of the ideas and policies for sustainable development. The thesis that is justified is that sustainable tourism destination is the result of a proper use of an appropriate governance model. The development and implementation of specific management model make the destination of an all year-round tourism in its different varieties (recreational, sports, etc., bearing economic, social and environmental benefits to society.

  7. Liquid-drop model applied to heavy ions irradiation

    International Nuclear Information System (INIS)

    De Cicco, Hernan; Alurralde, Martin A.; Saint-Martin, Maria L. G.; Bernaola, Omar A.

    1999-01-01

    Liquid-drop model is used, previously applied in the study of radiation damage in metals, in an energy range not covered by molecular dynamics, in order to understand experimental data of particle tracks in an organic material (Makrofol E), which cannot be accurately described by the existing theoretical methods. The nuclear and electronic energy depositions are considered for each ion considered and the evolution of the thermal explosion is evaluated. The experimental observation of particle tracks in a region previously considered as 'prohibited' are justified. Although the model used has free parameters and some discrepancies with the experimental diametrical values exist, the agreement obtained is highly superior than that of other existing models. (author)

  8. Some variations of the Kristallin-I near-field model

    International Nuclear Information System (INIS)

    Smith, P.A.; Curti, E.

    1995-11-01

    The Kristallin-I project is an integrated analysis of the final disposal of vitrified high-level radioactive waste (HLW) in the crystalline basement of Northern Switzerland. It includes an analysis of the radiological consequences of radionuclide release from a repository. This analysis employs a chain of independent models for the near-field, geosphere and biosphere. In constructing these models, processes are incorporated that are believed to be relevant to repository safety, while other processes are neglected. In the present report, a set of simplified, steady-state models of the near-field is developed to investigate the possible effects of specific processes which are neglected in the time-dependent Kristallin-I near-field model. These processes are neglected, either because (i) they are thought unlikely to occur to a significant degree, or because (ii) they are likely to make a positive contribution to the performance of the near-field barrier to radionuclide migration, but are insufficiently understood to justify incorporating them in a safety assessment. The aim of this report is to investigate whether the arguments for neglecting these processes in the Kristallin-I near-field model can be justified. This work addresses the following topics: - radionuclide transport at the bentonite-host rock interface, - canister settlement, -chemical conditions and radionuclide transport at the glass-bentonite interface. (author) figs., tabs., refs

  9. Linking normative models of natural tasks to descriptive models of neural response.

    Science.gov (United States)

    Jaini, Priyank; Burge, Johannes

    2017-10-01

    Understanding how nervous systems exploit task-relevant properties of sensory stimuli to perform natural tasks is fundamental to the study of perceptual systems. However, there are few formal methods for determining which stimulus properties are most useful for a given natural task. As a consequence, it is difficult to develop principled models for how to compute task-relevant latent variables from natural signals, and it is difficult to evaluate descriptive models fit to neural response. Accuracy maximization analysis (AMA) is a recently developed Bayesian method for finding the optimal task-specific filters (receptive fields). Here, we introduce AMA-Gauss, a new faster form of AMA that incorporates the assumption that the class-conditional filter responses are Gaussian distributed. Then, we use AMA-Gauss to show that its assumptions are justified for two fundamental visual tasks: retinal speed estimation and binocular disparity estimation. Next, we show that AMA-Gauss has striking formal similarities to popular quadratic models of neural response: the energy model and the generalized quadratic model (GQM). Together, these developments deepen our understanding of why the energy model of neural response have proven useful, improve our ability to evaluate results from subunit model fits to neural data, and should help accelerate psychophysics and neuroscience research with natural stimuli.

  10. Mechanistic movement models to understand epidemic spread.

    Science.gov (United States)

    Fofana, Abdou Moutalab; Hurford, Amy

    2017-05-05

    An overlooked aspect of disease ecology is considering how and why animals come into contact with one and other resulting in disease transmission. Mathematical models of disease spread frequently assume mass-action transmission, justified by stating that susceptible and infectious hosts mix readily, and foregoing any detailed description of host movement. Numerous recent studies have recorded, analysed and modelled animal movement. These movement models describe how animals move with respect to resources, conspecifics and previous movement directions and have been used to understand the conditions for the occurrence and the spread of infectious diseases when hosts perform a type of movement. Here, we summarize the effect of the different types of movement on the threshold conditions for disease spread. We identify gaps in the literature and suggest several promising directions for future research. The mechanistic inclusion of movement in epidemic models may be beneficial for the following two reasons. Firstly, the estimation of the transmission coefficient in an epidemic model is possible because animal movement data can be used to estimate the rate of contacts between conspecifics. Secondly, unsuccessful transmission events, where a susceptible host contacts an infectious host but does not become infected can be quantified. Following an outbreak, this enables disease ecologists to identify 'near misses' and to explore possible alternative epidemic outcomes given shifts in ecological or immunological parameters.This article is part of the themed issue 'Opening the black box: re-examining the ecology and evolution of parasite transmission'. © 2017 The Author(s).

  11. Homogenised constitutive model dedicated to reinforced concrete plates subjected to seismic solicitations

    International Nuclear Information System (INIS)

    Combescure, Christelle

    2013-01-01

    Safety reassessments are periodically performed on the EDF nuclear power plants and the recent seismic reassessments leaded to the necessity of taking into account the non-linear behaviour of materials when modeling and simulating industrial structures of these power plants under seismic solicitations. A large proportion of these infrastructures is composed of reinforced concrete buildings, including reinforced concrete slabs and walls, and literature seems to be poor on plate modeling dedicated to seismic applications for this material. As for the few existing models dedicated to these specific applications, they present either a lack of dissipation energy in the material behaviour, or no micromechanical approach that justifies the parameters needed to properly describe the model. In order to provide a constitutive model which better represents the reinforced concrete plate behaviour under seismic loadings and whose parameters are easier to identify for the civil engineer, a constitutive model dedicated to reinforced concrete plates under seismic solicitations is proposed: the DHRC (Dissipative Homogenised Reinforced Concrete) model. Justified by a periodic homogenisation approach, this model includes two dissipative phenomena: damage of concrete matrix and internal sliding at the interface between steel rebar and surrounding concrete. An original coupling term between damage and sliding, resulting from the homogenisation process, induces a better representation of energy dissipation during the material degradation. The model parameters are identified from the geometric characteristics of the plate and a restricted number of material characteristics, allowing a very simple use of the model. Numerical validations of the DHRC model are presented, showing good agreement with experimental behaviour. A one dimensional simplification of the DHRC model is proposed, allowing the representation of reinforced concrete bars and simplified models of rods and wire mesh

  12. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method

    DEFF Research Database (Denmark)

    Valentin, Jan B.; Andreetta, Christian; Boomsma, Wouter

    2014-01-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length...... the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins....... The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. © 2013 Wiley Periodicals, Inc....

  13. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  14. IMPORTANCE OF DIFFERENT MODELS IN DECISION MAKING, EXPLAINING THE STRATEGIC BEHAVIOR IN ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Cristiano de Oliveira Maciel

    2006-11-01

    Full Text Available This study is about the different models of decision process analyzing the organizational strategy. The article presents the strategy according to a cognitive approach. The discussion about that approach has three models of decision process: rational actor model, organizational behavior, and political model. These models, respectively, present some improvement in the decision making results, search for a good decision facing the cognitive restrictions of the administrator, and lots of talks for making a decision. According to the emphasis of each model, the possibilities for analyzing the strategy are presented. The article also shows that it is necessary to take into account the three different ways of analysis. That statement is justified once the analysis as well as the decision making become more complex, mainly those which are more important for the organizations.

  15. Selection of hydrologic modeling approaches for climate change assessment: A comparison of model scale and structures

    Science.gov (United States)

    Surfleet, Christopher G.; Tullos, Desirèe; Chang, Heejun; Jung, Il-Won

    2012-09-01

    SummaryA wide variety of approaches to hydrologic (rainfall-runoff) modeling of river basins confounds our ability to select, develop, and interpret models, particularly in the evaluation of prediction uncertainty associated with climate change assessment. To inform the model selection process, we characterized and compared three structurally-distinct approaches and spatial scales of parameterization to modeling catchment hydrology: a large-scale approach (using the VIC model; 671,000 km2 area), a basin-scale approach (using the PRMS model; 29,700 km2 area), and a site-specific approach (the GSFLOW model; 4700 km2 area) forced by the same future climate estimates. For each approach, we present measures of fit to historic observations and predictions of future response, as well as estimates of model parameter uncertainty, when available. While the site-specific approach generally had the best fit to historic measurements, the performance of the model approaches varied. The site-specific approach generated the best fit at unregulated sites, the large scale approach performed best just downstream of flood control projects, and model performance varied at the farthest downstream sites where streamflow regulation is mitigated to some extent by unregulated tributaries and water diversions. These results illustrate how selection of a modeling approach and interpretation of climate change projections require (a) appropriate parameterization of the models for climate and hydrologic processes governing runoff generation in the area under study, (b) understanding and justifying the assumptions and limitations of the model, and (c) estimates of uncertainty associated with the modeling approach.

  16. Simplicity versus complexity in modelling groundwater recharge in Chalk catchments

    Directory of Open Access Journals (Sweden)

    R. B. Bradford

    2002-01-01

    Full Text Available Models of varying complexity are available to provide estimates of recharge in headwater Chalk catchments. Some measure of how estimates vary between different models can help guide the choice of model for a particular application. This paper compares recharge estimates derived from four models employing input data at varying spatial resolutions for a Chalk headwater catchment (River Pang, UK over a four-year period (1992-1995 that includes a range of climatic conditions. One model was validated against river flow data to provide a measure of their relative performance. Each model gave similar total recharge for the crucial winter recharge period when evaporation is low. However, the simple models produced relatively lower estimates of the summer and early autumn recharge due to the way in which processes governing recharge especially evaporation and infiltration are represented. The relative uniformity of land use, soil types and rainfall across headwater, drift-free Chalk catchments suggests that complex, distributed models offer limited benefits for recharge estimates at the catchment scale compared to simple models. Nonetheless, distributed models would be justified for studies where the pattern and amount of recharge need to be known in greater detail and to provide more reliable estimates of recharge during years with low rainfall. Keywords: Chalk, modelling, groundwater recharge

  17. Stochastic differential equations in NONMEM: implementation, application, and comparison with ordinary differential equations

    DEFF Research Database (Denmark)

    Tornøe, Christoffer Wenzel; Overgaard, Rune Viig; Agerso, H.

    2005-01-01

    of noise: a measurement and a system noise term. The measurement noise represents uncorrelated error due to, for example, assay error while the system noise accounts for structural misspecifications, approximations of the dynamical model, and true random physiological fluctuations. Since the system noise......Purpose. The objective of the present analysis was to explore the use of stochastic differential equations (SDEs) in population pharmacokinetic/pharmacodynamic (PK/PD) modeling. Methods. The intra-individual variability in nonlinear mixed-effects models based on SDEs is decomposed into two types...... accounts for model misspecifications, the SDEs provide a diagnostic tool for model appropriateness. The focus of the article is on the implementation of the Extended Kalman Filter (EKF) in NONMEM(R) for parameter estimation in SDE models. Results. Various applications of SDEs in population PK/PD modeling...

  18. Compartmental modeling and tracer kinetics

    CERN Document Server

    Anderson, David H

    1983-01-01

    This monograph is concerned with mathematical aspects of compartmental an­ alysis. In particular, linear models are closely analyzed since they are fully justifiable as an investigative tool in tracer experiments. The objective of the monograph is to bring the reader up to date on some of the current mathematical prob­ lems of interest in compartmental analysis. This is accomplished by reviewing mathematical developments in the literature, especially over the last 10-15 years, and by presenting some new thoughts and directions for future mathematical research. These notes started as a series of lectures that I gave while visiting with the Division of Applied ~1athematics, Brown University, 1979, and have developed in­ to this collection of articles aimed at the reader with a beginning graduate level background in mathematics. The text can be used as a self-paced reading course. With this in mind, exercises have been appropriately placed throughout the notes. As an aid in reading the material, the e~d of a ...

  19. Investigation of Model Simplification and Its Influence on the Accuracy in FEM Magnetic Calculations of Gearless Drives

    DEFF Research Database (Denmark)

    Andersen, Søren Bøgh; Santos, Ilmar F.; Fuerst, Axel

    2012-01-01

    on the calculated forces and torque coming from the drive where each simplification made is described and justified. To further reduce the evaluation time, it is examined how coarse the mesh can be, while still predicting the results with a high accuracy. From this investigation, it is shown that there are certain...... interactions. This multiphysics model will later on be used for simulating and parameter optimization of a gearless mill drive with the use of Evolution Strategies which necessitates the reduction in computation time. What has been investigated is how model simplifications influence the accuracy...

  20. Integrating evolution into ecological modelling: accommodating phenotypic changes in agent based models.

    Directory of Open Access Journals (Sweden)

    Aristides Moustakas

    Full Text Available Evolutionary change is a characteristic of living organisms and forms one of the ways in which species adapt to changed conditions. However, most ecological models do not incorporate this ubiquitous phenomenon. We have developed a model that takes a 'phenotypic gambit' approach and focuses on changes in the frequency of phenotypes (which differ in timing of breeding and fecundity within a population, using, as an example, seasonal breeding. Fitness per phenotype calculated as the individual's contribution to population growth on an annual basis coincide with the population dynamics per phenotype. Simplified model variants were explored to examine whether the complexity included in the model is justified. Outputs from the spatially implicit model underestimated the number of individuals across all phenotypes. When no phenotype transitions are included (i.e. offspring always inherit their parent's phenotype numbers of all individuals are always underestimated. We conclude that by using a phenotypic gambit approach evolutionary dynamics can be incorporated into individual based models, and that all that is required is an understanding of the probability of offspring inheriting the parental phenotype.

  1. PDS-Modelling and Regional Bayesian Estimation of Extreme Rainfalls

    DEFF Research Database (Denmark)

    Madsen, Henrik; Rosbjerg, Dan; Harremoës, Poul

    1994-01-01

    Since 1979 a country-wide system of raingauges has been operated in Denmark in order to obtain a better basis for design and analysis of urban drainage systems. As an alternative to the traditional non-parametric approach the Partial Duration Series method is employed in the modelling of extreme ....... The application of the Bayesian approach is derived in case of both exponential and generalized Pareto distributed exceedances. Finally, the aspect of including economic perspectives in the estimation of the design events is briefly discussed....... in Denmark cannot be justified. In order to obtain an estimation procedure at non-monitored sites and to improve at-site estimates a regional Bayesian approach is adopted. The empirical regional distributions of the parameters in the Partial Duration Series model are used as prior information...

  2. Mathematical model of the Danube Delta Hydrographical Network Morphological Dynamics

    Directory of Open Access Journals (Sweden)

    CIOACA Eugenia

    2010-09-01

    Full Text Available This paper presents an innovative technology used to investigate the Danube Delta Biosphere Reserve hydrographical network from the morphologic changes point of view, as result of fluvial processes, erosion and alluvial sedimentation. Field measurements and data processing are performed on water flow, sediment transport andbathymetry. Geospatial databases resulted help for constructing the mathematical /hydraulic model to simulate the hydro-morphological dynamics. There is used as workbench the Delft3D software – a product of DELTARES - Delft Hydraulic Institute, The Netherlands. The model results serve as practical tool for end users to scientifically justify the management decisions made on hydrographic network rehabilitation / reconstruction in order to improve the water flow regime.

  3. A dynamic P53-MDM2 model with time delay

    Energy Technology Data Exchange (ETDEWEB)

    Mihalas, Gh.I. [Department of Biophysics and Medical Informatics, University of Medicine and Pharmacy, Piata Eftimie Murgu, nr. 3, 300041 Timisoara (Romania)]. E-mail: mihalas@medinfo.umft.ro; Neamtu, M. [Department of Forecasting, Economic Analysis, Mathematics and Statistics, West University of Timisoara, Str. Pestalozzi, nr. 14A, 300115 Timisoara (Romania)]. E-mail: mihaela.neamtu@fse.uvt.ro; Opris, D. [Department of Applied Mathematics, West University of Timisoara, Bd. V. Parvan, nr. 4, 300223 Timisoara (Romania)]. E-mail: opris@math.uvt.ro; Horhat, R.F. [Department of Biophysics and Medical Informatics, University of Medicine and Pharmacy, Piata Eftimie Murgu, nr. 3, 300041 Timisoara (Romania)]. E-mail: rhorhat@yahoo.com

    2006-11-15

    Specific activator and repressor transcription factors which bind to specific regulator DNA sequences, play an important role in gene activity control. Interactions between genes coding such transcription factors should explain the different stable or sometimes oscillatory gene activities characteristic for different tissues. Starting with the model P53-MDM2 described into [Mihalas GI, Simon Z, Balea G, Popa E. Possible oscillatory behaviour in P53-MDM2 interaction computer simulation. J Biol Syst 2000;8(1):21-9] and the process described into [Kohn KW, Pommier Y. Molecular interaction map of P53 and MDM2 logic elements, which control the off-on switch of P53 in response to DNA damage. Biochem Biophys Res Commun 2005;331:816-27] we enveloped a new model of this interaction. Choosing the delay as a bifurcation parameter we study the direction and stability of the bifurcating periodic solutions. Some numerical examples are finally given for justifying the theoretical results.

  4. Building a Narrative Based Requirements Engineering Mediation Model

    Science.gov (United States)

    Ma, Nan; Hall, Tracy; Barker, Trevor

    This paper presents a narrative-based Requirements Engineering (RE) mediation model to help RE practitioners to effectively identify, define, and resolve conflicts of interest, goals, and requirements. Within the SPI community, there is a common belief that social, human, and organizational issues significantly impact on the effectiveness of software process improvement in general and the requirements engineering process in particularl. Conflicts among different stakeholders are an important human and social issue that need more research attention in the SPI and RE community. By drawing on the conflict resolution literature and IS literature, we argue that conflict resolution in RE is a mediated process, in which a requirements engineer can act as a mediator among different stakeholders. To address socio-psychological aspects of conflict in RE and SPI, Winslade and Monk (2000)'s narrative mediation model is introduced, justified, and translated into the context of RE.

  5. A Novel Computer Virus Propagation Model under Security Classification

    Directory of Open Access Journals (Sweden)

    Qingyi Zhu

    2017-01-01

    Full Text Available In reality, some computers have specific security classification. For the sake of safety and cost, the security level of computers will be upgraded with increasing of threats in networks. Here we assume that there exists a threshold value which determines when countermeasures should be taken to level up the security of a fraction of computers with low security level. And in some specific realistic environments the propagation network can be regarded as fully interconnected. Inspired by these facts, this paper presents a novel computer virus dynamics model considering the impact brought by security classification in full interconnection network. By using the theory of dynamic stability, the existence of equilibria and stability conditions is analysed and proved. And the above optimal threshold value is given analytically. Then, some numerical experiments are made to justify the model. Besides, some discussions and antivirus measures are given.

  6. A multi-criteria model for maintenance job scheduling

    Directory of Open Access Journals (Sweden)

    Sunday A. Oke

    2007-12-01

    Full Text Available This paper presents a multi-criteria maintenance job scheduling model, which is formulated using a weighted multi-criteria integer linear programming maintenance scheduling framework. Three criteria, which have direct relationship with the primary objectives of a typical production setting, were used. These criteria are namely minimization of equipment idle time, manpower idle time and lateness of job with unit parity. The mathematical model constrained by available equipment, manpower and job available time within planning horizon was tested with a 10-job, 8-hour time horizon problem with declared equipment and manpower available as against the required. The results, analysis and illustrations justify multi-criteria consideration. Thus, maintenance managers are equipped with a tool for adequate decision making that guides against error in the accumulated data which may lead to wrong decision making. The idea presented is new since it provides an approach that has not been documented previously in the literature.

  7. Dynamics of a Computer Virus Propagation Model with Delays and Graded Infection Rate

    Directory of Open Access Journals (Sweden)

    Zizhen Zhang

    2017-01-01

    Full Text Available A four-compartment computer virus propagation model with two delays and graded infection rate is investigated in this paper. The critical values where a Hopf bifurcation occurs are obtained by analyzing the distribution of eigenvalues of the corresponding characteristic equation. In succession, direction and stability of the Hopf bifurcation when the two delays are not equal are determined by using normal form theory and center manifold theorem. Finally, some numerical simulations are also carried out to justify the obtained theoretical results.

  8. A model for the inverse 1-median problem on trees under uncertain costs

    Directory of Open Access Journals (Sweden)

    Kien Trung Nguyen

    2016-01-01

    Full Text Available We consider the problem of justifying vertex weights of a tree under uncertain costs so that a prespecified vertex become optimal and the total cost should be optimal in the uncertainty scenario. We propose a model which delivers the information about the optimal cost which respect to each confidence level \\(\\alpha \\in [0,1]\\. To obtain this goal, we first define an uncertain variable with respect to the minimum cost in each confidence level. If all costs are independently linear distributed, we present the inverse distribution function of this uncertain variable in \\(O(n^{2}\\log n\\ time, where \\(n\\ is the number of vertices in the tree.

  9. Development, modelling and evaluation of a small-scale gas liquefaction plant

    DEFF Research Database (Denmark)

    Nguyen, Tuong-Van; Rothuizen, Erasmus Damgaard; Markussen, Wiebke Brix

    2017-01-01

    plant uses a multi-component refrigerant together with a propane precooling cycle and plate heat exchangers, to achieve a higher performance. This LNG production concept was modelled based on the Danish natural gas composition. Firstly, the total power consumption and heat transfer conductance were...... measurements, for a feed processing rate of 2160 kg/h. The results indicate that the specific power consumption can be reduced to the 1400-1800 kJ/kg range, for an exergetic efficiency of 25-30%. A good agreement between the simulation and experimental results was found, which justifies the use of the property...

  10. Environmental management model for small dairies in the industrial corridor of boyaca (colombia)

    OpenAIRE

    Deháquiz Mejía, Janneth Esperanza; Rodríguez C., Luis Felipe; Bermúdez C., Lilia Teresa

    2012-01-01

    justify; line-height: normal; margin: 0cm 0cm 0pt; mso-layout-grid-align: none;">This research addresses the general objective of environmental issues, focusing on the Environmental Management System model that can be applied to small businesses in the dairy industry in the Industrial Corridor of Boyaca, for which the En...

  11. HEALTH CARE MODELS AND SOCIAL CONTROL STRATEGIES

    Directory of Open Access Journals (Sweden)

    Aline Vieira Simões

    2011-06-01

    Full Text Available This study aimed to understand the context of health care models and the social control strategies. It is a bibliographic review of critical and reflexive nature based of the references by technical texts, scientific publications and official documents related to public health policies, assisting in the preparation of candidates in the exam for knowledge. It has been selected eleven books and five articles. The material was categorized into three approaches: Historical Context of Public Health Policies, Health Care Models and Social Control Strategies. The results analysis and discussion subsidized the understanding of public health policies, since the implementation of SUS, and regulates health care; however a large country like Brazil, a single model of health care would not be able to meet the demands of health services, which justifies the implementation of various proposals. And, for social control it was possible to understand its influence on public policy changes, where we have identified the health councils and conferences as social control strategies, involving social actors in a critical and constructive role in the process of changing models of care.

  12. Evidence for the credibility of health economic models for health policy decision-making: a systematic literature review of screening for abdominal aortic aneurysms

    DEFF Research Database (Denmark)

    Søgaard, Rikke; Lindholt, Jes S.

    2012-01-01

    OBJECTIVE: To investigate whether the credibility of health economic models of screening for abdominal aortic aneurysms for health policy decision-making has improved since 2005 when a systematic review by Campbell et al. concluded that reporting standards were poor and there was divergence between...... benefited from general advances in health economic modelling and some improvements in reporting were noted. However, the low level of agreement between studies in model structures and assumptions, and difficulty in justifying these (convergent validity), remain a threat to the credibility of health economic...

  13. Sensitivity of fluvial sediment source apportionment to mixing model assumptions: A Bayesian model comparison.

    Science.gov (United States)

    Cooper, Richard J; Krueger, Tobias; Hiscock, Kevin M; Rawlins, Barry G

    2014-11-01

    Mixing models have become increasingly common tools for apportioning fluvial sediment load to various sediment sources across catchments using a wide variety of Bayesian and frequentist modeling approaches. In this study, we demonstrate how different model setups can impact upon resulting source apportionment estimates in a Bayesian framework via a one-factor-at-a-time (OFAT) sensitivity analysis. We formulate 13 versions of a mixing model, each with different error assumptions and model structural choices, and apply them to sediment geochemistry data from the River Blackwater, Norfolk, UK, to apportion suspended particulate matter (SPM) contributions from three sources (arable topsoils, road verges, and subsurface material) under base flow conditions between August 2012 and August 2013. Whilst all 13 models estimate subsurface sources to be the largest contributor of SPM (median ∼76%), comparison of apportionment estimates reveal varying degrees of sensitivity to changing priors, inclusion of covariance terms, incorporation of time-variant distributions, and methods of proportion characterization. We also demonstrate differences in apportionment results between a full and an empirical Bayesian setup, and between a Bayesian and a frequentist optimization approach. This OFAT sensitivity analysis reveals that mixing model structural choices and error assumptions can significantly impact upon sediment source apportionment results, with estimated median contributions in this study varying by up to 21% between model versions. Users of mixing models are therefore strongly advised to carefully consider and justify their choice of model structure prior to conducting sediment source apportionment investigations. An OFAT sensitivity analysis of sediment fingerprinting mixing models is conductedBayesian models display high sensitivity to error assumptions and structural choicesSource apportionment results differ between Bayesian and frequentist approaches.

  14. MAGNETICALLY LEVITATED TRAIN'S SUSPENSION MODEL

    Directory of Open Access Journals (Sweden)

    V. A. Polyakov

    2017-10-01

    Full Text Available Purpose. The implementation of the magnetically levitated train’s (MLT levitation force (LF occurs during the interaction between fields of superconducting train’s (STC and short-circuited track’s contours (STC, which are included in to levitation module (LU. Based on this, the purpose of this study is to obtain a correct description of such interaction. Methodology. At the present stage, the main and most universal tool for the analysis and synthesis of processes and systems is their mathematical and, in particular, computer modeling. At the same time, the radical advantages of this tool make even more important the precision of choosing a specific methodology for research conducting. This is particularly relevant in relation to such large and complex systems as MLT. For this reason, the work pays special attention to the reasoned choice of the selective features of the research paradigm. Findings. The analysis results of existing versions of LF implementation’s models show that each of them, along with the advantages, also has significant drawbacks. In this regard, one of the main results of the study should be the construction of this force implementation’s mathematical model, which preserves the advantages of the mentioned versions, but free from their shortcomings. The rationality of application, for the train’s LF researching, of an integrative holistic paradigm, which assimilates the advantages of the electric circuit's and magnetic field's theory’s, is reasonably justified in work. Originality. The scientific novelty of the research – in priority of such a paradigm’s and the corresponding version’s of the LF’s implementation’s model’s creating. Practical value. The main manifestation of the practical significance of the work is the possibility, in the case of using its results, to significantly increase the effectiveness of dynamic MLT research while reducing their resource costing.

  15. GIS-Based Fast Moving Landslide Risk Analysis Model Using Qualitative Approach: A Case Study of Balakot, Pakistan

    Directory of Open Access Journals (Sweden)

    Abdul Salam Soomro

    2011-04-01

    Full Text Available The innovation of this research is the development of new model called fast moving landslide risk analysis model by modifying one of the previous prominent landslide risk algorithms focusing on the fast moving type of the landslides (such as mudslides, mud flows, block slide, rock fall and topple based on the qualitative approach using Heuristic method in GIS (Geographical Information Systems. The different event controlling parameters and criteria were used for fast moving landslide predictive risk model. The pair wise comparison method was used in which the parameters of landslide hazard and vulnerability were compared by their assigned weights. The drawback of the used approach was justified by the standard value of consistency ratio, which proved the assigned weight of the parameters as reasonable and consistent. The model was validated by using the occurred landslides inventory data and correlation coefficient test, which showed the positive relationship between the landslide risk predicted regions and the occurred landslides locations. The various landslide events occurred on 8th October, 2005 were accumulated as landslide inventory by the interpretation of satellite imagery. The validation of the model was justified by using one of the statistical two paired, \\"t\\" test, and the amount of the predicted risk in the different regions. It is believed that this modified model will prove beneficial to the decision makers in future.

  16. On the validity of evolutionary models with site-specific parameters.

    Directory of Open Access Journals (Sweden)

    Konrad Scheffler

    Full Text Available Evolutionary models that make use of site-specific parameters have recently been criticized on the grounds that parameter estimates obtained under such models can be unreliable and lack theoretical guarantees of convergence. We present a simulation study providing empirical evidence that a simple version of the models in question does exhibit sensible convergence behavior and that additional taxa, despite not being independent of each other, lead to improved parameter estimates. Although it would be desirable to have theoretical guarantees of this, we argue that such guarantees would not be sufficient to justify the use of these models in practice. Instead, we emphasize the importance of taking the variance of parameter estimates into account rather than blindly trusting point estimates - this is standardly done by using the models to construct statistical hypothesis tests, which are then validated empirically via simulation studies.

  17. Small is beautiful: models of small neuronal networks.

    Science.gov (United States)

    Lamb, Damon G; Calabrese, Ronald L

    2012-08-01

    Modeling has contributed a great deal to our understanding of how individual neurons and neuronal networks function. In this review, we focus on models of the small neuronal networks of invertebrates, especially rhythmically active CPG networks. Models have elucidated many aspects of these networks, from identifying key interacting membrane properties to pointing out gaps in our understanding, for example missing neurons. Even the complex CPGs of vertebrates, such as those that underlie respiration, have been reduced to small network models to great effect. Modeling of these networks spans from simplified models, which are amenable to mathematical analyses, to very complicated biophysical models. Some researchers have now adopted a population approach, where they generate and analyze many related models that differ in a few to several judiciously chosen free parameters; often these parameters show variability across animals and thus justify the approach. Models of small neuronal networks will continue to expand and refine our understanding of how neuronal networks in all animals program motor output, process sensory information and learn. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Model(ing) Law

    DEFF Research Database (Denmark)

    Carlson, Kerstin

    The International Criminal Tribunal for the former Yugoslavia (ICTY) was the first and most celebrated of a wave of international criminal tribunals (ICTs) built in the 1990s designed to advance liberalism through international criminal law. Model(ing) Justice examines the case law of the ICTY...

  19. Models and role models

    NARCIS (Netherlands)

    ten Cate, J.M.

    2015-01-01

    Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of

  20. Contributions of the Model of Modelling Diagram to the Learning of Ionic Bonding: Analysis of A Case Study

    Science.gov (United States)

    Mendonça, Paula Cristina Cardoso; Justi, Rosária

    2011-08-01

    Current proposals for science education recognise the importance of students' involvement in activities aimed at favouring the understanding of science as a human, dynamic and non-linear construct. Modelling-based teaching is one of the alternatives through which to address such issues. Modelling-based teaching activities for ionic bonding were introduced. This topic was chosen because of both the high incidence of students' alternative conceptions and its abstract nature, which justify the need for understanding complex models. The diagram Model of Modelling was used as a theoretical construct during the development of the teaching activities, which were implemented in a Brazilian medium level public school class (16-18 years old students). The data collected were the written material and models produced by the students, the content-knowledge tests, the video-recording of the lessons, and the observations and field notes of both the teacher and the researcher who observed the lessons. The analysis of such data enabled the production of case studies for each of the student groups. In this paper, we analyse one of the case studies, looking for evidence about the way that specific elements of the teaching strategy supported students' learning. It supported our belief in the use of the Model of Modelling diagram as a theoretical construct with which to develop and analyse modelling-based teaching activities.

  1. A general phenomenological model for work function

    Science.gov (United States)

    Brodie, I.; Chou, S. H.; Yuan, H.

    2014-07-01

    A general phenomenological model is presented for obtaining the zero Kelvin work function of any crystal facet of metals and semiconductors, both clean and covered with a monolayer of electropositive atoms. It utilizes the known physical structure of the crystal and the Fermi energy of the two-dimensional electron gas assumed to form on the surface. A key parameter is the number of electrons donated to the surface electron gas per surface lattice site or adsorbed atom, which is taken to be an integer. Initially this is found by trial and later justified by examining the state of the valence electrons of the relevant atoms. In the case of adsorbed monolayers of electropositive atoms a satisfactory justification could not always be found, particularly for cesium, but a trial value always predicted work functions close to the experimental values. The model can also predict the variation of work function with temperature for clean crystal facets. The model is applied to various crystal faces of tungsten, aluminium, silver, and select metal oxides, and most demonstrate good fits compared to available experimental values.

  2. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  3. Local yield stress statistics in model amorphous solids

    Science.gov (United States)

    Barbot, Armand; Lerbinger, Matthias; Hernandez-Garcia, Anier; García-García, Reinaldo; Falk, Michael L.; Vandembroucq, Damien; Patinet, Sylvain

    2018-03-01

    We develop and extend a method presented by Patinet, Vandembroucq, and Falk [Phys. Rev. Lett. 117, 045501 (2016), 10.1103/PhysRevLett.117.045501] to compute the local yield stresses at the atomic scale in model two-dimensional Lennard-Jones glasses produced via differing quench protocols. This technique allows us to sample the plastic rearrangements in a nonperturbative manner for different loading directions on a well-controlled length scale. Plastic activity upon shearing correlates strongly with the locations of low yield stresses in the quenched states. This correlation is higher in more structurally relaxed systems. The distribution of local yield stresses is also shown to strongly depend on the quench protocol: the more relaxed the glass, the higher the local plastic thresholds. Analysis of the magnitude of local plastic relaxations reveals that stress drops follow exponential distributions, justifying the hypothesis of an average characteristic amplitude often conjectured in mesoscopic or continuum models. The amplitude of the local plastic rearrangements increases on average with the yield stress, regardless of the system preparation. The local yield stress varies with the shear orientation tested and strongly correlates with the plastic rearrangement locations when the system is sheared correspondingly. It is thus argued that plastic rearrangements are the consequence of shear transformation zones encoded in the glass structure that possess weak slip planes along different orientations. Finally, we justify the length scale employed in this work and extract the yield threshold statistics as a function of the size of the probing zones. This method makes it possible to derive physically grounded models of plasticity for amorphous materials by directly revealing the relevant details of the shear transformation zones that mediate this process.

  4. Sandwich corrected standard errors in family-based genome-wide association studies.

    Science.gov (United States)

    Minică, Camelia C; Dolan, Conor V; Kampert, Maarten M D; Boomsma, Dorret I; Vink, Jacqueline M

    2015-03-01

    Given the availability of genotype and phenotype data collected in family members, the question arises which estimator ensures the most optimal use of such data in genome-wide scans. Using simulations, we compared the Unweighted Least Squares (ULS) and Maximum Likelihood (ML) procedures. The former is implemented in Plink and uses a sandwich correction to correct the standard errors for model misspecification of ignoring the clustering. The latter is implemented by fast linear mixed procedures and models explicitly the familial resemblance. However, as it commits to a background model limited to additive genetic and unshared environmental effects, it employs a misspecified model for traits with a shared environmental component. We considered the performance of the two procedures in terms of type I and type II error rates, with correct and incorrect model specification in ML. For traits characterized by moderate to large familial resemblance, using an ML procedure with a correctly specified model for the conditional familial covariance matrix should be the strategy of choice. The potential loss in power encountered by the sandwich corrected ULS procedure does not outweigh its computational convenience. Furthermore, the ML procedure was quite robust under model misspecification in the simulated settings and appreciably more powerful than the sandwich corrected ULS procedure. However, to correct for the effects of model misspecification in ML in circumstances other than those considered here, we propose to use a sandwich correction. We show that the sandwich correction can be formulated in terms of the fast ML method.

  5. Topological phases in the non-Hermitian Su-Schrieffer-Heeger model

    Science.gov (United States)

    Lieu, Simon

    2018-01-01

    We address the conditions required for a Z topological classification in the most general form of the non-Hermitian Su-Schrieffer-Heeger (SSH) model. Any chirally symmetric SSH model will possess a "conjugated-pseudo-Hermiticity" which we show is responsible for a quantized "complex" Berry phase. Consequently, we provide an example where the complex Berry phase of a band is used as a quantized invariant to predict the existence of gapless edge modes in a non-Hermitian model. The chirally broken, P T -symmetric model is studied; we suggest an explanation for why the topological invariant is a global property of the Hamiltonian. A geometrical picture is provided by examining eigenvector evolution on the Bloch sphere. We justify our analysis numerically and discuss relevant applications.

  6. Modelling of the behaviour of a UF6 container in a fire

    International Nuclear Information System (INIS)

    Pinton, Eric

    1996-01-01

    This thesis is justified by the safety needs about storage and transport of UF 6 containers. To define their behaviour under fire conditions, a modelling was developed. Before tackling the numerical modelling, a phenomenological interpretation with experimental results of containers inside a furnace (800 C) during a fixed period was carried out. The internal heat transfers were considerably improved with these results. The 2D elaborated model takes into account most of the physical phenomena encountered in this type of situation (boiling, evaporation, condensation, radiant heat transfers through an absorbing gas, convection, pressurisation, thermal contact resistance, UF 6 expansion, solid core sinking in the liquid, elastic and plastic deformations of the steel container). This model was successfully confronted with experiments. (author) [fr

  7. HYPERELASTIC MODELS FOR GRANULAR MATERIALS

    Energy Technology Data Exchange (ETDEWEB)

    Humrickhouse, Paul W; Corradini, Michael L

    2009-01-29

    A continuum framework for modeling of dust mobilization and transport, and the behavior of granular systems in general, has been reviewed, developed and evaluated for reactor design applications. The large quantities of micron-sized particles expected in the international fusion reactor design, ITER, will accumulate into piles and layers on surfaces, which are large relative to the individual particle size; thus, particle-particle, rather than particle-surface, interactions will determine the behavior of the material in bulk, and a continuum approach is necessary and justified in treating the phenomena of interest; e.g., particle resuspension and transport. The various constitutive relations that characterize these solid particle interactions in dense granular flows have been discussed previously, but prior to mobilization their behavior is not even fluid. Even in the absence of adhesive forces between particles, dust or sand piles can exist in static equilibrium under gravity and other forces, e.g., fluid shear. Their behavior is understood to be elastic, though not linear. The recent “granular elasticity” theory proposes a non-linear elastic model based on “Hertz contacts” between particles; the theory identifies the Coulomb yield condition as a requirement for thermodynamic stability, and has successfully reproduced experimental results for stress distributions in sand piles. The granular elasticity theory is developed and implemented in a stand- alone model and then implemented as part of a finite element model, ABAQUS, to determine the stress distributions in dust piles subjected to shear by a fluid flow. We identify yield with the onset of mobilization, and establish, for a given dust pile and flow geometry, the threshold pressure (force) conditions on the surface due to flow required to initiate it. While the granular elasticity theory applies strictly to cohesionless granular materials, attractive forces are clearly important in the interaction of

  8. Inhalation Exposure Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    K. Rautenstrauch

    2004-01-01

    This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception

  9. Inhalation Exposure Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rautenstrauch

    2004-09-10

    This analysis is one of 10 reports that support the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents development of input parameters for the biosphere model that are related to atmospheric mass loading and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for a Yucca Mountain repository. Inhalation Exposure Input Parameters for the Biosphere Model is one of five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the Technical Work Plan for Biosphere Modeling and Expert Support (BSC 2004 [DIRS 169573]). This analysis report defines and justifies values of mass loading for the biosphere model. Mass loading is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Mass loading values are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air inhaled by a receptor and concentrations in air surrounding crops. Concentrations in air to which the receptor is exposed are then used in the inhalation submodel to calculate the dose contribution to the receptor from inhalation of contaminated airborne particles. Concentrations in air surrounding plants are used in the plant submodel to calculate the concentrations of radionuclides in foodstuffs contributed from uptake by foliar interception.

  10. CEAI: CCM-based email authorship identification model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2013-11-01

    Full Text Available In this paper we present a model for email authorship identification (EAI by employing a Cluster-based Classification (CCM technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature set to include some more interesting and effective features for email authorship identification (e.g., the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell. We also included Info Gain feature selection based content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM-based models, as well as the models proposed by Iqbal et al. (2010, 2013 [1,2]. The proposed model attains an accuracy rate of 94% for 10 authors, 89% for 25 authors, and 81% for 50 authors, respectively on Enron dataset, while 89.5% accuracy has been achieved on authors’ constructed real email dataset. The results on Enron dataset have been achieved on quite a large number of authors as compared to the models proposed by Iqbal et al. [1,2].

  11. AN ACCURATE MODELING OF DELAY AND SLEW METRICS FOR ON-CHIP VLSI RC INTERCONNECTS FOR RAMP INPUTS USING BURR’S DISTRIBUTION FUNCTION

    Directory of Open Access Journals (Sweden)

    Rajib Kar

    2010-09-01

    Full Text Available This work presents an accurate and efficient model to compute the delay and slew metric of on-chip interconnect of high speed CMOS circuits foe ramp input. Our metric assumption is based on the Burr’s Distribution function. The Burr’s distribution is used to characterize the normalized homogeneous portion of the step response. We used the PERI (Probability distribution function Extension for Ramp Inputs technique that extends delay metrics and slew metric for step inputs to the more general and realistic non-step inputs. The accuracy of our models is justified with the results compared with that of SPICE simulations.

  12. Geometric singularities and spectra of Landau-Ginzburg models

    International Nuclear Information System (INIS)

    Greene, B.R.; Roan, S.S.; Yau, S.T.

    1991-01-01

    Some mathematical and physical aspects of superconformal string compactification in weighted projective space are discussed. In particular, we recast the path integral argument establishing the connection between Landau-Ginsburg conformal theories and Calabi-Yau string compactification in a geometric framework. We then prove that the naive expression for the vanishing of the first Chern class for a complete intersection (adopted from the smooth case) is sufficient to ensure that the resulting variety, which is generically singular, can be resolved to a smooth Calabi-Yau space. This justifies much analysis which has recently been expended on the study of Landau-Ginzburg models. Furthermore, we derive some simple formulae for the determination of the Witten index in these theories which are complementary to those derived using semiclassical reasoning by Vafa. Finally, we also comment on the possible geometrical significance of unorbifolded Landau-Ginzburg theories. (orig.)

  13. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...

  14. Electromechanical modelling of tapered ionic polymer metal composites transducers

    Directory of Open Access Journals (Sweden)

    Rakesha Chandra Dash

    2016-09-01

    Full Text Available Ionic polymer metal composites (IPMCs are relatively new smart materials that exhibit a bidirectional electromechanical coupling. IPMCs have large number of important engineering applications such as micro robotics, biomedical devices, biomimetic robotics etc. This paper presents a comparison between tapered and uniform cantilevered Nafion based IPMCs transducer. Electromechanical modelling is done for the tapered beam. Thickness can be varied according to the requirement of force and deflection. Numerical results pertaining to the force and deflection characteristics of both type IPMCs transducer are obtained. It is shown that the desired amount of force and deflections for tapered IPMCs can be achieved for a given voltage. Different fixed end (t0 and free end (t1 thickness values have been taken to justify the results using MATLAB.

  15. Nonlinear Model Predictive Control for Oil Reservoirs Management

    DEFF Research Database (Denmark)

    Capolei, Andrea

    . The controller consists of -A model based optimizer for maximizing some predicted financial measure of the reservoir (e.g. the net present value). -A parameter and state estimator. -Use of the moving horizon principle for data assimilation and implementation of the computed control input. The optimizer uses...... Optimization has been suggested to compensate for inherent geological uncertainties in an oil field. In robust optimization of an oil reservoir, the water injection and production borehole pressures are computed such that the predicted net present value of an ensemble of permeability field realizations...... equivalent strategy is not justified for the particular case studied in this paper. The third contribution of this thesis is a mean-variance method for risk mitigation in production optimization of oil reservoirs. We introduce a return-risk bicriterion objective function for the profit-risk tradeoff...

  16. A model of algorithmic representation of a business process

    Directory of Open Access Journals (Sweden)

    E. I. Koshkarova

    2014-01-01

    Full Text Available This article presents and justifies the possibility of developing a method for estimation and optimization of an enterprise business processes; the proposed method is based on identity of two notions – an algorithm and a business process. The described method relies on extraction of a recursive model from the business process, based on the example of one process automated by the BPM system and further estimation and optimization of that process in accordance with estimation and optimization techniques applied to algorithms. The results of this investigation could be used by experts working in the field of reengineering of enterprise business processes, automation of business processes along with development of enterprise informational systems.

  17. Mathematical models of gas-dynamic and thermophysical processes in underground coal mining at different stages of mine development

    Directory of Open Access Journals (Sweden)

    М. В. Грязев

    2017-03-01

    Full Text Available New trends have been traced and the existing ones refined regarding filtration and diffusive motion of gases in coal beds and surrounding rock, spontaneous heating of coal and transport of gas traces by ventilation currents in operating coal mines. Mathematical models of gas-dynamic and thermophysical processes inside underworked territories after mine abandonment have been justified. Mathematical models are given for feasible air feeding of production and development areas, as well as for the development of geotechnical solutions to ensure gas-dynamic safety at every stage of coal mine operation. It is demonstrated that the use of high-performance equipment in the production and development areas requires more precise filtration equations used when assessing coal mine methane hazard. A mathematical model of pressure field of non-associated methane in the edge area of the coal seam has been justified. The model is based on one-dimensional hyperbolic equation and takes into consideration final rate of pressure distribution in the seam. Trends in gas exchange between mined-out spaces of high methane- and CO2-concentration mines with the earth surface have been refined in order to ensure environmental safety of underworked territories.

  18. Cognitive modeling

    OpenAIRE

    Zandbelt, Bram

    2017-01-01

    Introductory presentation on cognitive modeling for the course ‘Cognitive control’ of the MSc program Cognitive Neuroscience at Radboud University. It addresses basic questions, such as 'What is a model?', 'Why use models?', and 'How to use models?'

  19. CONCEPTUAL AND METHODOLOGICAL MISTAKES IN PSYCHOLOGY AND HEALTH: A CASE STUDY ON THE USE AND ABUSE OF STRUCTURAL EQUATION MODELLING

    Directory of Open Access Journals (Sweden)

    Julio Alfonso Piña López

    2016-09-01

    Full Text Available In this article, a research paper is analysed, which was justified based on the theory of developmental psychopathology, the protective factors, self-regulation, resilience, and quality of life among individuals who lived with type 2 diabetes and hypertension. Structural equation modelling (SEM was used for the data analysis. Although the authors conclude that the data are adequate to the theory tested, they commit errors of logic, concept, methodology and interpretation which, taken together, demonstrate a flagrant rupture between the theory and the data.

  20. Nongeneric tool support for model-driven product development; Werkzeugunterstuetzung fuer die modellbasierte Produktentwicklung. Maschinenlesbare Spezifikationen selbst erstellen

    Energy Technology Data Exchange (ETDEWEB)

    Bock, C. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Zuehlke, D. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Deutsches Forschungszentrum fuer Kuenstliche Intelligenz (DFKI), Kaiserslautern (DE). Zentrum fuer Mensch-Maschine-Interaktion (ZMMI)

    2006-07-15

    A well-defined specification process is a central success factor in human-machine-interface development. Consequently in interdisciplinary development teams specification documents are an important communication instrument. In order to replace todays typically paper-based specification and to leverage the benefits of their electronic equivalents developers demand comprehensive and applicable computer-based tool kits. Manufacturers' increasing awareness of appropriate tool support causes alternative approaches for tool kit creation to emerge. Therefore this article introduces meta-modelling as a promising attempt to create nongeneric tool support with justifiable effort. This enables manufacturers to take advantage of electronic specifications in product development processes.

  1. The green electricity market model. Proposal for an optional, cost-neutral direct marketing model for supplying electricity customers

    International Nuclear Information System (INIS)

    Heinemann, Ronald

    2014-01-01

    One of the main goals of the Renewable Energy Law (EEG) is the market integration of renewable energy resources. For this purpose it has introduced compulsory direct marketing on the basis of a moving market premium. At the same time the green electricity privilege, a regulation which made it possible for customers to be supplied with electricity from EEG plants, has been abolished without substitution with effect from 1 August 2014. This means that, aside from other direct marketing channels, which will not be economically viable save for in a few exceptional cases, it will no longer be possible in future to sell electricity from EEG plants to electricity customers under the designation ''electricity from renewable energy''. The reason for this is that electricity sold under the market premium model can no longer justifiably be said to originate from renewable energy. As a consequence, almost all green electricity products sold in Germany carry a foreign green electricity certificate.

  2. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  3. Modelling the models

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models.   Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...

  4. On a Mathematical Model of Brain Activities

    International Nuclear Information System (INIS)

    Fichtner, K.-H.; Fichtner, L.; Freudenberg, W.; Ohya, M.

    2007-01-01

    The procedure of recognition can be described as follows: There is a set of complex signals stored in the memory. Choosing one of these signals may be interpreted as generating a hypothesis concerning an 'expexted view of the world'. Then the brain compares a signal arising from our senses with the signal chosen from the memory leading to a change of the state of both signals. Furthermore, measurements of that procedure like EEG or MEG are based on the fact that recognition of signals causes a certain loss of excited neurons, i.e. the neurons change their state from 'excited' to 'nonexcited'. For that reason a statistical model of the recognition process should reflect both--the change of the signals and the loss of excited neurons. A first attempt to explain the process of recognition in terms of quantum statistics was given. In the present note it is not possible to present this approach in detail. In lieu we will sketch roughly a few of the basic ideas and structures of the proposed model of the recognition process (Section). Further, we introduce the basic spaces and justify the choice of spaces used in this approach. A more elaborate presentation including all proofs will be given in a series of some forthcoming papers. In this series also the procedures of creation of signals from the memory, amplification, accumulation and transformation of input signals, and measurements like EEG and MEG will be treated in detail

  5. Modelling contractor’s bidding decision

    Directory of Open Access Journals (Sweden)

    Biruk Sławomir

    2017-03-01

    Full Text Available The authors aim to provide a set of tools to facilitate the main stages of the competitive bidding process for construction contractors. These involve 1 deciding whether to bid, 2 calculating the total price, and 3 breaking down the total price into the items of the bill of quantities or the schedule of payments to optimise contractor cash flows. To define factors that affect the decision to bid, the authors rely upon literature on the subject and put forward that multi-criteria methods are applied to calculate a single measure of contract attractiveness (utility value. An attractive contract implies that the contractor is likely to offer a lower price to increase chances of winning the competition. The total bid price is thus to be interpolated between the lowest acceptable and the highest justifiable price based on the contract attractiveness. With the total bid price established, the next step is to split it between the items of the schedule of payments. A linear programming model is proposed for this purpose. The application of the models is illustrated with a numerical example.

  6. The Intercultural Danube - a European Model

    Directory of Open Access Journals (Sweden)

    Gheorghe Lateș

    2014-08-01

    Full Text Available The EU construction began following the logic of economics, which in time it has created dysfunctions that seem to accentuate and create a quasi-general skepticism. This paper aims at analyzing the union construction and reconstruction on other conceptual premises, placing culture at the forefront of the new strategy. A multicultural Europe, based on the state primordial ethnicity is no longer current; the cultural diversity does not lead to unity, but rather it is a factor of dissolution. The Danubian model reunites races, languages and religions, being so diverse that their functional diachrony justifies the idea of reconstruction, based on what it was, and it did not generate tensions or conflicts. The ethnic identity did not become, in the Danube area, ethnicism, what it constitutes in a synchronic approach, as a model of rethinking the union, not by hierarchies, barriers, but rather by the opportunity of the coexistence of the peoples that connect history and the present of the horizontal axis River of a united Europe.

  7. Study and discretization of kinetic models and fluid models at low Mach number

    International Nuclear Information System (INIS)

    Dellacherie, Stephane

    2011-01-01

    This thesis summarizes our work between 1995 and 2010. It concerns the analysis and the discretization of Fokker-Planck or semi-classical Boltzmann kinetic models and of Euler or Navier-Stokes fluid models at low Mach number. The studied Fokker-Planck equation models the collisions between ions and electrons in a hot plasma, and is here applied to the inertial confinement fusion. The studied semi-classical Boltzmann equations are of two types. The first one models the thermonuclear reaction between a deuterium ion and a tritium ion producing an α particle and a neutron particle, and is also in our case used to describe inertial confinement fusion. The second one (known as the Wang-Chang and Uhlenbeck equations) models the transitions between electronic quantified energy levels of uranium and iron atoms in the AVLIS isotopic separation process. The basic properties of these two Boltzmann equations are studied, and, for the Wang-Chang and Uhlenbeck equations, a kinetic-fluid coupling algorithm is proposed. This kinetic-fluid coupling algorithm incited us to study the relaxation concept for gas and immiscible fluids mixtures, and to underline connections with classical kinetic theory. Then, a diphasic low Mach number model without acoustic waves is proposed to model the deformation of the interface between two immiscible fluids induced by high heat transfers at low Mach number. In order to increase the accuracy of the results without increasing computational cost, an AMR algorithm is studied on a simplified interface deformation model. These low Mach number studies also incited us to analyse on cartesian meshes the inaccuracy at low Mach number of Godunov schemes. Finally, the LBM algorithm applied to the heat equation is justified

  8. Modelling Practice

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...... years of experience is providing in directing the reader in their activities.Traps and pitfalls are discussed and strategies also given to improve model development towards “fit-for-purpose” models. The emphasis in this chapter is the adoption and exercise of a modelling methodology that has proven very...

  9. Power Quality Analysis Using a Hybrid Model of the Fuzzy Min-Max Neural Network and Clustering Tree.

    Science.gov (United States)

    Seera, Manjeevan; Lim, Chee Peng; Loo, Chu Kiong; Singh, Harapajan

    2016-12-01

    A hybrid intelligent model comprising a modified fuzzy min-max (FMM) clustering neural network and a modified clustering tree (CT) is developed. A review of clustering models with rule extraction capabilities is presented. The hybrid FMM-CT model is explained. We first use several benchmark problems to illustrate the cluster evolution patterns from the proposed modifications in FMM. Then, we employ a case study with real data related to power quality monitoring to assess the usefulness of FMM-CT. The results are compared with those from other clustering models. More importantly, we extract explanatory rules from FMM-CT to justify its predictions. The empirical findings indicate the usefulness of the proposed model in tackling data clustering and power quality monitoring problems under different environments.

  10. Linear versus quadratic portfolio optimization model with transaction cost

    Science.gov (United States)

    Razak, Norhidayah Bt Ab; Kamil, Karmila Hanim; Elias, Siti Masitah

    2014-06-01

    Optimization model is introduced to become one of the decision making tools in investment. Hence, it is always a big challenge for investors to select the best model that could fulfill their goal in investment with respect to risk and return. In this paper we aims to discuss and compare the portfolio allocation and performance generated by quadratic and linear portfolio optimization models namely of Markowitz and Maximin model respectively. The application of these models has been proven to be significant and popular among others. However transaction cost has been debated as one of the important aspects that should be considered for portfolio reallocation as portfolio return could be significantly reduced when transaction cost is taken into consideration. Therefore, recognizing the importance to consider transaction cost value when calculating portfolio' return, we formulate this paper by using data from Shariah compliant securities listed in Bursa Malaysia. It is expected that, results from this paper will effectively justify the advantage of one model to another and shed some lights in quest to find the best decision making tools in investment for individual investors.

  11. A Sensitivity Analysis of fMRI Balloon Model

    KAUST Repository

    Zayane, Chadia

    2015-04-22

    Functional magnetic resonance imaging (fMRI) allows the mapping of the brain activation through measurements of the Blood Oxygenation Level Dependent (BOLD) contrast. The characterization of the pathway from the input stimulus to the output BOLD signal requires the selection of an adequate hemodynamic model and the satisfaction of some specific conditions while conducting the experiment and calibrating the model. This paper, focuses on the identifiability of the Balloon hemodynamic model. By identifiability, we mean the ability to estimate accurately the model parameters given the input and the output measurement. Previous studies of the Balloon model have somehow added knowledge either by choosing prior distributions for the parameters, freezing some of them, or looking for the solution as a projection on a natural basis of some vector space. In these studies, the identification was generally assessed using event-related paradigms. This paper justifies the reasons behind the need of adding knowledge, choosing certain paradigms, and completing the few existing identifiability studies through a global sensitivity analysis of the Balloon model in the case of blocked design experiment.

  12. SLOWKIN: a simplified model for the simulation of reactor transients in SLOWPOKE-2

    International Nuclear Information System (INIS)

    Rozon, D.; Kavih, S.

    1997-01-01

    This paper will describe the model used to analyse reactor transients in the SLOWPOKE-2 reactor at Polytechnique. The model is intended to simulate reactor transients which will be induced by control rod displacements during commissioning of the new LEU core to be installed in the SLOWPOKE-2 reactor in 1997, in replacement of the original HEU core. A simplified treatment is justified since our objective is mainly to provide a physical interpretation for any difference observed in the transient behaviour of the new core, as opposed to the current HEU core. The SLOWKIN model used point kinetics to predict neutron power with time. The reactor physics codes DRAGON/DONJON were used to provide some reactor physics insight on the strong neutronic/thermalhydraulic coupling in the reactor and to generate the necessary reactivity coefficients to be used in SLOWKIN. (DM)

  13. More Than Meets the Eye: Toward a Post-Materialist Model of Consciousness.

    Science.gov (United States)

    Brabant, Olivier

    2016-01-01

    Commonly accepted models of human consciousness have substantial shortcomings, in the sense that they cannot account for the entire scope of human experiences. The goal of this article is to describe a model with higher explanatory power, by integrating ideas from psychology and quantum mechanics. In the first part, the need for a paradigm change will be justified by presenting three types of phenomena that challenge the materialistic view of consciousness. The second part is about proposing an alternative view of reality and mind-matter manifestation that is able to accommodate these phenomena. Finally, the ideas from the previous parts will be combined with the psychological concepts developed by Frederic W. H. Myers. The result is a more comprehensive model of human consciousness that offers a novel perspective on altered states of consciousness, genius, and mental health. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Longitudinal beta-binomial modeling using GEE for overdispersed binomial data.

    Science.gov (United States)

    Wu, Hongqian; Zhang, Ying; Long, Jeffrey D

    2017-03-15

    Longitudinal binomial data are frequently generated from multiple questionnaires and assessments in various scientific settings for which the binomial data are often overdispersed. The standard generalized linear mixed effects model may result in severe underestimation of standard errors of estimated regression parameters in such cases and hence potentially bias the statistical inference. In this paper, we propose a longitudinal beta-binomial model for overdispersed binomial data and estimate the regression parameters under a probit model using the generalized estimating equation method. A hybrid algorithm of the Fisher scoring and the method of moments is implemented for computing the method. Extensive simulation studies are conducted to justify the validity of the proposed method. Finally, the proposed method is applied to analyze functional impairment in subjects who are at risk of Huntington disease from a multisite observational study of prodromal Huntington disease. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. The professional medical ethics model of decision making under conditions of clinical uncertainty.

    Science.gov (United States)

    McCullough, Laurence B

    2013-02-01

    The professional medical ethics model of decision making may be applied to decisions clinicians and patients make under the conditions of clinical uncertainty that exist when evidence is low or very low. This model uses the ethical concepts of medicine as a profession, the professional virtues of integrity and candor and the patient's virtue of prudence, the moral management of medical uncertainty, and trial of intervention. These features combine to justifiably constrain clinicians' and patients' autonomy with the goal of preventing nondeliberative decisions of patients and clinicians. To prevent biased recommendations by the clinician that promote such nondeliberative decisions, medically reasonable alternatives supported by low or very low evidence should be offered but not recommended. The professional medical ethics model of decision making aims to improve the quality of decisions by reducing the unacceptable variation that can result from nondeliberative decision making by patients and clinicians when evidence is low or very low.

  16. Application of Activated Sludge Model No. 1 to biological treatment of pure winery effluents: case studies.

    Science.gov (United States)

    Stricker, A E; Racault, Y

    2005-01-01

    The practical applicability of computer simulation of aerobic biological treatment systems for winery effluents was investigated to enhance traditional on-site evaluation of new processes. As there is no existing modelling tool for pure winery effluent, a model widely used for municipal activated sludge (ASM1) was used. The calibration and validation steps were performed on extended on-site data. The global soluble COD, DO and OUR were properly reproduced. Possible causes for the remaining discrepancies between measured and simulated data were identified and suggestions for improvement directions were made to adapt ASM1 to winery effluents. The calibrated model was then used to simulate scenarios to evaluate the plant behaviour for different operation or design. In combination with on-site observations, it allowed us to establish useful and justified improvement suggestions for aeration tank and aeration device design as well as feed, draw and aeration operation.

  17. Sufficient Sample Size and Power in Multilevel Ordinal Logistic Regression Models

    Directory of Open Access Journals (Sweden)

    Sabz Ali

    2016-01-01

    Full Text Available For most of the time, biomedical researchers have been dealing with ordinal outcome variable in multilevel models where patients are nested in doctors. We can justifiably apply multilevel cumulative logit model, where the outcome variable represents the mild, severe, and extremely severe intensity of diseases like malaria and typhoid in the form of ordered categories. Based on our simulation conditions, Maximum Likelihood (ML method is better than Penalized Quasilikelihood (PQL method in three-category ordinal outcome variable. PQL method, however, performs equally well as ML method where five-category ordinal outcome variable is used. Further, to achieve power more than 0.80, at least 50 groups are required for both ML and PQL methods of estimation. It may be pointed out that, for five-category ordinal response variable model, the power of PQL method is slightly higher than the power of ML method.

  18. Sufficient Sample Size and Power in Multilevel Ordinal Logistic Regression Models.

    Science.gov (United States)

    Ali, Sabz; Ali, Amjad; Khan, Sajjad Ahmad; Hussain, Sundas

    2016-01-01

    For most of the time, biomedical researchers have been dealing with ordinal outcome variable in multilevel models where patients are nested in doctors. We can justifiably apply multilevel cumulative logit model, where the outcome variable represents the mild, severe, and extremely severe intensity of diseases like malaria and typhoid in the form of ordered categories. Based on our simulation conditions, Maximum Likelihood (ML) method is better than Penalized Quasilikelihood (PQL) method in three-category ordinal outcome variable. PQL method, however, performs equally well as ML method where five-category ordinal outcome variable is used. Further, to achieve power more than 0.80, at least 50 groups are required for both ML and PQL methods of estimation. It may be pointed out that, for five-category ordinal response variable model, the power of PQL method is slightly higher than the power of ML method.

  19. An interpretation of the behavior of EoS/GE models for asymmetric systems

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios; Panayiotis, Vlamos

    2000-01-01

    or zero pressure or at other conditions (system's pressure, constant volume packing fraction). In a number of publications over the last years, the achievements and the shortcomings of the various EoS/G(E) models have been presented via phase equilibrium calculations. This short communication provides...... an explanation of several literature EoSIGE models, especially those based on zero-reference pressure (PSRK, MHV1, MHV2), in the prediction of phase equilibria for asymmetric systems as well as an interpretation of the LCVM and kappa-MHV1 models which provide an empirical - yet as shown here theoretically...... justified - solution to these problems. (C) 2000 Elsevier Science Ltd. All rights reserved....

  20. An exactly solvable model for the integrability-chaos transition in rough quantum billiards.

    Science.gov (United States)

    Olshanii, Maxim; Jacobs, Kurt; Rigol, Marcos; Dunjko, Vanja; Kennard, Harry; Yurovsky, Vladimir A

    2012-01-24

    A central question of dynamics, largely open in the quantum case, is to what extent it erases a system's memory of its initial properties. Here we present a simple statistically solvable quantum model describing this memory loss across an integrability-chaos transition under a perturbation obeying no selection rules. From the perspective of quantum localization-delocalization on the lattice of quantum numbers, we are dealing with a situation where every lattice site is coupled to every other site with the same strength, on average. The model also rigorously justifies a similar set of relationships, recently proposed in the context of two short-range-interacting ultracold atoms in a harmonic waveguide. Application of our model to an ensemble of uncorrelated impurities on a rectangular lattice gives good agreement with ab initio numerics.

  1. To Vaccinate or Not to Vaccinate: How Teenagers Justified Their Decision

    Science.gov (United States)

    Lundstrom, Mats; Ekborg, Margareta; Ideland, Malin

    2012-01-01

    This article reports on a study of how teenagers made their decision on whether or not to vaccinate themselves against the new influenza. Its purpose was to identify connections between how teenagers talk about themselves and the decision they made. How do the teenagers construct their identities while talking about a specific socio-scientific…

  2. Tidal power development -- A realistic, justifiable and topical problem of today

    International Nuclear Information System (INIS)

    Bernshtein, L.B.

    1995-01-01

    Modern tidal power plant designs have shown that with the use of large single-basin schemes, tidal power can be integrated with other forms of power generation. Tidal power is an environmentally benign means of producing electricity, particularly during off-peak demand. A number of tidal power schemes have been evaluated. These include Cumberland (1.4 Gigawatts (GW)), Cobequid (4.4 GW) in Canada; Sevrn (8.6 GW), Mersey (0.7 GW), Wyre (0.06 GW) and Conwy (0.03 GW) in Great Britain; Tugur (6.8 GW) in Russia and Garolim (0.5 GW) in South Korea. These schemes ar opening up future prospects for very large scale opportunities which could have global importance, for example, the transmission of 24 GW of electricity from tidal power plants in Great Britain to Europe. Another example is the potential transmission of 87 GW from Penzhinsh tidal power plant in Russia

  3. Is risk associated with drinking water in Australia of significant concern to justify mandatory regulation?

    Science.gov (United States)

    McKay, J; Moeller, A

    2001-10-01

    Presently in Australia there are no mandatory drinking water standards. Here we argue that the risk associated with drinking water in Australia is of a dimension discernible to warrant mandatory regulations. The catchments that supply the major metropolitan areas of Sydney and Adelaide, and the groundwater for the city of Perth have been seriously compromised by the encroachment of development and activities. Melbourne in the past has generally relied on a closed catchment reservoir system; however, population growth in the near future will sequester the full online operation of additional reservoirs, which have multiple land use catchments. In addition to the current landscape circumstances, the management of a water system in itself proposes significant issues of risk. Two critical assumptions that are unique to a mass medium substance like water and dramatically alter the appraisal of risk are: (1) very large numbers of people are potentially exposed, and (2) small changes in contaminant levels may have adverse population outcomes. It is also known that water reticulation systems frequently suffer from contamination problems caused solely by the distribution system, and optimal management of these facilities would best be served by statutory protected transparency and dedicated water quality programs. In 1979, an Australian parliamentary committee stated that an "uncontaminated water supply is" a "basic requirement for the obtainment of good health"; however, recent surveys of Australian water systems show many are not meeting basic water quality criteria, and many communities are not receiving regular monitoring or testing as required by government authorized Australian drinking water guidelines. Exacerbating this situation is the lack of reporting and statutory endorsed standardized procedures to ensure information is properly and promptly recorded and that data are centralized for maximum benefit. The evaluation of risk associated with drinking water in Australia is often hampered by inadequate or incomplete data. Lastly, regional and rural water supplies face a vast array of contemporary problems and experiences that include widespread usage of pesticides and agricultural chemicals. In recent years, the Darling River has experienced the worst algal bloom known to man, and this river system not only supplies a number of regional and rural towns with water, but eventually connects with the River Murray, which supplies the State of South Australia with approximately 50% of its water requirements.

  4. Is it justifiable to assert that clinical lycanthropy may be correlated to porphyria cutanea tarda?

    Directory of Open Access Journals (Sweden)

    Lorenzo Martini

    2017-10-01

    Full Text Available Scope of this study is to demonstrate an old theory expressed in 1963, when Illis (Guy’s Hospital in London established a correlation between the clinical lycanthropy and congenital porphyria cutanea tarda. We had the fortune to live in a village where they say a lycanthrope lives too and is accustomed to hid himself at home for the 3 days when on the full moon, when he becomes (and behaves as a werewolf. Werewolves like to walk around before dawn craving for water and since We love to walk very early in the morning (as philosopher Emanuel Kant used to do, We have had this chance to encounter this mysterious man, who is a normal man with a regular lifestyle according to lunar cycle. He presents a very pale face with scares and blisters and generally when somebody asks him about this cutaneous manifestations he says he detests sun and light and his skin reacts by this way. We attempted to treat this individual by a pomade containing rutin, diosmin,Centella asiatica,niacinamide and escin. Results are encouraging as well.

  5. High Incidence of Asymptomatic Syphilis in HIV-Infected MSM Justifies Routine Screening

    NARCIS (Netherlands)

    Branger, Judith; van der Meer, Jan T. M.; van Ketel, Ruud J.; Jurriaans, Suzanne; Prins, Jan M.

    2009-01-01

    Background: Recently, the incidence of syphilis has risen, mainly among men having sex with men (MSM), many of whom are coinfected with HIV. Current guidelines recommend at least yearly syphilis testing in this group. In this study, we assessed the yield of routine syphilis screening in outpatient

  6. Halogenated flame retardants: do the fire safety benefits justify the risks?

    Science.gov (United States)

    Shaw, Susan D; Blum, Arlene; Weber, Roland; Kannan, Kurunthachalam; Rich, David; Lucas, Donald; Koshland, Catherine P; Dobraca, Dina; Hanson, Sarah; Birnbaum, Linda S

    2010-01-01

    Since the 1970s, an increasing number of regulations have expanded the use of brominated and chlorinated flame retardants. Many of these chemicals are now recognized as global contaminants and are associated with adverse health effects in animals and humans, including endocrine and thyroid disruption, immunotoxicity, reproductive toxicity, cancer, and adverse effects on fetal and child development and neurologic function. Some flame retardants such as polybrominated diphenyl ethers (PBDEs) have been banned or voluntarily phased out by manufacturers because of their environmental persistence and toxicity, only to be replaced by other organohalogens of unknown toxicity. Despite restrictions on further production in some countries, consumer products previously treated with banned retardants are still in use and continue to release toxic chemicals into the environment, and the worldwide use of organohalogen retardants continues to increase. This paper examines major uses and known toxic effects of commonly-used organohalogen flame retardants, replacements for those that have been phased out, their combustion by-products, and their effectiveness at reducing fire hazard. Policy and other solutions to maintain fire safety while reducing toxicity are suggested. The major conclusions are: (1) Flammability regulations can cause greater adverse environmental and health impacts than fire safety benefits. (2) The current options for end-of-life disposal of products treated with organohalogens retardants are problematic. (3) Life-cycle analyses evaluating benefits and risks should consider the health and environmental effects of the chemicals, as well as their fire safety impacts. (4) Most fire deaths and most fire injuries result from inhaling carbon monoxide, irritant gases, and soot. The incorporation of organohalogens can increase the yield of these toxic by-products during combustion. (5) Fire-safe cigarettes, fire-safe candles, child-resistant lighters, sprinklers, and smoke detectors can prevent fires without the potential adverse effects of flame retardant chemicals. (6) Alternatives to organohalogen flame retardant chemicals include using less flammable materials, design changes, and safer chemicals. To date, before evaluating their health and environmental impacts, many flame retardant chemicals have been produced and used, resulting in high levels of human exposure. As a growing literature continues to find adverse impacts from such chemicals, a more systematic approach to their regulation is needed. Before implementing new flammability standards, decision-makers should evaluate the potential fire safety benefit versus the health and environmental impacts of the chemicals, materials, or technologies likely to be used to meet the standard. Reducing the use of toxic or untested flame retardant chemicals in consumer products can protect human and animal health and the global environment without compromising fire safety.

  7. Is the Inclusion of Animal Source Foods in Fortified Blended Foods Justified?

    Directory of Open Access Journals (Sweden)

    Kristen E. Noriega

    2014-09-01

    Full Text Available Fortified blended foods (FBF are used for the prevention and treatment of moderate acute malnutrition (MAM in nutritionally vulnerable individuals, particularly children. A recent review of FBF recommended the addition of animal source food (ASF in the form of whey protein concentrate (WPC, especially to corn-soy blends. The justification for this recommendation includes the potential of ASF to increase length, weight, muscle mass accretion and recovery from wasting, as well as to improve protein quality and provide essential growth factors. Evidence was collected from the following four different types of studies: (1 epidemiological; (2 ASF versus no intervention or a low-calorie control; (3 ASF versus an isocaloric non-ASF; and (4 ASF versus an isocaloric, isonitrogenous non-ASF. Epidemiological studies consistently associated improved growth outcomes with ASF consumption; however, little evidence from isocaloric and isocaloric, isonitrogenous interventions was found to support the inclusion of meat or milk in FBF. Evidence suggests that whey may benefit muscle mass accretion, but not linear growth. Overall, little evidence supports the costly addition of WPC to FBFs. Further, randomized isocaloric, isonitrogenous ASF interventions with nutritionally vulnerable children are needed.

  8. Justifying the clinical use of fresh frozen plasma-an audit

    International Nuclear Information System (INIS)

    Sharif, M.M.; Maqbool, S.; Butt, T.K.; Iqbal, S.; Mumtaz, A.

    2007-01-01

    To determine the appropriateness of fresh frozen plasma (FFP), uses in various haematological and clinical disorders, with reference to the British Committee for Standards in Haematology (BCSH) guidelines through an audit. The data was collected from June 2001 to June 2004 from the request forms ordered by the clinicians for the transfusion of FFP at the Department of Haematology and Transfusion Medicine, Shalamar Hospital, Lahore. A total of 2075 healthy blood donors donated their whole blood for the preparation of fresh frozen plasma (FFP). All blood donors were screened for anti HCV, HBsAg, VDRL and HIV. Those 2075 FFP units were prepared on high-speed centrifuge and were rapidly stored at -30 degree C freezer. A total of 587 patients were transfused 2075 units of FFP for various clinical disorders. The percentage of FFP units, transfused appropriately and inappropriately, as defined by BCSH guidelines, was estimated. Out of 2075 FFP units, 335 (24.41%) FFP units were transfused to patients suffering from bleeding due to disseminated intravascular coagulation (DIC), 306 (22.30%) units used for massive transfusion and surgical bleeding, 236 (17.20%) units for bleeding due to chronic liver disease, 202 (14.72%) units used to control bleeding due to coagulation factor deficiencies, 84(6.12%) units for thrombotic thrombocytopenic purpura (TTP), 75(5.46%) units prior to liver biopsy to correct prolonged prothrombin time (PT), 72(5.24%) units for haemorrhage due to haemolytic disease of newborn (HDN) and 62(4.51%) units to control bleeding due to warfarin overdosage, 425(60.45%) units used for nutritional support and hypovolaemia replacement, 131(18.63%) units for the reversal of prolonged INR in the absence of bleeding due to warfarin, 92 (13.08%) units used in ICU to correct prolonged PT without bleeding due to Vitamin K deficiency and 55(7.82%) units for chronic liver disease (CLD) to correct prolonged PT and APTT in the absence of bleeding. In summary, 1372 (66.12%) FFP units were appropriately and 703 (33. 88 %) were inappropriately used. In conclusion, 33.88% FFP was inappropriately used mainly due to lack of awareness of international guidelines and ignorance of risks. (author)

  9. A Justified Initial Accounting Estimate as an Integral Part of the Enterprise Accounting Policy

    Directory of Open Access Journals (Sweden)

    Marenych Tetyana H

    2016-05-01

    Full Text Available The aim of the article is justification of the need to specify in the order on accounting policies not only the elements of the accounting policy itself but also the initial accounting estimates, which will increase the reliability of financial reporting and the development of proposals on improvement of the given administrative documents of the enterprise. It is noted that in recent years the importance of a high-quality accounting policy has increased significantly not only for users of financial reports but also for achieving the purposes of determining the object of levying the profits tax. There revealed significant differences at reflecting in accounting the consequences of changes in the accounting policy and accounting estimate. There has been generalized the information in the order on the enterprise accounting policy with respect to accounting estimates. It is proposed to provide a separate section in the order, where there should be presented information about the list of accounting estimates taken, about how the company will make changes in the accounting policy, accounting estimate as well as correct errors

  10. Final final Reginald M.J. Oduor Justifying Non-violent Civil ...

    African Journals Online (AJOL)

    Jimmy Gitonga

    Africans in the post-First World War depression, high Poll Tax demands and poor working conditions on the settler estates (Kihoro 2005, 25). Thuku forged a working relationship with the EAA, the Young Kavirondo Association (YKA) and the Indian community. There were plans for him to visit India to learn some.

  11. Can patriotism justify killing in defense of one’s country?

    Directory of Open Access Journals (Sweden)

    Pavković Aleksandar

    2007-01-01

    Full Text Available Cosmopolitan liberals would be ready to fight - and to kill and be killed for the sake of restoring international justice or for the abolition of profoundly unjust political institutions. Patriots are ready to do the same for their own country. Sometimes the cosmopolitan liberals and patriots would fight on the same side and sometimes on the opposite sides of the conflict. Thus the former would join the latter in the defense of Serbia against Austria-Hungary (in 1914 but would oppose the white Southerner patriots in the American Civil War (in 1861. In this paper I argue that fighting and killing for one’s country is, in both of those cases, different from the defense of one’s own life and the lives of those who cannot defend themselves. Killing for one’s country is killing in order to fulfill a particular political preference. The same is the case with fighting for the abolition of a profoundly unjust political institution. It is not amoral or immoral to refuse to kill for any one of these two political preferences because there is no reason to believe that either political preference trumps our moral constraints against killing.

  12. Routine histopathology of gallbladder after elective cholecystectomy for gallstones: waste of resources or a justified act?

    Science.gov (United States)

    Siddiqui, Faisal G; Memon, Ahmer A; Abro, Arshad H; Sasoli, Nazeer A; Ahmad, Lubna

    2013-07-08

    Selective approach for sending cholecystectomy specimens for histopathology results in missing discrete pathologies such as premalignant benign lesions such as porcelain gallbladder, carcinoma-in-situ, and early carcinomas. To avoid such blunders therefore, every cholecystectomy specimen should be routinely examined histologically. Unfortunately, the practice of discarding gallbladder specimen is standard in most tertiary care hospitals of Pakistan including the primary investigators' own institution. This study was conducted to assess the feasibility or otherwise of performing histopathology in every specimen of gallbladder. This cohort study included 220 patients with gallstones for cholecystectomy. All cases with known secondaries from gallbladder, local invasion from other viscera, traumatic rupture of gallbladder, gross malignancy of gallbladder found during surgery was excluded from the study. Laparoscopic cholecystectomy was performed in majority of cases except in those cases where anatomical distortion and dense adhesions prevented laparoscopy. All gallbladder specimens were sent for histopathology, irrespective of their gross appearance. Over a period of two years, 220 patients with symptomatic gallstones were admitted for cholecystectomy. Most of the patients were females (88%). Ninety two per cent patients presented with upper abdominal pain of varying duration. All specimens were sent for histopathology. Two hundred and three of the specimens showed evidence chronic cholecystitis, 7 acute cholecystitis with mucocele, 3 acute cholecystitis with empyema and one chronic cholecystitis associated with poly. Six gallbladders (2.8%) showed adenocarcinoma of varying differentiation along with cholelithiasis. The histopathological spectrum of gallbladder is extremely variable. Incidental diagnosis of carcinoma gall bladder is not rare; if the protocol of routine histopathology of all gallbladder specimens is not followed, subclinical malignancies would fail to be identified with disastrous results. We strongly recommend routine histopathology of all cholecystectomy specimens.

  13. Is Lower Quality Clinical Care Ethically Justifiable for Patients Residing in Areas with Infrastructure Deficits?

    Science.gov (United States)

    Inhorn, Marcia C; Patrizio, Pasquale

    2018-03-01

    Reproductive health services, including infertility care, are important in countries with infrastructure deficits, such as Lebanon, which now hosts more than one million Syrian refugees. Islamic prohibitions on child adoption and third-party reproductive assistance (donor eggs, sperm, embryos, and surrogacy) mean that most Muslim couples must turn to in vitro fertilization (IVF) and intracytoplasmic sperm injection (ICSI) to overcome their childlessness. Attempts to bring low-cost IVF-ICSI to underserved populations might help infertile couples where no other services are available. However, a low-cost IVF-ICSI protocol for male infertility remains technically challenging and thus may result in two standards of clinical care. Nonetheless, low-cost IVF-ICSI represents a form of reproductive justice in settings with infrastructure deficits and is clearly better than no treatment at all. © 2018 American Medical Association. All Rights Reserved.

  14. is the expulsion of women as foreigners in ezra 9-10 justifiably ...

    African Journals Online (AJOL)

    2012:1. 159. Yahweh's ordinance (Hoglund 1992:35). In other words, Ezra wanted the golah community to maintain religious purity (Anderson 1966:165). Yet, there is also another feeling that the so-called intermarriage in Ezra. 9 and 10 threatened the economic stability of the Province of Yehud by threatening its land base ...

  15. Female genital mutilation of minors in Italy: is a harmless and symbolic alternative justified?

    Directory of Open Access Journals (Sweden)

    Maria Luisa Di Pietro

    2012-09-01

    Full Text Available

    In 2004, Omar Abdulcadir - a gynecologist of the Centre for the prevention and therapy of female genital mutilation (FMG at the Careggi Hospital (Florence - proposed a “harmless and symbolic” alternative to FMG, which consists in the puncture of the clitoris under local anesthesia, in order to allow the outflow of some drops of blood (1.

    The intention behind the symbolic alternative is to avoid more severe forms of FGM while respecting cultural heritage. The proposal of this alternative procedure, which was sustained by the leaders of 10 local African immigrant communities, has encountered ample criticism (1.

    However, the question is: is the puncture of the clitoris prohibited by the Italian Law n. 7/2006? If it is not, could it be considered a method of reducing health risks caused by the more invasive forms of FGM (2? Or could it culturally legitimize FGM, causing a greater difficulty in the attempts to prevent and eradicate FGM in Italy?

  16. Who Justifies Questionable Reporting Practices? Answers from a Representative Survey of Journalists in Germany

    Directory of Open Access Journals (Sweden)

    Philip Baugut

    2017-07-01

    Full Text Available Based on a secondary analysis of representative survey data of journalists in Germany (n= 1536, this paper draws attention to two variables that are important when it comes to explain whether journalists accept questionable reporting practices, such as paying people to obtain information or using confidential government documents without permission. First, perceived role achievement is important, as journalists who do not feel able to achieve an active role tend to accept questionable reporting practices more often. Second, however, this relationship is only true for journalists having a moderate tendency to the political left. Findings are explained by means of the theory of cognitive dissonance.

  17. Routine follow-up imaging of kidney injuries may not be justified.

    Science.gov (United States)

    Bukur, Marko; Inaba, Kenji; Barmparas, Galinos; Paquet, Christian; Best, Charles; Lam, Lydia; Plurad, David; Demetriades, Demetrios

    2011-05-01

    The purpose of this investigation was to determine the yield of repeat follow-up imaging in patients sustaining renal trauma. The Los Angeles County+University of Southern California Medical Center trauma registry was reviewed to identify all patients with a diagnosis of kidney injury from 2005 to 2008. All final attending radiologist interpretations and the dates of the initial and follow-up computerized tomography (CT) scans were also reviewed. Grades I, II, and III were grouped as low-grade injuries and grades IV and V as high-grade injuries. During the 4-year study period, 120 (1.2% of all trauma admissions) patients had a total of 121 kidney injuries: 85.8% were male, and the mean age±SD was 31.1 years±14.5 years. Overall, 22.6% of blunt and 35.6% of penetrating kidney injuries were high grade (IV-V; p=0.148). These high-grade injuries were managed operatively in 35.7% and 76.2% of blunt and penetrating injuries, respectively, (p=0.022). Overall, 31.7% underwent at least one follow-up CT; 24.2% of patients with blunt and 39.7% of patients with penetrating kidney injury, respectively. None of the patients with a low-grade injury managed nonoperatively developed a complication, independent of the injury mechanism. High-grade blunt and penetrating kidney injuries managed nonoperatively were associated with 11.1% and 20.0% complication rate identified on follow-up CT, respectively. For patients who underwent surgical interventions for penetrating kidney injuries, the diagnosis of the complication was made at 9.8 days±7.0 days (range, 1-24 days), with 83.3% of them diagnosed within 8 days postoperatively. The most frequent complication identified was an abscess in the renal fossa (50.0% of all complications). Other complications included urinoma, ureteral stricture, and pseudoaneurysm. All patients who developed complications were symptomatic, prompting the imaging that led to the diagnosis. All patients who developed a complication after a penetrating injury required intervention for the management of the complication. Selective reimaging of renal injuries based on clinical and laboratory criteria seems to be safe regardless of injury mechanism or management. High-grade penetrating injuries undergoing operative intervention should carry the highest degree of vigilance and lowest threshold for repeat imaging.

  18. 75 FR 36432 - Termination of Declarations Justifying Emergency Use Authorizations of Certain In Vitro...

    Science.gov (United States)

    2010-06-25

    ... 3879 S. River Road, Bldg. A St. George, UT 84790 Epoch BioSciences ELITech Molecular Diagnostics 2009... Diagnostics, Inc. Focus Diagnostics Influenza A H1N1 (2009) Real-Time RT-PCR IVD device 11331 Valley View Street Cypress, CA 90630 Focus Diagnostics, Inc. Focus Diagnostics Simplexa Influenza A H1N1 (2009)device...

  19. Do the risks of emergent colectomy justify nonoperative management strategies for recurrent diverticulitis?

    Science.gov (United States)

    Novitsky, Yuri W; Sechrist, Cathy; Payton, B Lauren; Kercher, Kent W; Heniford, B Todd

    2009-02-01

    The nonoperative approach to recurrent and even multiple recurrent diverticulitis has recently been advocated. This approach, however, may result in more frequent acute attacks requiring emergent colectomy. Our aim was to compare the colectomy outcomes for diverticulitis in the elective and acute settings. All patients with diverticulitis undergoing elective (EL) and emergent (EM) colectomy selected from the 2001 to 2002 Nationwide Inpatient Sample Database were analyzed and compared. Five thousand ninety-seven (27.1% emergent) colectomy cases were analyzed. EL patients had a significantly reduced length of stay (7.5 vs 13.3 days) and total hospital charges ($25,420 vs $51,170). Postsurgical morbidity and mortality were significantly higher in the EM group (29.0% vs 14.9% and 7.4% vs .8%, respectively). Colostomy was needed in 5.7% of EL and in 48.9% of EM patients (P = .001). Emergent colectomy in the setting of diverticulitis is associated with significantly higher morbidity, longer hospitalization, greater hospital charges, and a 9-fold increase in mortality. Prophylactic resection in the setting of recurrent diverticulitis should continue to be an acceptable and possibly more "conservative" approach.

  20. [Is high-dose epinephrine justified in cardiorespiratory arrest in children?].

    Science.gov (United States)

    Rodríguez Núñez, A; García, C; López-Herce Cid, J

    2005-02-01

    To evaluate the impact on survival of intravenous or intraosseous high-dose epinephrine compared with standard doses in children with cardiorespiratory arrest. We performed a multicenter, prospective study. Cardiopulmonary resuscitation data from 283 children was collected following international guidelines (Utstein style) over 18 months. In a secondary analysis we studied survival in 92 children who were treated with intravenous or intraosseous epinephrine. One or more conventional doses of epinephrine (0.01 mg/kg) were administered in 12 patients and a first conventional dose followed by one or more high doses (0.1 mg/kg) were administered in 80 patients. The age and weight of children in the conventional-dose group were higher than those in the high-dose group (97.1 +/- 70.5 months vs 29.9 +/- 36.9 months, p = 0.03 and 24.7 +/- 20.8 kg vs 11.9 +/- 8.9 kg, p = 0.037, respectively). The number of doses administered in the conventional-dose group was lower than that in the high-dose group (4 +/- 4 vs 5.4 +/- 3.4, p = 0.01). No significant differences were observed between the two groups in type of arrest, site of arrest, initial electrocardiographic rhythm, response to resuscitation attempts with return of spontaneous circulation, total resuscitation time, neurological status at the end of the episode and survival to hospital discharge and at 1-year of follow-up. Although the present study has considerable limitations, the results suggest that high doses of epinephrine do not improve survival in cardiorespiratory arrest in children.