WorldWideScience

Sample records for underlying assumptions relating

  1. Contextuality under weak assumptions

    International Nuclear Information System (INIS)

    Simmons, Andrew W; Rudolph, Terry; Wallman, Joel J; Pashayan, Hakop; Bartlett, Stephen D

    2017-01-01

    The presence of contextuality in quantum theory was first highlighted by Bell, Kochen and Specker, who discovered that for quantum systems of three or more dimensions, measurements could not be viewed as deterministically revealing pre-existing properties of the system. More precisely, no model can assign deterministic outcomes to the projectors of a quantum measurement in a way that depends only on the projector and not the context (the full set of projectors) in which it appeared, despite the fact that the Born rule probabilities associated with projectors are independent of the context. A more general, operational definition of contextuality introduced by Spekkens, which we will term ‘probabilistic contextuality’, drops the assumption of determinism and allows for operations other than measurements to be considered contextual. Even two-dimensional quantum mechanics can be shown to be contextual under this generalised notion. Probabilistic noncontextuality represents the postulate that elements of an operational theory that cannot be distinguished from each other based on the statistics of arbitrarily many repeated experiments (they give rise to the same operational probabilities) are ontologically identical. In this paper, we introduce a framework that enables us to distinguish between different noncontextuality assumptions in terms of the relationships between the ontological representations of objects in the theory given a certain relation between their operational representations. This framework can be used to motivate and define a ‘possibilistic’ analogue, encapsulating the idea that elements of an operational theory that cannot be unambiguously distinguished operationally can also not be unambiguously distinguished ontologically. We then prove that possibilistic noncontextuality is equivalent to an alternative notion of noncontextuality proposed by Hardy. Finally, we demonstrate that these weaker noncontextuality assumptions are sufficient to prove

  2. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  3. Estimating Risks and Relative Risks in Case-Base Studies under the Assumptions of Gene-Environment Independence and Hardy-Weinberg Equilibrium

    Science.gov (United States)

    Chui, Tina Tsz-Ting; Lee, Wen-Chung

    2014-01-01

    Many diseases result from the interactions between genes and the environment. An efficient method has been proposed for a case-control study to estimate the genetic and environmental main effects and their interactions, which exploits the assumptions of gene-environment independence and Hardy-Weinberg equilibrium. To estimate the absolute and relative risks, one needs to resort to an alternative design: the case-base study. In this paper, the authors show how to analyze a case-base study under the above dual assumptions. This approach is based on a conditional logistic regression of case-counterfactual controls matched data. It can be easily fitted with readily available statistical packages. When the dual assumptions are met, the method is approximately unbiased and has adequate coverage probabilities for confidence intervals. It also results in smaller variances and shorter confidence intervals as compared with a previous method for a case-base study which imposes neither assumption. PMID:25137392

  4. Estimating risks and relative risks in case-base studies under the assumptions of gene-environment independence and Hardy-Weinberg equilibrium.

    Directory of Open Access Journals (Sweden)

    Tina Tsz-Ting Chui

    Full Text Available Many diseases result from the interactions between genes and the environment. An efficient method has been proposed for a case-control study to estimate the genetic and environmental main effects and their interactions, which exploits the assumptions of gene-environment independence and Hardy-Weinberg equilibrium. To estimate the absolute and relative risks, one needs to resort to an alternative design: the case-base study. In this paper, the authors show how to analyze a case-base study under the above dual assumptions. This approach is based on a conditional logistic regression of case-counterfactual controls matched data. It can be easily fitted with readily available statistical packages. When the dual assumptions are met, the method is approximately unbiased and has adequate coverage probabilities for confidence intervals. It also results in smaller variances and shorter confidence intervals as compared with a previous method for a case-base study which imposes neither assumption.

  5. Bank stress testing under different balance sheet assumptions

    OpenAIRE

    Busch, Ramona; Drescher, Christian; Memmel, Christoph

    2017-01-01

    Using unique supervisory survey data on the impact of a hypothetical interest rate shock on German banks, we analyse price and quantity effects on banks' net interest margin components under different balance sheet assumptions. In the first year, the cross-sectional variation of banks' simulated price effect is nearly eight times as large as the one of the simulated quantity effect. After five years, however, the importance of both effects converges. Large banks adjust their balance sheets mo...

  6. Unrealistic Assumptions in Economics: an Analysis under the Logic of Socioeconomic Processes

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2014-11-01

    Full Text Available The realism of assumptions is an ongoing debate within the philosophy of economics. One of the most referenced papers in this matter belongs to Milton Friedman. He defends the use of unrealistic assumptions, not only because of a pragmatic issue, but also the intrinsic difficulties of determining the extent of realism. On the other hand, realists have criticized (and still do today the use of unrealistic assumptions - such as the assumption of rational choice, perfect information, homogeneous goods, etc. However, they did not accompany their statements with a proper epistemological argument that supports their positions. In this work it is expected to show that the realism of (a particular sort of assumptions is clearly relevant when examining economic models, since the system under study (the real economies is not compatible with logic of invariance and of mechanisms, but with the logic of possibility trees. Because of this, models will not function as tools for predicting outcomes, but as representations of alternative scenarios, whose similarity to the real world will be examined in terms of the verisimilitude of a class of model assumptions

  7. Do unreal assumptions pervert behaviour?

    DEFF Research Database (Denmark)

    Petersen, Verner C.

    of the basic assumptions underlying the theories found in economics. Assumptions relating to the primacy of self-interest, to resourceful, evaluative, maximising models of man, to incentive systems and to agency theory. The major part of the paper then discusses how these assumptions and theories may pervert......-interested way nothing will. The purpose of this paper is to take a critical look at some of the assumptions and theories found in economics and discuss their implications for the models and the practices found in the management of business. The expectation is that the unrealistic assumptions of economics have...... become taken for granted and tacitly included into theories and models of management. Guiding business and manage¬ment to behave in a fashion that apparently makes these assumptions become "true". Thus in fact making theories and models become self-fulfilling prophecies. The paper elucidates some...

  8. Are waves of relational assumptions eroding traditional analysis?

    Science.gov (United States)

    Meredith-Owen, William

    2013-11-01

    The author designates as 'traditional' those elements of psychoanalytic presumption and practice that have, in the wake of Fordham's legacy, helped to inform analytical psychology and expand our capacity to integrate the shadow. It is argued that this element of the broad spectrum of Jungian practice is in danger of erosion by the underlying assumptions of the relational approach, which is fast becoming the new establishment. If the maps of the traditional landscape of symbolic reference (primal scene, Oedipus et al.) are disregarded, analysts are left with only their own self-appointed authority with which to orientate themselves. This self-centric epistemological basis of the relationalists leads to a revision of 'analytic attitude' that may be therapeutic but is not essentially analytic. This theme is linked to the perennial challenge of balancing differentiation and merger and traced back, through Chasseguet-Smirgel, to its roots in Genesis. An endeavour is made to illustrate this within the Journal convention of clinically based discussion through a commentary on Colman's (2013) avowedly relational treatment of the case material presented in his recent Journal paper 'Reflections on knowledge and experience' and through an assessment of Jessica Benjamin's (2004) relational critique of Ron Britton's (1989) transference embodied approach. © 2013, The Society of Analytical Psychology.

  9. Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie; Bekker, Tilde; Barendregt, Wolmet

    2016-01-01

    In this full-day workshop we want to discuss how the IDC community can make underlying assumptions, values and views regarding children and childhood in making design decisions more explicit. What assumptions do IDC designers and researchers make, and how can they be supported in reflecting......, and intends to share different approaches for uncovering and reflecting on values, assumptions and views about children and childhood in design....

  10. Underlying assumptions and core beliefs in anorexia nervosa and dieting.

    Science.gov (United States)

    Cooper, M; Turner, H

    2000-06-01

    To investigate assumptions and beliefs in anorexia nervosa and dieting. The Eating Disorder Belief Questionnaire (EDBQ), was administered to patients with anorexia nervosa, dieters and female controls. The patients scored more highly than the other two groups on assumptions about weight and shape, assumptions about eating and negative self-beliefs. The dieters scored more highly than the female controls on assumptions about weight and shape. The cognitive content of anorexia nervosa (both assumptions and negative self-beliefs) differs from that found in dieting. Assumptions about weight and shape may also distinguish dieters from female controls.

  11. Commentary: Considering Assumptions in Associations Between Music Preferences and Empathy-Related Responding

    Directory of Open Access Journals (Sweden)

    Susan A O'Neill

    2015-09-01

    Full Text Available This commentary considers some of the assumptions underpinning the study by Clark and Giacomantonio (2015. Their exploratory study examined relationships between young people's music preferences and their cognitive and affective empathy-related responses. First, the prescriptive assumption that music preferences can be measured according to how often an individual listens to a particular music genre is considered within axiology or value theory as a multidimensional construct (general, specific, and functional values. This is followed by a consideration of the causal assumption that if we increase young people's empathy through exposure to prosocial song lyrics this will increase their prosocial behavior. It is suggested that the predictive power of musical preferences on empathy-related responding might benefit from a consideration of the larger pattern of psychological and subjective wellbeing within the context of developmental regulation across ontogeny that involves mutually influential individual—context relations.

  12. A framework for the organizational assumptions underlying safety culture

    International Nuclear Information System (INIS)

    Packer, Charles

    2002-01-01

    The safety culture of the nuclear organization can be addressed at the three levels of culture proposed by Edgar Schein. The industry literature provides a great deal of insight at the artefact and espoused value levels, although as yet it remains somewhat disorganized. There is, however, an overall lack of understanding of the assumption level of safety culture. This paper describes a possible framework for conceptualizing the assumption level, suggesting that safety culture is grounded in unconscious beliefs about the nature of the safety problem, its solution and how to organize to achieve the solution. Using this framework, the organization can begin to uncover the assumptions at play in its normal operation, decisions and events and, if necessary, engage in a process to shift them towards assumptions more supportive of a strong safety culture. (author)

  13. The stable model semantics under the any-world assumption

    OpenAIRE

    Straccia, Umberto; Loyer, Yann

    2004-01-01

    The stable model semantics has become a dominating approach to complete the knowledge provided by a logic program by means of the Closed World Assumption (CWA). The CWA asserts that any atom whose truth-value cannot be inferred from the facts and rules is supposed to be false. This assumption is orthogonal to the so-called the Open World Assumption (OWA), which asserts that every such atom's truth is supposed to be unknown. The topic of this paper is to be more fine-grained. Indeed, the objec...

  14. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    International Nuclear Information System (INIS)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-01-01

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model

  15. Oil production, oil prices, and macroeconomic adjustment under different wage assumptions

    International Nuclear Information System (INIS)

    Harvie, C.; Maleka, P.T.

    1992-01-01

    In a previous paper one of the authors developed a simple model to try to identify the possible macroeconomic adjustment processes arising in an economy experiencing a temporary period of oil production, under alternative wage adjustment assumptions, namely nominal and real wage rigidity. Certain assumptions were made regarding the characteristics of actual production, the permanent revenues generated from that oil production, and the net exports/imports of oil. The role of the price of oil, and possible changes in that price was essentially ignored. Here we attempt to incorporate the price of oil, as well as changes in that price, in conjunction with the production of oil, the objective being to identify the contribution which the price of oil, and changes in it, make to the adjustment process itself. The emphasis in this paper is not given to a mathematical derivation and analysis of the model's dynamics of adjustment or its comparative statics, but rather to the derivation of simulation results from the model, for a specific assumed case, using a numerical algorithm program, conducive to the type of theoretical framework utilized here. The results presented suggest that although the adjustment profiles of the macroeconomic variables of interest, for either wage adjustment assumption, remain fundamentally the same, the magnitude of these adjustments is increased. Hence to derive a more accurate picture of the dimensions of adjustment of these macroeconomic variables, it is essential to include the price of oil as well as changes in that price. (Author)

  16. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  17. A Proposal for Testing Local Realism Without Using Assumptions Related to Hidden Variable States

    Science.gov (United States)

    Ryff, Luiz Carlos

    1996-01-01

    A feasible experiment is discussed which allows us to prove a Bell's theorem for two particles without using an inequality. The experiment could be used to test local realism against quantum mechanics without the introduction of additional assumptions related to hidden variables states. Only assumptions based on direct experimental observation are needed.

  18. Forecasting Value-at-Risk under Different Distributional Assumptions

    Directory of Open Access Journals (Sweden)

    Manuela Braione

    2016-01-01

    Full Text Available Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR. We provide a comprehensive look at the problem by considering the impact that different distributional assumptions have on the accuracy of both univariate and multivariate GARCH models in out-of-sample VaR prediction. The set of analyzed distributions comprises the normal, Student, Multivariate Exponential Power and their corresponding skewed counterparts. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures used to rank the different specifications. The results show the importance of allowing for heavy-tails and skewness in the distributional assumption with the skew-Student outperforming the others across all tests and confidence levels.

  19. Recursive Subspace Identification of AUV Dynamic Model under General Noise Assumption

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A recursive subspace identification algorithm for autonomous underwater vehicles (AUVs is proposed in this paper. Due to the advantages at handling nonlinearities and couplings, the AUV model investigated here is for the first time constructed as a Hammerstein model with nonlinear feedback in the linear part. To better take the environment and sensor noises into consideration, the identification problem is concerned as an errors-in-variables (EIV one which means that the identification procedure is under general noise assumption. In order to make the algorithm recursively, propagator method (PM based subspace approach is extended into EIV framework to form the recursive identification method called PM-EIV algorithm. With several identification experiments carried out by the AUV simulation platform, the proposed algorithm demonstrates its effectiveness and feasibility.

  20. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. Wrong assumptions in the financial crisis

    NARCIS (Netherlands)

    Aalbers, M.B.

    2009-01-01

    Purpose - The purpose of this paper is to show how some of the assumptions about the current financial crisis are wrong because they misunderstand what takes place in the mortgage market. Design/methodology/approach - The paper discusses four wrong assumptions: one related to regulation, one to

  2. Sensitivity Analysis Without Assumptions.

    Science.gov (United States)

    Ding, Peng; VanderWeele, Tyler J

    2016-05-01

    Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.

  3. Political Assumptions Underlying Pedagogies of National Education: The Case of Student Teachers Teaching 'British Values' in England

    Science.gov (United States)

    Sant, Edda; Hanley, Chris

    2018-01-01

    Teacher education in England now requires that student teachers follow practices that do not undermine "fundamental British values" where these practices are assessed against a set of ethics and behaviour standards. This paper examines the political assumptions underlying pedagogical interpretations about the education of national…

  4. Consequences of Violated Equating Assumptions under the Equivalent Groups Design

    Science.gov (United States)

    Lyren, Per-Erik; Hambleton, Ronald K.

    2011-01-01

    The equal ability distribution assumption associated with the equivalent groups equating design was investigated in the context of a selection test for admission to higher education. The purpose was to assess the consequences for the test-takers in terms of receiving improperly high or low scores compared to their peers, and to find strong…

  5. The relevance of ''theory rich'' bridge assumptions

    NARCIS (Netherlands)

    Lindenberg, S

    1996-01-01

    Actor models are increasingly being used as a form of theory building in sociology because they can better represent the caul mechanisms that connect macro variables. However, actor models need additional assumptions, especially so-called bridge assumptions, for filling in the relatively empty

  6. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Limitations to the Dutch cannabis toleration policy: Assumptions underlying the reclassification of cannabis above 15% THC.

    Science.gov (United States)

    Van Laar, Margriet; Van Der Pol, Peggy; Niesink, Raymond

    2016-08-01

    The Netherlands has seen an increase in Δ9-tetrahydrocannabinol (THC) concentrations from approximately 8% in the 1990s up to 20% in 2004. Increased cannabis potency may lead to higher THC-exposure and cannabis related harm. The Dutch government officially condones the sale of cannabis from so called 'coffee shops', and the Opium Act distinguishes cannabis as a Schedule II drug with 'acceptable risk' from other drugs with 'unacceptable risk' (Schedule I). Even in 1976, however, cannabis potency was taken into account by distinguishing hemp oil as a Schedule I drug. In 2011, an advisory committee recommended tightening up legislation, leading to a 2013 bill proposing the reclassification of high potency cannabis products with a THC content of 15% or more as a Schedule I drug. The purpose of this measure was twofold: to reduce public health risks and to reduce illegal cultivation and export of cannabis by increasing punishment. This paper focuses on the public health aspects and describes the (explicit and implicit) assumptions underlying this '15% THC measure', as well as to what extent these are supported by scientific research. Based on scientific literature and other sources of information, we conclude that the 15% measure can provide in theory a slight health benefit for specific groups of cannabis users (i.e., frequent users preferring strong cannabis, purchasing from coffee shops, using 'steady quantities' and not changing their smoking behaviour), but certainly not for all cannabis users. These gains should be weighed against the investment in enforcement and the risk of unintended (adverse) effects. Given the many assumptions and uncertainty about the nature and extent of the expected buying and smoking behaviour changes, the measure is a political choice and based on thin evidence. Copyright © 2016 Springer. Published by Elsevier B.V. All rights reserved.

  8. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the

  9. Timber value—a matter of choice: a study of how end use assumptions affect timber values.

    Science.gov (United States)

    John H. Beuter

    1971-01-01

    The relationship between estimated timber values and actual timber prices is discussed. Timber values are related to how, where, and when the timber is used. An analysis demonstrates the relative values of a typical Douglas-fir stand under assumptions about timber use.

  10. Economic assumptions for evaluating reactor-related options for managing plutonium

    International Nuclear Information System (INIS)

    Rothwell, G.

    1996-01-01

    This paper discusses the economic assumptions in the U.S. National Academy of Sciences' report, Management and Disposition of Excess Weapons Plutonium: Reactor-Related Options (1995). It reviews the Net Present Value approach for discounting and comparing the costs and benefits of reactor-related options. It argues that because risks associated with the returns to plutonium management are unlikely to be constant over time, it is preferable to use a real risk-free rate to discount cash flows and explicitly describe the probability distributions for costs and benefits, allowing decision makers to determine the risk premium of each option. As a baseline for comparison, it assumes that one economic benefit of changing the current plutonium management system is a reduction in on-going Surveillance and Maintenance (S and M) costs. This reduction in the present value of S and M costs can be compared with the discounted costs of each option. These costs include direct construction costs, indirect costs, operating costs minus revenues, and decontamination and decommissioning expenses. The paper also discusses how to conduct an uncertainty analysis. It finishes by summarizing conclusions and recommendations and discusses how these recommendations might apply to the evaluation of Russian plutonium management options. (author)

  11. Capturing Assumptions while Designing a Verification Model for Embedded Systems

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    A formal proof of a system correctness typically holds under a number of assumptions. Leaving them implicit raises the chance of using the system in a context that violates some assumptions, which in return may invalidate the correctness proof. The goal of this paper is to show how combining

  12. Educational Technology as a Subversive Activity: Questioning Assumptions Related to Teaching and Leading with Technology

    Science.gov (United States)

    Kruger-Ross, Matthew J.; Holcomb, Lori B.

    2012-01-01

    The use of educational technologies is grounded in the assumptions of teachers, learners, and administrators. Assumptions are choices that structure our understandings and help us make meaning. Current advances in Web 2.0 and social media technologies challenge our assumptions about teaching and learning. The intersection of technology and…

  13. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    Science.gov (United States)

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  14. Incorporating assumption deviation risk in quantitative risk assessments: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Khorsandi, Jahon; Aven, Terje

    2017-01-01

    Quantitative risk assessments (QRAs) of complex engineering systems are based on numerous assumptions and expert judgments, as there is limited information available for supporting the analysis. In addition to sensitivity analyses, the concept of assumption deviation risk has been suggested as a means for explicitly considering the risk related to inaccuracies and deviations in the assumptions, which can significantly impact the results of the QRAs. However, challenges remain for its practical implementation, considering the number of assumptions and magnitude of deviations to be considered. This paper presents an approach for integrating an assumption deviation risk analysis as part of QRAs. The approach begins with identifying the safety objectives for which the QRA aims to support, and then identifies critical assumptions with respect to ensuring the objectives are met. Key issues addressed include the deviations required to violate the safety objectives, the uncertainties related to the occurrence of such events, and the strength of knowledge supporting the assessments. Three levels of assumptions are considered, which include assumptions related to the system's structural and operational characteristics, the effectiveness of the established barriers, as well as the consequence analysis process. The approach is illustrated for the case of an offshore installation. - Highlights: • An approach for assessing the risk of deviations in QRA assumptions is presented. • Critical deviations and uncertainties related to their occurrence are addressed. • The analysis promotes critical thinking about the foundation and results of QRAs. • The approach is illustrated for the case of an offshore installation.

  15. Implicit assumptions underlying simple harvest models of marine bird populations can mislead environmental management decisions.

    Science.gov (United States)

    O'Brien, Susan H; Cook, Aonghais S C P; Robinson, Robert A

    2017-10-01

    Assessing the potential impact of additional mortality from anthropogenic causes on animal populations requires detailed demographic information. However, these data are frequently lacking, making simple algorithms, which require little data, appealing. Because of their simplicity, these algorithms often rely on implicit assumptions, some of which may be quite restrictive. Potential Biological Removal (PBR) is a simple harvest model that estimates the number of additional mortalities that a population can theoretically sustain without causing population extinction. However, PBR relies on a number of implicit assumptions, particularly around density dependence and population trajectory that limit its applicability in many situations. Among several uses, it has been widely employed in Europe in Environmental Impact Assessments (EIA), to examine the acceptability of potential effects of offshore wind farms on marine bird populations. As a case study, we use PBR to estimate the number of additional mortalities that a population with characteristics typical of a seabird population can theoretically sustain. We incorporated this level of additional mortality within Leslie matrix models to test assumptions within the PBR algorithm about density dependence and current population trajectory. Our analyses suggest that the PBR algorithm identifies levels of mortality which cause population declines for most population trajectories and forms of population regulation. Consequently, we recommend that practitioners do not use PBR in an EIA context for offshore wind energy developments. Rather than using simple algorithms that rely on potentially invalid implicit assumptions, we recommend use of Leslie matrix models for assessing the impact of additional mortality on a population, enabling the user to explicitly define assumptions and test their importance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. The Importance of the Assumption of Uncorrelated Errors in Psychometric Theory

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.; Patelis, Thanos

    2015-01-01

    A critical discussion of the assumption of uncorrelated errors in classical psychometric theory and its applications is provided. It is pointed out that this assumption is essential for a number of fundamental results and underlies the concept of parallel tests, the Spearman-Brown's prophecy and the correction for attenuation formulas as well as…

  17. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  18. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  19. MONITORED GEOLOGIC REPOSITORY LIFE CYCLE COST ESTIMATE ASSUMPTIONS DOCUMENT

    International Nuclear Information System (INIS)

    R.E. Sweeney

    2001-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost (LCC) estimate and schedule update incorporating information from the Viability Assessment (VA) , License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  20. Monitored Geologic Repository Life Cycle Cost Estimate Assumptions Document

    International Nuclear Information System (INIS)

    Sweeney, R.

    2000-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost estimate and schedule update incorporating information from the Viability Assessment (VA), License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  1. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  2. Adult Learning Assumptions

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to examine Knowles' theory of andragogy and his six assumptions of how adults learn while providing evidence to support two of his assumptions based on the theory of andragogy. As no single theory explains how adults learn, it can best be assumed that adults learn through the accumulation of formal and informal…

  3. Relating Climate Change Risks to Water Supply Planning Assumptions: Recent Applications by the U.S. Bureau of Reclamation (Invited)

    Science.gov (United States)

    Brekke, L. D.

    2009-12-01

    Presentation highlights recent methods carried by Reclamation to incorporate climate change and variability information into water supply assumptions for longer-term planning. Presentation also highlights limitations of these methods, and possible method adjustments that might be made to address these limitations. Reclamation was established more than one hundred years ago with a mission centered on the construction of irrigation and hydropower projects in the Western United States. Reclamation’s mission has evolved since its creation to include other activities, including municipal and industrial water supply projects, ecosystem restoration, and the protection and management of water supplies. Reclamation continues to explore ways to better address mission objectives, often considering proposals to develop new infrastructure and/or modify long-term criteria for operations. Such studies typically feature operations analysis to disclose benefits and effects of a given proposal, which are sensitive to assumptions made about future water supplies, water demands, and operating constraints. Development of these assumptions requires consideration to more fundamental future drivers such as land use, demographics, and climate. On the matter of establishing planning assumptions for water supplies under climate change, Reclamation has applied several methods. This presentation highlights two activities where the first focuses on potential changes in hydroclimate frequencies and the second focuses on potential changes in hydroclimate period-statistics. The first activity took place in the Colorado River Basin where there was interest in the interarrival possibilities of drought and surplus events of varying severity relevant to proposals on new criteria for handling lower basin shortages. The second activity occurred in California’s Central Valley where stakeholders were interested in how projected climate change possibilities translated into changes in hydrologic and

  4. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  5. The Immoral Assumption Effect: Moralization Drives Negative Trait Attributions.

    Science.gov (United States)

    Meindl, Peter; Johnson, Kate M; Graham, Jesse

    2016-04-01

    Jumping to negative conclusions about other people's traits is judged as morally bad by many people. Despite this, across six experiments (total N = 2,151), we find that multiple types of moral evaluations--even evaluations related to open-mindedness, tolerance, and compassion--play a causal role in these potentially pernicious trait assumptions. Our results also indicate that moralization affects negative-but not positive-trait assumptions, and that the effect of morality on negative assumptions cannot be explained merely by people's general (nonmoral) preferences or other factors that distinguish moral and nonmoral traits, such as controllability or desirability. Together, these results suggest that one of the more destructive human tendencies--making negative assumptions about others--can be caused by the better angels of our nature. © 2016 by the Society for Personality and Social Psychology, Inc.

  6. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  7. Critically Challenging Some Assumptions in HRD

    Science.gov (United States)

    O'Donnell, David; McGuire, David; Cross, Christine

    2006-01-01

    This paper sets out to critically challenge five interrelated assumptions prominent in the (human resource development) HRD literature. These relate to: the exploitation of labour in enhancing shareholder value; the view that employees are co-contributors to and co-recipients of HRD benefits; the distinction between HRD and human resource…

  8. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  9. Tale of Two Courthouses: A Critique of the Underlying Assumptions in Chronic Disease Self-Management for Aboriginal People

    Directory of Open Access Journals (Sweden)

    Isabelle Ellis

    2009-12-01

    Full Text Available This article reviews the assumptions that underpin thecommonly implemented Chronic Disease Self-Managementmodels. Namely that there are a clear set of instructions forpatients to comply with, that all health care providers agreewith; and that the health care provider and the patient agreewith the chronic disease self-management plan that wasdeveloped as part of a consultation. These assumptions areevaluated for their validity in the remote health care context,particularly for Aboriginal people. These assumptions havebeen found to lack validity in this context, therefore analternative model to enhance chronic disease care isproposed.

  10. Quasi-experimental study designs series-paper 7: assessing the assumptions.

    Science.gov (United States)

    Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian

    2017-09-01

    Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Estimation of the energy loss at the blades in rowing: common assumptions revisited.

    Science.gov (United States)

    Hofmijster, Mathijs; De Koning, Jos; Van Soest, A J

    2010-08-01

    In rowing, power is inevitably lost as kinetic energy is imparted to the water during push-off with the blades. Power loss is estimated from reconstructed blade kinetics and kinematics. Traditionally, it is assumed that the oar is completely rigid and that force acts strictly perpendicular to the blade. The aim of the present study was to evaluate how reconstructed blade kinematics, kinetics, and average power loss are affected by these assumptions. A calibration experiment with instrumented oars and oarlocks was performed to establish relations between measured signals and oar deformation and blade force. Next, an on-water experiment was performed with a single female world-class rower rowing at constant racing pace in an instrumented scull. Blade kinematics, kinetics, and power loss under different assumptions (rigid versus deformable oars; absence or presence of a blade force component parallel to the oar) were reconstructed. Estimated power losses at the blades are 18% higher when parallel blade force is incorporated. Incorporating oar deformation affects reconstructed blade kinematics and instantaneous power loss, but has no effect on estimation of power losses at the blades. Assumptions on oar deformation and blade force direction have implications for the reconstructed blade kinetics and kinematics. Neglecting parallel blade forces leads to a substantial underestimation of power losses at the blades.

  12. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption...

  13. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    Science.gov (United States)

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Bootstrapping realized volatility and realized beta under a local Gaussianity assumption

    DEFF Research Database (Denmark)

    Hounyo, Ulrich

    The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...... context, as recently explained by Mykland and Zhang (2009). Our main contributions are as follows. First, we show that the local Gaussian bootstrap is firstorder consistent when used to estimate the distributions of realized volatility and ealized betas. Second, we show that the local Gaussian bootstrap...... matches accurately the first four cumulants of realized volatility, implying that this method provides third-order refinements. This is in contrast with the wild bootstrap of Gonçalves and Meddahi (2009), which is only second-order correct. Third, we show that the local Gaussian bootstrap is able...

  15. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    Science.gov (United States)

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Multiverse Assumptions and Philosophy

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2018-02-01

    Full Text Available Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong topics such as: infinity, duplicate yous, hypothetical fields, more than three space dimensions, Hilbert space, advanced civilizations, and reality established by mathematical relationships. It is easy to confuse multiverse proposals because many divergent models exist. This overview defines the characteristics of eleven popular multiverse proposals. The characteristics compared are: initial conditions, values of constants, laws of nature, number of space dimensions, number of universes, and fine tuning explanations. Future scientific experiments may validate selected assumptions; but until they do, proposals by philosophers may be as valid as theoretical scientific theories.

  17. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  18. Allele Age Under Non-Classical Assumptions is Clarified by an Exact Computational Markov Chain Approach.

    Science.gov (United States)

    De Sanctis, Bianca; Krukov, Ivan; de Koning, A P Jason

    2017-09-19

    Determination of the age of an allele based on its population frequency is a well-studied problem in population genetics, for which a variety of approximations have been proposed. We present a new result that, surprisingly, allows the expectation and variance of allele age to be computed exactly (within machine precision) for any finite absorbing Markov chain model in a matter of seconds. This approach makes none of the classical assumptions (e.g., weak selection, reversibility, infinite sites), exploits modern sparse linear algebra techniques, integrates over all sample paths, and is rapidly computable for Wright-Fisher populations up to N e  = 100,000. With this approach, we study the joint effect of recurrent mutation, dominance, and selection, and demonstrate new examples of "selective strolls" where the classical symmetry of allele age with respect to selection is violated by weakly selected alleles that are older than neutral alleles at the same frequency. We also show evidence for a strong age imbalance, where rare deleterious alleles are expected to be substantially older than advantageous alleles observed at the same frequency when population-scaled mutation rates are large. These results highlight the under-appreciated utility of computational methods for the direct analysis of Markov chain models in population genetics.

  19. 76 FR 81966 - Agency Information Collection Activities; Proposed Collection; Comments Requested; Assumption of...

    Science.gov (United States)

    2011-12-29

    ... Indian country is subject to State criminal jurisdiction under Public Law 280 (18 U.S.C. 1162(a)) to... Collection; Comments Requested; Assumption of Concurrent Federal Criminal Jurisdiction in Certain Areas of Indian Country ACTION: 60-Day notice of information collection under review. The Department of Justice...

  20. Bioclim Deliverable D6b: application of statistical down-scaling within the BIOCLIM hierarchical strategy: methods, data requirements and underlying assumptions

    International Nuclear Information System (INIS)

    2004-01-01

    -study regions were identified, together with the additional issues which arise in applying these techniques to output from the BIOCLIM simulations. This preliminary work is described in this BIOCLIM technical note. It provides an overview of statistical down-scaling methods, together with their underlying assumptions and advantages/disadvantages. Specific issues relating to their application within the BIOCLIM context (i.e., application to the IPSL C M4 D snapshot simulations) are identified, for example, the stationarity issue. The predictor and predictand data sets that would be required to implement these methods within the BIOCLIM hierarchical strategy are also outlined, together with the methodological steps involved. Implementation of these techniques was delayed in order to give priority to the application of the rule-based down-scaling method developed in WP3 to WP2 EMIC output (see Deliverable D8a). This task was not originally planned, but has allowed more comprehensive comparison and evaluation of the BIOCLIM scenarios and down-scaling methods to be undertaken

  1. Behavioural assumptions in labour economics: Analysing social security reforms and labour market transitions

    OpenAIRE

    van Huizen, T.M.

    2012-01-01

    The aim of this dissertation is to test behavioural assumptions in labour economics models and thereby improve our understanding of labour market behaviour. The assumptions under scrutiny in this study are derived from an analysis of recent influential policy proposals: the introduction of savings schemes in the system of social security. A central question is how this reform will affect labour market incentives and behaviour. Part I (Chapter 2 and 3) evaluates savings schemes. Chapter 2 exam...

  2. Monitoring Assumptions in Assume-Guarantee Contracts

    Directory of Open Access Journals (Sweden)

    Oleg Sokolsky

    2016-05-01

    Full Text Available Pre-deployment verification of software components with respect to behavioral specifications in the assume-guarantee form does not, in general, guarantee absence of errors at run time. This is because assumptions about the environment cannot be discharged until the environment is fixed. An intuitive approach is to complement pre-deployment verification of guarantees, up to the assumptions, with post-deployment monitoring of environment behavior to check that the assumptions are satisfied at run time. Such a monitor is typically implemented by instrumenting the application code of the component. An additional challenge for the monitoring step is that environment behaviors are typically obtained through an I/O library, which may alter the component's view of the input format. This transformation requires us to introduce a second pre-deployment verification step to ensure that alarms raised by the monitor would indeed correspond to violations of the environment assumptions. In this paper, we describe an approach for constructing monitors and verifying them against the component assumption. We also discuss limitations of instrumentation-based monitoring and potential ways to overcome it.

  3. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other...... application domains of the theory. We argue that assumptional symmetry leads to theoretical advancement by promoting the development of theory with greater falsifiability and stronger ontological grounding. Thus, strategic management theory may be advanced by systematically searching for asymmetrical...

  4. Testing legal assumptions regarding the effects of dancer nudity and proximity to patron on erotic expression.

    Science.gov (United States)

    Linz, D; Blumenthal, E; Donnerstein, E; Kunkel, D; Shafer, B J; Lichtenstein, A

    2000-10-01

    A field experiment was conducted in order to test the assumptions by the Supreme Court in Barnes v. Glen Theatre, Inc. (1991) and the Ninth Circuit Court of Appeals in Colacurcio v. City of Kent (1999) that government restrictions on dancer nudity and dancer-patron proximity do not affect the content of messages conveyed by erotic dancers. A field experiment was conducted in which dancer nudity (nude vs. partial clothing) and dancer-patron proximity (4 feet; 6 in.; 6 in. plus touch) were manipulated under controlled conditions in an adult night club. After male patrons viewed the dances, they completed questionnaires assessing affective states and reception of erotic, relational intimacy, and social messages. Contrary to the assumptions of the courts, the results showed that the content of messages conveyed by the dancers was significantly altered by restrictions placed on dancer nudity and dancer-patron proximity. These findings are interpreted in terms of social psychological responses to nudity and communication theories of nonverbal behavior. The legal implications of rejecting the assumptions made by the courts in light of the findings of this study are discussed. Finally, suggestions are made for future research.

  5. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  6. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  7. Oil price assumptions in macroeconomic forecasts: should we follow future market expectations?

    International Nuclear Information System (INIS)

    Coimbra, C.; Esteves, P.S.

    2004-01-01

    In macroeconomic forecasting, in spite of its important role in price and activity developments, oil prices are usually taken as an exogenous variable, for which assumptions have to be made. This paper evaluates the forecasting performance of futures market prices against the other popular technical procedure, the carry-over assumption. The results suggest that there is almost no difference between opting for futures market prices or using the carry-over assumption for short-term forecasting horizons (up to 12 months), while, for longer-term horizons, they favour the use of futures market prices. However, as futures market prices reflect market expectations for world economic activity, futures oil prices should be adjusted whenever market expectations for world economic growth are different to the values underlying the macroeconomic scenarios, in order to fully ensure the internal consistency of those scenarios. (Author)

  8. The crux of the method: assumptions in ordinary least squares and logistic regression.

    Science.gov (United States)

    Long, Rebecca G

    2008-10-01

    Logistic regression has increasingly become the tool of choice when analyzing data with a binary dependent variable. While resources relating to the technique are widely available, clear discussions of why logistic regression should be used in place of ordinary least squares regression are difficult to find. The current paper compares and contrasts the assumptions of ordinary least squares with those of logistic regression and explains why logistic regression's looser assumptions make it adept at handling violations of the more important assumptions in ordinary least squares.

  9. Uncertainties in sandy shorelines evolution under the Bruun rule assumption

    Directory of Open Access Journals (Sweden)

    Gonéri eLe Cozannet

    2016-04-01

    Full Text Available In the current practice of sandy shoreline change assessments, the local sedimentary budget is evaluated using the sediment balance equation, that is, by summing the contributions of longshore and cross-shore processes. The contribution of future sea-level-rise induced by climate change is usually obtained using the Bruun rule, which assumes that the shoreline retreat is equal to the change of sea-level divided by the slope of the upper shoreface. However, it remains unsure that this approach is appropriate to account for the impacts of future sea-level rise. This is due to the lack of relevant observations to validate the Bruun rule under the expected sea-level rise rates. To address this issue, this article estimates the coastal settings and period of time under which the use of the Bruun rule could be (invalidated, in the case of wave-exposed gently-sloping sandy beaches. Using the sedimentary budgets of Stive (2004 and probabilistic sea-level rise scenarios based on IPCC, we provide shoreline change projections that account for all uncertain hydrosedimentary processes affecting idealized coasts (impacts of sea-level rise, storms and other cross-shore and longshore processes. We evaluate the relative importance of each source of uncertainties in the sediment balance equation using a global sensitivity analysis. For scenario RCP 6.0 and 8.5 and in the absence of coastal defences, the model predicts a perceivable shift toward generalized beach erosion by the middle of the 21st century. In contrast, the model predictions are unlikely to differ from the current situation in case of scenario RCP 2.6. Finally, the contribution of sea-level rise and climate change scenarios to sandy shoreline change projections uncertainties increases with time during the 21st century. Our results have three primary implications for coastal settings similar to those provided described in Stive (2004 : first, the validation of the Bruun rule will not necessarily be

  10. Marking and Moderation in the UK: False Assumptions and Wasted Resources

    Science.gov (United States)

    Bloxham, Sue

    2009-01-01

    This article challenges a number of assumptions underlying marking of student work in British universities. It argues that, in developing rigorous moderation procedures, we have created a huge burden for markers which adds little to accuracy and reliability but creates additional work for staff, constrains assessment choices and slows down…

  11. Comparisons between a new point kernel-based scheme and the infinite plane source assumption method for radiation calculation of deposited airborne radionuclides from nuclear power plants.

    Science.gov (United States)

    Zhang, Xiaole; Efthimiou, George; Wang, Yan; Huang, Meng

    2018-04-01

    Radiation from the deposited radionuclides is indispensable information for environmental impact assessment of nuclear power plants and emergency management during nuclear accidents. Ground shine estimation is related to multiple physical processes, including atmospheric dispersion, deposition, soil and air radiation shielding. It still remains unclear that whether the normally adopted "infinite plane" source assumption for the ground shine calculation is accurate enough, especially for the area with highly heterogeneous deposition distribution near the release point. In this study, a new ground shine calculation scheme, which accounts for both the spatial deposition distribution and the properties of air and soil layers, is developed based on point kernel method. Two sets of "detector-centered" grids are proposed and optimized for both the deposition and radiation calculations to better simulate the results measured by the detectors, which will be beneficial for the applications such as source term estimation. The evaluation against the available data of Monte Carlo methods in the literature indicates that the errors of the new scheme are within 5% for the key radionuclides in nuclear accidents. The comparisons between the new scheme and "infinite plane" assumption indicate that the assumption is tenable (relative errors within 20%) for the area located 1 km away from the release source. Within 1 km range, the assumption mainly causes errors for wet deposition and the errors are independent of rain intensities. The results suggest that the new scheme should be adopted if the detectors are within 1 km from the source under the stable atmosphere (classes E and F), or the detectors are within 500 m under slightly unstable (class C) or neutral (class D) atmosphere. Otherwise, the infinite plane assumption is reasonable since the relative errors induced by this assumption are within 20%. The results here are only based on theoretical investigations. They should

  12. PFP issues/assumptions development and management planning guide

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The PFP Issues/Assumptions Development and Management Planning Guide presents the strategy and process used for the identification, allocation, and maintenance of an Issues/Assumptions Management List for the Plutonium Finishing Plant (PFP) integrated project baseline. Revisions to this document will include, as attachments, the most recent version of the Issues/Assumptions Management List, both open and current issues/assumptions (Appendix A), and closed or historical issues/assumptions (Appendix B). This document is intended be a Project-owned management tool. As such, this document will periodically require revisions resulting from improvements of the information, processes, and techniques as now described. Revisions that suggest improved processes will only require PFP management approval

  13. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, R.S.; Alonso, D.; McKane, A.J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  14. World assumptions, posttraumatic stress and quality of life after a natural disaster: A longitudinal study

    Science.gov (United States)

    2012-01-01

    Background Changes in world assumptions are a fundamental concept within theories that explain posttraumatic stress disorder. The objective of the present study was to gain a greater understanding of how changes in world assumptions are related to quality of life and posttraumatic stress symptoms after a natural disaster. Methods A longitudinal study of 574 Norwegian adults who survived the Southeast Asian tsunami in 2004 was undertaken. Multilevel analyses were used to identify which factors at six months post-tsunami predicted quality of life and posttraumatic stress symptoms two years post-tsunami. Results Good quality of life and posttraumatic stress symptoms were negatively related. However, major differences in the predictors of these outcomes were found. Females reported significantly higher quality of life and more posttraumatic stress than men. The association between level of exposure to the tsunami and quality of life seemed to be mediated by posttraumatic stress. Negative perceived changes in the assumption “the world is just” were related to adverse outcome in both quality of life and posttraumatic stress. Positive perceived changes in the assumptions “life is meaningful” and “feeling that I am a valuable human” were associated with higher levels of quality of life but not with posttraumatic stress. Conclusions Quality of life and posttraumatic stress symptoms demonstrate differences in their etiology. World assumptions may be less specifically related to posttraumatic stress than has been postulated in some cognitive theories. PMID:22742447

  15. Assumptions for the Annual Energy Outlook 1992

    International Nuclear Information System (INIS)

    1992-01-01

    This report serves a auxiliary document to the Energy Information Administration (EIA) publication Annual Energy Outlook 1992 (AEO) (DOE/EIA-0383(92)), released in January 1992. The AEO forecasts were developed for five alternative cases and consist of energy supply, consumption, and price projections by major fuel and end-use sector, which are published at a national level of aggregation. The purpose of this report is to present important quantitative assumptions, including world oil prices and macroeconomic growth, underlying the AEO forecasts. The report has been prepared in response to external requests, as well as analyst requirements for background information on the AEO and studies based on the AEO forecasts

  16. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  17. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  18. On some unwarranted tacit assumptions in cognitive neuroscience.

    Science.gov (United States)

    Mausfeld, Rainer

    2012-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input-output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings.

  19. Influence of model assumptions about HIV disease progression after initiating or stopping treatment on estimates of infections and deaths averted by scaling up antiretroviral therapy

    Science.gov (United States)

    Sucharitakul, Kanes; Boily, Marie-Claude; Dimitrov, Dobromir

    2018-01-01

    Background Many mathematical models have investigated the population-level impact of expanding antiretroviral therapy (ART), using different assumptions about HIV disease progression on ART and among ART dropouts. We evaluated the influence of these assumptions on model projections of the number of infections and deaths prevented by expanded ART. Methods A new dynamic model of HIV transmission among men who have sex with men (MSM) was developed, which incorporated each of four alternative assumptions about disease progression used in previous models: (A) ART slows disease progression; (B) ART halts disease progression; (C) ART reverses disease progression by increasing CD4 count; (D) ART reverses disease progression, but disease progresses rapidly once treatment is stopped. The model was independently calibrated to HIV prevalence and ART coverage data from the United States under each progression assumption in turn. New HIV infections and HIV-related deaths averted over 10 years were compared for fixed ART coverage increases. Results Little absolute difference (ART coverage (varied between 33% and 90%) if ART dropouts reinitiated ART at the same rate as ART-naïve MSM. Larger differences in the predicted fraction of HIV-related deaths averted were observed (up to 15pp). However, if ART dropouts could only reinitiate ART at CD4ART interruption did not affect the fraction of HIV infections averted with expanded ART, unless ART dropouts only re-initiated ART at low CD4 counts. Different disease progression assumptions had a larger influence on the fraction of HIV-related deaths averted with expanded ART. PMID:29554136

  20. Analysis On Political Speech Of Susilo Bambang Yudhoyono: Common Sense Assumption And Ideology

    Directory of Open Access Journals (Sweden)

    Sayit Abdul Karim

    2015-10-01

    Full Text Available This paper presents an analysis on political speech of Susilo Bambang Yudhoyono (SBY, the former president of Indonesia at the Indonesian conference on “Moving towards sustainability: together we must create the future we want”. Ideologies are closely linked to power and language because using language is the commonest form of social behavior, and the form of social behavior where we rely most on ‘common-sense’ assumptions. The objectives of this study are to discuss the common sense assumption and ideology by means of language use in SBY’s political speech which is mainly grounded in Norman Fairclough’s theory of language and power in critical discourse analysis. There are two main problems of analysis, namely; first, what are the common sense assumption and ideology in Susilo Bambang Yudhoyono’s political speech; and second, how do they relate to each other in the political discourse? The data used in this study was in the form of written text on “moving towards sustainability: together we must create the future we want”. A qualitative descriptive analysis was employed to analyze the common sense assumption and ideology in the written text of Susilo Bambang Yudhoyono’s political speech which was delivered at Riocto entro Convention Center, Rio de Janeiro on June 20, 2012. One dimension of ‘common sense’ is the meaning of words. The results showed that the common sense assumption and ideology conveyed through SBY’s specific words or expressions can significantly explain how political discourse is constructed and affected by the SBY’s rule and position, life experience, and power relations. He used language as a powerful social tool to present his common sense assumption and ideology to convince his audiences and fellow citizens that the future of sustainability has been an important agenda for all people.

  1. Assumptions and Policy Decisions for Vital Area Identification Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myungsu; Bae, Yeon-Kyoung; Lee, Youngseung [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    U.S. Nuclear Regulatory Commission and IAEA guidance indicate that certain assumptions and policy questions should be addressed to a Vital Area Identification (VAI) process. Korea Hydro and Nuclear Power conducted a VAI based on current Design Basis Threat and engineering judgement to identify APR1400 vital areas. Some of the assumptions were inherited from Probabilistic Safety Assessment (PSA) as a sabotage logic model was based on PSA logic tree and equipment location data. This paper illustrates some important assumptions and policy decisions for APR1400 VAI analysis. Assumptions and policy decisions could be overlooked at the beginning stage of VAI, however they should be carefully reviewed and discussed among engineers, plant operators, and regulators. Through APR1400 VAI process, some of the policy concerns and assumptions for analysis were applied based on document research and expert panel discussions. It was also found that there are more assumptions to define for further studies for other types of nuclear power plants. One of the assumptions is mission time, which was inherited from PSA.

  2. Impact of one-layer assumption on diffuse reflectance spectroscopy of skin

    Science.gov (United States)

    Hennessy, Ricky; Markey, Mia K.; Tunnell, James W.

    2015-02-01

    Diffuse reflectance spectroscopy (DRS) can be used to noninvasively measure skin properties. To extract skin properties from DRS spectra, you need a model that relates the reflectance to the tissue properties. Most models are based on the assumption that skin is homogenous. In reality, skin is composed of multiple layers, and the homogeneity assumption can lead to errors. In this study, we analyze the errors caused by the homogeneity assumption. This is accomplished by creating realistic skin spectra using a computational model, then extracting properties from those spectra using a one-layer model. The extracted parameters are then compared to the parameters used to create the modeled spectra. We used a wavelength range of 400 to 750 nm and a source detector separation of 250 μm. Our results show that use of a one-layer skin model causes underestimation of hemoglobin concentration [Hb] and melanin concentration [mel]. Additionally, the magnitude of the error is dependent on epidermal thickness. The one-layer assumption also causes [Hb] and [mel] to be correlated. Oxygen saturation is overestimated when it is below 50% and underestimated when it is above 50%. We also found that the vessel radius factor used to account for pigment packaging is correlated with epidermal thickness.

  3. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the ...

  4. HYPROLOG: A New Logic Programming Language with Assumptions and Abduction

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2005-01-01

    We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars. The lan......We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars....... The language shows a novel flexibility in the interaction between the different paradigms, including all additional built-in predicates and constraints solvers that may be available. Assumptions and abduction are especially useful for language processing, and we can show how HYPROLOG works seamlessly together...

  5. Operating Characteristics of Statistical Methods for Detecting Gene-by-Measured Environment Interaction in the Presence of Gene-Environment Correlation under Violations of Distributional Assumptions.

    Science.gov (United States)

    Van Hulle, Carol A; Rathouz, Paul J

    2015-02-01

    Accurately identifying interactions between genetic vulnerabilities and environmental factors is of critical importance for genetic research on health and behavior. In the previous work of Van Hulle et al. (Behavior Genetics, Vol. 43, 2013, pp. 71-84), we explored the operating characteristics for a set of biometric (e.g., twin) models of Rathouz et al. (Behavior Genetics, Vol. 38, 2008, pp. 301-315), for testing gene-by-measured environment interaction (GxM) in the presence of gene-by-measured environment correlation (rGM) where data followed the assumed distributional structure. Here we explore the effects that violating distributional assumptions have on the operating characteristics of these same models even when structural model assumptions are correct. We simulated N = 2,000 replicates of n = 1,000 twin pairs under a number of conditions. Non-normality was imposed on either the putative moderator or on the ultimate outcome by ordinalizing or censoring the data. We examined the empirical Type I error rates and compared Bayesian information criterion (BIC) values. In general, non-normality in the putative moderator had little impact on the Type I error rates or BIC comparisons. In contrast, non-normality in the outcome was often mistaken for or masked GxM, especially when the outcome data were censored.

  6. Stop-loss premiums under dependence

    NARCIS (Netherlands)

    Albers, Willem/Wim

    1999-01-01

    Stop-loss premiums are typically calculated under the assumption that the insured lives in the underlying portfolio are independent. Here we study the effects of small departures from this assumption. Using Edgeworth expansions, it is made transparent which configurations of dependence parameters

  7. On Some Unwarranted Tacit Assumptions in Cognitive Neuroscience†

    Science.gov (United States)

    Mausfeld, Rainer

    2011-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. PMID:22435062

  8. Three-class ROC analysis--the equal error utility assumption and the optimality of three-class ROC surface using the ideal observer.

    Science.gov (United States)

    He, Xin; Frey, Eric C

    2006-08-01

    Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.

  9. Occupancy estimation and the closure assumption

    Science.gov (United States)

    Rota, Christopher T.; Fletcher, Robert J.; Dorazio, Robert M.; Betts, Matthew G.

    2009-01-01

    1. Recent advances in occupancy estimation that adjust for imperfect detection have provided substantial improvements over traditional approaches and are receiving considerable use in applied ecology. To estimate and adjust for detectability, occupancy modelling requires multiple surveys at a site and requires the assumption of 'closure' between surveys, i.e. no changes in occupancy between surveys. Violations of this assumption could bias parameter estimates; however, little work has assessed model sensitivity to violations of this assumption or how commonly such violations occur in nature. 2. We apply a modelling procedure that can test for closure to two avian point-count data sets in Montana and New Hampshire, USA, that exemplify time-scales at which closure is often assumed. These data sets illustrate different sampling designs that allow testing for closure but are currently rarely employed in field investigations. Using a simulation study, we then evaluate the sensitivity of parameter estimates to changes in site occupancy and evaluate a power analysis developed for sampling designs that is aimed at limiting the likelihood of closure. 3. Application of our approach to point-count data indicates that habitats may frequently be open to changes in site occupancy at time-scales typical of many occupancy investigations, with 71% and 100% of species investigated in Montana and New Hampshire respectively, showing violation of closure across time periods of 3 weeks and 8 days respectively. 4. Simulations suggest that models assuming closure are sensitive to changes in occupancy. Power analyses further suggest that the modelling procedure we apply can effectively test for closure. 5. Synthesis and applications. Our demonstration that sites may be open to changes in site occupancy over time-scales typical of many occupancy investigations, combined with the sensitivity of models to violations of the closure assumption, highlights the importance of properly addressing

  10. Semi-supervised learning via regularized boosting working on multiple semi-supervised assumptions.

    Science.gov (United States)

    Chen, Ke; Wang, Shihai

    2011-01-01

    Semi-supervised learning concerns the problem of learning in the presence of labeled and unlabeled data. Several boosting algorithms have been extended to semi-supervised learning with various strategies. To our knowledge, however, none of them takes all three semi-supervised assumptions, i.e., smoothness, cluster, and manifold assumptions, together into account during boosting learning. In this paper, we propose a novel cost functional consisting of the margin cost on labeled data and the regularization penalty on unlabeled data based on three fundamental semi-supervised assumptions. Thus, minimizing our proposed cost functional with a greedy yet stagewise functional optimization procedure leads to a generic boosting framework for semi-supervised learning. Extensive experiments demonstrate that our algorithm yields favorite results for benchmark and real-world classification tasks in comparison to state-of-the-art semi-supervised learning algorithms, including newly developed boosting algorithms. Finally, we discuss relevant issues and relate our algorithm to the previous work.

  11. Contemporary assumptions on human nature and work and approach to human potential managing

    Directory of Open Access Journals (Sweden)

    Vujić Dobrila

    2006-01-01

    Full Text Available A general problem of this research is to identify if there is a relationship between the assumption on human nature and work (Mcgregor, Argyris, Schein, Steers and Porter and a general organizational model preference, as well as a mechanism of human resource management? This research was carried out in 2005/2006. The sample consisted of 317 subjects (197 managers, 105 highly educated subordinates and 15 entrepreneurs in 7 big enterprises in a group of small business enterprises differentiating in terms of the entrepreneur’s structure and a type of activity. A general hypothesis "that assumptions on human nature and work are statistically significant in connection to the preference approach (models, of work motivation commitment", has been confirmed. A specific hypothesis have been also confirmed: ·The assumptions on a human as a rational economic being are statistically significant in correlation with only two mechanisms of traditional models, the mechanism of method work control and the working discipline mechanism. ·Statistically significant assumptions on a human as a social being are correlated with all mechanisms of engaging employees, which belong to the model of the human relations, except the mechanism introducing the adequate type of prizes for all employees independently of working results. ·The assumptions on a human as a creative being are statistically significant, positively correlating with preference of two mechanisms belonging to the human resource model by investing into education and training and making conditions for the application of knowledge and skills. The young with assumptions on a human as a creative being prefer much broader repertoire of mechanisms belonging to the human resources model from the remaining category of subjects in the pattern. The connection between the assumption on human nature and preference models of engaging appears especially in the sub-pattern of managers, in the category of young subjects

  12. Effect of grid resolution and subgrid assumptions on the model prediction of a reactive buoyant plume under convective conditions

    International Nuclear Information System (INIS)

    Chock, D.P.; Winkler, S.L.; Pu Sun

    2002-01-01

    We have introduced a new and elaborate approach to understand the impact of grid resolution and subgrid chemistry assumption on the grid-model prediction of species concentrations for a system with highly non-homogeneous chemistry - a reactive buoyant plume immediately downwind of the stack in a convective boundary layer. The Parcel-Grid approach plume was used to describe both the air parcel turbulent transport and chemistry. This approach allows an identical transport process for all simulations. It also allows a description of subgrid chemistry. The ambient and plume parcel transport follows the description of Luhar and Britter (Atmos. Environ, 23 (1989) 1911, 26A (1992) 1283). The chemistry follows that of the Carbon-Bond mechanism. Three different grid sizes were considered: fine, medium and coarse, together with three different subgrid chemistry assumptions: micro-scale or individual parcel, tagged-parcel (plume and ambient parcels treated separately), and untagged-parcel (plume and ambient parcels treated indiscriminately). Reducing the subgrid information is not necessarily similar to increasing the model grid size. In our example, increasing the grid size leads to a reduction in the suppression of ozone in the presence of a high-NO x stack plume, and a reduction in the effectiveness of the NO x -inhibition effect. On the other hand, reducing the subgrid information (by using the untagged-parcel assumption) leads to an increase in ozone reduction and an enhancement of the NO x -inhibition effect insofar as the ozone extremum is concerned. (author)

  13. 40 CFR 265.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ..., STORAGE, AND DISPOSAL FACILITIES Financial Requirements § 265.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  14. 40 CFR 144.66 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM Financial Responsibility: Class I Hazardous Waste Injection Wells § 144.66 State assumption of responsibility. (a) If a State either assumes legal... 40 Protection of Environment 22 2010-07-01 2010-07-01 false State assumption of responsibility...

  15. Load assumption for fatigue design of structures and components counting methods, safety aspects, practical application

    CERN Document Server

    Köhler, Michael; Pötter, Kurt; Zenner, Harald

    2017-01-01

    Understanding the fatigue behaviour of structural components under variable load amplitude is an essential prerequisite for safe and reliable light-weight design. For designing and dimensioning, the expected stress (load) is compared with the capacity to withstand loads (fatigue strength). In this process, the safety necessary for each particular application must be ensured. A prerequisite for ensuring the required fatigue strength is a reliable load assumption. The authors describe the transformation of the stress- and load-time functions which have been measured under operational conditions to spectra or matrices with the application of counting methods. The aspects which must be considered for ensuring a reliable load assumption for designing and dimensioning are discussed in detail. Furthermore, the theoretical background for estimating the fatigue life of structural components is explained, and the procedures are discussed for numerous applications in practice. One of the prime intentions of the authors ...

  16. 40 CFR 264.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... FACILITIES Financial Requirements § 264.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure, post-closure care, or... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  17. 40 CFR 261.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... Excluded Hazardous Secondary Materials § 261.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure or liability... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  18. 40 CFR 267.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... STANDARDIZED PERMIT Financial Requirements § 267.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure care or liability... 40 Protection of Environment 26 2010-07-01 2010-07-01 false State assumption of responsibility...

  19. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  20. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  1. Formalization and Analysis of Reasoning by Assumption

    OpenAIRE

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been speci...

  2. Drug policy in sport: hidden assumptions and inherent contradictions.

    Science.gov (United States)

    Smith, Aaron C T; Stewart, Bob

    2008-03-01

    This paper considers the assumptions underpinning the current drugs-in-sport policy arrangements. We examine the assumptions and contradictions inherent in the policy approach, paying particular attention to the evidence that supports different policy arrangements. We find that the current anti-doping policy of the World Anti-Doping Agency (WADA) contains inconsistencies and ambiguities. WADA's policy position is predicated upon four fundamental principles; first, the need for sport to set a good example; secondly, the necessity of ensuring a level playing field; thirdly, the responsibility to protect the health of athletes; and fourthly, the importance of preserving the integrity of sport. A review of the evidence, however, suggests that sport is a problematic institution when it comes to setting a good example for the rest of society. Neither is it clear that sport has an inherent or essential integrity that can only be sustained through regulation. Furthermore, it is doubtful that WADA's anti-doping policy is effective in maintaining a level playing field, or is the best means of protecting the health of athletes. The WADA anti-doping policy is based too heavily on principals of minimising drug use, and gives insufficient weight to the minimisation of drug-related harms. As a result drug-related harms are being poorly managed in sport. We argue that anti-doping policy in sport would benefit from placing greater emphasis on a harm minimisation model.

  3. SOME CONCEPTIONS AND MISCONCEPTIONS ON REALITY AND ASSUMPTIONS IN FINANCIAL ACCOUNTING

    OpenAIRE

    Stanley C. W. Salvary

    2005-01-01

    This paper addresses two problematic issues arising from the importation of terms into financial accounting: (1) the nature of economic reality; and (2) the role of assumptions. These two issues have stirred a lot of controversy relating to financial accounting measurements and affect attestation reports. This paper attempts to provide conceptual clarity on these two issues.

  4. Technical note: Evaluation of the simultaneous measurements of mesospheric OH, HO2, and O3 under a photochemical equilibrium assumption - a statistical approach

    Science.gov (United States)

    Kulikov, Mikhail Y.; Nechaev, Anton A.; Belikovich, Mikhail V.; Ermakova, Tatiana S.; Feigin, Alexander M.

    2018-05-01

    This Technical Note presents a statistical approach to evaluating simultaneous measurements of several atmospheric components under the assumption of photochemical equilibrium. We consider simultaneous measurements of OH, HO2, and O3 at the altitudes of the mesosphere as a specific example and their daytime photochemical equilibrium as an evaluating relationship. A simplified algebraic equation relating local concentrations of these components in the 50-100 km altitude range has been derived. The parameters of the equation are temperature, neutral density, local zenith angle, and the rates of eight reactions. We have performed a one-year simulation of the mesosphere and lower thermosphere using a 3-D chemical-transport model. The simulation shows that the discrepancy between the calculated evolution of the components and the equilibrium value given by the equation does not exceed 3-4 % in the full range of altitudes independent of season or latitude. We have developed a statistical Bayesian evaluation technique for simultaneous measurements of OH, HO2, and O3 based on the equilibrium equation taking into account the measurement error. The first results of the application of the technique to MLS/Aura data (Microwave Limb Sounder) are presented in this Technical Note. It has been found that the satellite data of the HO2 distribution regularly demonstrate lower altitudes of this component's mesospheric maximum. This has also been confirmed by model HO2 distributions and comparison with offline retrieval of HO2 from the daily zonal means MLS radiance.

  5. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2012-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R_f=......-Reingold style pseudorandom functions, and auxiliary input secure encryption. This can be seen as an alternative to the known family of k-LIN assumptions....

  6. Heterosexual assumptions in verbal and non-verbal communication in nursing.

    Science.gov (United States)

    Röndahl, Gerd; Innala, Sune; Carlsson, Marianne

    2006-11-01

    This paper reports a study of what lesbian women and gay men had to say, as patients and as partners, about their experiences of nursing in hospital care, and what they regarded as important to communicate about homosexuality and nursing. The social life of heterosexual cultures is based on the assumption that all people are heterosexual, thereby making homosexuality socially invisible. Nurses may assume that all patients and significant others are heterosexual, and these heteronormative assumptions may lead to poor communication that affects nursing quality by leading nurses to ask the wrong questions and make incorrect judgements. A qualitative interview study was carried out in the spring of 2004. Seventeen women and 10 men ranging in age from 23 to 65 years from different parts of Sweden participated. They described 46 experiences as patients and 31 as partners. Heteronormativity was communicated in waiting rooms, in patient documents and when registering for admission, and nursing staff sometimes showed perplexity when an informant deviated from this heteronormative assumption. Informants had often met nursing staff who showed fear of behaving incorrectly, which could lead to a sense of insecurity, thereby impeding further communication. As partners of gay patients, informants felt that they had to deal with heterosexual assumptions more than they did when they were patients, and the consequences were feelings of not being accepted as a 'true' relative, of exclusion and neglect. Almost all participants offered recommendations about how nursing staff could facilitate communication. Heterosexual norms communicated unconsciously by nursing staff contribute to ambivalent attitudes and feelings of insecurity that prevent communication and easily lead to misconceptions. Educational and management interventions, as well as increased communication, could make gay people more visible and thereby encourage openness and awareness by hospital staff of the norms that they

  7. Formalization and analysis of reasoning by assumption.

    Science.gov (United States)

    Bosse, Tibor; Jonker, Catholijn M; Treur, Jan

    2006-01-02

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.

  8. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  9. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  10. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  11. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  12. Challenging the assumptions for thermal sensation scales

    DEFF Research Database (Denmark)

    Schweiker, Marcel; Fuchs, Xaver; Becker, Susanne

    2016-01-01

    Scales are widely used to assess the personal experience of thermal conditions in built environments. Most commonly, thermal sensation is assessed, mainly to determine whether a particular thermal condition is comfortable for individuals. A seven-point thermal sensation scale has been used...... extensively, which is suitable for describing a one-dimensional relationship between physical parameters of indoor environments and subjective thermal sensation. However, human thermal comfort is not merely a physiological but also a psychological phenomenon. Thus, it should be investigated how scales for its...... assessment could benefit from a multidimensional conceptualization. The common assumptions related to the usage of thermal sensation scales are challenged, empirically supported by two analyses. These analyses show that the relationship between temperature and subjective thermal sensation is non...

  13. Idaho National Engineering Laboratory installation roadmap assumptions document

    International Nuclear Information System (INIS)

    1993-05-01

    This document is a composite of roadmap assumptions developed for the Idaho National Engineering Laboratory (INEL) by the US Department of Energy Idaho Field Office and subcontractor personnel as a key element in the implementation of the Roadmap Methodology for the INEL Site. The development and identification of these assumptions in an important factor in planning basis development and establishes the planning baseline for all subsequent roadmap analysis at the INEL

  14. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  15. Assumptions of Corporate Social Responsibility as Competitiveness Factor

    Directory of Open Access Journals (Sweden)

    Zaneta Simanaviciene

    2017-09-01

    Full Text Available The purpose of this study was to examine the assumptions of corporate social responsibility (CSR as competitiveness factor in economic downturn. Findings indicate that factors affecting the quality of the micro-economic business environment, i.e., the sophistication of enterprise’s strategy and management processes, the quality of the human capital resources, the increase of product / service demand, the development of related and supporting sectors and the efficiency of natural resources, and competitive capacities of enterprise impact competitiveness at a micro-level. The outcomes suggest that the implementation of CSR elements, i.e., economic, environmental and social responsibilities, gives good opportunities to increase business competitiveness.

  16. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  17. Major Assumptions of Mastery Learning.

    Science.gov (United States)

    Anderson, Lorin W.

    Mastery learning can be described as a set of group-based, individualized, teaching and learning strategies based on the premise that virtually all students can and will, in time, learn what the school has to teach. Inherent in this description are assumptions concerning the nature of schools, classroom instruction, and learners. According to the…

  18. Testing surrogacy assumptions: can threatened and endangered plants be grouped by biological similarity and abundances?

    Directory of Open Access Journals (Sweden)

    Judy P Che-Castaldo

    Full Text Available There is renewed interest in implementing surrogate species approaches in conservation planning due to the large number of species in need of management but limited resources and data. One type of surrogate approach involves selection of one or a few species to represent a larger group of species requiring similar management actions, so that protection and persistence of the selected species would result in conservation of the group of species. However, among the criticisms of surrogate approaches is the need to test underlying assumptions, which remain rarely examined. In this study, we tested one of the fundamental assumptions underlying use of surrogate species in recovery planning: that there exist groups of threatened and endangered species that are sufficiently similar to warrant similar management or recovery criteria. Using a comprehensive database of all plant species listed under the U.S. Endangered Species Act and tree-based random forest analysis, we found no evidence of species groups based on a set of distributional and biological traits or by abundances and patterns of decline. Our results suggested that application of surrogate approaches for endangered species recovery would be unjustified. Thus, conservation planning focused on individual species and their patterns of decline will likely be required to recover listed species.

  19. 7 CFR 772.10 - Transfer and assumption-AMP loans.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Transfer and assumption-AMP loans. 772.10 Section 772..., DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS SERVICING MINOR PROGRAM LOANS § 772.10 Transfer and assumption—AMP loans. (a) Eligibility. The Agency may approve transfers and assumptions of AMP loans when: (1) The...

  20. Are Prescription Opioids Driving the Opioid Crisis? Assumptions vs Facts.

    Science.gov (United States)

    Rose, Mark Edmund

    2018-04-01

    Sharp increases in opioid prescriptions, and associated increases in overdose deaths in the 2000s, evoked widespread calls to change perceptions of opioid analgesics. Medical literature discussions of opioid analgesics began emphasizing patient and public health hazards. Repetitive exposure to this information may influence physician assumptions. While highly consequential to patients with pain whose function and quality of life may benefit from opioid analgesics, current assumptions about prescription opioid analgesics, including their role in the ongoing opioid overdose epidemic, have not been scrutinized. Information was obtained by searching PubMed, governmental agency websites, and conference proceedings. Opioid analgesic prescribing and associated overdose deaths both peaked around 2011 and are in long-term decline; the sharp overdose increase recorded in 2014 was driven by illicit fentanyl and heroin. Nonmethadone prescription opioid analgesic deaths, in the absence of co-ingested benzodiazepines, alcohol, or other central nervous system/respiratory depressants, are infrequent. Within five years of initial prescription opioid misuse, 3.6% initiate heroin use. The United States consumes 80% of the world opioid supply, but opioid access is nonexistent for 80% and severely restricted for 4.1% of the global population. Many current assumptions about opioid analgesics are ill-founded. Illicit fentanyl and heroin, not opioid prescribing, now fuel the current opioid overdose epidemic. National discussion has often neglected the potentially devastating effects of uncontrolled chronic pain. Opioid analgesic prescribing and related overdoses are in decline, at great cost to patients with pain who have benefited or may benefit from, but cannot access, opioid analgesic therapy.

  1. On the validity of Brownian assumptions in the spin van der Waals model

    International Nuclear Information System (INIS)

    Oh, Suhk Kun

    1985-01-01

    A simple Brownian motion theory of the spin van der Waals model, which can be stationary, Markoffian or Gaussian, is studied. By comparing the Brownian motion theory with an exact theory called the generalized Langevin equation theory, the validity of the Brownian assumptions is tested. Thereby, it is shown explicitly how the Markoffian and Gaussian properties are modified in the spin van der Waals model under the influence of quantum fluctuations and long range ordering. (Author)

  2. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan; Almasri, Islam; Shi, Yuexiang; Gao, Xin

    2014-01-01

    of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue

  3. The social contact hypothesis under the assumption of endemic equilibrium: Elucidating the transmission potential of VZV in Europe

    Directory of Open Access Journals (Sweden)

    E. Santermans

    2015-06-01

    Full Text Available The basic reproduction number R0 and the effective reproduction number R are pivotal parameters in infectious disease epidemiology, quantifying the transmission potential of an infection in a population. We estimate both parameters from 13 pre-vaccination serological data sets on varicella zoster virus (VZV in 12 European countries and from population-based social contact surveys under the commonly made assumptions of endemic and demographic equilibrium. The fit to the serology is evaluated using the inferred effective reproduction number R as a model eligibility criterion combined with AIC as a model selection criterion. For only 2 out of 12 countries, the common choice of a constant proportionality factor is sufficient to provide a good fit to the seroprevalence data. For the other countries, an age-specific proportionality factor provides a better fit, assuming physical contacts lasting longer than 15 min are a good proxy for potential varicella transmission events. In all countries, primary infection with VZV most often occurs in early childhood, but there is substantial variation in transmission potential with R0 ranging from 2.8 in England and Wales to 7.6 in The Netherlands. Two non-parametric methods, the maximal information coefficient (MIC and a random forest approach, are used to explain these differences in R0 in terms of relevant country-specific characteristics. Our results suggest an association with three general factors: inequality in wealth, infant vaccination coverage and child care attendance. This illustrates the need to consider fundamental differences between European countries when formulating and parameterizing infectious disease models.

  4. Topographic controls on shallow groundwater levels in a steep, prealpine catchment: When are the TWI assumptions valid?

    NARCIS (Netherlands)

    Rinderer, M.; van Meerveld, H.J.; Seibert, J.

    2014-01-01

    Topographic indices like the Topographic Wetness Index (TWI) have been used to predict spatial patterns of average groundwater levels and to model the dynamics of the saturated zone during events (e.g., TOPMODEL). However, the assumptions underlying the use of the TWI in hydrological models, of

  5. An Extension to Deng's Entropy in the Open World Assumption with an Application in Sensor Data Fusion.

    Science.gov (United States)

    Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S

    2018-06-11

    Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  6. Model specification in oral health-related quality of life research.

    Science.gov (United States)

    Kieffer, Jacobien M; Verrips, Erik; Hoogstraten, Johan

    2009-10-01

    The aim of this study was to analyze conventional wisdom regarding the construction and analysis of oral health-related quality of life (OHRQoL) questionnaires and to outline statistical complications. Most methods used for developing and analyzing questionnaires, such as factor analysis and Cronbach's alpha, presume psychological constructs to be latent, inferring a reflective measurement model with the underlying assumption of local independence. Local independence implies that the latent variable explains why the variables observed are related. Many OHRQoL questionnaires are analyzed as if they were based on a reflective measurement model; local independence is thus assumed. This assumption requires these questionnaires to consist solely of items that reflect, instead of determine, OHRQoL. The tenability of this assumption is the main topic of the present study. It is argued that OHRQoL questionnaires are a mix of both a formative measurement model and a reflective measurement model, thus violating the assumption of local independence. The implications are discussed.

  7. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  8. Assumptions for the Annual Energy Outlook 1993

    International Nuclear Information System (INIS)

    1993-01-01

    This report is an auxiliary document to the Annual Energy Outlook 1993 (AEO) (DOE/EIA-0383(93)). It presents a detailed discussion of the assumptions underlying the forecasts in the AEO. The energy modeling system is an economic equilibrium system, with component demand modules representing end-use energy consumption by major end-use sector. Another set of modules represents petroleum, natural gas, coal, and electricity supply patterns and pricing. A separate module generates annual forecasts of important macroeconomic and industrial output variables. Interactions among these components of energy markets generate projections of prices and quantities for which energy supply equals energy demand. This equilibrium modeling system is referred to as the Intermediate Future Forecasting System (IFFS). The supply models in IFFS for oil, coal, natural gas, and electricity determine supply and price for each fuel depending upon consumption levels, while the demand models determine consumption depending upon end-use price. IFFS solves for market equilibrium for each fuel by balancing supply and demand to produce an energy balance in each forecast year

  9. Emerging Assumptions About Organization Design, Knowledge And Action

    Directory of Open Access Journals (Sweden)

    Alan Meyer

    2013-12-01

    Full Text Available Participants in the Organizational Design Community’s 2013 Annual Conference faced the challenge of “making organization design knowledge actionable.”  This essay summarizes the opinions and insights participants shared during the conference.  I reflect on these ideas, connect them to recent scholarly thinking about organization design, and conclude that seeking to make design knowledge actionable is nudging the community away from an assumption set based upon linearity and equilibrium, and toward a new set of assumptions based on emergence, self-organization, and non-linearity.

  10. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  11. The Arundel Assumption And Revision Of Some Large-Scale Maps ...

    African Journals Online (AJOL)

    The rather common practice of stating or using the Arundel Assumption without reference to appropriate mapping standards (except mention of its use for graphical plotting) is a major cause of inaccuracies in map revision. This paper describes an investigation to ascertain the applicability of the Assumption to the revision of ...

  12. Evaluating methodological assumptions of a catch-curve survival estimation of unmarked precocial shorebird chickes

    Science.gov (United States)

    McGowan, Conor P.; Gardner, Beth

    2013-01-01

    Estimating productivity for precocial species can be difficult because young birds leave their nest within hours or days of hatching and detectability thereafter can be very low. Recently, a method for using a modified catch-curve to estimate precocial chick daily survival for age based count data was presented using Piping Plover (Charadrius melodus) data from the Missouri River. However, many of the assumptions of the catch-curve approach were not fully evaluated for precocial chicks. We developed a simulation model to mimic Piping Plovers, a fairly representative shorebird, and age-based count-data collection. Using the simulated data, we calculated daily survival estimates and compared them with the known daily survival rates from the simulation model. We conducted these comparisons under different sampling scenarios where the ecological and statistical assumptions had been violated. Overall, the daily survival estimates calculated from the simulated data corresponded well with true survival rates of the simulation. Violating the accurate aging and the independence assumptions did not result in biased daily survival estimates, whereas unequal detection for younger or older birds and violating the birth death equilibrium did result in estimator bias. Assuring that all ages are equally detectable and timing data collection to approximately meet the birth death equilibrium are key to the successful use of this method for precocial shorebirds.

  13. An Extension to Deng’s Entropy in the Open World Assumption with an Application in Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Yongchuan Tang

    2018-06-01

    Full Text Available Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  14. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  15. Investigating Darcy-scale assumptions by means of a multiphysics algorithm

    Science.gov (United States)

    Tomin, Pavel; Lunati, Ivan

    2016-09-01

    Multiphysics (or hybrid) algorithms, which couple Darcy and pore-scale descriptions of flow through porous media in a single numerical framework, are usually employed to decrease the computational cost of full pore-scale simulations or to increase the accuracy of pure Darcy-scale simulations when a simple macroscopic description breaks down. Despite the massive increase in available computational power, the application of these techniques remains limited to core-size problems and upscaling remains crucial for practical large-scale applications. In this context, the Hybrid Multiscale Finite Volume (HMsFV) method, which constructs the macroscopic (Darcy-scale) problem directly by numerical averaging of pore-scale flow, offers not only a flexible framework to efficiently deal with multiphysics problems, but also a tool to investigate the assumptions used to derive macroscopic models and to better understand the relationship between pore-scale quantities and the corresponding macroscale variables. Indeed, by direct comparison of the multiphysics solution with a reference pore-scale simulation, we can assess the validity of the closure assumptions inherent to the multiphysics algorithm and infer the consequences for macroscopic models at the Darcy scale. We show that the definition of the scale ratio based on the geometric properties of the porous medium is well justified only for single-phase flow, whereas in case of unstable multiphase flow the nonlinear interplay between different forces creates complex fluid patterns characterized by new spatial scales, which emerge dynamically and weaken the scale-separation assumption. In general, the multiphysics solution proves very robust even when the characteristic size of the fluid-distribution patterns is comparable with the observation length, provided that all relevant physical processes affecting the fluid distribution are considered. This suggests that macroscopic constitutive relationships (e.g., the relative

  16. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  17. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Science.gov (United States)

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  18. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Directory of Open Access Journals (Sweden)

    Anja F. Ernst

    2017-05-01

    Full Text Available Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  19. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  20. Testing of one-inch UF{sub 6} cylinder valves under simulated fire conditions

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, P.G. [Martin Marietta Energy Systems, Inc., Paducah, KY (United States)

    1991-12-31

    Accurate computational models which predict the behavior of UF{sub 6} cylinders exposed to fires are required to validate existing firefighting and emergency response procedures. Since the cylinder valve is a factor in the containment provided by the UF{sub 6} cylinder, its behavior under fire conditions has been a necessary assumption in the development of such models. Consequently, test data is needed to substantiate these assumptions. Several studies cited in this document provide data related to the behavior of a 1-inch UF{sub 6} cylinder valve in fire situations. To acquire additional data, a series of tests were conducted at the Paducah Gaseous Diffusion Plant (PGDP) under a unique set of test conditions. This document describes this testing and the resulting data.

  1. Stable isotopes and elasmobranchs: tissue types, methods, applications and assumptions.

    Science.gov (United States)

    Hussey, N E; MacNeil, M A; Olin, J A; McMeans, B C; Kinney, M J; Chapman, D D; Fisk, A T

    2012-04-01

    Stable-isotope analysis (SIA) can act as a powerful ecological tracer with which to examine diet, trophic position and movement, as well as more complex questions pertaining to community dynamics and feeding strategies or behaviour among aquatic organisms. With major advances in the understanding of the methodological approaches and assumptions of SIA through dedicated experimental work in the broader literature coupled with the inherent difficulty of studying typically large, highly mobile marine predators, SIA is increasingly being used to investigate the ecology of elasmobranchs (sharks, skates and rays). Here, the current state of SIA in elasmobranchs is reviewed, focusing on available tissues for analysis, methodological issues relating to the effects of lipid extraction and urea, the experimental dynamics of isotopic incorporation, diet-tissue discrimination factors, estimating trophic position, diet and mixing models and individual specialization and niche-width analyses. These areas are discussed in terms of assumptions made when applying SIA to the study of elasmobranch ecology and the requirement that investigators standardize analytical approaches. Recommendations are made for future SIA experimental work that would improve understanding of stable-isotope dynamics and advance their application in the study of sharks, skates and rays. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  2. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2017-01-01

    are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute......A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...... over what constitutes proper assumptions—even in the absence of corroborating or falsifying empirical evidence. We also discuss how changing assumptions may drive future progress in the resource-based view....

  3. Clinical review: Moral assumptions and the process of organ donation in the intensive care unit

    OpenAIRE

    Streat, Stephen

    2004-01-01

    The objective of the present article is to review moral assumptions underlying organ donation in the intensive care unit. Data sources used include personal experience, and a Medline search and a non-Medline search of relevant English-language literature. The study selection included articles concerning organ donation. All data were extracted and analysed by the author. In terms of data synthesis, a rational, utilitarian moral perspective dominates, and has captured and circumscribed, the lan...

  4. A simulation study to compare three self-controlled case series approaches: correction for violation of assumption and evaluation of bias.

    Science.gov (United States)

    Hua, Wei; Sun, Guoying; Dodd, Caitlin N; Romio, Silvana A; Whitaker, Heather J; Izurieta, Hector S; Black, Steven; Sturkenboom, Miriam C J M; Davis, Robert L; Deceuninck, Genevieve; Andrews, N J

    2013-08-01

    The assumption that the occurrence of outcome event must not alter subsequent exposure probability is critical for preserving the validity of the self-controlled case series (SCCS) method. This assumption is violated in scenarios in which the event constitutes a contraindication for exposure. In this simulation study, we compared the performance of the standard SCCS approach and two alternative approaches when the event-independent exposure assumption was violated. Using the 2009 H1N1 and seasonal influenza vaccines and Guillain-Barré syndrome as a model, we simulated a scenario in which an individual may encounter multiple unordered exposures and each exposure may be contraindicated by the occurrence of outcome event. The degree of contraindication was varied at 0%, 50%, and 100%. The first alternative approach used only cases occurring after exposure with follow-up time starting from exposure. The second used a pseudo-likelihood method. When the event-independent exposure assumption was satisfied, the standard SCCS approach produced nearly unbiased relative incidence estimates. When this assumption was partially or completely violated, two alternative SCCS approaches could be used. While the post-exposure cases only approach could handle only one exposure, the pseudo-likelihood approach was able to correct bias for both exposures. Violation of the event-independent exposure assumption leads to an overestimation of relative incidence which could be corrected by alternative SCCS approaches. In multiple exposure situations, the pseudo-likelihood approach is optimal; the post-exposure cases only approach is limited in handling a second exposure and may introduce additional bias, thus should be used with caution. Copyright © 2013 John Wiley & Sons, Ltd.

  5. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    Science.gov (United States)

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  6. Investigating the Assumptions of Uses and Gratifications Research

    Science.gov (United States)

    Lometti, Guy E.; And Others

    1977-01-01

    Discusses a study designed to determine empirically the gratifications sought from communication channels and to test the assumption that individuals differentiate channels based on gratifications. (MH)

  7. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    Science.gov (United States)

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. The Emperors sham - wrong assumption that sham needling is sham.

    Science.gov (United States)

    Lundeberg, Thomas; Lund, Iréne; Näslund, Jan; Thomas, Moolamanil

    2008-12-01

    During the last five years a large number of randomised controlled clinical trials (RCTs) have been published on the efficacy of acupuncture in different conditions. In most of these studies verum is compared with sham acupuncture. In general both verum and sham have been found to be effective, and often with little reported difference in outcome. This has repeatedly led to the conclusion that acupuncture is no more effective than placebo treatment. However, this conclusion is based on the assumption that sham acupuncture is inert. Since sham acupuncture evidently is merely another form of acupuncture from the physiological perspective, the assumption that sham is sham is incorrect and conclusions based on this assumption are therefore invalid. Clinical guidelines based on such conclusions may therefore exclude suffering patients from valuable treatments.

  9. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Directory of Open Access Journals (Sweden)

    Giordano James

    2010-01-01

    Full Text Available Abstract A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order, and what counts as abnormality (i.e.- disorder. The distinction(s between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice.

  10. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Science.gov (United States)

    2010-01-01

    A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order), and what counts as abnormality (i.e.- disorder). The distinction(s) between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice. PMID:20109176

  11. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and

  12. The Avalanche Hypothesis and Compression of Morbidity: Testing Assumptions through Cohort-Sequential Analysis.

    Directory of Open Access Journals (Sweden)

    Jordan Silberman

    Full Text Available The compression of morbidity model posits a breakpoint in the adult lifespan that separates an initial period of relative health from a subsequent period of ever increasing morbidity. Researchers often assume that such a breakpoint exists; however, this assumption is hitherto untested.To test the assumption that a breakpoint exists--which we term a morbidity tipping point--separating a period of relative health from a subsequent deterioration in health status. An analogous tipping point for healthcare costs was also investigated.Four years of adults' (N = 55,550 morbidity and costs data were retrospectively analyzed. Data were collected in Pittsburgh, PA between 2006 and 2009; analyses were performed in Rochester, NY and Ann Arbor, MI in 2012 and 2013. Cohort-sequential and hockey stick regression models were used to characterize long-term trajectories and tipping points, respectively, for both morbidity and costs.Morbidity increased exponentially with age (P<.001. A morbidity tipping point was observed at age 45.5 (95% CI, 41.3-49.7. An exponential trajectory was also observed for costs (P<.001, with a costs tipping point occurring at age 39.5 (95% CI, 32.4-46.6. Following their respective tipping points, both morbidity and costs increased substantially (Ps<.001.Findings support the existence of a morbidity tipping point, confirming an important but untested assumption. This tipping point, however, may occur earlier in the lifespan than is widely assumed. An "avalanche of morbidity" occurred after the morbidity tipping point-an ever increasing rate of morbidity progression. For costs, an analogous tipping point and "avalanche" were observed. The time point at which costs began to increase substantially occurred approximately 6 years before health status began to deteriorate.

  13. Academic Public Relations Curricula: How They Compare with the Bateman-Cutlip Commission Standards.

    Science.gov (United States)

    McCartney, Hunter P.

    To see what effect the 1975 Bateman-Cutlip Commission's recommendations have had on improving public relations education in the United States, 173 questionnaires were sent to colleges or universities with accredited or comprehensive programs in public relations. Responding to five basic assumptions underlying the commission's recommendations,…

  14. Recognising the Effects of Costing Assumptions in Educational Business Simulation Games

    Science.gov (United States)

    Eckardt, Gordon; Selen, Willem; Wynder, Monte

    2015-01-01

    Business simulations are a powerful way to provide experiential learning that is focussed, controlled, and concentrated. Inherent in any simulation, however, are numerous assumptions that determine feedback, and hence the lessons learnt. In this conceptual paper we describe some common cost assumptions that are implicit in simulation design and…

  15. On the derivation of approximations to cellular automata models and the assumption of independence.

    Science.gov (United States)

    Davies, K J; Green, J E F; Bean, N G; Binder, B J; Ross, J V

    2014-07-01

    Cellular automata are discrete agent-based models, generally used in cell-based applications. There is much interest in obtaining continuum models that describe the mean behaviour of the agents in these models. Previously, continuum models have been derived for agents undergoing motility and proliferation processes, however, these models only hold under restricted conditions. In order to narrow down the reason for these restrictions, we explore three possible sources of error in deriving the model. These sources are the choice of limiting arguments, the use of a discrete-time model as opposed to a continuous-time model and the assumption of independence between the state of sites. We present a rigorous analysis in order to gain a greater understanding of the significance of these three issues. By finding a limiting regime that accurately approximates the conservation equation for the cellular automata, we are able to conclude that the inaccuracy between our approximation and the cellular automata is completely based on the assumption of independence. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Revealing patterns of cultural transmission from frequency data: equilibrium and non-equilibrium assumptions

    Science.gov (United States)

    Crema, Enrico R.; Kandler, Anne; Shennan, Stephen

    2016-12-01

    A long tradition of cultural evolutionary studies has developed a rich repertoire of mathematical models of social learning. Early studies have laid the foundation of more recent endeavours to infer patterns of cultural transmission from observed frequencies of a variety of cultural data, from decorative motifs on potsherds to baby names and musical preferences. While this wide range of applications provides an opportunity for the development of generalisable analytical workflows, archaeological data present new questions and challenges that require further methodological and theoretical discussion. Here we examine the decorative motifs of Neolithic pottery from an archaeological assemblage in Western Germany, and argue that the widely used (and relatively undiscussed) assumption that observed frequencies are the result of a system in equilibrium conditions is unwarranted, and can lead to incorrect conclusions. We analyse our data with a simulation-based inferential framework that can overcome some of the intrinsic limitations in archaeological data, as well as handle both equilibrium conditions and instances where the mode of cultural transmission is time-variant. Results suggest that none of the models examined can produce the observed pattern under equilibrium conditions, and suggest. instead temporal shifts in the patterns of cultural transmission.

  17. Implementing multiple intervention strategies in Dutch public health-related policy networks

    NARCIS (Netherlands)

    Harting, Janneke; Peters, Dorothee; Grêaux, Kimberly; van Assema, Patricia; Verweij, Stefan; Stronks, Karien; Klijn, Erik-Hans

    2017-01-01

    Improving public health requires multiple intervention strategies. Implementing such an intervention mix is supposed to require a multisectoral policy network. As evidence to support this assumption is scarce, we examined under which conditions public health-related policy networks were able to

  18. The usefulness of information on HDL-cholesterol: potential pitfalls of conventional assumptions

    Directory of Open Access Journals (Sweden)

    Furberg Curt D

    2001-05-01

    Full Text Available Abstract Treatment decisions related to disease prevention are often based on two conventional and related assumptions. First, an intervention-induced change in a surrogate marker (such as high-density lipoprotein [HDL]-cholesterol in the desired direction translates into health benefits (such as reduction in coronary events. Second, it is unimportant which interventions are used to alter surrogate markers, since an intervention benefit is independent of the means by which it is achieved. The scientific foundation for these assumptions has been questioned. In this commentary, the appropriateness of relying on low levels of HDL-cholesterol for treatment decisions is reviewed. The Veterans Affairs - HDL-Cholesterol Intervention Trial (VA-HIT investigators recently reported that only 23% of the gemfibrozil-induced relative reduction in risk of coronary events observed in the trial could be explained by changes in HDL-cholesterol between baseline and the 1-year visit. Thus, 77% of the health benefit to the participants was unexplained. Other possible explanations are that gemfibrozil has multiple mechanisms of action, disease manifestations are multifactorial, and laboratory measurements of HDL-cholesterol are imprecise. The wisdom of relying on levels and changes in surrogate markers such as HDL-cholesterol to make decisions about treatment choices should questioned. It seems better to rely on direct evidence of health benefits and to prescribe specific interventions that have been shown to reduce mortality and morbidity. Since extrapolations based on surrogate markers may not be in patients' best interest, the practice of medicine ought to be evidence-based.

  19. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  20. Determining Bounds on Assumption Errors in Operational Analysis

    Directory of Open Access Journals (Sweden)

    Neal M. Bengtson

    2014-01-01

    Full Text Available The technique of operational analysis (OA is used in the study of systems performance, mainly for estimating mean values of various measures of interest, such as, number of jobs at a device and response times. The basic principles of operational analysis allow errors in assumptions to be quantified over a time period. The assumptions which are used to derive the operational analysis relationships are studied. Using Karush-Kuhn-Tucker (KKT conditions bounds on error measures of these OA relationships are found. Examples of these bounds are used for representative performance measures to show limits on the difference between true performance values and those estimated by operational analysis relationships. A technique for finding tolerance limits on the bounds is demonstrated with a simulation example.

  1. CHILDREN'S EDUCATION IN THE REGULAR NATIONAL BASIS: ASSUMPTIONS AND INTERFACES WITH PHYSICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    André da Silva Mello

    2016-09-01

    Full Text Available This paper aims at discussing the Children's Education organization within the Regular Curricular National Basis (BNCC, focusing on the permanencies and advances taking in relation to the precedent documents, and analyzing the presence of Physical Education in Children's Education from the assumptions that guide the Base, in interface with researches about pedagogical experiences with this field of knowledge. To do so, it carries out a documental-bibliographic analysis, using as sources the BNCC, the National Curricular Referential for Children's Education, the National Curricular Guidelines for Children's Education and academic-scientific productions belonging to the Physical Education area that approach Children's Education. In the analysis process, the work establishes categories which allow the interlocution among different sources used in this study. Data analyzed offers indications that the assumption present in the BNCC dialogue, not explicitly, with the movements of the curricular component and with the Physical Education academic-scientific production regarding Children's Education.

  2. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  3. Central limit theorems under special relativity.

    Science.gov (United States)

    McKeague, Ian W

    2015-04-01

    Several relativistic extensions of the Maxwell-Boltzmann distribution have been proposed, but they do not explain observed lognormal tail-behavior in the flux distribution of various astrophysical sources. Motivated by this question, extensions of classical central limit theorems are developed under the conditions of special relativity. The results are related to CLTs on locally compact Lie groups developed by Wehn, Stroock and Varadhan, but in this special case the asymptotic distribution has an explicit form that is readily seen to exhibit lognormal tail behavior.

  4. Formalization and Analysis of Reasoning by Assumption

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning

  5. Psychopatholgy, fundamental assumptions and CD-4 T lymphocyte ...

    African Journals Online (AJOL)

    In addition, we explored whether psychopathology and negative fundamental assumptions in ... Method: Self-rating questionnaires to assess depressive symptoms, ... associated with all participants scoring in the positive range of the FA scale.

  6. Direct numerical simulations of temporally developing hydrocarbon shear flames at elevated pressure: effects of the equation of state and the unity Lewis number assumption

    Science.gov (United States)

    Korucu, Ayse; Miller, Richard

    2016-11-01

    Direct numerical simulations (DNS) of temporally developing shear flames are used to investigate both equation of state (EOS) and unity-Lewis (Le) number assumption effects in hydrocarbon flames at elevated pressure. A reduced Kerosene / Air mechanism including a semi-global soot formation/oxidation model is used to study soot formation/oxidation processes in a temporarlly developing hydrocarbon shear flame operating at both atmospheric and elevated pressures for the cubic Peng-Robinson real fluid EOS. Results are compared to simulations using the ideal gas law (IGL). The results show that while the unity-Le number assumption with the IGL EOS under-predicts the flame temperature for all pressures, with the real fluid EOS it under-predicts the flame temperature for 1 and 35 atm and over-predicts the rest. The soot mass fraction, Ys, is only under-predicted for the 1 atm flame for both IGL and real gas fluid EOS models. While Ys is over-predicted for elevated pressures with IGL EOS, for the real gas EOS Ys's predictions are similar to results using a non-unity Le model derived from non-equilibrium thermodynamics and real diffusivities. Adopting the unity Le assumption is shown to cause misprediction of Ys, the flame temperature, and the mass fractions of CO, H and OH.

  7. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    Directory of Open Access Journals (Sweden)

    Hazim Adnan Hashim

    2016-09-01

    Full Text Available The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led many individuals to build new kind of beliefs and assumptions about themselves and the world. Many writers have written about the human ordeals that followed this incident. Don DeLillo’s Falling Man reflects the traumatic repercussions of this disaster on Americans’ fundamental assumptions. The objective of this study is to examine the novel from the traumatic perspective that has afflicted the victims’ fundamental understandings of the world and the self. Individuals’ fundamental understandings could be changed or modified due to exposure to certain types of events like war, terrorism, political violence or even the sense of alienation. The Assumptive World theory of Ronnie Janoff-Bulman will be used as a framework to study the traumatic experience of the characters in Falling Man. The significance of the study lies in providing a new perception to the field of trauma that can help trauma victims to adopt alternative assumptions or reshape their previous ones to heal from traumatic effects.

  8. The Role of Policy Assumptions in Validating High-stakes Testing Programs.

    Science.gov (United States)

    Kane, Michael

    L. Cronbach has made the point that for validity arguments to be convincing to diverse audiences, they need to be based on assumptions that are credible to these audiences. The interpretations and uses of high stakes test scores rely on a number of policy assumptions about what should be taught in schools, and more specifically, about the content…

  9. Encoding-related brain activity dissociates between the recollective processes underlying successful recall and recognition: a subsequent-memory study.

    Science.gov (United States)

    Sadeh, Talya; Maril, Anat; Goshen-Gottstein, Yonatan

    2012-07-01

    The subsequent-memory (SM) paradigm uncovers brain mechanisms that are associated with mnemonic activity during encoding by measuring participants' neural activity during encoding and classifying the encoding trials according to performance in the subsequent retrieval phase. The majority of these studies have converged on the notion that the mechanism supporting recognition is mediated by familiarity and recollection. The process of recollection is often assumed to be a recall-like process, implying that the active search for the memory trace is similar, if not identical, for recall and recognition. Here we challenge this assumption and hypothesize - based on previous findings obtained in our lab - that the recollective processes underlying recall and recognition might show dissociative patterns of encoding-related brain activity. To this end, our design controlled for familiarity, thereby focusing on contextual, recollective processes. We found evidence for dissociative neurocognitive encoding mechanisms supporting subsequent-recall and subsequent-recognition. Specifically, the contrast of subsequent-recognition versus subsequent-recall revealed activation in the Parahippocampal cortex (PHc) and the posterior hippocampus--regions associated with contextual processing. Implications of our findings and their relation to current cognitive models of recollection are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  11. The incompressibility assumption in computational simulations of nasal airflow.

    Science.gov (United States)

    Cal, Ismael R; Cercos-Pita, Jose Luis; Duque, Daniel

    2017-06-01

    Most of the computational works on nasal airflow up to date have assumed incompressibility, given the low Mach number of these flows. However, for high temperature gradients, the incompressibility assumption could lead to a loss of accuracy, due to the temperature dependence of air density and viscosity. In this article we aim to shed some light on the influence of this assumption in a model of calm breathing in an Asian nasal cavity, by solving the fluid flow equations in compressible and incompressible formulation for different ambient air temperatures using the OpenFOAM package. At low flow rates and warm climatological conditions, similar results were obtained from both approaches, showing that density variations need not be taken into account to obtain a good prediction of all flow features, at least for usual breathing conditions. This agrees with most of the simulations previously reported, at least as far as the incompressibility assumption is concerned. However, parameters like nasal resistance and wall shear stress distribution differ for air temperatures below [Formula: see text]C approximately. Therefore, density variations should be considered for simulations at such low temperatures.

  12. The dispersion relation of a gravitating spiral system

    International Nuclear Information System (INIS)

    Evangelidis, E.

    1977-01-01

    The dispersion relation has been found for a galaxy, without the assumption that the centrifugal force is balanced by the gravitational force. It has been shown that such a system (1) can be gravitationally unstable under appropriate conditions, and (2) that there is no resonance at ω=2Ω (Ω=angular velocity of the Galaxy). (Auth.)

  13. On the Empirical Importance of the Conditional Skewness Assumption in Modelling the Relationship between Risk and Return

    Science.gov (United States)

    Pipień, M.

    2008-09-01

    We present the results of an application of Bayesian inference in testing the relation between risk and return on the financial instruments. On the basis of the Intertemporal Capital Asset Pricing Model, proposed by Merton we built a general sampling distribution suitable in analysing this relationship. The most important feature of our assumptions is that the skewness of the conditional distribution of returns is used as an alternative source of relation between risk and return. This general specification relates to Skewed Generalized Autoregressive Conditionally Heteroscedastic-in-Mean model. In order to make conditional distribution of financial returns skewed we considered the unified approach based on the inverse probability integral transformation. In particular, we applied hidden truncation mechanism, inverse scale factors, order statistics concept, Beta and Bernstein distribution transformations and also a constructive method. Based on the daily excess returns on the Warsaw Stock Exchange Index we checked the empirical importance of the conditional skewness assumption on the relation between risk and return on the Warsaw Stock Market. We present posterior probabilities of all competing specifications as well as the posterior analysis of the positive sign of the tested relationship.

  14. False assumptions.

    Science.gov (United States)

    Swaminathan, M

    1997-01-01

    Indian women do not have to be told the benefits of breast feeding or "rescued from the clutches of wicked multinational companies" by international agencies. There is no proof that breast feeding has declined in India; in fact, a 1987 survey revealed that 98% of Indian women breast feed. Efforts to promote breast feeding among the middle classes rely on such initiatives as the "baby friendly" hospital where breast feeding is promoted immediately after birth. This ignores the 76% of Indian women who give birth at home. Blaming this unproved decline in breast feeding on multinational companies distracts attention from more far-reaching and intractable effects of social change. While the Infant Milk Substitutes Act is helpful, it also deflects attention from more pressing issues. Another false assumption is that Indian women are abandoning breast feeding to comply with the demands of employment, but research indicates that most women give up employment for breast feeding, despite the economic cost to their families. Women also seek work in the informal sector to secure the flexibility to meet their child care responsibilities. Instead of being concerned about "teaching" women what they already know about the benefits of breast feeding, efforts should be made to remove the constraints women face as a result of their multiple roles and to empower them with the support of families, governmental policies and legislation, employers, health professionals, and the media.

  15. 7 CFR 1980.476 - Transfer and assumptions.

    Science.gov (United States)

    2010-01-01

    ...-354 449-30 to recover its pro rata share of the actual loss at that time. In completing Form FmHA or... the lender on liquidations and property management. A. The State Director may approve all transfer and... Director will notify the Finance Office of all approved transfer and assumption cases on Form FmHA or its...

  16. Rethinking our assumptions about the evolution of bird song and other sexually dimorphic signals

    Directory of Open Access Journals (Sweden)

    J. Jordan Price

    2015-04-01

    Full Text Available Bird song is often cited as a classic example of a sexually-selected ornament, in part because historically it has been considered a primarily male trait. Recent evidence that females also sing in many songbird species and that sexual dimorphism in song is often the result of losses in females rather than gains in males therefore appears to challenge our understanding of the evolution of bird song through sexual selection. Here I propose that these new findings do not necessarily contradict previous research, but rather they disagree with some of our assumptions about the evolution of sexual dimorphisms in general and female song in particular. These include misconceptions that current patterns of elaboration and diversity in each sex reflect past rates of change and that levels of sexual dimorphism necessarily reflect levels of sexual selection. Using New World blackbirds (Icteridae as an example, I critically evaluate these past assumptions in light of new phylogenetic evidence. Understanding the mechanisms underlying such sexually dimorphic traits requires a clear understanding of their evolutionary histories. Only then can we begin to ask the right questions.

  17. On the "well-mixed" assumption and numerical 2-D tracing of atmospheric moisture

    Directory of Open Access Journals (Sweden)

    H. F. Goessling

    2013-06-01

    Full Text Available Atmospheric water vapour tracers (WVTs are an elegant tool to determine source–sink relations of moisture "online" in atmospheric general circulation models (AGCMs. However, it is sometimes desirable to establish such relations "offline" based on already existing atmospheric data (e.g. reanalysis data. One simple and frequently applied offline method is 2-D moisture tracing. It makes use of the "well-mixed" assumption, which allows for treating the vertical dimension integratively. Here we scrutinise the "well-mixed" assumption and 2-D moisture tracing by means of analytical considerations in combination with AGCM-WVT simulations. We find that vertically well-mixed conditions are seldom met. Due to the presence of vertical inhomogeneities, 2-D moisture tracing (i neglects a significant degree of fast-recycling, and (ii results in erroneous advection where the direction of the horizontal winds varies vertically. The latter is not so much the case in the extratropics, but in the tropics this can lead to large errors. For example, computed by 2-D moisture tracing, the fraction of precipitation in the western Sahel that originates from beyond the Sahara is ~40%, whereas the fraction that originates from the tropical and Southern Atlantic is only ~4%. According to full (i.e. 3-D moisture tracing, however, both regions contribute roughly equally, showing that the errors introduced by the 2-D approximation can be substantial.

  18. Estimation of bias with the single-zone assumption in measurement of residential air exchange using the perfluorocarbon tracer gas method.

    Science.gov (United States)

    Van Ryswyk, K; Wallace, L; Fugler, D; MacNeill, M; Héroux, M È; Gibson, M D; Guernsey, J R; Kindzierski, W; Wheeler, A J

    2015-12-01

    Residential air exchange rates (AERs) are vital in understanding the temporal and spatial drivers of indoor air quality (IAQ). Several methods to quantify AERs have been used in IAQ research, often with the assumption that the home is a single, well-mixed air zone. Since 2005, Health Canada has conducted IAQ studies across Canada in which AERs were measured using the perfluorocarbon tracer (PFT) gas method. Emitters and detectors of a single PFT gas were placed on the main floor to estimate a single-zone AER (AER(1z)). In three of these studies, a second set of emitters and detectors were deployed in the basement or second floor in approximately 10% of homes for a two-zone AER estimate (AER(2z)). In total, 287 daily pairs of AER(2z) and AER(1z) estimates were made from 35 homes across three cities. In 87% of the cases, AER(2z) was higher than AER(1z). Overall, the AER(1z) estimates underestimated AER(2z) by approximately 16% (IQR: 5-32%). This underestimate occurred in all cities and seasons and varied in magnitude seasonally, between homes, and daily, indicating that when measuring residential air exchange using a single PFT gas, the assumption of a single well-mixed air zone very likely results in an under prediction of the AER. The results of this study suggest that the long-standing assumption that a home represents a single well-mixed air zone may result in a substantial negative bias in air exchange estimates. Indoor air quality professionals should take this finding into consideration when developing study designs or making decisions related to the recommendation and installation of residential ventilation systems. © 2014 Her Majesty the Queen in Right of Canada. Indoor Air published by John Wiley & Sons Ltd Reproduced with the permission of the Minister of Health Canada.

  19. Sensitivity of fluvial sediment source apportionment to mixing model assumptions: A Bayesian model comparison.

    Science.gov (United States)

    Cooper, Richard J; Krueger, Tobias; Hiscock, Kevin M; Rawlins, Barry G

    2014-11-01

    Mixing models have become increasingly common tools for apportioning fluvial sediment load to various sediment sources across catchments using a wide variety of Bayesian and frequentist modeling approaches. In this study, we demonstrate how different model setups can impact upon resulting source apportionment estimates in a Bayesian framework via a one-factor-at-a-time (OFAT) sensitivity analysis. We formulate 13 versions of a mixing model, each with different error assumptions and model structural choices, and apply them to sediment geochemistry data from the River Blackwater, Norfolk, UK, to apportion suspended particulate matter (SPM) contributions from three sources (arable topsoils, road verges, and subsurface material) under base flow conditions between August 2012 and August 2013. Whilst all 13 models estimate subsurface sources to be the largest contributor of SPM (median ∼76%), comparison of apportionment estimates reveal varying degrees of sensitivity to changing priors, inclusion of covariance terms, incorporation of time-variant distributions, and methods of proportion characterization. We also demonstrate differences in apportionment results between a full and an empirical Bayesian setup, and between a Bayesian and a frequentist optimization approach. This OFAT sensitivity analysis reveals that mixing model structural choices and error assumptions can significantly impact upon sediment source apportionment results, with estimated median contributions in this study varying by up to 21% between model versions. Users of mixing models are therefore strongly advised to carefully consider and justify their choice of model structure prior to conducting sediment source apportionment investigations. An OFAT sensitivity analysis of sediment fingerprinting mixing models is conductedBayesian models display high sensitivity to error assumptions and structural choicesSource apportionment results differ between Bayesian and frequentist approaches.

  20. Interface Input/Output Automata: Splitting Assumptions from Guarantees

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Nyman, Ulrik; Wasowski, Andrzej

    2006-01-01

    's \\IOAs [11], relying on a context dependent notion of refinement based on relativized language inclusion. There are two main contributions of the work. First, we explicitly separate assumptions from guarantees, increasing the modeling power of the specification language and demonstrating an interesting...

  1. Untested assumptions: psychological research and credibility assessment in legal decision-making

    Directory of Open Access Journals (Sweden)

    Jane Herlihy

    2015-05-01

    Full Text Available Background: Trauma survivors often have to negotiate legal systems such as refugee status determination or the criminal justice system. Methods & results: We outline and discuss the contribution which research on trauma and related psychological processes can make to two particular areas of law where complex and difficult legal decisions must be made: in claims for refugee and humanitarian protection, and in reporting and prosecuting sexual assault in the criminal justice system. Conclusion: There is a breadth of psychological knowledge that, if correctly applied, would limit the inappropriate reliance on assumptions and myth in legal decision-making in these settings. Specific recommendations are made for further study.

  2. The ruin probability of a discrete time risk model under constant interest rate with heavy tails

    NARCIS (Netherlands)

    Tang, Q.

    2004-01-01

    This paper investigates the ultimate ruin probability of a discrete time risk model with a positive constant interest rate. Under the assumption that the gross loss of the company within one year is subexponentially distributed, a simple asymptotic relation for the ruin probability is derived and

  3. Posttraumatic Growth and Shattered World Assumptions Among Ex-POWs

    DEFF Research Database (Denmark)

    Lahav, Y.; Bellin, Elisheva S.; Solomon, Z.

    2016-01-01

    Objective: The controversy regarding the nature of posttraumatic growth (PTG) includes two main competing claims: one which argues that PTG reflects authentic positive changes and the other which argues that PTG reflects illusionary defenses. The former also suggests that PTG evolves from shattered...... world assumptions (WAs) and that the co-occurrence of high PTG and negative WAs among trauma survivors reflects reconstruction of an integrative belief system. The present study aimed to test these claims by investigating, for the first time, the mediating role of dissociation in the relation between...... PTG and WAs. Method: Former prisoners of war (ex-POWs; n = 158) and comparable controls (n = 106) were assessed 38 years after the Yom Kippur War. Results: Ex-POWs endorsed more negative WAs and higher PTG and dissociation compared to controls. Ex-POWs with posttraumatic stress disorder (PTSD...

  4. Testing the assumption of normality in body sway area calculations during unipedal stance tests with an inertial sensor.

    Science.gov (United States)

    Kyoung Jae Kim; Lucarevic, Jennifer; Bennett, Christopher; Gaunaurd, Ignacio; Gailey, Robert; Agrawal, Vibhor

    2016-08-01

    The quantification of postural sway during the unipedal stance test is one of the essentials of posturography. A shift of center of pressure (CoP) is an indirect measure of postural sway and also a measure of a person's ability to maintain balance. A widely used method in laboratory settings to calculate the sway of body center of mass (CoM) is through an ellipse that encloses 95% of CoP trajectory. The 95% ellipse can be computed under the assumption that the spatial distribution of the CoP points recorded from force platforms is normal. However, to date, this assumption of normality has not been demonstrated for sway measurements recorded from a sacral inertial measurement unit (IMU). This work provides evidence for non-normality of sway trajectories calculated at a sacral IMU with injured subjects as well as healthy subjects.

  5. Modelling sexual transmission of HIV: testing the assumptions, validating the predictions

    Science.gov (United States)

    Baggaley, Rebecca F.; Fraser, Christophe

    2010-01-01

    Purpose of review To discuss the role of mathematical models of sexual transmission of HIV: the methods used and their impact. Recent findings We use mathematical modelling of “universal test and treat” as a case study to illustrate wider issues relevant to all modelling of sexual HIV transmission. Summary Mathematical models are used extensively in HIV epidemiology to deduce the logical conclusions arising from one or more sets of assumptions. Simple models lead to broad qualitative understanding, while complex models can encode more realistic assumptions and thus be used for predictive or operational purposes. An overreliance on model analysis where assumptions are untested and input parameters cannot be estimated should be avoided. Simple models providing bold assertions have provided compelling arguments in recent public health policy, but may not adequately reflect the uncertainty inherent in the analysis. PMID:20543600

  6. Efficient pseudorandom generators based on the DDH assumption

    NARCIS (Netherlands)

    Rezaeian Farashahi, R.; Schoenmakers, B.; Sidorenko, A.; Okamoto, T.; Wang, X.

    2007-01-01

    A family of pseudorandom generators based on the decisional Diffie-Hellman assumption is proposed. The new construction is a modified and generalized version of the Dual Elliptic Curve generator proposed by Barker and Kelsey. Although the original Dual Elliptic Curve generator is shown to be

  7. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  8. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    Science.gov (United States)

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions

  9. Sensitivity of TRIM projections to management, harvest, yield, and stocking adjustment assumptions.

    Science.gov (United States)

    Susan J. Alexander

    1991-01-01

    The Timber Resource Inventory Model (TRIM) was used to make several projections of forest industry timber supply for the Douglas-fir region. The sensitivity of these projections to assumptions about management and yields is discussed. A base run is compared to runs in which yields were altered, stocking adjustment was eliminated, harvest assumptions were changed, and...

  10. Validity of the mockwitness paradigm: testing the assumptions.

    Science.gov (United States)

    McQuiston, Dawn E; Malpass, Roy S

    2002-08-01

    Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.

  11. Pre-equilibrium assumptions and statistical model parameters effects on reaction cross-section calculations

    International Nuclear Information System (INIS)

    Avrigeanu, M.; Avrigeanu, V.

    1992-02-01

    A systematic study on effects of statistical model parameters and semi-classical pre-equilibrium emission models has been carried out for the (n,p) reactions on the 56 Fe and 60 Co target nuclei. The results obtained by using various assumptions within a given pre-equilibrium emission model differ among them more than the ones of different models used under similar conditions. The necessity of using realistic level density formulas is emphasized especially in connection with pre-equilibrium emission models (i.e. with the exciton state density expression), while a basic support could be found only by replacement of the Williams exciton state density formula with a realistic one. (author). 46 refs, 12 figs, 3 tabs

  12. A method of statistical analysis in the field of sports science when assumptions of parametric tests are not violated

    Directory of Open Access Journals (Sweden)

    Elżbieta Sandurska

    2016-12-01

    Full Text Available Introduction: Application of statistical software typically does not require extensive statistical knowledge, allowing to easily perform even complex analyses. Consequently, test selection criteria and important assumptions may be easily overlooked or given insufficient consideration. In such cases, the results may likely lead to wrong conclusions. Aim: To discuss issues related to assumption violations in the case of Student's t-test and one-way ANOVA, two parametric tests frequently used in the field of sports science, and to recommend solutions. Description of the state of knowledge: Student's t-test and ANOVA are parametric tests, and therefore some of the assumptions that need to be satisfied include normal distribution of the data and homogeneity of variances in groups. If the assumptions are violated, the original design of the test is impaired, and the test may then be compromised giving spurious results. A simple method to normalize the data and to stabilize the variance is to use transformations. If such approach fails, a good alternative to consider is a nonparametric test, such as Mann-Whitney, the Kruskal-Wallis or Wilcoxon signed-rank tests. Summary: Thorough verification of the parametric tests assumptions allows for correct selection of statistical tools, which is the basis of well-grounded statistical analysis. With a few simple rules, testing patterns in the data characteristic for the study of sports science comes down to a straightforward procedure.

  13. Child Development Knowledge and Teacher Preparation: Confronting Assumptions.

    Science.gov (United States)

    Katz, Lilian G.

    This paper questions the widely held assumption that acquiring knowledge of child development is an essential part of teacher preparation and teaching competence, especially among teachers of young children. After discussing the influence of culture, parenting style, and teaching style on developmental expectations and outcomes, the paper asserts…

  14. Is special relativity logically inconsistent

    International Nuclear Information System (INIS)

    Prokhovnik, S.J.

    1980-01-01

    The author gives his view that Special Relativity is logically and mathematically consistent, as well as physically comprehensible if, and only if, it is firmly based on the single assumption of a unique fundamental reference frame for light propagation. The theory and all its results are derivable from this assumption; the Relativity and Light Principles become intelligible consequences of this assumption; the physical significance and source of time dilation and length contraction are made manifest thereby. (Auth.)

  15. Low Streamflow Forcasting using Minimum Relative Entropy

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2013-12-01

    Minimum relative entropy spectral analysis is derived in this study, and applied to forecast streamflow time series. Proposed method extends the autocorrelation in the manner that the relative entropy of underlying process is minimized so that time series data can be forecasted. Different prior estimation, such as uniform, exponential and Gaussian assumption, is taken to estimate the spectral density depending on the autocorrelation structure. Seasonal and nonseasonal low streamflow series obtained from Colorado River (Texas) under draught condition is successfully forecasted using proposed method. Minimum relative entropy determines spectral of low streamflow series with higher resolution than conventional method. Forecasted streamflow is compared to the prediction using Burg's maximum entropy spectral analysis (MESA) and Configurational entropy. The advantage and disadvantage of each method in forecasting low streamflow is discussed.

  16. Making Predictions about Chemical Reactivity: Assumptions and Heuristics

    Science.gov (United States)

    Maeyer, Jenine; Talanquer, Vicente

    2013-01-01

    Diverse implicit cognitive elements seem to support but also constrain reasoning in different domains. Many of these cognitive constraints can be thought of as either implicit assumptions about the nature of things or reasoning heuristics for decision-making. In this study we applied this framework to investigate college students' understanding of…

  17. Problematic assumptions have slowed down depression research: why symptoms, not syndromes are the way forward

    Directory of Open Access Journals (Sweden)

    Eiko I Fried

    2015-03-01

    Full Text Available Major Depression (MD is a highly heterogeneous diagnostic category. Diverse symptoms such as sad mood, anhedonia, and fatigue are routinely added to an unweighted sum-score, and cutoffs are used to distinguish between depressed participants and healthy controls. Researchers then investigate outcome variables like MD risk factors, biomarkers, and treatment response in such samples. These practices presuppose that (1 depression is a discrete condition, and that (2 symptoms are interchangeable indicators of this latent disorder. Here I review these two assumptions, elucidate their historical roots, show how deeply engrained they are in psychological and psychiatric research, and document that they contrast with evidence. Depression is not a consistent syndrome with clearly demarcated boundaries, and depression symptoms are not interchangeable indicators of an underlying disorder. Current research practices lump individuals with very different problems into one category, which has contributed to the remarkably slow progress in key research domains such as the development of efficacious antidepressants or the identification of biomarkers for depression.The recently proposed network framework offers an alternative to the problematic assumptions. MD is not understood as a distinct condition, but as heterogeneous symptom cluster that substantially overlaps with other syndromes such as anxiety disorders. MD is not framed as an underlying disease with a number of equivalent indicators, but as a network of symptoms that have direct causal influence on each other: insomnia can cause fatigue which then triggers concentration and psychomotor problems. This approach offers new opportunities for constructing an empirically based classification system and has broad implications for future research.

  18. Dialogic or Dialectic? The Significance of Ontological Assumptions in Research on Educational Dialogue

    Science.gov (United States)

    Wegerif, Rupert

    2008-01-01

    This article explores the relationship between ontological assumptions and studies of educational dialogue through a focus on Bakhtin's "dialogic". The term dialogic is frequently appropriated to a modernist framework of assumptions, in particular the neo-Vygotskian or sociocultural tradition. However, Vygotsky's theory of education is dialectic,…

  19. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  20. The effective Hamiltonian in curved quantum waveguides under mild regularity assumptions

    Czech Academy of Sciences Publication Activity Database

    Krejčiřík, David; Šediváková, Helena

    2012-01-01

    Roč. 24, č. 7 (2012), 1250018/1-1250018/39 ISSN 0129-055X R&D Projects: GA MŠk LC06002; GA ČR GAP203/11/0701 Institutional support: RVO:61389005 Keywords : quantum waveguides * thin-width limit * effective Hamiltonian * twisting versus bending * norm-resolvent convergence * Dirichlet Laplacian * curved tubes * relatively parallel frame * Steklov approximation Subject RIV: BE - Theoretical Physics Impact factor: 1.092, year: 2012

  1. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    ) model~\\cite{borges99data}. These techniques typically rely on the \\textit{Markov assumption with history depth} $n$, i.e., it is assumed that the next requested page is only dependent on the last $n$ pages visited. This is not always valid, i.e. false browsing patterns may be discovered. However, to our...

  2. Risk related behaviour under different ambient scent conditions

    Directory of Open Access Journals (Sweden)

    Alina Gagarina

    2016-09-01

    Full Text Available The article analyses the effect of two ambient scents (peppermint and vanilla and their intensiveness on risk related behaviour that is expressed through selected decision making heuristics. Purpose of the article: The purpose of this article is to identify the relationship of ambient scent type and intensiveness with risk related behaviour that is expressed through selected decision making heuristics. Methodology/methods: 2×2 factorial experiment with control group was run. Ambient scent type (vanilla vs. peppermint and intensiveness (8 (1mg vs. 16 sprays (2mg of scent concentrate in the same room were manipulated as between subject variables. Risk aversion, effect of anchoring heuristic on bidding, and affect (risk and benefit heuristics were tracked as dependent variables. Scientific aim: To identify whether ambient scent type and intensiveness have effect on risk related behaviour. Findings: Evidence suggests that there are effects of ambient scent on risk related behaviour, thus fulfilling the missing gap to relate ambient environment to decision making heuristics when risks are involved. However, not all heuristics were affected by experimental conditions. Subjects were bidding significantly higher amounts under low anchor conditions, when peppermint scent was around (if compared to vanilla group. Affect risk was perceived as lower in peppermint ambient scent conditions, if compared to the control group. Intensity of ambient scent also had influence on affect risk: subjects perceived less risk under high scent intensity conditions. Conclusions: By manipulating ambient scent, marketers may reduce or increase consumers risk perception and behaviour and as a consequence influence their purchase decisions. Marketers could use peppermint scent in high intensiveness in the situations where they want consumers to undertake higher risks (expensive purchases, gambling, insurance, since stakes were higher under peppermint ambient scent condition

  3. Axion: Mass -- Dark Matter Abundance Relation

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The axion is a hypothetical particle which would explain why QCD is approximately T-conserving, and is also an excellent Cold Dark Matter candidate. It should be possible to make a clean theoretical prediction relating the dark matter density in axions and the axion mass (under reasonable assumptions about inflation). But the axion's early-Universe dynamics, which establish its density as dark matter, are unexpectedly rich in a way which is only starting to yield to quantitative numerical study.

  4. Using different assumptions of aerosol mixing state and chemical composition to predict CCN concentrations based on field measurements in urban Beijing

    Science.gov (United States)

    Ren, Jingye; Zhang, Fang; Wang, Yuying; Collins, Don; Fan, Xinxin; Jin, Xiaoai; Xu, Weiqi; Sun, Yele; Cribb, Maureen; Li, Zhanqing

    2018-05-01

    Understanding the impacts of aerosol chemical composition and mixing state on cloud condensation nuclei (CCN) activity in polluted areas is crucial for accurately predicting CCN number concentrations (NCCN). In this study, we predict NCCN under five assumed schemes of aerosol chemical composition and mixing state based on field measurements in Beijing during the winter of 2016. Our results show that the best closure is achieved with the assumption of size dependent chemical composition for which sulfate, nitrate, secondary organic aerosols, and aged black carbon are internally mixed with each other but externally mixed with primary organic aerosol and fresh black carbon (external-internal size-resolved, abbreviated as EI-SR scheme). The resulting ratios of predicted-to-measured NCCN (RCCN_p/m) were 0.90 - 0.98 under both clean and polluted conditions. Assumption of an internal mixture and bulk chemical composition (INT-BK scheme) shows good closure with RCCN_p/m of 1.0 -1.16 under clean conditions, implying that it is adequate for CCN prediction in continental clean regions. On polluted days, assuming the aerosol is internally mixed and has a chemical composition that is size dependent (INT-SR scheme) achieves better closure than the INT-BK scheme due to the heterogeneity and variation in particle composition at different sizes. The improved closure achieved using the EI-SR and INT-SR assumptions highlight the importance of measuring size-resolved chemical composition for CCN predictions in polluted regions. NCCN is significantly underestimated (with RCCN_p/m of 0.66 - 0.75) when using the schemes of external mixtures with bulk (EXT-BK scheme) or size-resolved composition (EXT-SR scheme), implying that primary particles experience rapid aging and physical mixing processes in urban Beijing. However, our results show that the aerosol mixing state plays a minor role in CCN prediction when the κorg exceeds 0.1.

  5. Detecting and accounting for violations of the constancy assumption in non-inferiority clinical trials.

    Science.gov (United States)

    Koopmeiners, Joseph S; Hobbs, Brian P

    2018-05-01

    Randomized, placebo-controlled clinical trials are the gold standard for evaluating a novel therapeutic agent. In some instances, it may not be considered ethical or desirable to complete a placebo-controlled clinical trial and, instead, the placebo is replaced by an active comparator with the objective of showing either superiority or non-inferiority to the active comparator. In a non-inferiority trial, the experimental treatment is considered non-inferior if it retains a pre-specified proportion of the effect of the active comparator as represented by the non-inferiority margin. A key assumption required for valid inference in the non-inferiority setting is the constancy assumption, which requires that the effect of the active comparator in the non-inferiority trial is consistent with the effect that was observed in previous trials. It has been shown that violations of the constancy assumption can result in a dramatic increase in the rate of incorrectly concluding non-inferiority in the presence of ineffective or even harmful treatment. In this paper, we illustrate how Bayesian hierarchical modeling can be used to facilitate multi-source smoothing of the data from the current trial with the data from historical studies, enabling direct probabilistic evaluation of the constancy assumption. We then show how this result can be used to adapt the non-inferiority margin when the constancy assumption is violated and present simulation results illustrating that our method controls the type-I error rate when the constancy assumption is violated, while retaining the power of the standard approach when the constancy assumption holds. We illustrate our adaptive procedure using a non-inferiority trial of raltegravir, an antiretroviral drug for the treatment of HIV.

  6. Towards New Probabilistic Assumptions in Business Intelligence

    OpenAIRE

    Schumann Andrew; Szelc Andrzej

    2015-01-01

    One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot ...

  7. Assumption-versus data-based approaches to summarizing species' ranges.

    Science.gov (United States)

    Peterson, A Townsend; Navarro-Sigüenza, Adolfo G; Gordillo, Alejandro

    2018-06-01

    For conservation decision making, species' geographic distributions are mapped using various approaches. Some such efforts have downscaled versions of coarse-resolution extent-of-occurrence maps to fine resolutions for conservation planning. We examined the quality of the extent-of-occurrence maps as range summaries and the utility of refining those maps into fine-resolution distributional hypotheses. Extent-of-occurrence maps tend to be overly simple, omit many known and well-documented populations, and likely frequently include many areas not holding populations. Refinement steps involve typological assumptions about habitat preferences and elevational ranges of species, which can introduce substantial error in estimates of species' true areas of distribution. However, no model-evaluation steps are taken to assess the predictive ability of these models, so model inaccuracies are not noticed. Whereas range summaries derived by these methods may be useful in coarse-grained, global-extent studies, their continued use in on-the-ground conservation applications at fine spatial resolutions is not advisable in light of reliance on assumptions, lack of real spatial resolution, and lack of testing. In contrast, data-driven techniques that integrate primary data on biodiversity occurrence with remotely sensed data that summarize environmental dimensions (i.e., ecological niche modeling or species distribution modeling) offer data-driven solutions based on a minimum of assumptions that can be evaluated and validated quantitatively to offer a well-founded, widely accepted method for summarizing species' distributional patterns for conservation applications. © 2016 Society for Conservation Biology.

  8. Dust mobilization by high-speed vapor flow under LOVA

    International Nuclear Information System (INIS)

    Matsuki, K.; Suzuki, S.; Ebara, S.; Yokomine, T.; Shimizu, A.

    2006-01-01

    In the safety analysis on the International Thermonuclear Experimental Reactor (ITER), the ingress of coolant (ICE) event and the loss of vacuum (LOVA) event are considered as one of the most serious accident. On the assumption of LOVA occurring after ICE, it is inferable that activated dusts are under the wet condition. Transport behavior of in-vessel activated dusts under the wet condition is not well understood in comparison with the dry case. In this study, we experimentally investigated the entrainment behavior of dust under LOVA after ICE. We measured dust entrainment by high-speed humid airflow with phase change. Graphite dusts and glass beads are used as substitutions for mobile inventory. The relations among the relative humidity, the entrainment of particles in the exhaust gas flow and the adhesion rate of dust particles on the pipe wall have been made clear, as has the distribution profile of dust deposition on the pipe wall. The entrainment ratio decreased as the relative humidity increased and increased as the initial pressure difference increased

  9. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J.; Kahn, Yonatan; McCullough, Matthew

    2015-10-06

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  10. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    Science.gov (United States)

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Observing gravitational-wave transient GW150914 with minimal assumptions

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. C.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, R.D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, A.L.S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, T.C; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brocki, P.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderon Bustillo, J.; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Qian; Chua, S. E.; Chung, E.S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, A.C.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, A.L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Debra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.A.; DeRosa, R. T.; Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M.G.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H. -B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, T. M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.M.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. R.; Flaminio, R.; Fletcher, M; Fournier, J. -D.; Franco, S; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.P.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.M.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; de Haas, R.; Hacker, J. J.; Buffoni-Hall, R.; Hall, E. D.; Hammond, G.L.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, P.J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hinder, I.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.H.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kefelian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan., S.; Khan, Z.; Khazanov, E. A.; Kijhunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.M.; King, E. J.; King, P. J.; Kinsey, M.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Laguna, P.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, R.; Leavey, S.; Lebigot, E. O.; Lee, C.H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lueck, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magana-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R.M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, J.C.; Moraru, D.; Gutierrez Moreno, M.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Gutierrez-Neri, M.; Neunzert, A.; Newton-Howes, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J.; Oh, S. H.; Ohme, F.; Oliver, M. B.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Page, J.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prolchorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Puerrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosinska, D.; Rowan, S.; Ruediger, A.; Ruggi, P.; Ryan, K.A.; Sachdev, P.S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabel, R.B.; Schofield, R. M. S.; Schoenbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, M.S.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shithriar, M. S.; Shaltev, M.; Shao, Z.M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, António Dias da; Simakov, D.; Singer, A; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, J.R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepanczyk, M. J.; Tacca, M.D.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Toyra, D.; Travasso, F.; Traylor, G.; Trifiro, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlhruch, H.; Vajente, G.; Valdes, G.; Van Bakel, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasuth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, R. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.M.; Wessels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, D.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J.L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. -P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be

  12. 29 CFR 1977.10 - Proceedings under or related to the Act.

    Science.gov (United States)

    2010-07-01

    ... OCCUPATIONAL SAFETY AND HEALTH ACT OF 1970 Specific Protections § 1977.10 Proceedings under or related to the... standard under section 6(f) of the Act and employee appeal of an Occupational Safety and Health Review... 1977.10 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION...

  13. Stress–strain relations for hydrogels under multiaxial deformation

    DEFF Research Database (Denmark)

    Drozdov, Aleksey; Christiansen, Jesper de Claville

    2013-01-01

    and solvent-dependent reference configuration. The importance of introduction of a reference configuration evolving under swelling is confirmed by the analysis of experimental data on nanocomposite hydrogels subjected to swelling and drying. Adjustable parameters in the stress–strain relations are found...... by fitting observations on swollen elastomers, chemical gels (linked by covalent bonds and sliding cross-links), and physical gels under uniaxial stretching, equi-biaxial tension, and pure shear. Good agreement is demonstrated between the observations and results of numerical simulation. A pronounced...

  14. The 'revealed preferences' theory: Assumptions and conjectures

    International Nuclear Information System (INIS)

    Green, C.H.

    1983-01-01

    Being kind of intuitive psychology the 'Revealed-Preferences'- theory based approaches towards determining the acceptable risks are a useful method for the generation of hypotheses. In view of the fact that reliability engineering develops faster than methods for the determination of reliability aims the Revealed-Preferences approach is a necessary preliminary help. Some of the assumptions on which the 'Revealed-Preferences' theory is based will be identified and analysed and afterwards compared with experimentally obtained results. (orig./DG) [de

  15. Modeling soil CO2 production and transport with dynamic source and diffusion terms: testing the steady-state assumption using DETECT v1.0

    Science.gov (United States)

    Ryan, Edmund M.; Ogle, Kiona; Kropp, Heather; Samuels-Crow, Kimberly E.; Carrillo, Yolima; Pendall, Elise

    2018-05-01

    The flux of CO2 from the soil to the atmosphere (soil respiration, Rsoil) is a major component of the global carbon (C) cycle. Methods to measure and model Rsoil, or partition it into different components, often rely on the assumption that soil CO2 concentrations and fluxes are in steady state, implying that Rsoil is equal to the rate at which CO2 is produced by soil microbial and root respiration. Recent research, however, questions the validity of this assumption. Thus, the aim of this work was two-fold: (1) to describe a non-steady state (NSS) soil CO2 transport and production model, DETECT, and (2) to use this model to evaluate the environmental conditions under which Rsoil and CO2 production are likely in NSS. The backbone of DETECT is a non-homogeneous, partial differential equation (PDE) that describes production and transport of soil CO2, which we solve numerically at fine spatial and temporal resolution (e.g., 0.01 m increments down to 1 m, every 6 h). Production of soil CO2 is simulated for every depth and time increment as the sum of root respiration and microbial decomposition of soil organic matter. Both of these factors can be driven by current and antecedent soil water content and temperature, which can also vary by time and depth. We also analytically solved the ordinary differential equation (ODE) corresponding to the steady-state (SS) solution to the PDE model. We applied the DETECT NSS and SS models to the six-month growing season period representative of a native grassland in Wyoming. Simulation experiments were conducted with both model versions to evaluate factors that could affect departure from SS, such as (1) varying soil texture; (2) shifting the timing or frequency of precipitation; and (3) with and without the environmental antecedent drivers. For a coarse-textured soil, Rsoil from the SS model closely matched that of the NSS model. However, in a fine-textured (clay) soil, growing season Rsoil was ˜ 3 % higher under the assumption of

  16. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...

  17. Incorporation of constructivist assumptions into problem-based instruction: a literature review.

    Science.gov (United States)

    Kantar, Lina

    2014-05-01

    The purpose of this literature review was to explore the use of distinct assumptions of constructivism when studying the impact of problem-based learning (PBL) on learners in undergraduate nursing programs. Content analysis research technique. The literature review included information retrieved from sources selected via electronic databases, such as EBSCOhost, ProQuest, Sage Publications, SLACK Incorporation, Springhouse Corporation, and Digital Dissertations. The literature review was conducted utilizing key terms and phrases associated with problem-based learning in undergraduate nursing education. Out of the 100 reviewed abstracts, only 15 studies met the inclusion criteria for the review. Four constructivist assumptions based the review process allowing for analysis and evaluation of the findings, followed by identification of issues and recommendations for the discipline and its research practice in the field of PBL. This literature review provided evidence that the nursing discipline is employing PBL in its programs, yet with limited data supporting conceptions of the constructivist perspective underlying this pedagogical approach. Three major issues were assessed and formed the basis for subsequent recommendations: (a) limited use of a theoretical framework and absence of constructivism in most of the studies, (b) incompatibility between research measures and research outcomes, and (c) brief exposure to PBL during which the change was measured. Educators have made the right choice in employing PBL as a pedagogical practice, yet the need to base implementation on constructivism is mandatory if the aim is a better preparation of graduates for practice. Undeniably there is limited convincing evidence regarding integration of constructivism in nursing education. Research that assesses the impact of PBL on learners' problem-solving and communication skills, self-direction, and motivation is paramount. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    Science.gov (United States)

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a

  19. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    OpenAIRE

    Hazim Adnan Hashim; Rosli Bin Talif; Lina Hameed Ali

    2016-01-01

    The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led man...

  20. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  1. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    Rose, Jeremy; Aaen, Ivan; Nielsen, Peter Axel

    2008-01-01

    Thinking about improving the management of software development in software firms is dominated by one approach: the capability maturity model devised and administered at the Software Engineering Institute at Carnegie Mellon University. Though CMM, and its replacement CMMI are widely known and used...... thinking about large production and manufacturing organisations (particularly in America) in the late industrial age. Many of the difficulties reported with CMMI can be attributed basing practice on these assumptions in organisations which have different cultures and management traditions, perhaps...

  2. New Assumptions to Guide SETI Research

    Science.gov (United States)

    Colombano, S. P.

    2018-01-01

    The recent Kepler discoveries of Earth-like planets offer the opportunity to focus our attention on detecting signs of life and technology in specific planetary systems, but I feel we need to become more flexible in our assumptions. The reason is that, while it is still reasonable and conservative to assume that life is most likely to have originated in conditions similar to ours, the vast time differences in potential evolutions render the likelihood of "matching" technologies very slim. In light of these challenges I propose a more "aggressive"� approach to future SETI exploration in directions that until now have received little consideration.

  3. Testing the assumptions of the pyrodiversity begets biodiversity hypothesis for termites in semi-arid Australia.

    Science.gov (United States)

    Davis, Hayley; Ritchie, Euan G; Avitabile, Sarah; Doherty, Tim; Nimmo, Dale G

    2018-04-01

    Fire shapes the composition and functioning of ecosystems globally. In many regions, fire is actively managed to create diverse patch mosaics of fire-ages under the assumption that a diversity of post-fire-age classes will provide a greater variety of habitats, thereby enabling species with differing habitat requirements to coexist, and enhancing species diversity (the pyrodiversity begets biodiversity hypothesis). However, studies provide mixed support for this hypothesis. Here, using termite communities in a semi-arid region of southeast Australia, we test four key assumptions of the pyrodiversity begets biodiversity hypothesis (i) that fire shapes vegetation structure over sufficient time frames to influence species' occurrence, (ii) that animal species are linked to resources that are themselves shaped by fire and that peak at different times since fire, (iii) that species' probability of occurrence or abundance peaks at varying times since fire and (iv) that providing a diversity of fire-ages increases species diversity at the landscape scale. Termite species and habitat elements were sampled in 100 sites across a range of fire-ages, nested within 20 landscapes chosen to represent a gradient of low to high pyrodiversity. We used regression modelling to explore relationships between termites, habitat and fire. Fire affected two habitat elements (coarse woody debris and the cover of woody vegetation) that were associated with the probability of occurrence of three termite species and overall species richness, thus supporting the first two assumptions of the pyrodiversity hypothesis. However, this did not result in those species or species richness being affected by fire history per se. Consequently, landscapes with a low diversity of fire histories had similar numbers of termite species as landscapes with high pyrodiversity. Our work suggests that encouraging a diversity of fire-ages for enhancing termite species richness in this study region is not necessary.

  4. A criterion of orthogonality on the assumption and restrictions in subgrid-scale modelling of turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Fang, L. [LMP, Ecole Centrale de Pékin, Beihang University, Beijing 100191 (China); Co-Innovation Center for Advanced Aero-Engine, Beihang University, Beijing 100191 (China); Sun, X.Y. [LMP, Ecole Centrale de Pékin, Beihang University, Beijing 100191 (China); Liu, Y.W., E-mail: liuyangwei@126.com [National Key Laboratory of Science and Technology on Aero-Engine Aero-Thermodynamics, School of Energy and Power Engineering, Beihang University, Beijing 100191 (China); Co-Innovation Center for Advanced Aero-Engine, Beihang University, Beijing 100191 (China)

    2016-12-09

    In order to shed light on understanding the subgrid-scale (SGS) modelling methodology, we analyze and define the concepts of assumption and restriction in the modelling procedure, then show by a generalized derivation that if there are multiple stationary restrictions in a modelling, the corresponding assumption function must satisfy a criterion of orthogonality. Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion. This study is expected to inspire future research on generally guiding the SGS modelling methodology. - Highlights: • The concepts of assumption and restriction in the SGS modelling procedure are defined. • A criterion of orthogonality on the assumption and restrictions is derived. • Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion.

  5. Measuring Productivity Change without Neoclassical Assumptions: A Conceptual Analysis

    NARCIS (Netherlands)

    B.M. Balk (Bert)

    2008-01-01

    textabstractThe measurement of productivity change (or difference) is usually based on models that make use of strong assumptions such as competitive behaviour and constant returns to scale. This survey discusses the basics of productivity measurement and shows that one can dispense with most if not

  6. Exploring five common assumptions on Attention Deficit Hyperactivity Disorder

    NARCIS (Netherlands)

    Batstra, Laura; Nieweg, Edo H.; Hadders-Algra, Mijna

    The number of children diagnosed with attention deficit hyperactivity disorder (ADHD) and treated with medication is steadily increasing. The aim of this paper was to critically discuss five debatable assumptions on ADHD that may explain these trends to some extent. These are that ADHD (i) causes

  7. A critical evaluation of the local-equilibrium assumption in modeling NAPL-pool dissolution

    Science.gov (United States)

    Seagren, Eric A.; Rittmann, Bruce E.; Valocchi, Albert J.

    1999-07-01

    An analytical modeling analysis was used to assess when local equilibrium (LE) and nonequilibrium (NE) modeling approaches may be appropriate for describing nonaqueous-phase liquid (NAPL) pool dissolution. NE mass-transfer between NAPL pools and groundwater is expected to affect the dissolution flux under conditions corresponding to values of Sh'St (the modified Sherwood number ( Lxkl/ Dz) multiplied by the Stanton number ( kl/ vx))≈400, the NE and LE solutions converge, and the LE assumption is appropriate. Based on typical groundwater conditions, many cases of interest are expected to fall in this range. The parameter with the greatest impact on Sh'St is kl. The NAPL pool mass-transfer coefficient correlation of Pfannkuch [Pfannkuch, H.-O., 1984. Determination of the contaminant source strength from mass exchange processes at the petroleum-ground-water interface in shallow aquifer systems. In: Proceedings of the NWWA/API Conference on Petroleum Hydrocarbons and Organic Chemicals in Ground Water—Prevention, Detection, and Restoration, Houston, TX. Natl. Water Well Assoc., Worthington, OH, Nov. 1984, pp. 111-129.] was evaluated using the toluene pool data from Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.]. Dissolution flux predictions made with kl calculated using the Pfannkuch correlation were similar to the LE model predictions, and deviated systematically from predictions made using the average overall kl=4.76 m/day estimated by Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.] and from the experimental data for vx>18 m/day. The Pfannkuch correlation kl was too large for vx>≈10 m/day, possibly because of the relatively low Peclet number data used by Pfannkuch [Pfannkuch, H.-O., 1984. Determination

  8. Influence of simulation assumptions and input parameters on energy balance calculations of residential buildings

    International Nuclear Information System (INIS)

    Dodoo, Ambrose; Tettey, Uniben Yao Ayikoe; Gustavsson, Leif

    2017-01-01

    In this study, we modelled the influence of different simulation assumptions on energy balances of two variants of a residential building, comprising the building in its existing state and with energy-efficient improvements. We explored how selected parameter combinations and variations affect the energy balances of the building configurations. The selected parameters encompass outdoor microclimate, building thermal envelope and household electrical equipment including technical installations. Our modelling takes into account hourly as well as seasonal profiles of different internal heat gains. The results suggest that the impact of parameter interactions on calculated space heating of buildings is somewhat small and relatively more noticeable for an energy-efficient building in contrast to a conventional building. We find that the influence of parameters combinations is more apparent as more individual parameters are varied. The simulations show that a building's calculated space heating demand is significantly influenced by how heat gains from electrical equipment are modelled. For the analyzed building versions, calculated final energy for space heating differs by 9–14 kWh/m"2 depending on the assumed energy efficiency level for electrical equipment. The influence of electrical equipment on calculated final space heating is proportionally more significant for an energy-efficient building compared to a conventional building. This study shows the influence of different simulation assumptions and parameter combinations when varied simultaneously. - Highlights: • Energy balances are modelled for conventional and efficient variants of a building. • Influence of assumptions and parameter combinations and variations are explored. • Parameter interactions influence is apparent as more single parameters are varied. • Calculated space heating demand is notably affected by how heat gains are modelled.

  9. Prospect relativity: how choice options influence decision under risk.

    Science.gov (United States)

    Stewart, Neil; Chater, Nick; Stott, Henry P; Reimers, Stian

    2003-03-01

    In many theories of decision under risk (e.g., expected utility theory, rank-dependent utility theory, and prospect theory), the utility of a prospect is independent of other options in the choice set. The experiments presented here show a large effect of the available options, suggesting instead that prospects are valued relative to one another. The judged certainty equivalent for a prospect is strongly influenced by the options available. Similarly, the selection of a preferred prospect is strongly influenced by the prospects available. Alternative theories of decision under risk (e.g., the stochastic difference model, multialternative decision field theory, and range frequency theory), where prospects are valued relative to one another, can provide an account of these context effects.

  10. Implicit Assumptions in Special Education Policy: Promoting Full Inclusion for Students with Learning Disabilities

    Science.gov (United States)

    Kirby, Moira

    2017-01-01

    Introduction: Everyday millions of students in the United States receive special education services. Special education is an institution shaped by societal norms. Inherent in these norms are implicit assumptions regarding disability and the nature of special education services. The two dominant implicit assumptions evident in the American…

  11. Close-Form Pricing of Benchmark Equity Default Swaps Under the CEV Assumption

    NARCIS (Netherlands)

    Campi, L.; Sbuelz, A.

    2005-01-01

    Equity Default Swaps are new equity derivatives designed as a product for credit investors.Equipped with a novel pricing result, we provide closedform values that give an analytic contribution to the viability of cross-asset trading related to credit risk.

  12. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed by Fro...... that there is indeed a constructive role for a wide suite of ecosystem models to evaluate fishing strategies in an ecosystem context...

  13. Does Artificial Neural Network Support Connectivism's Assumptions?

    Science.gov (United States)

    AlDahdouh, Alaa A.

    2017-01-01

    Connectivism was presented as a learning theory for the digital age and connectivists claim that recent developments in Artificial Intelligence (AI) and, more specifically, Artificial Neural Network (ANN) support their assumptions of knowledge connectivity. Yet, very little has been done to investigate this brave allegation. Does the advancement…

  14. Anti-Atheist Bias in the United States: Testing Two Critical Assumptions

    Directory of Open Access Journals (Sweden)

    Lawton K Swan

    2012-02-01

    Full Text Available Decades of opinion polling and empirical investigations have clearly demonstrated a pervasive anti-atheist prejudice in the United States. However, much of this scholarship relies on two critical and largely unaddressed assumptions: (a that when people report negative attitudes toward atheists, they do so because they are reacting specifically to their lack of belief in God; and (b that survey questions asking about attitudes toward atheists as a group yield reliable information about biases against individual atheist targets. To test these assumptions, an online survey asked a probability-based random sample of American adults (N = 618 to evaluate a fellow research participant (“Jordan”. Jordan garnered significantly more negative evaluations when identified as an atheist than when described as religious or when religiosity was not mentioned. This effect did not differ as a function of labeling (“atheist” versus “no belief in God”, or the amount of individuating information provided about Jordan. These data suggest that both assumptions are tenable: nonbelief—rather than extraneous connotations of the word “atheist”—seems to underlie the effect, and participants exhibited a marked bias even when confronted with an otherwise attractive individual.

  15. Sensitivity of the OMI ozone profile retrieval (OMO3PR) to a priori assumptions

    NARCIS (Netherlands)

    Mielonen, T.; De Haan, J.F.; Veefkind, J.P.

    2014-01-01

    We have assessed the sensitivity of the operational OMI ozone profile retrieval (OMO3PR) algorithm to a number of a priori assumptions. We studied the effect of stray light correction, surface albedo assumptions and a priori ozone profiles on the retrieved ozone profile. Then, we studied how to

  16. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  17. I Assumed You Knew: Teaching Assumptions as Co-Equal to Observations in Scientific Work

    Science.gov (United States)

    Horodyskyj, L.; Mead, C.; Anbar, A. D.

    2016-12-01

    Introductory science curricula typically begin with a lesson on the "nature of science". Usually this lesson is short, built with the assumption that students have picked up this information elsewhere and only a short review is necessary. However, when asked about the nature of science in our classes, student definitions were often confused, contradictory, or incomplete. A cursory review of how the nature of science is defined in a number of textbooks is similarly inconsistent and excessively loquacious. With such confusion both from the student and teacher perspective, it is no surprise that students walk away with significant misconceptions about the scientific endeavor, which they carry with them into public life. These misconceptions subsequently result in poor public policy and personal decisions on issues with scientific underpinnings. We will present a new way of teaching the nature of science at the introductory level that better represents what we actually do as scientists. Nature of science lessons often emphasize the importance of observations in scientific work. However, they rarely mention and often hide the importance of assumptions in interpreting those observations. Assumptions are co-equal to observations in building models, which are observation-assumption networks that can be used to make predictions about future observations. The confidence we place in these models depends on whether they are assumption-dominated (hypothesis) or observation-dominated (theory). By presenting and teaching science in this manner, we feel that students will better comprehend the scientific endeavor, since making observations and assumptions and building mental models is a natural human behavior. We will present a model for a science lab activity that can be taught using this approach.

  18. THE REFLECTION ON THE EFFICIENCY OF REALISM ASSUMPTIONS IN THE CONTEMPORARY WORLD THROUGH THE INVASION OF IRAQ IN 2003

    Directory of Open Access Journals (Sweden)

    Leonardo Luiz Silveira da Silva

    2014-11-01

    Full Text Available At the turn of the nineteenth and twentieth centuries, when Rudolph Kjellén created the geopolitics term, political realism was recognized as a dominant explanatory theory for understanding power relations globally. After the two World Wars, the interdependence of nations seemed a final goal, even during the Cold War period, the opposition between ideologies could put political limitations on integration which already was possible due to advances of transport and communication. In the 1970s, Realism has found in academia, within the International Relations, a rival theory that have a great explanatory power for rearrangement of the future order: liberalism, understood as a theory pointing the international institutions and the interdependence of trade as powerful forces to explain the power relations globally. Through the 2003 invasion of Iraq, this work intends point to two different ways to interpret the same event in the light of Realism and Liberalism, aiming at finding their limitations on a world still in transition with regard to consolidation or extinction of its analytical assumptions. To do so, it focuses on the characteristics of the contemporary world coming jeopardizing some realistic assumptions.

  19. Leakage-Resilient Circuits without Computational Assumptions

    DEFF Research Database (Denmark)

    Dziembowski, Stefan; Faust, Sebastian

    2012-01-01

    Physical cryptographic devices inadvertently leak information through numerous side-channels. Such leakage is exploited by so-called side-channel attacks, which often allow for a complete security breache. A recent trend in cryptography is to propose formal models to incorporate leakage...... on computational assumptions, our results are purely information-theoretic. In particular, we do not make use of public key encryption, which was required in all previous works...... into the model and to construct schemes that are provably secure within them. We design a general compiler that transforms any cryptographic scheme, e.g., a block-cipher, into a functionally equivalent scheme which is resilient to any continual leakage provided that the following three requirements are satisfied...

  20. Hepatitis C bio-behavioural surveys in people who inject drugs-a systematic review of sensitivity to the theoretical assumptions of respondent driven sampling.

    Science.gov (United States)

    Buchanan, Ryan; Khakoo, Salim I; Coad, Jonathan; Grellier, Leonie; Parkes, Julie

    2017-07-11

    New, more effective and better-tolerated therapies for hepatitis C (HCV) have made the elimination of HCV a feasible objective. However, for this to be achieved, it is necessary to have a detailed understanding of HCV epidemiology in people who inject drugs (PWID). Respondent-driven sampling (RDS) can provide prevalence estimates in hidden populations such as PWID. The aims of this systematic review are to identify published studies that use RDS in PWID to measure the prevalence of HCV, and compare each study against the STROBE-RDS checklist to assess their sensitivity to the theoretical assumptions underlying RDS. Searches were undertaken in accordance with PRISMA systematic review guidelines. Included studies were English language publications in peer-reviewed journals, which reported the use of RDS to recruit PWID to an HCV bio-behavioural survey. Data was extracted under three headings: (1) survey overview, (2) survey outcomes, and (3) reporting against selected STROBE-RDS criteria. Thirty-one studies met the inclusion criteria. They varied in scale (range 1-15 survey sites) and the sample sizes achieved (range 81-1000 per survey site) but were consistent in describing the use of standard RDS methods including: seeds, coupons and recruitment incentives. Twenty-seven studies (87%) either calculated or reported the intention to calculate population prevalence estimates for HCV and two used RDS data to calculate the total population size of PWID. Detailed operational and analytical procedures and reporting against selected criteria from the STROBE-RDS checklist varied between studies. There were widespread indications that sampling did not meet the assumptions underlying RDS, which led to two studies being unable to report an estimated HCV population prevalence in at least one survey location. RDS can be used to estimate a population prevalence of HCV in PWID and estimate the PWID population size. Accordingly, as a single instrument, it is a useful tool for

  1. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  2. The microeconomics of mineral extraction under capacity constraints

    Energy Technology Data Exchange (ETDEWEB)

    Cairns, R.D. [McGill University, Montreal, PQ (Canada). Dept. of Economics

    1998-09-01

    The mineral investment decision under certainty is discussed in the context of broad microeconomic features of the industry, the central one being that production is constrained by capacity. The assumptions of the economic literature on natural resources are evaluated in the context of these features and the assumptions that permit the modeling of such facts are examined. Several characteristics of extraction and equilibrium, and some implications of uncertainty, are considered. 38 refs., 1 app.

  3. Origins and Traditions in Comparative Education: Challenging Some Assumptions

    Science.gov (United States)

    Manzon, Maria

    2018-01-01

    This article questions some of our assumptions about the history of comparative education. It explores new scholarship on key actors and ways of knowing in the field. Building on the theory of the social constructedness of the field of comparative education, the paper elucidates how power shapes our scholarly histories and identities.

  4. The metaphysics of D-CTCs: On the underlying assumptions of Deutsch's quantum solution to the paradoxes of time travel

    Science.gov (United States)

    Dunlap, Lucas

    2016-11-01

    I argue that Deutsch's model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a "multiverse", and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent a quantum solution to the paradoxes of time travel.

  5. Evolution of hadron beams under intrabeam scattering

    International Nuclear Information System (INIS)

    Wei, Jie.

    1993-01-01

    Based on assumptions applicable to many circular accelerators, we simplify into analytical form the growth rates of a hadron beam under Coulomb intrabeam scattering (IBS). Because of the dispersion that correlates the horizontal closed orbit to the momentum, the scaling behavior of the growth rates are drastically different at energies low and high compared with the transition energy. At high energies the rates are approximately independent of the energy. Asymptotically, the horizontal and longitudinal beam amplitudes are linearly related by the average dispersion. At low energies, the beam evolves such that the velocity distribution in the rest frame becomes isotropic in all the directions

  6. Validity of the isotropic thermal conductivity assumption in supercell lattice dynamics

    Science.gov (United States)

    Ma, Ruiyuan; Lukes, Jennifer R.

    2018-02-01

    Superlattices and nano phononic crystals have attracted significant attention due to their low thermal conductivities and their potential application as thermoelectric materials. A widely used expression to calculate thermal conductivity, presented by Klemens and expressed in terms of the relaxation time by Callaway and Holland, originates from the Boltzmann transport equation. In its most general form, this expression involves a direct summation of the heat current contributions from individual phonons of all wavevectors and polarizations in the first Brillouin zone. In common practice, the expression is simplified by making an isotropic assumption that converts the summation over wavevector to an integral over wavevector magnitude. The isotropic expression has been applied to superlattices and phononic crystals, but its validity for different supercell sizes has not been studied. In this work, the isotropic and direct summation methods are used to calculate the thermal conductivities of bulk Si, and Si/Ge quantum dot superlattices. The results show that the differences between the two methods increase substantially with the supercell size. These differences arise because the vibrational modes neglected in the isotropic assumption provide an increasingly important contribution to the thermal conductivity for larger supercells. To avoid the significant errors that can result from the isotropic assumption, direct summation is recommended for thermal conductivity calculations in superstructures.

  7. Evaluating growth assumptions using diameter or radial increments in natural even-aged longleaf pine

    Science.gov (United States)

    John C. Gilbert; Ralph S. Meldahl; Jyoti N. Rayamajhi; John S. Kush

    2010-01-01

    When using increment cores to predict future growth, one often assumes future growth is identical to past growth for individual trees. Once this assumption is accepted, a decision has to be made between which growth estimate should be used, constant diameter growth or constant basal area growth. Often, the assumption of constant diameter growth is used due to the ease...

  8. Fair-sampling assumption is not necessary for testing local realism

    International Nuclear Information System (INIS)

    Berry, Dominic W.; Jeong, Hyunseok; Stobinska, Magdalena; Ralph, Timothy C.

    2010-01-01

    Almost all Bell inequality experiments to date have used postselection and therefore relied on the fair sampling assumption for their interpretation. The standard form of the fair sampling assumption is that the loss is independent of the measurement settings, so the ensemble of detected systems provides a fair statistical sample of the total ensemble. This is often assumed to be needed to interpret Bell inequality experiments as ruling out hidden-variable theories. Here we show that it is not necessary; the loss can depend on measurement settings, provided the detection efficiency factorizes as a function of the measurement settings and any hidden variable. This condition implies that Tsirelson's bound must be satisfied for entangled states. On the other hand, we show that it is possible for Tsirelson's bound to be violated while the Clauser-Horne-Shimony-Holt (CHSH)-Bell inequality still holds for unentangled states, and present an experimentally feasible example.

  9. Design assumptions and bases for small D-T-fueled Sperical Tokamak (ST) fusion core

    International Nuclear Information System (INIS)

    Peng, Y.K.M.; Galambos, J.D.; Fogarty, P.J.

    1996-01-01

    Recent progress in defining the assumptions and clarifying the bases for a small D-T-fueled ST fusion core are presented. The paper covers several issues in the physics of ST plasmas, the technology of neutral beam injection, the engineering design configuration, and the center leg material under intense neutron irradiation. This progress was driven by the exciting data from pioneering ST experiments, a heightened interest in proof-of-principle experiments at the MA level in plasma current, and the initiation of the first conceptual design study of the small ST fusion core. The needs recently identified for a restructured fusion energy sciences program have provided a timely impetus for examining the subject of this paper. Our results, though preliminary in nature, strengthen the case for the potential realism and attractiveness of the ST approach

  10. Moving from assumption to observation: Implications for energy and emissions impacts of plug-in hybrid electric vehicles

    International Nuclear Information System (INIS)

    Davies, Jamie; Kurani, Kenneth S.

    2013-01-01

    Plug-in hybrid electric vehicles (PHEVs) are currently for sale in most parts of the United States, Canada, Europe and Japan. These vehicles are promoted as providing distinct consumer and public benefits at the expense of grid electricity. However, the specific benefits or impacts of PHEVs ultimately relies on consumers purchase and vehicle use patterns. While considerable effort has been dedicated to understanding PHEV impacts on a per mile basis few studies have assessed the impacts of PHEV given actual consumer use patterns or operating conditions. Instead, simplifying assumptions have been made about the types of cars individual consumers will choose to purchase and how they will drive and charge them. Here, we highlight some of these consumer purchase and use assumptions, studies which have employed these assumptions and compare these assumptions to actual consumer data recorded in a PHEV demonstration project. Using simulation and hypothetical scenarios we discuss the implication for PHEV impact analyses and policy if assumptions about key PHEV consumer use variables such as vehicle choice, home charging frequency, distribution of driving distances, and access to workplace charging were to change. -- Highlights: •The specific benefits or impacts of PHEVs ultimately relies on consumers purchase and vehicle use patterns. •Simplifying, untested, assumptions have been made by prior studies about PHEV consumer driving, charging and vehicle purchase behaviors. •Some simplifying assumptions do not match observed data from a PHEV demonstration project. •Changing the assumptions about PHEV consumer driving, charging, and vehicle purchase behaviors affects estimates of PHEV impacts. •Premature simplification may have lasting consequences for standard setting and performance based incentive programs which rely on these estimates

  11. Data-driven smooth tests of the proportional hazards assumption

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2007-01-01

    Roč. 13, č. 1 (2007), s. 1-16 ISSN 1380-7870 R&D Projects: GA AV ČR(CZ) IAA101120604; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * Neyman's smooth test * proportional hazards assumption * Schwarz's selection rule Subject RIV: BA - General Mathematics Impact factor: 0.491, year: 2007

  12. Robust inference in summary data Mendelian randomization via the zero modal pleiotropy assumption.

    Science.gov (United States)

    Hartwig, Fernando Pires; Davey Smith, George; Bowden, Jack

    2017-12-01

    Mendelian randomization (MR) is being increasingly used to strengthen causal inference in observational studies. Availability of summary data of genetic associations for a variety of phenotypes from large genome-wide association studies (GWAS) allows straightforward application of MR using summary data methods, typically in a two-sample design. In addition to the conventional inverse variance weighting (IVW) method, recently developed summary data MR methods, such as the MR-Egger and weighted median approaches, allow a relaxation of the instrumental variable assumptions. Here, a new method - the mode-based estimate (MBE) - is proposed to obtain a single causal effect estimate from multiple genetic instruments. The MBE is consistent when the largest number of similar (identical in infinite samples) individual-instrument causal effect estimates comes from valid instruments, even if the majority of instruments are invalid. We evaluate the performance of the method in simulations designed to mimic the two-sample summary data setting, and demonstrate its use by investigating the causal effect of plasma lipid fractions and urate levels on coronary heart disease risk. The MBE presented less bias and lower type-I error rates than other methods under the null in many situations. Its power to detect a causal effect was smaller compared with the IVW and weighted median methods, but was larger than that of MR-Egger regression, with sample size requirements typically smaller than those available from GWAS consortia. The MBE relaxes the instrumental variable assumptions, and should be used in combination with other approaches in sensitivity analyses. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  13. Discriminative Relational Topic Models.

    Science.gov (United States)

    Chen, Ning; Zhu, Jun; Xia, Fei; Zhang, Bo

    2015-05-01

    Relational topic models (RTMs) provide a probabilistic generative process to describe both the link structure and document contents for document networks, and they have shown promise on predicting network structures and discovering latent topic representations. However, existing RTMs have limitations in both the restricted model expressiveness and incapability of dealing with imbalanced network data. To expand the scope and improve the inference accuracy of RTMs, this paper presents three extensions: 1) unlike the common link likelihood with a diagonal weight matrix that allows the-same-topic interactions only, we generalize it to use a full weight matrix that captures all pairwise topic interactions and is applicable to asymmetric networks; 2) instead of doing standard Bayesian inference, we perform regularized Bayesian inference (RegBayes) with a regularization parameter to deal with the imbalanced link structure issue in real networks and improve the discriminative ability of learned latent representations; and 3) instead of doing variational approximation with strict mean-field assumptions, we present collapsed Gibbs sampling algorithms for the generalized relational topic models by exploring data augmentation without making restricting assumptions. Under the generic RegBayes framework, we carefully investigate two popular discriminative loss functions, namely, the logistic log-loss and the max-margin hinge loss. Experimental results on several real network datasets demonstrate the significance of these extensions on improving prediction performance.

  14. Investigating Teachers' and Students' Beliefs and Assumptions about CALL Programme at Caledonian College of Engineering

    Science.gov (United States)

    Ali, Holi Ibrahim Holi

    2012-01-01

    This study is set to investigate students' and teachers' perceptions and assumptions about newly implemented CALL Programme at the School of Foundation Studies, Caledonian College of Engineering, Oman. Two versions of questionnaire were administered to 24 teachers and 90 students to collect their beliefs and assumption about CALL programame. The…

  15. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  16. Change in Soil Porosity under Load

    Science.gov (United States)

    Dyba, V. P.; Skibin, E. G.

    2017-11-01

    The theoretical basis for the process of soil compaction under various loading paths is considered in the article, the theoretical assumptions are compared with the results of the tests of clay soil on a stabilometer. The variant of the critical state model of the sealing plastic-rigid environment is also considered the strength characteristics of which depend on the porosity coefficient. The loading surface is determined by the results of compression and stabilometrical tests. In order to clarify the results of this task, it is necessary to carry out stabilometric tests under conditions of simple loading, i.e. where the vertical pressure would be proportional to the compression pressure σ3 = kσ1. Within the study the attempts were made to confirm the model given in the beginning of the article by laboratory tests. After the analysis of the results, the provided theoretical assumptions were confirmed.

  17. Assumptions of the primordial spectrum and cosmological parameter estimation

    International Nuclear Information System (INIS)

    Shafieloo, Arman; Souradeep, Tarun

    2011-01-01

    The observables of the perturbed universe, cosmic microwave background (CMB) anisotropy and large structures depend on a set of cosmological parameters, as well as the assumed nature of primordial perturbations. In particular, the shape of the primordial power spectrum (PPS) is, at best, a well-motivated assumption. It is known that the assumed functional form of the PPS in cosmological parameter estimation can affect the best-fit-parameters and their relative confidence limits. In this paper, we demonstrate that a specific assumed form actually drives the best-fit parameters into distinct basins of likelihood in the space of cosmological parameters where the likelihood resists improvement via modifications to the PPS. The regions where considerably better likelihoods are obtained allowing free-form PPS lie outside these basins. In the absence of a preferred model of inflation, this raises a concern that current cosmological parameter estimates are strongly prejudiced by the assumed form of PPS. Our results strongly motivate approaches toward simultaneous estimation of the cosmological parameters and the shape of the primordial spectrum from upcoming cosmological data. It is equally important for theorists to keep an open mind towards early universe scenarios that produce features in the PPS. (paper)

  18. Testing the rationality assumption using a design difference in the TV game show 'Jeopardy'

    OpenAIRE

    Sjögren Lindquist, Gabriella; Säve-Söderbergh, Jenny

    2006-01-01

    Abstract This paper empirically investigates the rationality assumption commonly applied in economic modeling by exploiting a design difference in the game-show Jeopardy between the US and Sweden. In particular we address the assumption of individuals’ capabilities to process complex mathematical problems to find optimal strategies. The vital difference is that US contestants are given explicit information before they act, while Swedish contestants individually need to calculate the same info...

  19. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid

  20. Basic Assumptions of the New Price System and Supplements to the Tariff System for Electricity Sale

    International Nuclear Information System (INIS)

    Klepo, M.

    1995-01-01

    The article outlines some basic assumptions of the new price system and major elements of the latest proposition for the changes and supplements to the Tariff system for Electricity Sale in the Republic of Croatia, including the analysis of those elements which brought about the present unfavourable and non-productive relations within the electric power system. The paper proposes measures and actions which should by means of a price system and tariff policy improve the present unfavourable relations and their consequences and achieve a desirable consumption structure and characteristics, resulting in rational management and effective power supply-economy relationships within the electric power system as a subsystem of the power supply sector. (author). 2 refs., 3 figs., 4 tabs

  1. Bell inequalities under non-ideal conditions

    OpenAIRE

    Especial, João N. C.

    2012-01-01

    Bell inequalities applicable to non-ideal EPRB experiments are critical to the interpretation of experimental Bell tests. In this article it is shown that previous treatments of this subject are incorrect due to an implicit assumption and new inequalities are derived under general conditions. Published experimental evidence is reinterpreted under these results and found to be entirely compatible with local-realism, both, when experiments involve inefficient detection, if fair-sampling detecti...

  2. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-05-23

    Protein-protein interactions are critically dependent on just a few residues (“hot spots”) at the interfaces. Hot spots make a dominant contribution to the binding free energy and if mutated they can disrupt the interaction. As mutagenesis studies require significant experimental efforts, there exists a need for accurate and reliable computational hot spot prediction methods. Compared to the supervised hot spot prediction algorithms, the semi-supervised prediction methods can take into consideration both the labeled and unlabeled residues in the dataset during the prediction procedure. The transductive support vector machine has been utilized for this task and demonstrated a better prediction performance. To the best of our knowledge, however, none of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue prediction, by considering all the three semisupervised assumptions using nonlinear models. Our algorithm, IterPropMCS, works in an iterative manner. In each iteration, the algorithm first propagates the labels of the labeled residues to the unlabeled ones, along the shortest path between them on a graph, assuming that they lie on a nonlinear manifold. Then it selects the most confident residues as the labeled ones for the next iteration, according to the cluster and smoothness criteria, which is implemented by a nonlinear density estimator. Experiments on a benchmark dataset, using protein structure-based features, demonstrate that our approach is effective in predicting hot spots and compares favorably to other available methods. The results also show that our method outperforms the state-of-the-art transductive learning methods.

  3. Automatic ethics: the effects of implicit assumptions and contextual cues on moral behavior.

    Science.gov (United States)

    Reynolds, Scott J; Leavitt, Keith; DeCelles, Katherine A

    2010-07-01

    We empirically examine the reflexive or automatic aspects of moral decision making. To begin, we develop and validate a measure of an individual's implicit assumption regarding the inherent morality of business. Then, using an in-basket exercise, we demonstrate that an implicit assumption that business is inherently moral impacts day-to-day business decisions and interacts with contextual cues to shape moral behavior. Ultimately, we offer evidence supporting a characterization of employees as reflexive interactionists: moral agents whose automatic decision-making processes interact with the environment to shape their moral behavior.

  4. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    Science.gov (United States)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  5. ψ -ontology result without the Cartesian product assumption

    Science.gov (United States)

    Myrvold, Wayne C.

    2018-05-01

    We introduce a weakening of the preparation independence postulate of Pusey et al. [Nat. Phys. 8, 475 (2012), 10.1038/nphys2309] that does not presuppose that the space of ontic states resulting from a product-state preparation can be represented by the Cartesian product of subsystem state spaces. On the basis of this weakened assumption, it is shown that, in any model that reproduces the quantum probabilities, any pair of pure quantum states |ψ >,|ϕ > with ≤1 /√{2 } must be ontologically distinct.

  6. Using Contemporary Art to Challenge Cultural Values, Beliefs, and Assumptions

    Science.gov (United States)

    Knight, Wanda B.

    2006-01-01

    Art educators, like many other educators born or socialized within the main-stream culture of a society, seldom have an opportunity to identify, question, and challenge their cultural values, beliefs, assumptions, and perspectives because school culture typically reinforces those they learn at home and in their communities (Bush & Simmons, 1990).…

  7. Assumptions and Challenges of Open Scholarship

    Directory of Open Access Journals (Sweden)

    George Veletsianos

    2012-10-01

    Full Text Available Researchers, educators, policymakers, and other education stakeholders hope and anticipate that openness and open scholarship will generate positive outcomes for education and scholarship. Given the emerging nature of open practices, educators and scholars are finding themselves in a position in which they can shape and/or be shaped by openness. The intention of this paper is (a to identify the assumptions of the open scholarship movement and (b to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. Through a critique of technology use in education, an understanding of educational technology narratives and their unfulfilled potential, and an appreciation of the negotiated implementation of technology use, we hope that this paper helps spark a conversation for a more critical, equitable, and effective future for education and open scholarship.

  8. Spatial Angular Compounding for Elastography without the Incompressibility Assumption

    OpenAIRE

    Rao, Min; Varghese, Tomy

    2005-01-01

    Spatial-angular compounding is a new technique that enables the reduction of noise artifacts in ultrasound elastography. Previous results using spatial angular compounding, however, were based on the use of the tissue incompressibility assumption. Compounded elastograms were obtained from a spatially-weighted average of local strain estimated from radiofrequency echo signals acquired at different insonification angles. In this paper, we present a new method for reducing the noise artifacts in...

  9. First assumptions and overlooking competing causes of death

    DEFF Research Database (Denmark)

    Leth, Peter Mygind; Andersen, Anh Thao Nguyen

    2014-01-01

    Determining the most probable cause of death is important, and it is sometimes tempting to assume an obvious cause of death, when it readily presents itself, and stop looking for other competing causes of death. The case story presented in the article illustrates this dilemma. The first assumption...... of cause of death, which was based on results from bacteriology tests, proved to be wrong when the results from the forensic toxicology testing became available. This case also illustrates how post mortem computed tomography (PMCT) findings of radio opaque material in the stomach alerted the pathologist...

  10. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  11. Nursing under the influence: a relational ethics perspective.

    Science.gov (United States)

    Kunyk, Diane; Austin, Wendy

    2012-05-01

    When nurses have active and untreated addictions, patient safety may be compromised and nurse-health endangered. Genuine responses are required to fulfil nurses' moral obligations to their patients as well as to their nurse-colleagues. Guided by core elements of relational ethics, the influences of nursing organizational responses along with the practice environment in shaping the situation are contemplated. This approach identifies the importance of consistency with nursing values, acknowledges nurses interdependence, and addresses the role of nursing organization as moral agent. By examining the relational space, the tension between what appears to be opposing moral responsibilities may be healed. Ongoing discourse to identify authentic actions for the professional practice issue of nursing under the influence is called upon.

  12.  Basic assumptions and definitions in the analysis of financial leverage

    Directory of Open Access Journals (Sweden)

    Tomasz Berent

    2015-12-01

    Full Text Available The financial leverage literature has been in a state of terminological chaos for decades as evidenced, for example, by the Nobel Prize Lecture mistake on the one hand, and the global financial crisis on the other. A meaningful analysis of the leverage phenomenon calls for the formulation of a coherent set of assumptions and basic definitions. The objective of the paper is to answer this call. The paper defines leverage as a value neutral concept useful in explaining the magnification effect exerted by financial activity upon the whole spectrum of financial results. By adopting constructivism as a methodological approach, we are able to introduce various types of leverage such as capital and income, base and non-base, accounting and market value, for levels and for distances (absolute and relative, costs and simple etc. The new definitions formulated here are subsequently adopted in the analysis of the content of leverage statements used by the leading finance textbook.

  13. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    Science.gov (United States)

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  14. Age-related variance in decisions under ambiguity is explained by changes in reasoning, executive functions, and decision-making under risk.

    Science.gov (United States)

    Schiebener, Johannes; Brand, Matthias

    2017-06-01

    Previous literature has explained older individuals' disadvantageous decision-making under ambiguity in the Iowa Gambling Task (IGT) by reduced emotional warning signals preceding decisions. We argue that age-related reductions in IGT performance may also be explained by reductions in certain cognitive abilities (reasoning, executive functions). In 210 participants (18-86 years), we found that the age-related variance on IGT performance occurred only in the last 60 trials. The effect was mediated by cognitive abilities and their relation with decision-making performance under risk with explicit rules (Game of Dice Task). Thus, reductions in cognitive functions in older age may be associated with both a reduced ability to gain explicit insight into the rules of the ambiguous decision situation and with failure to choose the less risky options consequently after the rules have been understood explicitly. Previous literature may have underestimated the relevance of cognitive functions for age-related decline in decision-making performance under ambiguity.

  15. Basic concepts and assumptions behind the new ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1979-01-01

    A review is given of some of the basic concepts and assumptions behind the current recommendations by the International Commission on Radiological Protection in ICRP Publications 26 and 28, which form the basis for the revision of the Basic Safety Standards jointly undertaken by IAEA, ILO, NEA and WHO. Special attention is given to the assumption of a linear, non-threshold dose-response relationship for stochastic radiation effects such as cancer and hereditary harm. The three basic principles of protection are discussed: justification of practice, optimization of protection and individual risk limitation. In the new ICRP recommendations particular emphasis is given to the principle of keeping all radiation doses as low as is reasonably achievable. A consequence of this is that the ICRP dose limits are now given as boundary conditions for the justification and optimization procedures rather than as values that should be used for purposes of planning and design. The fractional increase in total risk at various ages after continuous exposure near the dose limits is given as an illustration. The need for taking other sources, present and future, into account when applying the dose limits leads to the use of the commitment concept. This is briefly discussed as well as the new quantity, the effective dose equivalent, introduced by ICRP. (author)

  16. Has the "Equal Environments" assumption been tested in twin studies?

    Science.gov (United States)

    Eaves, Lindon; Foley, Debra; Silberg, Judy

    2003-12-01

    A recurring criticism of the twin method for quantifying genetic and environmental components of human differences is the necessity of the so-called "equal environments assumption" (EEA) (i.e., that monozygotic and dizygotic twins experience equally correlated environments). It has been proposed to test the EEA by stratifying twin correlations by indices of the amount of shared environment. However, relevant environments may also be influenced by genetic differences. We present a model for the role of genetic factors in niche selection by twins that may account for variation in indices of the shared twin environment (e.g., contact between members of twin pairs). Simulations reveal that stratification of twin correlations by amount of contact can yield spurious evidence of large shared environmental effects in some strata and even give false indications of genotype x environment interaction. The stratification approach to testing the equal environments assumption may be misleading and the results of such tests may actually be consistent with a simpler theory of the role of genetic factors in niche selection.

  17. Bell violation using entangled photons without the fair-sampling assumption.

    Science.gov (United States)

    Giustina, Marissa; Mech, Alexandra; Ramelow, Sven; Wittmann, Bernhard; Kofler, Johannes; Beyer, Jörn; Lita, Adriana; Calkins, Brice; Gerrits, Thomas; Nam, Sae Woo; Ursin, Rupert; Zeilinger, Anton

    2013-05-09

    The violation of a Bell inequality is an experimental observation that forces the abandonment of a local realistic viewpoint--namely, one in which physical properties are (probabilistically) defined before and independently of measurement, and in which no physical influence can propagate faster than the speed of light. All such experimental violations require additional assumptions depending on their specific construction, making them vulnerable to so-called loopholes. Here we use entangled photons to violate a Bell inequality while closing the fair-sampling loophole, that is, without assuming that the sample of measured photons accurately represents the entire ensemble. To do this, we use the Eberhard form of Bell's inequality, which is not vulnerable to the fair-sampling assumption and which allows a lower collection efficiency than other forms. Technical improvements of the photon source and high-efficiency transition-edge sensors were crucial for achieving a sufficiently high collection efficiency. Our experiment makes the photon the first physical system for which each of the main loopholes has been closed, albeit in different experiments.

  18. Under-reporting of work-related musculoskeletal disorders in the Veterans Administration.

    Science.gov (United States)

    Siddharthan, Kris; Hodgson, Michael; Rosenberg, Deborah; Haiduven, Donna; Nelson, Audrey

    2006-01-01

    Work-related musculoskeletal disorders following patient contact represent a major concern for health care workers. Unfortunately, research and prevention have been hampered by difficulties ascertaining true prevalence rates owing to under-reporting of these injuries. The purpose of this study is to determine the predictors for under-reporting work-related musculoskeletal injuries and their reasons. Multivariate analysis using data obtained in a survey of Veterans Administration employees in the USA was used to determine underreporting patterns among registered nurses, licensed practical nurses and nursing assistants. Focus groups among health care workers were conducted at one of the largest Veterans Administration hospitals to determine reasons for under-reporting. A significant number of workers reported work-related musculoskeletal pain, which was not reported as an injury but required rescheduling work such as changing shifts and taking sick leave to recuperate. The findings indicate that older health care workers and those with longer service were less likely to report as were those working in the evening and night shifts. Hispanic workers and personnel who had repetitive injuries were prone to under-reporting, as were workers in places that lack proper equipment to move and handle patients. Reasons for under-reporting include the time involved, peer pressure not to report and frustration with workers' compensation procedures. This study provides insights into under-reporting musculoskeletal injuries in a major US government organization. The research indicates that current reporting procedures appear to be overtly cumbersome in time and effort. More flexible work assignments are needed to cover staff shortfalls owing to injuries. Health education on the detrimental long-term effects of ergonomic injuries and the need for prompt attention to injuries should prove useful in improving rates of reporting.

  19. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  20. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  1. Ego depletion and attention regulation under pressure: is a temporary loss of self-control strength indeed related to impaired attention regulation?

    Science.gov (United States)

    Englert, Chris; Zwemmer, Kris; Bertrams, Alex; Oudejans, Raôul R

    2015-04-01

    In the current study we investigated whether ego depletion negatively affects attention regulation under pressure in sports by assessing participants' dart throwing performance and accompanying gaze behavior. According to the strength model of self-control, the most important aspect of self-control is attention regulation. Because higher levels of state anxiety are associated with impaired attention regulation, we chose a mixed design with ego depletion (yes vs. no) as between-subjects and anxiety level (high vs. low) as within-subjects factor. Participants performed a perceptual-motor task requiring selective attention, namely, dart throwing. In line with our expectations, depleted participants in the high-anxiety condition performed worse and displayed a shorter final fixation on bull's eye, demonstrating that when one's self-control strength is depleted, attention regulation under pressure cannot be maintained. This is the first study that directly supports the general assumption that ego depletion is a major factor in influencing attention regulation under pressure.

  2. Public-private partnerships to improve primary healthcare surgeries: clarifying assumptions about the role of private provider activities.

    Science.gov (United States)

    Mudyarabikwa, Oliver; Tobi, Patrick; Regmi, Krishna

    2017-07-01

    Aim To examine assumptions about public-private partnership (PPP) activities and their role in improving public procurement of primary healthcare surgeries. PPPs were developed to improve the quality of care and patient satisfaction. However, evidence of their effectiveness in delivering health benefits is limited. A qualitative study design was employed. A total of 25 interviews with public sector staff (n=23) and private sector managers (n=2) were conducted to understand their interpretations of assumptions in the activities of private investors and service contractors participating in Local Improvement Finance Trust (LIFT) partnerships. Realist evaluation principles were applied in the data analysis to interpret the findings. Six thematic areas of assumed health benefits were identified: (i) quality improvement; (ii) improved risk management; (iii) reduced procurement costs; (iv) increased efficiency; (v) community involvement; and (vi) sustainable investment. Primary Care Trusts that chose to procure their surgeries through LIFT were expected to support its implementation by providing an environment conducive for the private participants to achieve these benefits. Private participant activities were found to be based on a range of explicit and tacit assumptions perceived helpful in achieving government objectives for LIFT. The success of PPPs depended upon private participants' (i) capacity to assess how PPP assumptions added value to their activities, (ii) effectiveness in interpreting assumptions in their expected activities, and (iii) preparedness to align their business principles to government objectives for PPPs. They risked missing some of the expected benefits because of some factors constraining realization of the assumptions. The ways in which private participants preferred to carry out their activities also influenced the extent to which expected benefits were achieved. Giving more discretion to public than private participants over critical

  3. On the macro-economic impacts of climate change under informational failures

    OpenAIRE

    CAO, Ruixuan; Gohin, Alexandre

    2012-01-01

    Although the sources, extent and physical impacts of the future climate change are highly uncertain, available dynamic economic assessments implicitly assume that economic agents perfectly know them. Perfect foresight, rational expectations or active learning are standard assumptions underlying simulated results. To the contrary, this paper builds on the assumption that economic agents may suffer for a while from limited knowledge about the average and variability of physical impa...

  4. Tax Treaty Treatment of Dividend Related Payments under Share Loan Agreements

    DEFF Research Database (Denmark)

    Dyppel, Katja Joo

    2014-01-01

    The article analyses some of the qualification and allocation challenges that dividend related payments under share loan agreements give rise to for tax treaty purposes. The analysis is based on constructed scenarios illustrating how inconsistent domestic allocation of the dividend related payments...... loan agreement fulfils the beneficial ownership requirement....

  5. Sensitivity Analysis and Bounding of Causal Effects with Alternative Identifying Assumptions

    Science.gov (United States)

    Jo, Booil; Vinokur, Amiram D.

    2011-01-01

    When identification of causal effects relies on untestable assumptions regarding nonidentified parameters, sensitivity of causal effect estimates is often questioned. For proper interpretation of causal effect estimates in this situation, deriving bounds on causal parameters or exploring the sensitivity of estimates to scientifically plausible…

  6. Super learning to hedge against incorrect inference from arbitrary parametric assumptions in marginal structural modeling.

    Science.gov (United States)

    Neugebauer, Romain; Fireman, Bruce; Roy, Jason A; Raebel, Marsha A; Nichols, Gregory A; O'Connor, Patrick J

    2013-08-01

    Clinical trials are unlikely to ever be launched for many comparative effectiveness research (CER) questions. Inferences from hypothetical randomized trials may however be emulated with marginal structural modeling (MSM) using observational data, but success in adjusting for time-dependent confounding and selection bias typically relies on parametric modeling assumptions. If these assumptions are violated, inferences from MSM may be inaccurate. In this article, we motivate the application of a data-adaptive estimation approach called super learning (SL) to avoid reliance on arbitrary parametric assumptions in CER. Using the electronic health records data from adults with new-onset type 2 diabetes, we implemented MSM with inverse probability weighting (IPW) estimation to evaluate the effect of three oral antidiabetic therapies on the worsening of glomerular filtration rate. Inferences from IPW estimation were noticeably sensitive to the parametric assumptions about the associations between both the exposure and censoring processes and the main suspected source of confounding, that is, time-dependent measurements of hemoglobin A1c. SL was successfully implemented to harness flexible confounding and selection bias adjustment from existing machine learning algorithms. Erroneous IPW inference about clinical effectiveness because of arbitrary and incorrect modeling decisions may be avoided with SL. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Quantum information versus black hole physics: deep firewalls from narrow assumptions.

    Science.gov (United States)

    Braunstein, Samuel L; Pirandola, Stefano

    2018-07-13

    The prevalent view that evaporating black holes should simply be smaller black holes has been challenged by the firewall paradox. In particular, this paradox suggests that something different occurs once a black hole has evaporated to one-half its original surface area. Here, we derive variations of the firewall paradox by tracking the thermodynamic entropy within a black hole across its entire lifetime and extend it even to anti-de Sitter space-times. Our approach sweeps away many unnecessary assumptions, allowing us to demonstrate a paradox exists even after its initial onset (when conventional assumptions render earlier analyses invalid). The most natural resolution may be to accept firewalls as a real phenomenon. Further, the vast entropy accumulated implies a deep firewall that goes 'all the way down' in contrast with earlier work describing only a structure at the horizon.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  8. Quantum information versus black hole physics: deep firewalls from narrow assumptions

    Science.gov (United States)

    Braunstein, Samuel L.; Pirandola, Stefano

    2018-07-01

    The prevalent view that evaporating black holes should simply be smaller black holes has been challenged by the firewall paradox. In particular, this paradox suggests that something different occurs once a black hole has evaporated to one-half its original surface area. Here, we derive variations of the firewall paradox by tracking the thermodynamic entropy within a black hole across its entire lifetime and extend it even to anti-de Sitter space-times. Our approach sweeps away many unnecessary assumptions, allowing us to demonstrate a paradox exists even after its initial onset (when conventional assumptions render earlier analyses invalid). The most natural resolution may be to accept firewalls as a real phenomenon. Further, the vast entropy accumulated implies a deep firewall that goes `all the way down' in contrast with earlier work describing only a structure at the horizon. This article is part of a discussion meeting issue `Foundations of quantum mechanics and their impact on contemporary society'.

  9. Cosmological tests of general relativity

    International Nuclear Information System (INIS)

    Hut, P.

    1977-01-01

    It is stated that the general relativity theory could be tested on a cosmological scale by measuring the Hubble constant and the deceleration parameter, if, in addition, everything could be known about the matter filling the universe. If, on the other hand, nothing could be presupposed about the matter content of the universe general relativity could not be tested by measuring any number of time derivatives of the scale factor. But upon making the assumption of a universe filled with non-interacting mixture of non-relativistic matter and radiation general relativity can in principle be tested by measuring the first five derivatives of the scale factor. Some general relations are here presented using this assumption. (author)

  10. Assumption-free estimation of heritability from genome-wide identity-by-descent sharing between full siblings.

    Directory of Open Access Journals (Sweden)

    2006-03-01

    Full Text Available The study of continuously varying, quantitative traits is important in evolutionary biology, agriculture, and medicine. Variation in such traits is attributable to many, possibly interacting, genes whose expression may be sensitive to the environment, which makes their dissection into underlying causative factors difficult. An important population parameter for quantitative traits is heritability, the proportion of total variance that is due to genetic factors. Response to artificial and natural selection and the degree of resemblance between relatives are all a function of this parameter. Following the classic paper by R. A. Fisher in 1918, the estimation of additive and dominance genetic variance and heritability in populations is based upon the expected proportion of genes shared between different types of relatives, and explicit, often controversial and untestable models of genetic and non-genetic causes of family resemblance. With genome-wide coverage of genetic markers it is now possible to estimate such parameters solely within families using the actual degree of identity-by-descent sharing between relatives. Using genome scans on 4,401 quasi-independent sib pairs of which 3,375 pairs had phenotypes, we estimated the heritability of height from empirical genome-wide identity-by-descent sharing, which varied from 0.374 to 0.617 (mean 0.498, standard deviation 0.036. The variance in identity-by-descent sharing per chromosome and per genome was consistent with theory. The maximum likelihood estimate of the heritability for height was 0.80 with no evidence for non-genetic causes of sib resemblance, consistent with results from independent twin and family studies but using an entirely separate source of information. Our application shows that it is feasible to estimate genetic variance solely from within-family segregation and provides an independent validation of previously untestable assumptions. Given sufficient data, our new paradigm will

  11. Plastic limit loads for cylindrical shell intersections under combined loading

    International Nuclear Information System (INIS)

    Skopinsky, V.N.; Berkov, N.A.; Vogov, R.A.

    2015-01-01

    In this research, applied methods of nonlinear analysis and results of determining the plastic limit loads for shell intersection configurations under combined internal pressure, in-plane moment and out-plane moment loadings are presented. The numerical analysis of shell intersections is performed using the finite element method, geometrically nonlinear shell theory in quadratic approximation and plasticity theory. For determining the load parameter of proportional combined loading, the developed maximum criterion of rate of change of relative plastic work is employed. The graphical results for model of cylindrical shell intersection under different two-parameter combined loadings (as generalized plastic limit load curves) and three-parameter combined loading (as generalized plastic limit load surface) are presented on the assumption that the internal pressure, in-plane moment and out-plane moment loads were applied in a proportional manner. - Highlights: • This paper presents nonlinear two-dimensional FE analysis for shell intersections. • Determining the plastic limit loads under combined loading is considered. • Developed maximum criterion of rate of change of relative plastic work is employed. • Plastic deformation mechanism in shell intersections is discussed. • Results for generalized plastic limit load curves of branch intersection are presented

  12. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  13. Transportation radiological risk assessment for the programmatic environmental impact statement: An overview of methodologies, assumptions, and input parameters

    International Nuclear Information System (INIS)

    Monette, F.; Biwer, B.; LePoire, D.; Chen, S.Y.

    1994-01-01

    The U.S. Department of Energy is considering a broad range of alternatives for the future configuration of radioactive waste management at its network of facilities. Because the transportation of radioactive waste is an integral component of the management alternatives being considered, the estimated human health risks associated with both routine and accident transportation conditions must be assessed to allow a complete appraisal of the alternatives. This paper provides an overview of the technical approach being used to assess the radiological risks from the transportation of radioactive wastes. The approach presented employs the RADTRAN 4 computer code to estimate the collective population risk during routine and accident transportation conditions. Supplemental analyses are conducted using the RISKIND computer code to address areas of specific concern to individuals or population subgroups. RISKIND is used for estimating routine doses to maximally exposed individuals and for assessing the consequences of the most severe credible transportation accidents. The transportation risk assessment is designed to ensure -- through uniform and judicious selection of models, data, and assumptions -- that relative comparisons of risk among the various alternatives are meaningful. This is accomplished by uniformly applying common input parameters and assumptions to each waste type for all alternatives. The approach presented can be applied to all radioactive waste types and provides a consistent and comprehensive evaluation of transportation-related risk

  14. A matter of fact? Adolescents' assumptions about crime, laws, and authority and their domain-specific beliefs about punishment.

    Science.gov (United States)

    Oosterhoff, Benjamin; Shook, Natalie J; Metzger, Aaron

    2018-01-01

    This study examined adolescents' beliefs about the amount of punishment individuals should receive for violating different laws and whether these beliefs are connected with their informational assumptions (i.e., perceived facts) about crime, laws, and authority. American adolescents (N = 340; M age  = 16.64, 58.2% female) reported their judgments concerning the appropriate punishment for violating laws regulating domain-specific behaviors and their informational assumptions regarding the prevalence and causes of crime, beliefs that authority is knowledgeable, and the purpose of punishment. Greater internal attributions for crime was associated with stronger punishment judgments for violating laws that regulate moral and conventional issues. Greater beliefs that punishment teaches right from wrong was associated with stronger punishment judgments for violating laws that regulate drug-related prudential issues, and lower punishment judgments for violating laws that regulate personal issues. Greater beliefs that authorities are more knowledgeable than others was associated with stronger punishment judgments for violating laws that regulate personal issues. Copyright © 2017 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  15. Unconditionally Secure and Universally Composable Commitments from Physical Assumptions

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Scafuro, Alessandra

    2013-01-01

    We present a constant-round unconditional black-box compiler that transforms any ideal (i.e., statistically-hiding and statistically-binding) straight-line extractable commitment scheme, into an extractable and equivocal commitment scheme, therefore yielding to UC-security [9]. We exemplify the u...... of unconditional UC-security with (malicious) PUFs and stateless tokens, our compiler can be instantiated with any ideal straight-line extractable commitment scheme, thus allowing the use of various setup assumptions which may better fit the application or the technology available....

  16. The philosophy and assumptions underlying exposure limits for ionising radiation, inorganic lead, asbestos and noise

    International Nuclear Information System (INIS)

    Akber, R.

    1996-01-01

    Full text: A review of the literature relating to exposure to, and exposure limits for, ionising radiation, inorganic lead, asbestos and noise was undertaken. The four hazards were chosen because they were insidious and ubiquitous, were potential hazards in both occupational and environmental settings and had early and late effects depending on dose and dose rate. For all four hazards, the effect of the hazard was enhanced by other exposures such as smoking or organic solvents. In the cases of inorganic lead and noise, there were documented health effects which affected a significant percentage of the exposed populations at or below the [effective] exposure limits. This was not the case for ionising radiation and asbestos. None of the exposure limits considered exposure to multiple mutagens/carcinogens in the calculation of risk. Ionising radiation was the only one of the hazards to have a model of all likely exposures, occupational, environmental and medical, as the basis for the exposure limits. The other three considered occupational exposure in isolation from environmental exposure. Inorganic lead and noise had economic considerations underlying the exposure limits and the exposure limits for asbestos were based on the current limit of detection. All four hazards had many variables associated with exposure, including idiosyncratic factors, that made modelling the risk very complex. The scientific idea of a time weighted average based on an eight hour day, and forty hour week on which the exposure limits for lead, asbestos and noise were based was underpinned by neither empirical evidence or scientific hypothesis. The methodology of the ACGIH in the setting of limits later brought into law, may have been unduly influenced by the industries most closely affected by those limits. Measuring exposure over part of an eight hour day and extrapolating to model exposure over the longer term is not the most effective way to model exposure. The statistical techniques used

  17. Sensitivity of C-Band Polarimetric Radar-Based Drop Size Distribution Measurements to Maximum Diameter Assumptions

    Science.gov (United States)

    Carey, Lawrence D.; Petersen, Walter A.

    2011-01-01

    The estimation of rain drop size distribution (DSD) parameters from polarimetric radar observations is accomplished by first establishing a relationship between differential reflectivity (Z(sub dr)) and the central tendency of the rain DSD such as the median volume diameter (D0). Since Z(sub dr) does not provide a direct measurement of DSD central tendency, the relationship is typically derived empirically from rain drop and radar scattering models (e.g., D0 = F[Z (sub dr)] ). Past studies have explored the general sensitivity of these models to temperature, radar wavelength, the drop shape vs. size relation, and DSD variability. Much progress has been made in recent years in measuring the drop shape and DSD variability using surface-based disdrometers, such as the 2D Video disdrometer (2DVD), and documenting their impact on polarimetric radar techniques. In addition to measuring drop shape, another advantage of the 2DVD over earlier impact type disdrometers is its ability to resolve drop diameters in excess of 5 mm. Despite this improvement, the sampling limitations of a disdrometer, including the 2DVD, make it very difficult to adequately measure the maximum drop diameter (D(sub max)) present in a typical radar resolution volume. As a result, D(sub max) must still be assumed in the drop and radar models from which D0 = F[Z(sub dr)] is derived. Since scattering resonance at C-band wavelengths begins to occur in drop diameters larger than about 5 mm, modeled C-band radar parameters, particularly Z(sub dr), can be sensitive to D(sub max) assumptions. In past C-band radar studies, a variety of D(sub max) assumptions have been made, including the actual disdrometer estimate of D(sub max) during a typical sampling period (e.g., 1-3 minutes), D(sub max) = C (where C is constant at values from 5 to 8 mm), and D(sub max) = M*D0 (where the constant multiple, M, is fixed at values ranging from 2.5 to 3.5). The overall objective of this NASA Global Precipitation Measurement

  18. Washington International Renewable Energy Conference (WIREC) 2008 Pledges. Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bilello, Daniel E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cowlin, Shannon C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wise, Alison [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2008-08-01

    This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions resulting from more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy.

  19. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  20. Halo-independent direct detection analyses without mass assumptions

    International Nuclear Information System (INIS)

    Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m χ −σ n plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v min −g-tilde plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v min to nuclear recoil momentum (p R ), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call h-til-tilde(p R ). The entire family of conventional halo-independent g-tilde(v min ) plots for all DM masses are directly found from the single h-tilde(p R ) plot through a simple rescaling of axes. By considering results in h-tilde(p R ) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple g-tilde(v min ) plots for different DM masses. We conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity

  1. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    Science.gov (United States)

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  2. Who needs the assumption of opportunistic behavior? Transaction cost economics does not!

    DEFF Research Database (Denmark)

    Koch, Carsten Allan

    2000-01-01

    The assumption of opportunistic behavior, familiar from transaction cost economics, has been and remains highly controversial. But opportunistic behavior, albeit undoubtedly an extremely important form of motivation, is not a necessary condition for the contractual problems studied by transaction...

  3. Interpreting the concordance statistic of a logistic regression model: relation to the variance and odds ratio of a continuous explanatory variable.

    Science.gov (United States)

    Austin, Peter C; Steyerberg, Ewout W

    2012-06-20

    When outcomes are binary, the c-statistic (equivalent to the area under the Receiver Operating Characteristic curve) is a standard measure of the predictive accuracy of a logistic regression model. An analytical expression was derived under the assumption that a continuous explanatory variable follows a normal distribution in those with and without the condition. We then conducted an extensive set of Monte Carlo simulations to examine whether the expressions derived under the assumption of binormality allowed for accurate prediction of the empirical c-statistic when the explanatory variable followed a normal distribution in the combined sample of those with and without the condition. We also examine the accuracy of the predicted c-statistic when the explanatory variable followed a gamma, log-normal or uniform distribution in combined sample of those with and without the condition. Under the assumption of binormality with equality of variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the product of the standard deviation of the normal components (reflecting more heterogeneity) and the log-odds ratio (reflecting larger effects). Under the assumption of binormality with unequal variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the standardized difference of the explanatory variable in those with and without the condition. In our Monte Carlo simulations, we found that these expressions allowed for reasonably accurate prediction of the empirical c-statistic when the distribution of the explanatory variable was normal, gamma, log-normal, and uniform in the entire sample of those with and without the condition. The discriminative ability of a continuous explanatory variable cannot be judged by its odds ratio alone, but always needs to be considered in relation to the heterogeneity of the population.

  4. Projections of temperature-related excess mortality under climate change scenarios.

    Science.gov (United States)

    Gasparrini, Antonio; Guo, Yuming; Sera, Francesco; Vicedo-Cabrera, Ana Maria; Huber, Veronika; Tong, Shilu; de Sousa Zanotti Stagliorio Coelho, Micheline; Nascimento Saldiva, Paulo Hilario; Lavigne, Eric; Matus Correa, Patricia; Valdes Ortega, Nicolas; Kan, Haidong; Osorio, Samuel; Kyselý, Jan; Urban, Aleš; Jaakkola, Jouni J K; Ryti, Niilo R I; Pascal, Mathilde; Goodman, Patrick G; Zeka, Ariana; Michelozzi, Paola; Scortichini, Matteo; Hashizume, Masahiro; Honda, Yasushi; Hurtado-Diaz, Magali; Cesar Cruz, Julio; Seposo, Xerxes; Kim, Ho; Tobias, Aurelio; Iñiguez, Carmen; Forsberg, Bertil; Åström, Daniel Oudin; Ragettli, Martina S; Guo, Yue Leon; Wu, Chang-Fu; Zanobetti, Antonella; Schwartz, Joel; Bell, Michelle L; Dang, Tran Ngoc; Van, Dung Do; Heaviside, Clare; Vardoulakis, Sotiris; Hajat, Shakoor; Haines, Andy; Armstrong, Ben

    2017-12-01

    Climate change can directly affect human health by varying exposure to non-optimal outdoor temperature. However, evidence on this direct impact at a global scale is limited, mainly due to issues in modelling and projecting complex and highly heterogeneous epidemiological relationships across different populations and climates. We collected observed daily time series of mean temperature and mortality counts for all causes or non-external causes only, in periods ranging from Jan 1, 1984, to Dec 31, 2015, from various locations across the globe through the Multi-Country Multi-City Collaborative Research Network. We estimated temperature-mortality relationships through a two-stage time series design. We generated current and future daily mean temperature series under four scenarios of climate change, determined by varying trajectories of greenhouse gas emissions, using five general circulation models. We projected excess mortality for cold and heat and their net change in 1990-2099 under each scenario of climate change, assuming no adaptation or population changes. Our dataset comprised 451 locations in 23 countries across nine regions of the world, including 85 879 895 deaths. Results indicate, on average, a net increase in temperature-related excess mortality under high-emission scenarios, although with important geographical differences. In temperate areas such as northern Europe, east Asia, and Australia, the less intense warming and large decrease in cold-related excess would induce a null or marginally negative net effect, with the net change in 2090-99 compared with 2010-19 ranging from -1·2% (empirical 95% CI -3·6 to 1·4) in Australia to -0·1% (-2·1 to 1·6) in east Asia under the highest emission scenario, although the decreasing trends would reverse during the course of the century. Conversely, warmer regions, such as the central and southern parts of America or Europe, and especially southeast Asia, would experience a sharp surge in heat-related

  5. CRITICAL ASSUMPTIONS IN THE F-TANK FARM CLOSURE OPERATIONAL DOCUMENTATION REGARDING WASTE TANK INTERNAL CONFIGURATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Hommel, S.; Fountain, D.

    2012-03-28

    The intent of this document is to provide clarification of critical assumptions regarding the internal configurations of liquid waste tanks at operational closure, with respect to F-Tank Farm (FTF) closure documentation. For the purposes of this document, FTF closure documentation includes: (1) Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the FTF PA) (SRS-REG-2007-00002), (2) Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site (DOE/SRS-WD-2012-001), (3) Tier 1 Closure Plan for the F-Area Waste Tank Systems at the Savannah River Site (SRR-CWDA-2010-00147), (4) F-Tank Farm Tanks 18 and 19 DOE Manual 435.1-1 Tier 2 Closure Plan Savannah River Site (SRR-CWDA-2011-00015), (5) Industrial Wastewater Closure Module for the Liquid Waste Tanks 18 and 19 (SRRCWDA-2010-00003), and (6) Tank 18/Tank 19 Special Analysis for the Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the Tank 18/Tank 19 Special Analysis) (SRR-CWDA-2010-00124). Note that the first three FTF closure documents listed apply to the entire FTF, whereas the last three FTF closure documents listed are specific to Tanks 18 and 19. These two waste tanks are expected to be the first two tanks to be grouted and operationally closed under the current suite of FTF closure documents and many of the assumptions and approaches that apply to these two tanks are also applicable to the other FTF waste tanks and operational closure processes.

  6. Utility of Web search query data in testing theoretical assumptions about mephedrone.

    Science.gov (United States)

    Kapitány-Fövény, Máté; Demetrovics, Zsolt

    2017-05-01

    With growing access to the Internet, people who use drugs and traffickers started to obtain information about novel psychoactive substances (NPS) via online platforms. This paper aims to analyze whether a decreasing Web interest in formerly banned substances-cocaine, heroin, and MDMA-and the legislative status of mephedrone predict Web interest about this NPS. Google Trends was used to measure changes of Web interest on cocaine, heroin, MDMA, and mephedrone. Google search results for mephedrone within the same time frame were analyzed and categorized. Web interest about classic drugs found to be more persistent. Regarding geographical distribution, location of Web searches for heroin and cocaine was less centralized. Illicit status of mephedrone was a negative predictor of its Web search query rates. The connection between mephedrone-related Web search rates and legislative status of this substance was significantly mediated by ecstasy-related Web search queries, the number of documentaries, and forum/blog entries about mephedrone. The results might provide support for the hypothesis that mephedrone's popularity was highly correlated with its legal status as well as it functioned as a potential substitute for MDMA. Google Trends was found to be a useful tool for testing theoretical assumptions about NPS. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Estimating the global prevalence of inadequate zinc intake from national food balance sheets: effects of methodological assumptions.

    Directory of Open Access Journals (Sweden)

    K Ryan Wessells

    Full Text Available The prevalence of inadequate zinc intake in a population can be estimated by comparing the zinc content of the food supply with the population's theoretical requirement for zinc. However, assumptions regarding the nutrient composition of foods, zinc requirements, and zinc absorption may affect prevalence estimates. These analyses were conducted to: (1 evaluate the effect of varying methodological assumptions on country-specific estimates of the prevalence of dietary zinc inadequacy and (2 generate a model considered to provide the best estimates.National food balance data were obtained from the Food and Agriculture Organization of the United Nations. Zinc and phytate contents of these foods were estimated from three nutrient composition databases. Zinc absorption was predicted using a mathematical model (Miller equation. Theoretical mean daily per capita physiological and dietary requirements for zinc were calculated using recommendations from the Food and Nutrition Board of the Institute of Medicine and the International Zinc Nutrition Consultative Group. The estimated global prevalence of inadequate zinc intake varied between 12-66%, depending on which methodological assumptions were applied. However, country-specific rank order of the estimated prevalence of inadequate intake was conserved across all models (r = 0.57-0.99, P<0.01. A "best-estimate" model, comprised of zinc and phytate data from a composite nutrient database and IZiNCG physiological requirements for absorbed zinc, estimated the global prevalence of inadequate zinc intake to be 17.3%.Given the multiple sources of uncertainty in this method, caution must be taken in the interpretation of the estimated prevalence figures. However, the results of all models indicate that inadequate zinc intake may be fairly common globally. Inferences regarding the relative likelihood of zinc deficiency as a public health problem in different countries can be drawn based on the country

  8. Life Insurance and Annuity Demand under Hyperbolic Discounting

    Directory of Open Access Journals (Sweden)

    Siqi Tang

    2018-04-01

    Full Text Available In this paper, we analyse and construct a lifetime utility maximisation model with hyperbolic discounting. Within the model, a number of assumptions are made: complete markets, actuarially fair life insurance/annuity is available, and investors have time-dependent preferences. Time dependent preferences are in contrast to the usual case of constant preferences (exponential discounting. We find: (1 investors (realistically demand more life insurance after retirement (in contrast to the standard model, which showed strong demand for life annuities, and annuities are rarely purchased; (2 optimal consumption paths exhibit a humped shape (which is usually only found in incomplete markets under the assumptions of the standard model.

  9. First conclusions about results of GPR investigations in the Church of the Assumption of the Blessed Virgin Mary in Kłodzko, Poland

    Directory of Open Access Journals (Sweden)

    A. Chernov

    2018-03-01

    Full Text Available The article presents results of a ground penetrating radar (GPR investigation carried out in the Church of the Assumption of the Blessed Virgin Mary in Kłodzko, Poland, dating from the 14th to 16th centuries. Due to the 20th century wars, the current state of knowledge about the history of the church is still poor. Under the floor of the Catholic temple, unknown structures might exist. To verify the presence of underground structures such as crypts and tombs, a GPR survey was carried out in chapels and aisles with 500 and 800 MHz GPR shielded antennas. Numerous anomalies were detected. It was concluded that those under the chapels were caused by the presence of crypts beneath the floor.

  10. Questioning the "big assumptions". Part I: addressing personal contradictions that impede professional development.

    Science.gov (United States)

    Bowe, Constance M; Lahey, Lisa; Armstrong, Elizabeth; Kegan, Robert

    2003-08-01

    The ultimate success of recent medical curriculum reforms is, in large part, dependent upon the faculty's ability to adopt and sustain new attitudes and behaviors. However, like many New Year's resolutions, sincere intent to change may be short lived and followed by a discouraging return to old behaviors. Failure to sustain the initial resolve to change can be misinterpreted as a lack of commitment to one's original goals and eventually lead to greater effort expended in rationalizing the status quo rather than changing it. The present article outlines how a transformative process that has proven to be effective in managing personal change, Questioning the Big Assumptions, was successfully used in an international faculty development program for medical educators to enhance individual personal satisfaction and professional effectiveness. This process systematically encouraged participants to explore and proactively address currently operative mechanisms that could stall their attempts to change at the professional level. The applications of the Big Assumptions process in faculty development helped individuals to recognize and subsequently utilize unchallenged and deep rooted personal beliefs to overcome unconscious resistance to change. This approach systematically led participants away from circular griping about what was not right in their current situation to identifying the actions that they needed to take to realize their individual goals. By thoughtful testing of personal Big Assumptions, participants designed behavioral changes that could be broadly supported and, most importantly, sustained.

  11. BAYESIAN ESTIMATION OF THE SHAPE PARAMETER OF THE GENERALISED EXPONENTIAL DISTRIBUTION UNDER DIFFERENT LOSS FUNCTIONS

    Directory of Open Access Journals (Sweden)

    SANKU DEY

    2010-11-01

    Full Text Available The generalized exponential (GE distribution proposed by Gupta and Kundu (1999 is an important lifetime distribution in survival analysis. In this article, we propose to obtain Bayes estimators and its associated risk based on a class of  non-informative prior under the assumption of three loss functions, namely, quadratic loss function (QLF, squared log-error loss function (SLELF and general entropy loss function (GELF. The motivation is to explore the most appropriate loss function among these three loss functions. The performances of the estimators are, therefore, compared on the basis of their risks obtained under QLF, SLELF and GELF separately. The relative efficiency of the estimators is also obtained. Finally, Monte Carlo simulations are performed to compare the performances of the Bayes estimates under different situations.

  12. Estimation of river and stream temperature trends under haphazard sampling

    Science.gov (United States)

    Gray, Brian R.; Lyubchich, Vyacheslav; Gel, Yulia R.; Rogala, James T.; Robertson, Dale M.; Wei, Xiaoqiao

    2015-01-01

    Long-term temporal trends in water temperature in rivers and streams are typically estimated under the assumption of evenly-spaced space-time measurements. However, sampling times and dates associated with historical water temperature datasets and some sampling designs may be haphazard. As a result, trends in temperature may be confounded with trends in time or space of sampling which, in turn, may yield biased trend estimators and thus unreliable conclusions. We address this concern using multilevel (hierarchical) linear models, where time effects are allowed to vary randomly by day and date effects by year. We evaluate the proposed approach by Monte Carlo simulations with imbalance, sparse data and confounding by trend in time and date of sampling. Simulation results indicate unbiased trend estimators while results from a case study of temperature data from the Illinois River, USA conform to river thermal assumptions. We also propose a new nonparametric bootstrap inference on multilevel models that allows for a relatively flexible and distribution-free quantification of uncertainties. The proposed multilevel modeling approach may be elaborated to accommodate nonlinearities within days and years when sampling times or dates typically span temperature extremes.

  13. X-ray near-field holography. Beyond idealized assumptions of the probe

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, Johannes

    2017-07-01

    The work at hand considers the imperfect, often neglected, aspects of X-ray nearfield phase-contrast propagation imaging, or in short: X-ray near-field holography (NFH). NFH is a X-ray microscopy technique able to yield high resolution, yet low dose imaging of a wide range of specimen. Derived from wave optical theory, propagation-based imaging methods rely on assumptions for the illuminating wave field. These are for example the assumptions of a perfect plane wave or spherical wave emanating from a point source or monochromaticity. Violation of the point source assumption implies for example at the same time the occurrence of a distorted wave front and a finite degree of coherence, both crucial for NFH. With the advances in X-ray focusing, instrumentation and X-ray wave guiding, NFH has become of high interest, since the barriers for practical implementation have been overcome. The idea of holography originates from electron microscopy to overcome the lack of high-quality electron lenses. With holography the need for optics between the specimen and detector is circumvented. The drawback, however, is that the measurement obtained at the detector is not a direct image of the specimen under survey but a ''propagated version'' of it, the so-called hologram. The problem with the optics is replaced by another problem, also referred to as the phase problem. The phase problem is caused by the fact that only the intensities of a wave field can be measured but not the phase information. The phase information is crucial for obtaining the image of the specimen and thus needs to be reconstructed. In recent years the methodology, sometimes also mythology, has been developed to reconstruct the specimen from the measured hologram. For a long time, the standard approach to deal with deviations from the ideal assumptions in real world holography experiments has been to simply ignore these. The prime example for this is the method of the standard flat

  14. X-ray near-field holography. Beyond idealized assumptions of the probe

    International Nuclear Information System (INIS)

    Hagemann, Johannes

    2017-01-01

    The work at hand considers the imperfect, often neglected, aspects of X-ray nearfield phase-contrast propagation imaging, or in short: X-ray near-field holography (NFH). NFH is a X-ray microscopy technique able to yield high resolution, yet low dose imaging of a wide range of specimen. Derived from wave optical theory, propagation-based imaging methods rely on assumptions for the illuminating wave field. These are for example the assumptions of a perfect plane wave or spherical wave emanating from a point source or monochromaticity. Violation of the point source assumption implies for example at the same time the occurrence of a distorted wave front and a finite degree of coherence, both crucial for NFH. With the advances in X-ray focusing, instrumentation and X-ray wave guiding, NFH has become of high interest, since the barriers for practical implementation have been overcome. The idea of holography originates from electron microscopy to overcome the lack of high-quality electron lenses. With holography the need for optics between the specimen and detector is circumvented. The drawback, however, is that the measurement obtained at the detector is not a direct image of the specimen under survey but a ''propagated version'' of it, the so-called hologram. The problem with the optics is replaced by another problem, also referred to as the phase problem. The phase problem is caused by the fact that only the intensities of a wave field can be measured but not the phase information. The phase information is crucial for obtaining the image of the specimen and thus needs to be reconstructed. In recent years the methodology, sometimes also mythology, has been developed to reconstruct the specimen from the measured hologram. For a long time, the standard approach to deal with deviations from the ideal assumptions in real world holography experiments has been to simply ignore these. The prime example for this is the method of the standard flat

  15. Globfit: Consistently fitting primitives by discovering global relations

    KAUST Repository

    Li, Yangyan; Wu, Xiaokun; Chrysathou, Yiorgos; Sharf, Andrei Sharf; Cohen-Or, Daniel; Mitra, Niloy J.

    2011-01-01

    Given a noisy and incomplete point set, we introduce a method that simultaneously recovers a set of locally fitted primitives along with their global mutual relations. We operate under the assumption that the data corresponds to a man-made engineering object consisting of basic primitives, possibly repeated and globally aligned under common relations. We introduce an algorithm to directly couple the local and global aspects of the problem. The local fit of the model is determined by how well the inferred model agrees to the observed data, while the global relations are iteratively learned and enforced through a constrained optimization. Starting with a set of initial RANSAC based locally fitted primitives, relations across the primitives such as orientation, placement, and equality are progressively learned and conformed to. In each stage, a set of feasible relations are extracted among the candidate relations, and then aligned to, while best fitting to the input data. The global coupling corrects the primitives obtained in the local RANSAC stage, and brings them to precise global alignment. We test the robustness of our algorithm on a range of synthesized and scanned data, with varying amounts of noise, outliers, and non-uniform sampling, and validate the results against ground truth, where available. © 2011 ACM.

  16. Globfit: Consistently fitting primitives by discovering global relations

    KAUST Repository

    Li, Yangyan

    2011-07-01

    Given a noisy and incomplete point set, we introduce a method that simultaneously recovers a set of locally fitted primitives along with their global mutual relations. We operate under the assumption that the data corresponds to a man-made engineering object consisting of basic primitives, possibly repeated and globally aligned under common relations. We introduce an algorithm to directly couple the local and global aspects of the problem. The local fit of the model is determined by how well the inferred model agrees to the observed data, while the global relations are iteratively learned and enforced through a constrained optimization. Starting with a set of initial RANSAC based locally fitted primitives, relations across the primitives such as orientation, placement, and equality are progressively learned and conformed to. In each stage, a set of feasible relations are extracted among the candidate relations, and then aligned to, while best fitting to the input data. The global coupling corrects the primitives obtained in the local RANSAC stage, and brings them to precise global alignment. We test the robustness of our algorithm on a range of synthesized and scanned data, with varying amounts of noise, outliers, and non-uniform sampling, and validate the results against ground truth, where available. © 2011 ACM.

  17. Testing the assumptions behind emissions trading in non-market goods: the RECLAIM program in Southern California

    International Nuclear Information System (INIS)

    Lejano, Raul P.; Hirose, Rei

    2005-01-01

    Emissions trading is, essentially, a policy instrument that is designed to simulate a market for an otherwise public good. Conceptually, its justification hinges on a number of key assumptions, namely the negligibility of local impacts, the ability to separate and commodify the good in question, and characteristics of a well-functioning market. The authors examine the performance of RECLAIM, a NO x emissions trading program in Southern California, USA, and illustrate how to test these assumptions. There is some evidence that the trading of NO x generates new externalities, such as the possibility that other air pollutants, e.g. volatile organics, are essentially traded along with it. Moreover, the RECLAIM program has recently begun to experience difficulties due to the fact that the market is relatively thin. This analysis provides ways to assess more deeply and reform these trading regimes, including opening up RECLAIM to public review. The case study speaks to a wider arena, as emissions trading is presently being considered in other parts of the world to address issues ranging from acid rain to non-point source pollution to greenhouse gases. The analytic approach, illustrated herein, is a general one that has a wider applicability than the particular case of NO x trading. It is hoped that this kind of critical inquiry can lead to a more careful deliberation of the merits and challenges of emissions trading

  18. Scenario Analysis In The Calculation Of Investment Efficiency–The Problem Of Formulating Assumptions

    Directory of Open Access Journals (Sweden)

    Dittmann Iwona

    2015-09-01

    Full Text Available This article concerns the problem of formulating assumptions in scenario analysis for investments which consist of the renting out of an apartment. The article attempts to indicate the foundations for the formulation of assumptions on the basis of observed retrospective regularities. It includes theoretical considerations regarding scenario design, as well as the results of studies on the formulation, in the past, of quantities which determined or were likely to bring closer estimate the value of the individual explanatory variables for a chosen measure of investment profitability (MIRRFCFE. The dynamics of and correlation between the variables were studied. The research was based on quarterly data from local residential real estate markets in Poland (in the six largest cities in the years 2006 – 2014, as well as on data from the financial market.

  19. Common-Sense Chemistry: The Use of Assumptions and Heuristics in Problem Solving

    Science.gov (United States)

    Maeyer, Jenine Rachel

    2013-01-01

    Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build…

  20. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  1. [The ratio birth-weight, placental weight and the term of delivery. A contribution to the problem of a relative placental insufficiency in late pregnancy (author's transl)].

    Science.gov (United States)

    Warkentin, B

    1976-12-10

    It is suggested, that a relative placental insufficiency in late pregnancy is one of the releasing factors of childbirth. Under this assumption 1027 deliveries in term pregnancy (266th-294th day of pregnancy) were inquired on the interrelationship between the ratio brith-weight: placental-weight and the duration of pregnancy. The average birth-weight increases slighly but significantly with the duration of pregnancy just as the average placental-weight. The average ratio birth-weight: placental-weight decreases significantly: The more unfavorable the ratio birth-weight: placental-weight is, the shorter remains the fetus in utero. This underlines the assumption of a relative placental insufficiency as one of the releasing factors of childbirth.

  2. IRT models with relaxed assumptions in eRm: A manual-like instruction

    Directory of Open Access Journals (Sweden)

    REINHOLD HATZINGER

    2009-03-01

    Full Text Available Linear logistic models with relaxed assumptions (LLRA as introduced by Fischer (1974 are a flexible tool for the measurement of change for dichotomous or polytomous responses. As opposed to the Rasch model, assumptions on dimensionality of items, their mutual dependencies and the distribution of the latent trait in the population of subjects are relaxed. Conditional maximum likelihood estimation allows for inference about treatment, covariate or trend effect parameters without taking the subjects' latent trait values into account. In this paper we will show how LLRAs based on the LLTM, LRSM and LPCM can be used to answer various questions about the measurement of change and how they can be fitted in R using the eRm package. A number of small didactic examples is provided that can easily be used as templates for real data sets. All datafiles used in this paper are available from http://eRm.R-Forge.R-project.org/

  3. Wartime Paris, cirrhosis mortality, and the ceteris paribus assumption.

    Science.gov (United States)

    Fillmore, Kaye Middleton; Roizen, Ron; Farrell, Michael; Kerr, William; Lemmens, Paul

    2002-07-01

    This article critiques the ceteris paribus assumption, which tacitly sustains the epidemiologic literature's inference that the sharp decline in cirrhosis mortality observed in Paris during the Second World War derived from a sharp constriction in wine consumption. Paris's wartime circumstances deviate substantially from the "all else being equal" assumption, and at least three other hypotheses for the cirrhosis decline may be contemplated. Historical and statistical review. Wartime Paris underwent tumultuous changes. Wine consumption did decline, but there were, as well, a myriad of other changes in diet and life experience, many involving new or heightened hardships, nutritional, experiential, institutional, health and mortality risks. Three competing hypotheses are presented: (1) A fraction of the candidates for cirrhosis mortality may have fallen to more sudden forms of death; (2) alcoholics, heavy drinkers and Paris's clochard subpopulation may have been differentially likely to become removed from the city's wartime population, whether by self-initiated departure, arrest and deportation, or death from other causes, even murder; and (3) there was mismeasurement in the cirrhosis mortality decline. The alcohol-cirrhosis connection provided the template for the alcohol research effort (now more than 20 years old) aimed at re-establishing scientific recognition of alcohol's direct alcohol-problems-generating associations and causal responsibilities. In a time given to reports of weaker associations of the alcohol-cirrhosis connection, the place and importance of the Paris curve in the wider literature, as regards that connection, remains. For this reason, the Paris findings should be subjected to as much research scrutiny as they undoubtedly deserve.

  4. Use of 7Be as a sediment tracer: a scope for testing and refining key assumptions related to its adsorption on a catchment scale

    Science.gov (United States)

    Ryken, Nick; Al-Barri, Bashar; Blake, Will; Taylor, Alex; Boeckx, Pascal; Verdoodt, Ann

    2014-05-01

    To date the use of Beryllium-7 (7Be) as a sediment tracer on catchment scale is largely understudied, although several studies applied the ratio 7Be/137Cs or 7Be/210Pbex for sediment source fingerprinting. Several key assumptions, (1) spatially uniform fallout, (2) immediate adsorption upon contact with the soil and (3) irreversible adsorption by the soil, must hold if 7Be is to be used as a sediment tracer. However, recent studies have raised questions about the validity of these assumptions in the changing environments on a catchment scale. In this study three representative soil types of the Mariaborrebeek catchment, a small watershed located in the Flemish Ardennes in Belgium, were collected to assess the adsorption rate of 7Be on the soil surface in this catchment. In a laboratory experiment, soil samples were equilibrated with a stable Be solution of 1 mg l-1 at a soil:solution ratio of 1:10 and the adsorption of Be was measured at different time intervals. Furthermore, different amendments were applied to assess the impact of soil pH, fertilizer and organic matter on the adsorption of Be. Preliminary results confirm a rapid and almost complete Be adsorption and a negative correlation between pH and Be adsorption. The results of this study might lead to the formulation of interpretation guidelines for the use of 7Be to assess short-term soil redistribution and sediment source fingerprinting on catchment scale.

  5. Sensitivity of probabilistic MCO water content estimates to key assumptions

    International Nuclear Information System (INIS)

    DUNCAN, D.R.

    1999-01-01

    Sensitivity of probabilistic multi-canister overpack (MCO) water content estimates to key assumptions is evaluated with emphasis on the largest non-cladding film-contributors, water borne by particulates adhering to damage sites, and water borne by canister particulate. Calculations considered different choices of damage state degree of independence, different choices of percentile for reference high inputs, three types of input probability density function (pdfs): triangular, log-normal, and Weibull, and the number of scrap baskets in an MCO

  6. What Were We Thinking? Five Erroneous Assumptions That Have Fueled Specialized Interventions for Adolescents Who Have Sexually Offended

    Science.gov (United States)

    Worling, James R.

    2013-01-01

    Since the early 1980s, five assumptions have influenced the assessment, treatment, and community supervision of adolescents who have offended sexually. In particular, interventions with this population have been informed by the assumptions that these youth are (i) deviant, (ii) delinquent, (iii) disordered, (iv) deficit-ridden, and (v) deceitful.…

  7. Fair lineups are better than biased lineups and showups, but not because they increase underlying discriminability.

    Science.gov (United States)

    Smith, Andrew M; Wells, Gary L; Lindsay, R C L; Penrod, Steven D

    2017-04-01

    Receiver Operating Characteristic (ROC) analysis has recently come in vogue for assessing the underlying discriminability and the applied utility of lineup procedures. Two primary assumptions underlie recommendations that ROC analysis be used to assess the applied utility of lineup procedures: (a) ROC analysis of lineups measures underlying discriminability, and (b) the procedure that produces superior underlying discriminability produces superior applied utility. These same assumptions underlie a recently derived diagnostic-feature detection theory, a theory of discriminability, intended to explain recent patterns observed in ROC comparisons of lineups. We demonstrate, however, that these assumptions are incorrect when ROC analysis is applied to lineups. We also demonstrate that a structural phenomenon of lineups, differential filler siphoning, and not the psychological phenomenon of diagnostic-feature detection, explains why lineups are superior to showups and why fair lineups are superior to biased lineups. In the process of our proofs, we show that computational simulations have assumed, unrealistically, that all witnesses share exactly the same decision criteria. When criterial variance is included in computational models, differential filler siphoning emerges. The result proves dissociation between ROC curves and underlying discriminability: Higher ROC curves for lineups than for showups and for fair than for biased lineups despite no increase in underlying discriminability. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. The effects of behavioral and structural assumptions in artificial stock market

    Science.gov (United States)

    Liu, Xinghua; Gregor, Shirley; Yang, Jianmei

    2008-04-01

    Recent literature has developed the conjecture that important statistical features of stock price series, such as the fat tails phenomenon, may depend mainly on the market microstructure. This conjecture motivated us to investigate the roles of both the market microstructure and agent behavior with respect to high-frequency returns and daily returns. We developed two simple models to investigate this issue. The first one is a stochastic model with a clearing house microstructure and a population of zero-intelligence agents. The second one has more behavioral assumptions based on Minority Game and also has a clearing house microstructure. With the first model we found that a characteristic of the clearing house microstructure, namely the clearing frequency, can explain fat tail, excess volatility and autocorrelation phenomena of high-frequency returns. However, this feature does not cause the same phenomena in daily returns. So the Stylized Facts of daily returns depend mainly on the agents’ behavior. With the second model we investigated the effects of behavioral assumptions on daily returns. Our study implicates that the aspects which are responsible for generating the stylized facts of high-frequency returns and daily returns are different.

  9. Fast logic?: Examining the time course assumption of dual process theory.

    Science.gov (United States)

    Bago, Bence; De Neys, Wim

    2017-01-01

    Influential dual process models of human thinking posit that reasoners typically produce a fast, intuitive heuristic (i.e., Type-1) response which might subsequently be overridden and corrected by slower, deliberative processing (i.e., Type-2). In this study we directly tested this time course assumption. We used a two response paradigm in which participants have to give an immediate answer and afterwards are allowed extra time before giving a final response. In four experiments we used a range of procedures (e.g., challenging response deadline, concurrent load) to knock out Type 2 processing and make sure that the initial response was intuitive in nature. Our key finding is that we frequently observe correct, logical responses as the first, immediate response. Response confidence and latency analyses indicate that these initial correct responses are given fast, with high confidence, and in the face of conflicting heuristic responses. Findings suggest that fast and automatic Type 1 processing also cues a correct logical response from the start. We sketch a revised dual process model in which the relative strength of different types of intuitions determines reasoning performance. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Handedness is related to neural mechanisms underlying hemispheric lateralization of face processing

    Science.gov (United States)

    Frässle, Stefan; Krach, Sören; Paulus, Frieder Michel; Jansen, Andreas

    2016-06-01

    While the right-hemispheric lateralization of the face perception network is well established, recent evidence suggests that handedness affects the cerebral lateralization of face processing at the hierarchical level of the fusiform face area (FFA). However, the neural mechanisms underlying differential hemispheric lateralization of face perception in right- and left-handers are largely unknown. Using dynamic causal modeling (DCM) for fMRI, we aimed to unravel the putative processes that mediate handedness-related differences by investigating the effective connectivity in the bilateral core face perception network. Our results reveal an enhanced recruitment of the left FFA in left-handers compared to right-handers, as evidenced by more pronounced face-specific modulatory influences on both intra- and interhemispheric connections. As structural and physiological correlates of handedness-related differences in face processing, right- and left-handers varied with regard to their gray matter volume in the left fusiform gyrus and their pupil responses to face stimuli. Overall, these results describe how handedness is related to the lateralization of the core face perception network, and point to different neural mechanisms underlying face processing in right- and left-handers. In a wider context, this demonstrates the entanglement of structurally and functionally remote brain networks, suggesting a broader underlying process regulating brain lateralization.

  11. The neuromechanism underlying verbal analogical reasoning of metaphorical relations: an event-related potentials study.

    Science.gov (United States)

    Zhao, Ming; Meng, Huishan; Xu, Zhiyuan; Du, Fenglei; Liu, Tao; Li, Yongxin; Chen, Feiyan

    2011-11-24

    Using event-related potentials (ERPs), this study investigated the neuromechanism underlying verbal analogical reasoning of two different metaphorical relations: attributive metaphor and relational metaphor. The analogical reasoning of attributive metaphor (AM-AR) involves a superficial similarity between analogues, while the analogical reasoning of relational metaphor (RM-AR) requires a structural similarity. Subjects were asked to judge whether one word pair was semantically analogous to another word pair. Results showed that the schema induction stage elicited a greater N400 component at the right anterior scalp for the AM-AR and RM-AR tasks, possibly attributable to semantic processing of metaphorical word pairs. The N400 was then followed by a widely distributed P300 and a late negative component (LNC1) at the left anterior scalp. The P300 was possibly related to the formation of a relational category, while the LNC1 was possibly related to the maintenance of a reasoning cue in working memory. The analogy mapping stage elicited broadly distributed N400 and LNC2, which might indicate the presence of semantic retrieval and analogical transfer. In the answer production stage, all conditions elicited the P2 component due to early stimulus encoding. The largest P2 amplitude was in the RM-AR task. The RM-AR elicited a larger LPC than did the AM-AR, even though the baseline correction was taken as a control for the differential P2 effect. The LPC effect might suggest that relational metaphors involved more integration processing than attributive metaphors. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Testing the simplex assumption underlying the Sport Motivation Scale: a structural equation modeling analysis.

    Science.gov (United States)

    Li, F; Harmer, P

    1996-12-01

    Self-determination theory (Deci & Ryan, 1985) suggests that motivational orientation or regulatory styles with respect to various behaviors can be conceptualized along a continuum ranging from low (a motivation) to high (intrinsic motivation) levels of self-determination. This pattern is manifested in the rank order of correlations among these regulatory styles (i.e., adjacent correlations are expected to be higher than those more distant) and is known as a simplex structure. Using responses from the Sport Motivation Scale (Pelletier et al., 1995) obtained from a sample of 857 college students (442 men, 415 women), the present study tested the simplex structure underlying SMS subscales via structural equation modeling. Results confirmed the simplex model structure, indicating that the various motivational constructs are empirically organized from low to high self-determination. The simplex pattern was further found to be invariant across gender. Findings from this study support the construct validity of the SMS and have important implications for studies focusing on the influence of motivational orientation in sport.

  13. What a public-relations-model regarding radioactive waste implicates

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Teruaki [CRC Research Institute, Inc., Tokyo (Japan)]|[Energy Research Center Wakasa Bay, Fukui (Japan)

    1996-12-31

    The behavior of public attitude to radioactive waste with time was investigated by using a mathematical model which was developed for estimating the extent of attitude change, being based on the assumption that the change of public attitude to a certain subject is caused by the information environment mainly formed by the newsmedia. Investigations were also made on the extent the public relations activity can contribute to the change of public opinion for the radioactive waste, and on the method of assortment and execution of various types of activity which brings the maximum change of attitude under a given condition of budget.

  14. Questioning the "big assumptions". Part II: recognizing organizational contradictions that impede institutional change.

    Science.gov (United States)

    Bowe, Constance M; Lahey, Lisa; Kegan, Robert; Armstrong, Elizabeth

    2003-08-01

    Well-designed medical curriculum reforms can fall short of their primary objectives during implementation when unanticipated or unaddressed organizational resistance surfaces. This typically occurs if the agents for change ignore faculty concerns during the planning stage or when the provision of essential institutional safeguards to support new behaviors are neglected. Disappointing outcomes in curriculum reforms then result in the perpetuation of or reversion to the status quo despite the loftiest of goals. Institutional resistance to change, much like that observed during personal development, does not necessarily indicate a communal lack of commitment to the organization's newly stated goals. It may reflect the existence of competing organizational objectives that must be addressed before substantive advances in a new direction can be accomplished. The authors describe how the Big Assumptions process (see previous article) was adapted and applied at the institutional level during a school of medicine's curriculum reform. Reform leaders encouraged faculty participants to articulate their reservations about considered changes to provided insights into the organization's competing commitments. The line of discussion provided an opportunity for faculty to appreciate the gridlock that existed until appropriate test of the school's long held Big Assumptions could be conducted. The Big Assumptions process proved useful in moving faculty groups to recognize and questions the validity of unchallenged institutional beliefs that were likely to undermine efforts toward change. The process also allowed the organization to put essential institutional safeguards in place that ultimately insured that substantive reforms could be sustained.

  15. Bayou Corne sinkhole : control measurements of State Highway 70 in Assumption Parish, Louisiana.

    Science.gov (United States)

    2014-01-01

    This project measured and assessed the surface stability of the portion of LA Highway 70 that is : potentially vulnerable to the Assumption Parish sinkhole. Using Global Positioning Systems (GPS) : enhanced by a real-time network (RTN) of continuousl...

  16. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    NARCIS (Netherlands)

    Ernst, Anja F.; Albers, Casper J.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated

  17. Reflections on assumption of energetic politics. Viewpoint of a sceptial observer

    International Nuclear Information System (INIS)

    Taczanowski, S.; Pohorecki, W.

    2000-01-01

    The Polish assumptions of energetic politics up to 2020 have been critically assessed. Energy sources availability as well as predicted fuel prices have been discussed for interesting period. Fossil fuels and uranium have been taken into account. On the presented basis it has been concluded that rejection the nuclear option in Poland for energetics development plans up to 2020 seems to be a serious mistake

  18. HYDRIDE-RELATED DEGRADATION OF SNF CLADDING UNDER REPOSITORY CONDITIONS

    International Nuclear Information System (INIS)

    McCoy, K.

    2000-01-01

    The purpose and scope of this analysis/model report is to analyze the degradation of commercial spent nuclear fuel (CSNF) cladding under repository conditions by the hydride-related metallurgical processes, such as delayed hydride cracking (DHC), hydride reorientation and hydrogen embrittlement, thereby providing a better understanding of the degradation process and clarifying which aspects of the process are known and which need further evaluation and investigation. The intended use is as an input to a more general analysis of cladding degradation

  19. The underlying emotion and the dream relating dream imagery to the dreamer's underlying emotion can help elucidate the nature of dreaming.

    Science.gov (United States)

    Hartmann, Ernest

    2010-01-01

    There is a widespread consensus that emotion is important in dreams, deriving from both biological and psychological studies. However, the emphasis on examining emotions explicitly mentioned in dreams is misplaced. The dream is basically made of imagery. The focus of our group has been on relating the dream imagery to the dreamer's underlying emotion. What is most important is the underlying emotion--the emotion of the dreamer, not the emotion in the dream. This chapter discusses many studies relating the dream-especially the central image of the dream--to the dreamer's underlying emotion. Focusing on the underlying emotion leads to a coherent and testable view of the nature of dreaming. It also helps to clarify some important puzzling features of the literature on dreams, such as why the clinical literature is different in so many ways from the experimental literature, especially the laboratory-based experimental literature. Based on central image intensity and the associated underlying emotion, we can identify a hierarchy of dreams, from the highest-intensity, "big dreams," to the lowest-intensity dreams from laboratory awakenings. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. Land use compounds habitat losses under projected climate change in a threatened California ecosystem.

    Directory of Open Access Journals (Sweden)

    Erin Coulter Riordan

    Full Text Available Given the rapidly growing human population in mediterranean-climate systems, land use may pose a more immediate threat to biodiversity than climate change this century, yet few studies address the relative future impacts of both drivers. We assess spatial and temporal patterns of projected 21(st century land use and climate change on California sage scrub (CSS, a plant association of considerable diversity and threatened status in the mediterranean-climate California Floristic Province. Using a species distribution modeling approach combined with spatially-explicit land use projections, we model habitat loss for 20 dominant shrub species under unlimited and no dispersal scenarios at two time intervals (early and late century in two ecoregions in California (Central Coast and South Coast. Overall, projected climate change impacts were highly variable across CSS species and heavily dependent on dispersal assumptions. Projected anthropogenic land use drove greater relative habitat losses compared to projected climate change in many species. This pattern was only significant under assumptions of unlimited dispersal, however, where considerable climate-driven habitat gains offset some concurrent climate-driven habitat losses. Additionally, some of the habitat gained with projected climate change overlapped with projected land use. Most species showed potential northern habitat expansion and southern habitat contraction due to projected climate change, resulting in sharply contrasting patterns of impact between Central and South Coast Ecoregions. In the Central Coast, dispersal could play an important role moderating losses from both climate change and land use. In contrast, high geographic overlap in habitat losses driven by projected climate change and projected land use in the South Coast underscores the potential for compounding negative impacts of both drivers. Limiting habitat conversion may be a broadly beneficial strategy under climate change

  1. Are implicit policy assumptions about climate adaptation trying to push drinking water utilities down an impossible path?

    Science.gov (United States)

    Klasic, M. R.; Ekstrom, J.; Bedsworth, L. W.; Baker, Z.

    2017-12-01

    Extreme events such as wildfires, droughts, and flooding are projected to be more frequent and intense under a changing climate, increasing challenges to water quality management. To protect and improve public health, drinking water utility managers need to understand and plan for climate change and extreme events. This three year study began with the assumption that improved climate projections were key to advancing climate adaptation at the local level. Through a survey (N = 259) and interviews (N = 61) with California drinking water utility managers during the peak of the state's recent drought, we found that scientific information was not a key barrier hindering adaptation. Instead, we found that managers fell into three distinct mental models based on their interaction with, perceptions, and attitudes, towards scientific information and the future of water in their system. One of the mental models, "modeled futures", is a concept most in line with how climate change scientists talk about the use of information. Drinking water utilities falling into the "modeled future" category tend to be larger systems that have adequate capacity to both receive and use scientific information. Medium and smaller utilities in California, that more often serve rural low income communities, tend to fall into the other two mental models, "whose future" and "no future". We show evidence that there is an implicit presumption that all drinking water utility managers should strive to align with "modeled future" mental models. This presentation questions this assumption as it leaves behind many utilities that need to adapt to climate change (several thousand in California alone), but may not have the technical, financial, managerial, or other capacity to do so. It is clear that no single solution or pathway to drought resilience exists for water utilities, but we argue that a more explicit understanding and definition of what it means to be a resilient drinking water utility is

  2. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    International Nuclear Information System (INIS)

    Weir, Scott M.; Suski, Jamie G.; Salice, Christopher J.

    2010-01-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  3. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure.

    Science.gov (United States)

    Weir, Scott M; Suski, Jamie G; Salice, Christopher J

    2010-12-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    Energy Technology Data Exchange (ETDEWEB)

    Weir, Scott M., E-mail: scott.weir@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States); Suski, Jamie G., E-mail: jamie.suski@ttu.ed [Texas Tech University, Department of Biological Sciences, Box 43131, Lubbock, TX (United States); Salice, Christopher J., E-mail: chris.salice@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States)

    2010-12-15

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  5. The Number of Candidate Variants in Exome Sequencing for Mendelian Disease under No Genetic Heterogeneity

    Directory of Open Access Journals (Sweden)

    Jo Nishino

    2013-01-01

    Full Text Available There has been recent success in identifying disease-causing variants in Mendelian disorders by exome sequencing followed by simple filtering techniques. Studies generally assume complete or high penetrance. However, there are likely many failed and unpublished studies due in part to incomplete penetrance or phenocopy. In this study, the expected number of candidate single-nucleotide variants (SNVs in exome data for autosomal dominant or recessive Mendelian disorders was investigated under the assumption of “no genetic heterogeneity.” All variants were assumed to be under the “null model,” and sample allele frequencies were modeled using a standard population genetics theory. To investigate the properties of pedigree data, full-sibs were considered in addition to unrelated individuals. In both cases, particularly regarding full-sibs, the number of SNVs remained very high without controls. The high efficacy of controls was also confirmed. When controls were used with a relatively large total sample size (e.g., N=20, 50, filtering incorporating of incomplete penetrance and phenocopy efficiently reduced the number of candidate SNVs. This suggests that filtering is useful when an assumption of no “genetic heterogeneity” is appropriate and could provide general guidelines for sample size determination.

  6. Under-recording of work-related injuries and illnesses: An OSHA priority.

    Science.gov (United States)

    Fagan, Kathleen M; Hodgson, Michael J

    2017-02-01

    A 2009 Government Accounting Office (GAO) report, along with numerous published studies, documented that many workplace injuries are not recorded on employers' recordkeeping logs required by the Occupational Safety and Health Administration (OSHA) and consequently are under-reported to the Bureau of Labor Statistics (BLS), resulting in a substantial undercount of occupational injuries in the United States. OSHA conducted a Recordkeeping National Emphasis Program (NEP) from 2009 to 2012 to identify the extent and causes of unrecorded and incorrectly recorded occupational injuries and illnesses. OSHA found recordkeeping violations in close to half of all facilities inspected. Employee interviews identified workers' fear of reprisal and employer disciplinary programs as the most important causes of under-reporting. Subsequent inspections in the poultry industry identified employer medical management policies that fostered both under-reporting and under-recording of workplace injuries and illnesses. OSHA corroborated previous research findings and identified onsite medical units as a potential new cause of both under-reporting and under-recording. Research is needed to better characterize and eliminate obstacles to the compilation of accurate occupational injury and illness data. Occupational health professionals who work with high hazard industries where low injury rates are being recorded may wish to scrutinize recordkeeping practices carefully. This work suggests that, although many high-risk establishments manage recordkeeping with integrity, the lower the reported injury rate, the greater the likelihood of under-recording and under-reporting of work-related injuries and illnesses. Published by Elsevier Ltd.

  7. Questioning the foundations of physics which of our fundamental assumptions are wrong?

    CERN Document Server

    Foster, Brendan; Merali, Zeeya

    2015-01-01

    The essays in this book look at way in which the fundaments of physics might need to be changed in order to make progress towards a unified theory. They are based on the prize-winning essays submitted to the FQXi essay competition “Which of Our Basic Physical Assumptions Are Wrong?”, which drew over 270 entries. As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.” The authors of the eighteen prize-winning essays have, where necessary, adapted their essays for the present volume so as to (a) incorporate the community feedback generated in the online discussion of the essays, (b) add new material that has come to light since their completion and (c) to ensure accessibility to a broad audience of re...

  8. Agenda dissonance: immigrant Hispanic women's and providers' assumptions and expectations for menopause healthcare.

    Science.gov (United States)

    Esposito, Noreen

    2005-02-01

    This focus group study examined immigrant Hispanic women's and providers' assumptions about and expectations of healthcare encounters in the context of menopause. Four groups of immigrant women from Central America and one group of healthcare providers were interviewed in Spanish and English, respectively. The women wanted provider-initiated, individualized anticipatory guidance about menopause, acknowledgement of their symptoms, and mainstream medical treatment for disruptive symptoms. Providers believed that menopause was an unimportant health issue for immigrant women and was overshadowed by concerns about high-risk medical problems, such as diabetes, heart disease and HIV prevention. The women expected a healthcare encounter to be patient centered, social, and complete in itself. Providers expected an encounter to be businesslike and one part of multiple visit care. Language and lack of time were barriers cited by all. Dissonance between patient-provider assumptions and expectations around issues of healthcare leads to missed opportunities for care.

  9. Maximum likelihood fitting of FROC curves under an initial-detection-and-candidate-analysis model

    International Nuclear Information System (INIS)

    Edwards, Darrin C.; Kupinski, Matthew A.; Metz, Charles E.; Nishikawa, Robert M.

    2002-01-01

    We have developed a model for FROC curve fitting that relates the observer's FROC performance not to the ROC performance that would be obtained if the observer's responses were scored on a per image basis, but rather to a hypothesized ROC performance that the observer would obtain in the task of classifying a set of 'candidate detections' as positive or negative. We adopt the assumptions of the Bunch FROC model, namely that the observer's detections are all mutually independent, as well as assumptions qualitatively similar to, but different in nature from, those made by Chakraborty in his AFROC scoring methodology. Under the assumptions of our model, we show that the observer's FROC performance is a linearly scaled version of the candidate analysis ROC curve, where the scaling factors are just given by the FROC operating point coordinates for detecting initial candidates. Further, we show that the likelihood function of the model parameters given observational data takes on a simple form, and we develop a maximum likelihood method for fitting a FROC curve to this data. FROC and AFROC curves are produced for computer vision observer datasets and compared with the results of the AFROC scoring method. Although developed primarily with computer vision schemes in mind, we hope that the methodology presented here will prove worthy of further study in other applications as well

  10. Temporal Distinctiveness in Task Switching: Assessing the Mixture-Distribution Assumption

    Directory of Open Access Journals (Sweden)

    James A Grange

    2016-02-01

    Full Text Available In task switching, increasing the response--cue interval has been shown to reduce the switch cost. This has been attributed to a time-based decay process influencing the activation of memory representations of tasks (task-sets. Recently, an alternative account based on interference rather than decay has been successfully applied to this data (Horoufchin et al., 2011. In this account, variation of the RCI is thought to influence the temporal distinctiveness (TD of episodic traces in memory, thus affecting their retrieval probability. This can affect performance as retrieval probability influences response time: If retrieval succeeds, responding is fast due to positive priming; if retrieval fails, responding is slow, due to having to perform the task via a slow algorithmic process. This account---and a recent formal model (Grange & Cross, 2015---makes the strong prediction that all RTs are a mixture of one of two processes: a fast process when retrieval succeeds, and a slow process when retrieval fails. The present paper assesses the evidence for this mixture-distribution assumption in TD data. In a first section, statistical evidence for mixture-distributions is found using the fixed-point property test. In a second section, a mathematical process model with mixture-distributions at its core is fitted to the response time distribution data. Both approaches provide good evidence in support of the mixture-distribution assumption, and thus support temporal distinctiveness accounts of the data.

  11. 76 FR 63188 - Notification of Employee Rights Under the National Labor Relations Act

    Science.gov (United States)

    2011-10-12

    ... the National Labor Relations Act AGENCY: National Labor Relations Board. ACTION: Final rule; delay of... rule requiring employers, including labor organizations in their capacity as employers, subject to the... under the NLRA. The Board hereby amends that rule to change the effective date from November 14, 2011...

  12. An optical flow algorithm based on gradient constancy assumption for PIV image processing

    International Nuclear Information System (INIS)

    Zhong, Qianglong; Yang, Hua; Yin, Zhouping

    2017-01-01

    Particle image velocimetry (PIV) has matured as a flow measurement technique. It enables the description of the instantaneous velocity field of the flow by analyzing the particle motion obtained from digitally recorded images. Correlation based PIV evaluation technique is widely used because of its good accuracy and robustness. Although very successful, correlation PIV technique has some weakness which can be avoided by optical flow based PIV algorithms. At present, most of the optical flow methods applied to PIV are based on brightness constancy assumption. However, some factors of flow imaging technology and the nature property of the fluids make the brightness constancy assumption less appropriate in real PIV cases. In this paper, an implementation of a 2D optical flow algorithm (GCOF) based on gradient constancy assumption is introduced. The proposed GCOF assumes the edges of the illuminated PIV particles are constant during motion. It comprises two terms: a combined local-global gradient data term and a first-order divergence and vorticity smooth term. The approach can provide accurate dense motion fields. The approach are tested on synthetic images and on two experimental flows. The comparison of GCOF with other optical flow algorithms indicates the proposed method is more accurate especially in conditions of illumination variation. The comparison of GCOF with correlation PIV technique shows that the proposed GCOF has advantages on preserving small divergence and vorticity structures of the motion field and getting less outliers. As a consequence, the GCOF acquire a more accurate and better topological description of the turbulent flow. (paper)

  13. Large-scale analyses of synonymous substitution rates can be sensitive to assumptions about the process of mutation.

    Science.gov (United States)

    Aris-Brosou, Stéphane; Bielawski, Joseph P

    2006-08-15

    A popular approach to examine the roles of mutation and selection in the evolution of genomes has been to consider the relationship between codon bias and synonymous rates of molecular evolution. A significant relationship between these two quantities is taken to indicate the action of weak selection on substitutions among synonymous codons. The neutral theory predicts that the rate of evolution is inversely related to the level of functional constraint. Therefore, selection against the use of non-preferred codons among those coding for the same amino acid should result in lower rates of synonymous substitution as compared with sites not subject to such selection pressures. However, reliably measuring the extent of such a relationship is problematic, as estimates of synonymous rates are sensitive to our assumptions about the process of molecular evolution. Previous studies showed the importance of accounting for unequal codon frequencies, in particular when synonymous codon usage is highly biased. Yet, unequal codon frequencies can be modeled in different ways, making different assumptions about the mutation process. Here we conduct a simulation study to evaluate two different ways of modeling uneven codon frequencies and show that both model parameterizations can have a dramatic impact on rate estimates and affect biological conclusions about genome evolution. We reanalyze three large data sets to demonstrate the relevance of our results to empirical data analysis.

  14. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    Science.gov (United States)

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  15. Computational studies of global nuclear energy development under the assumption of the world's heterogeneous development

    International Nuclear Information System (INIS)

    Egorov, A.F.; Korobejnikov, V.V.; Poplavskaya, E.V.; Fesenko, G.A.

    2013-01-01

    Authors study the mathematical model of Global nuclear energy development until the end of this century. For comparative scenarios analysis of transition to sustainable nuclear energy systems, the models of heterogeneous world with an allowance for specific national development are under investigation [ru

  16. Cirrus Cloud Optical Thickness and Effective Diameter Retrieved by MODIS: Impacts of Single Habit Assumption, 3-D Radiative Effects, and Cloud Inhomogeneity

    Science.gov (United States)

    Zhou, Yongbo; Sun, Xuejin; Mielonen, Tero; Li, Haoran; Zhang, Riwei; Li, Yan; Zhang, Chuanliang

    2018-01-01

    For inhomogeneous cirrus clouds, cloud optical thickness (COT) and effective diameter (De) provided by the Moderate Resolution Imaging Spectrometer (MODIS) Collection 6 cloud products are associated with errors due to the single habit assumption (SHA), independent pixel assumption (IPA), photon absorption effect (PAE), and plane-parallel assumption (PPA). SHA means that every cirrus cloud is assumed to have the same shape habit of ice crystals. IPA errors are caused by three-dimensional (3D) radiative effects. PPA and PAE errors are caused by cloud inhomogeneity. We proposed a method to single out these different errors. These errors were examined using the Spherical Harmonics Discrete Ordinate Method simulations done for the MODIS 0.86 μm and 2.13 μm bands. Four midlatitude and tropical cirrus cases were studied. For the COT retrieval, the impacts of SHA and IPA were especially large for optically thick cirrus cases. SHA errors in COT varied distinctly with scattering angles. For the De retrieval, SHA decreased De under most circumstances. PAE decreased De for optically thick cirrus cases. For the COT and De retrievals, the dominant error source was SHA for overhead sun whereas for oblique sun, it could be any of SHA, IPA, and PAE, varying with cirrus cases and sun-satellite viewing geometries. On the domain average, the SHA errors in COT (De) were within -16.1%-42.6% (-38.7%-2.0%), whereas the 3-D radiative effects- and cloud inhomogeneity-induced errors in COT (De) were within -5.6%-19.6% (-2.9%-8.0%) and -2.6%-0% (-3.7%-9.8%), respectively.

  17. Anti-Atheist Bias in the United States: Testing Two Critical Assumptions

    OpenAIRE

    Swan, Lawton K; Heesacker, Martin

    2012-01-01

    Decades of opinion polling and empirical investigations have clearly demonstrated a pervasive anti-atheist prejudice in the United States. However, much of this scholarship relies on two critical and largely unaddressed assumptions: (a) that when people report negative attitudes toward atheists, they do so because they are reacting specifically to their lack of belief in God; and (b) that survey questions asking about attitudes toward atheists as a group yield reliable information about biase...

  18. BPA review of Washington Public Power Supply System, Projects 1 and 3 (WNP 1 and 3), construction schedule and financing assumptions

    International Nuclear Information System (INIS)

    1984-01-01

    This document contains the following appendices: Data provided By Supply System Regarding Costs and Schedules; Basic Supply System Data and Assumptions; Detailed Modeling of Net Present Values; Origin and Detailed Description of the System Analysis Mode; Decision Analysis Model; Pro Forma Budget Expenditure Levels for Fiscal years 1984 through 1990; Financial Flexibility Analysis - Discretionary/Nondiscretionary Expenditure Levels; Detailed Analysis of BPA's Debt Structure Under the 13 Pro Forma Budget Scenarios for Fiscal Years 1984 through 1990; Wertheim and Co., Inc., August 30, 1984 Letter; Project Considerations and Licensing/Regulatory Issues, Supply System September 15, 1984 Letter; and Summary of Litigation Affecting WNP 1 and 3, and WNP 4 and 5

  19. Proposed optical test of Bell's inequalities not resting upon the fair sampling assumption

    International Nuclear Information System (INIS)

    Santos, Emilio

    2004-01-01

    Arguments are given against the fair sampling assumption, used to claim an empirical disproof of local realism. New tests are proposed, able to discriminate between quantum mechanics and a restricted, but appealing, family of local hidden-variables models. Such tests require detectors with efficiencies just above 20%

  20. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  1. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Gernaey, Krist V.; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant...

  2. Factors underlying male and female use of violent video games

    NARCIS (Netherlands)

    Hartmann, T.; Möller, I.; Krause, C.

    2015-01-01

    Research has consistently shown that males play violent video games more frequently than females, but factors underlying this gender gap have not been examined to date. This approach examines the assumption that males play violent video games more because they anticipate more enjoyment and less

  3. Simulation of growing grains under orientation relation - dependent quadruple point dragging

    International Nuclear Information System (INIS)

    Ito, K

    2015-01-01

    The growth behaviour of a specified grain embedded in matrix grains, for which the migration mobility of the quadruple points depended on the relation between the orientations of the growing and shrinking grains, was studied using a modified Potts MC-type threedimensional simulation. Large embedded grains continued to grow without being overcome by coarsening matrix grains, whereas small embedded grains disappeared, under the influence of the relative mobilities of the quadruple points, the composition of the matrix grain texture and the width of the grain size distribution of the matrix grains. These results indicate that orientation relation-dependent quadruple point dragging can affect the recrystallization texture during the grain coarsening stage. (paper)

  4. Review and Discussion on the Key Assumptions and Challenges Surrounding the Use of {sup 7}Be as a Soil and Sediment Tracer

    Energy Technology Data Exchange (ETDEWEB)

    Mabit, L. [Soil and Water Management and Crop Nutrition Laboratory, IAEA (Austria); Taylor, A.; Blake, W. H. [School of Geography, Earth and Environmental Sciences, Plymouth University (United Kingdom); Smith, H. G. [School of Environmental Sciences, University of Liverpool (United Kingdom); Keith-Roach, M. J. [Kemakta Konsult, Stockholm (Sweden)

    2014-01-15

    as a hillslope soil erosion tracer unless suitable studies are undertaken to demonstrate otherwise. Plant interception and potential uptake are likely to contribute to significant heterogeneity. Therefore it is vital that a suitable period of radioactive decay separate pre-harvest inventory and pre-harvest sampling. The second key assumption is rapid sorption of the tracer to soil particles. Rapid sorption of {sup 7}Be to soil particles upon fallout is assumed to be at shallow soil depth distributions and laboratory batch studies have been reported on this sorption. Applications of {sup 7}Be as a tracer have overlooked the potential for high rates of infiltration through preferential flow pathways to increase sorption time, thus, influencing depth distributions, which has implications for erosion modelling using current conversion models. Furthermore, there is potential for {sup 7}Be to be transported in the dissolved phase in overland flow and this remains a key area for research to determine the influence of this upon redistribution estimates. As a tracer at the catchment scale, {sup 7}Be offers a unique opportunity to provide an indication of recent sedimentation and the transport of surface material, which could make a significant contribution to catchment management schemes. Successful use at this scale does, however, rest upon support for the third assumption of irreversible sorption to soil particles in a range of environments and such support is currently lacking. Knowledge of {sup 7}Be behaviour with changing physicochemical parameters in fluvial environments is conflicting and there is evidence to suggest that {sup 7}Be may be mobilised under reducing, saline or low pH conditions. Impact upon sorption behaviour is likely to be highly site specific and it is a priority that future laboratory studies are coupled with in situ monitoring of parameters to determine the likelihood for increased tracer mobility under representative conditions and timescales. A

  5. Causal analysis of ordinal treatments and binary outcomes under truncation by death.

    Science.gov (United States)

    Wang, Linbo; Richardson, Thomas S; Zhou, Xiao-Hua

    2017-06-01

    It is common that in multi-arm randomized trials, the outcome of interest is "truncated by death," meaning that it is only observed or well-defined conditioning on an intermediate outcome. In this case, in addition to pairwise contrasts, the joint inference for all treatment arms is also of interest. Under a monotonicity assumption we present methods for both pairwise and joint causal analyses of ordinal treatments and binary outcomes in presence of truncation by death. We illustrate via examples the appropriateness of our assumptions in different scientific contexts.

  6. Ancestral assumptions and the clinical uncertainty of evolutionary medicine.

    Science.gov (United States)

    Cournoyea, Michael

    2013-01-01

    Evolutionary medicine is an emerging field of medical studies that uses evolutionary theory to explain the ultimate causes of health and disease. Educational tools, online courses, and medical school modules are being developed to help clinicians and students reconceptualize health and illness in light of our evolutionary past. Yet clinical guidelines based on our ancient life histories are epistemically weak, relying on the controversial assumptions of adaptationism and advocating a strictly biophysical account of health. To fulfill the interventionist goals of clinical practice, it seems that proximate explanations are all we need to develop successful diagnostic and therapeutic guidelines. Considering these epistemic concerns, this article argues that the clinical relevance of evolutionary medicine remains uncertain at best.

  7. Polarized BRDF for coatings based on three-component assumption

    Science.gov (United States)

    Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong

    2017-02-01

    A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.

  8. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus.

    Science.gov (United States)

    Taliotis, Constantinos; Taibi, Emanuele; Howells, Mark; Rogner, Holger; Bazilian, Morgan; Welsch, Manuel

    2017-10-01

    The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  9. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus

    Directory of Open Access Journals (Sweden)

    Constantinos Taliotis

    2017-10-01

    Full Text Available The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  10. Investigating Teachers’ and Students’ Beliefs and Assumptions about CALL Programme at Caledonian College of Engineering

    Directory of Open Access Journals (Sweden)

    Holi Ibrahim Holi Ali

    2012-01-01

    Full Text Available This study is set to investigate students’ and teachers’ perceptions and assumptions about newly implemented CALL Programme at the School of Foundation Studies, Caledonian College of Engineering, Oman. Two versions of questionnaire were administered to 24 teachers and 90 students to collect their beliefs and assumption about CALL programame. The results shows that the great majority of the students report that CALL is very interesting, motivating and useful to them and they learn a lot form it. However, the number of CALL hours should be increased, lab should be equipped and arranged in user friendly way, assessment should be integrated into CALL, and smart boards, black boards should be incorporated into the programme.

  11. A biomechanical testing system to determine micromotion between hip implant and femur accounting for deformation of the hip implant: Assessment of the influence of rigid body assumptions on micromotions measurements.

    Science.gov (United States)

    Leuridan, Steven; Goossens, Quentin; Roosen, Jorg; Pastrav, Leonard; Denis, Kathleen; Mulier, Michiel; Desmet, Wim; Vander Sloten, Jos

    2017-02-01

    Accurate pre-clinical evaluation of the initial stability of new cementless hip stems using in vitro micromotion measurements is an important step in the design process to assess the new stem's potential. Several measuring systems, linear variable displacement transducer-based and other, require assuming bone or implant to be rigid to obtain micromotion values or to calculate derived quantities such as relative implant tilting. An alternative linear variable displacement transducer-based measuring system not requiring a rigid body assumption was developed in this study. The system combined advantages of local unidirectional and frame-and-bracket micromotion measuring concepts. The influence and possible errors that would be made by adopting a rigid body assumption were quantified. Furthermore, as the system allowed emulating local unidirectional and frame-and-bracket systems, the influence of adopting rigid body assumptions were also analyzed for both concepts. Synthetic and embalmed bone models were tested in combination with primary and revision implants. Single-legged stance phase loading was applied to the implant - bone constructs. Adopting a rigid body assumption resulted in an overestimation of mediolateral micromotion of up to 49.7μm at more distal measuring locations. Maximal average relative rotational motion was overestimated by 0.12° around the anteroposterior axis. Frontal and sagittal tilting calculations based on a unidirectional measuring concept underestimated the true tilting by an order of magnitude. Non-rigid behavior is a factor that should not be dismissed in micromotion stability evaluations of primary and revision femoral implants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  13. 26 CFR 1.752-6 - Partnership assumption of partner's section 358(h)(3) liability after October 18, 1999, and...

    Science.gov (United States)

    2010-04-01

    ... general. If, in a transaction described in section 721(a), a partnership assumes a liability (defined in...) does not apply to an assumption of a liability (defined in section 358(h)(3)) by a partnership as part... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Partnership assumption of partner's section 358...

  14. Assessing the skill of hydrology models at simulating the water cycle in the HJ Andrews LTER: Assumptions, strengths and weaknesses

    Science.gov (United States)

    Simulated impacts of climate on hydrology can vary greatly as a function of the scale of the input data, model assumptions, and model structure. Four models are commonly used to simulate streamflow in model assumptions, and model structure. Four models are commonly used to simu...

  15. Is a "Complex" Task Really Complex? Validating the Assumption of Cognitive Task Complexity

    Science.gov (United States)

    Sasayama, Shoko

    2016-01-01

    In research on task-based learning and teaching, it has traditionally been assumed that differing degrees of cognitive task complexity can be inferred through task design and/or observations of differing qualities in linguistic production elicited by second language (L2) communication tasks. Without validating this assumption, however, it is…

  16. Tackle-related injury rates and nature of injuries in South African Youth Week tournament rugby union players (under-13 to under-18): an observational cohort study.

    Science.gov (United States)

    Burger, Nicholas; Lambert, Mike I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief

    2014-08-12

    The tackle situation is most often associated with the high injury rates in rugby union. Tackle injury epidemiology in rugby union has previously been focused on senior cohorts but less is known about younger cohorts. The aim of this study was to report on the nature and rates of tackle-related injuries in South African youth rugby union players representing their provinces at national tournaments. Observational cohort study. Four South African Youth Week tournaments (under-13 Craven Week, under-16 Grant Khomo Week, under-18 Academy Week, under-18 Craven Week). Injury data were collected from 3652 youth rugby union players (population at risk) in 2011 and 2012. Tackle-related injury severity ('time-loss' and 'medical attention'), type and location, injury rate per 1000 h (including 95% CIs). Injury rate ratios (IRR) were calculated and modelled using a Poisson regression. A χ(2) analysis was used to detect linear trends between injuries and increasing match quarters. The 2012 under-13 Craven Week had a significantly greater 'time-loss' injury rate when compared with the 2012 under-18 Academy Week (IRR=4.43; 95% CI 2.13 to 9.21, p<0.05) and under-18 Craven Week (IRR=3.52; 95% CI 1.54 to 8.00, p<0.05). The Poisson regression also revealed a higher probability of 'overall' ('time-loss' and 'medical attention' combined) and 'time-loss' tackle-related injuries occurring at the under-13 Craven Week. The proportion of 'overall' and 'time-loss' injuries increased significantly with each quarter of the match when all four tournaments were combined (p<0.05). There was a difference in the tackle-related injury rate between the under-13 tournament and the two under-18 tournaments, and the tackle-related injury rate was higher in the final quarter of matches. Ongoing injury surveillance is required to better interpret these findings. Injury prevention strategies targeting the tackle may only be effective once the rate and nature of injuries have been accurately determined

  17. 40 CFR 86.1917 - How does in-use testing under this subpart relate to the emission-related warranty in Section 207...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false How does in-use testing under this...) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES (CONTINUED) Manufacturer-Run In-Use Testing Program for Heavy-Duty Diesel Engines § 86.1917 How does in-use testing under this subpart relate...

  18. Tests of the frozen-flux and tangentially geostrophic assumptions using magnetic satellite data

    DEFF Research Database (Denmark)

    Chulliat, A.; Olsen, Nils; Sabaka, T.

    In 1984, Jean-Louis Le Mouël published a paper suggesting that the flow at the top of the Earth’s core is tangentially geostrophic, i.e., the Lorentz force is much smaller than the Coriolis force in this particular region of the core. This new assumption wassubsequently used to discriminate among...

  19. All projects related to | Page 439 | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ICT Access and Usage in Higher Education in South Africa ... The assumption underlying information and communication technology (ICT) in higher education policy ... Launched in 2003, Research ICT Africa (RIA) has successfully conducted ...

  20. Teaching and Learning Science in the 21st Century: Challenging Critical Assumptions in Post-Secondary Science

    Directory of Open Access Journals (Sweden)

    Amanda L. Glaze

    2018-01-01

    Full Text Available It is widely agreed upon that the goal of science education is building a scientifically literate society. Although there are a range of definitions for science literacy, most involve an ability to problem solve, make evidence-based decisions, and evaluate information in a manner that is logical. Unfortunately, science literacy appears to be an area where we struggle across levels of study, including with students who are majoring in the sciences in university settings. One reason for this problem is that we have opted to continue to approach teaching science in a way that fails to consider the critical assumptions that faculties in the sciences bring into the classroom. These assumptions include expectations of what students should know before entering given courses, whose responsibility it is to ensure that students entering courses understand basic scientific concepts, the roles of researchers and teachers, and approaches to teaching at the university level. Acknowledging these assumptions and the potential for action to shift our teaching and thinking about post-secondary education represents a transformative area in science literacy and preparation for the future of science as a field.

  1. Tackle-related injury rates and nature of injuries in South African Youth Week tournament rugby union players (under-13 to under-18): an observational cohort study

    Science.gov (United States)

    Burger, Nicholas; Lambert, Mike I; Viljoen, Wayne; Brown, James C; Readhead, Clint; Hendricks, Sharief

    2014-01-01

    Objectives The tackle situation is most often associated with the high injury rates in rugby union. Tackle injury epidemiology in rugby union has previously been focused on senior cohorts but less is known about younger cohorts. The aim of this study was to report on the nature and rates of tackle-related injuries in South African youth rugby union players representing their provinces at national tournaments. Design Observational cohort study. Setting Four South African Youth Week tournaments (under-13 Craven Week, under-16 Grant Khomo Week, under-18 Academy Week, under-18 Craven Week). Participants Injury data were collected from 3652 youth rugby union players (population at risk) in 2011 and 2012. Outcome measures Tackle-related injury severity (‘time-loss’ and ‘medical attention’), type and location, injury rate per 1000 h (including 95% CIs). Injury rate ratios (IRR) were calculated and modelled using a Poisson regression. A χ2 analysis was used to detect linear trends between injuries and increasing match quarters. Results The 2012 under-13 Craven Week had a significantly greater ‘time-loss’ injury rate when compared with the 2012 under-18 Academy Week (IRR=4.43; 95% CI 2.13 to 9.21, p<0.05) and under-18 Craven Week (IRR=3.52; 95% CI 1.54 to 8.00, p<0.05). The Poisson regression also revealed a higher probability of ‘overall’ (‘time-loss’ and ‘medical attention’ combined) and ‘time-loss’ tackle-related injuries occurring at the under-13 Craven Week. The proportion of ‘overall’ and ‘time-loss’ injuries increased significantly with each quarter of the match when all four tournaments were combined (p<0.05). Conclusions There was a difference in the tackle-related injury rate between the under-13 tournament and the two under-18 tournaments, and the tackle-related injury rate was higher in the final quarter of matches. Ongoing injury surveillance is required to better interpret these findings. Injury prevention strategies

  2. On the Validity of the “Thin” and “Thick” Double-Layer Assumptions When Calculating Streaming Currents in Porous Media

    Directory of Open Access Journals (Sweden)

    Matthew D. Jackson

    2012-01-01

    Full Text Available We find that the thin double layer assumption, in which the thickness of the electrical diffuse layer is assumed small compared to the radius of curvature of a pore or throat, is valid in a capillary tubes model so long as the capillary radius is >200 times the double layer thickness, while the thick double layer assumption, in which the diffuse layer is assumed to extend across the entire pore or throat, is valid so long as the capillary radius is >6 times smaller than the double layer thickness. At low surface charge density (0.5 M the validity criteria are less stringent. Our results suggest that the thin double layer assumption is valid in sandstones at low specific surface charge (<10 mC⋅m−2, but may not be valid in sandstones of moderate- to small pore-throat size at higher surface charge if the brine concentration is low (<0.001 M. The thick double layer assumption is likely to be valid in mudstones at low brine concentration (<0.1 M and surface charge (<10 mC⋅m−2, but at higher surface charge, it is likely to be valid only at low brine concentration (<0.003 M. Consequently, neither assumption may be valid in mudstones saturated with natural brines.

  3. Multiplant strategy under core-periphery structure

    OpenAIRE

    Tsubota, Kenmei

    2012-01-01

    A typical implicit assumption on monopolistic competition models for trade and economic geography is that firms can produce and sell only at one place. This paper fallows endogenous determination of the number of plants in a new economic geography model and examine the stable outcomes of organization choice between single-plant and multi-plant in two regions. We explicitly consider the firms' trade-off between larger economies of scale under single plant configuration and the saving in interr...

  4. Testing Mean Differences among Groups: Multivariate and Repeated Measures Analysis with Minimal Assumptions.

    Science.gov (United States)

    Bathke, Arne C; Friedrich, Sarah; Pauly, Markus; Konietschke, Frank; Staffen, Wolfgang; Strobl, Nicolas; Höller, Yvonne

    2018-03-22

    To date, there is a lack of satisfactory inferential techniques for the analysis of multivariate data in factorial designs, when only minimal assumptions on the data can be made. Presently available methods are limited to very particular study designs or assume either multivariate normality or equal covariance matrices across groups, or they do not allow for an assessment of the interaction effects across within-subjects and between-subjects variables. We propose and methodologically validate a parametric bootstrap approach that does not suffer from any of the above limitations, and thus provides a rather general and comprehensive methodological route to inference for multivariate and repeated measures data. As an example application, we consider data from two different Alzheimer's disease (AD) examination modalities that may be used for precise and early diagnosis, namely, single-photon emission computed tomography (SPECT) and electroencephalogram (EEG). These data violate the assumptions of classical multivariate methods, and indeed classical methods would not have yielded the same conclusions with regards to some of the factors involved.

  5. Rethinking The Going Concern Assumption As A Pre-Condition For Accounting Measurement

    OpenAIRE

    Saratiel Wedzerai Musvoto; Daan G Gouws

    2011-01-01

    This study compares the principles of the going concern concept against the principles of representational measurement to determine if it is possible to establish foundations of accounting measurement with the going concern concept as a precondition. Representational measurement theory is a theory that establishes measurement in social scientific disciplines such as accounting. The going concern assumption is prescribed as one of the preconditions for measuring the attributes of the elements ...

  6. Goals, dilemmas and assumptions in infant feeding education and support. Applying theory of constraints thinking tools to develop new priorities for action.

    Science.gov (United States)

    Trickey, Heather; Newburn, Mary

    2014-01-01

    Three important infant feeding support problems are addressed: (1) mothers who use formula milk can feel undersupported and judged; (2) mothers can feel underprepared for problems with breastfeeding; and (3) many mothers who might benefit from breastfeeding support do not access help. Theory of constraints (TOC) is used to examine these problems in relation to ante-natal education and post-natal support. TOC suggests that long-standing unresolved problems or 'undesirable effects' in any system (in this case a system to provide education and support) are caused by conflicts, or dilemmas, within the system, which might not be explicitly acknowledged. Potential solutions are missed by failure to question assumptions which, when interrogated, often turn out to be invalid. Three core dilemmas relating to the three problems are identified, articulated and explored using TOC methodology. These are whether to: (1) promote feeding choice or to promote breastfeeding; (2) present breastfeeding positively, as straightforward and rewarding, or focus on preparing mothers for problems; and (3) offer support proactively or ensure that mothers themselves initiate requests for support. Assumptions are identified and interrogated, leading to clarified priorities for action relating to each problem. These are (1) shift the focus from initial decision-making towards support for mothers throughout their feeding journeys, enabling and protecting decisions to breastfeed as one aspect of ongoing support; (2) to promote the concept of an early-weeks investment and adjustment period during which breastfeeding is established; and (3) to develop more proactive mother-centred models of support for all forms of infant feeding. © 2012 John Wiley & Sons Ltd.

  7. Cognition and relative importance underlying consumer valuation of park-and-ride facilities

    NARCIS (Netherlands)

    Bos, D.M.; Molin, E.J.E.; Timmermans, H.J.P.; vd Heijden, R.E.C.M.

    2003-01-01

    Results are reported of a study designed to identify the cognitive constructs underlying the valuation of park-and-ride (P&R) facilities and to measure the relative importance attached to the attributes of such facilities. Results show that the reliability of public transport is quite important.

  8. Absolute continuity under time shift of trajectories and related stochastic calculus

    CERN Document Server

    Löbus, Jörg-Uwe

    2017-01-01

    The text is concerned with a class of two-sided stochastic processes of the form X=W+A. Here W is a two-sided Brownian motion with random initial data at time zero and A\\equiv A(W) is a function of W. Elements of the related stochastic calculus are introduced. In particular, the calculus is adjusted to the case when A is a jump process. Absolute continuity of (X,P) under time shift of trajectories is investigated. For example under various conditions on the initial density with respect to the Lebesgue measure, m, and on A with A_0=0 we verify \\frac{P(dX_{\\cdot -t})}{P(dX_\\cdot)}=\\frac{m(X_{-t})}{m(X_0)}\\cdot \\prod_i\\left|\

  9. “Marginal land” for energy crops: Exploring definitions and embedded assumptions

    International Nuclear Information System (INIS)

    Shortall, O.K.

    2013-01-01

    The idea of using less productive or “marginal land” for energy crops is promoted as a way to overcome the previous land use controversies faced by biofuels. It is argued that marginal land use would not compete with food production, is widely available and would incur fewer environmental impacts. This term is notoriously vague however, as are the details of how marginal land use for energy crops would work in practice. This paper explores definitions of the term “marginal land” in academic, consultancy, NGO, government and industry documents in the UK. It identifies three separate definitions of the term: land unsuitable for food production; ambiguous lower quality land; and economically marginal land. It probes these definitions further by exploring the technical, normative and political assumptions embedded within them. It finds that the first two definitions are normatively motivated: this land should be used to overcome controversies and the latter definition is predictive: this land is likely to be used. It is important that the different advantages, disadvantages and implications of the definitions are spelled out so definitions are not conflated to create unrealistic expectations about the role of marginal land in overcoming biofuels land use controversies. -- Highlights: •Qualitative methods were used to explore definitions of the term “marginal land”. •Three definitions were identified. •Two definitions focus on overcoming biomass land use controversies. •One definition predicts what land will be used for growing biomass. •Definitions contain problematic assumptions

  10. NEUROBIOLOGICAL AND PSYCHOPATHOLOGICAL MECHANISMS UNDERLYING ADDICTION-LIKE BEHAVIORS: AN OVERVIEW AND THEMATIC SYNTHESIS.

    Directory of Open Access Journals (Sweden)

    Loredana Scala

    2017-08-01

    Full Text Available The term dependency is increasingly being used also to explain symptoms resulting from the repetition of a behavior or legalized and socially accepted activities that do not involve substance assumption. These activities, although considered normal habits of daily life can become real addictions that may affect and disrupt socio-relational and working functioning. Growing evidence suggests to consider behavioral addictions similar to drug dependence for their common symptoms, the high frequency of poly-dependence conditions, and the correlation in risk (impulsivity, sensation seeking, early exposure, familiarity and protective (parental control, adequate metacognitive skills factors. The aim of this paper is to describe addiction in its general aspects, highlighting the underlying neurobiological and psychopathological mechanisms.

  11. 75 FR 18138 - Health Care Eligibility Under the Secretarial Designee Program and Related Special Authorities

    Science.gov (United States)

    2010-04-09

    ... Component members not in a present duty status. This authority includes payment for health care services in... 0790-AI52] Health Care Eligibility Under the Secretarial Designee Program and Related Special... establish policies and assign responsibilities for health care eligibility under the Secretarial Designee...

  12. 75 FR 72682 - Health Care Eligibility Under the Secretarial Designee Program and Related Special Authorities

    Science.gov (United States)

    2010-11-26

    ... members not in a present duty status. This authority includes payment for health care services in private... 0790-AI52 Health Care Eligibility Under the Secretarial Designee Program and Related Special... assigns responsibilities for health care eligibility under the Secretarial Designee Program. It also...

  13. Bayou Corne sinkhole : control measurements of State Highway 70 in Assumption Parish, Louisiana, tech summary.

    Science.gov (United States)

    2014-01-01

    The sinkhole located in Assumption Parish, Louisiana, threatens the stability of Highway 70, a state maintained route. In order to : mitigate the potential damaging e ects of the sinkhole on this infrastructure, the Louisiana Department of Transpo...

  14. Transfer Efficiency of Bacteria and Viruses from Porous and Nonporous Fomites to Fingers under Different Relative Humidity Conditions

    Science.gov (United States)

    Gerba, Charles P.; Tamimi, Akrum H.; Kitajima, Masaaki; Maxwell, Sheri L.; Rose, Joan B.

    2013-01-01

    Fomites can serve as routes of transmission for both enteric and respiratory pathogens. The present study examined the effect of low and high relative humidity on fomite-to-finger transfer efficiency of five model organisms from several common inanimate surfaces (fomites). Nine fomites representing porous and nonporous surfaces of different compositions were studied. Escherichia coli, Staphylococcus aureus, Bacillus thuringiensis, MS2 coliphage, and poliovirus 1 were placed on fomites in 10-μl drops and allowed to dry for 30 min under low (15% to 32%) or high (40% to 65%) relative humidity. Fomite-to-finger transfers were performed using 1.0 kg/cm2 of pressure for 10 s. Transfer efficiencies were greater under high relative humidity for both porous and nonporous surfaces. Most organisms on average had greater transfer efficiencies under high relative humidity than under low relative humidity. Nonporous surfaces had a greater transfer efficiency (up to 57%) than porous surfaces (humidity, as well as under high relative humidity (nonporous, up to 79.5%; porous, <13.4%). Transfer efficiency also varied with fomite material and organism type. The data generated can be used in quantitative microbial risk assessment models to assess the risk of infection from fomite-transmitted human pathogens and the relative levels of exposure to different types of fomites and microorganisms. PMID:23851098

  15. Linear-No-Threshold Default Assumptions for Noncancer and Nongenotoxic Cancer Risks: A Mathematical and Biological Critique.

    Science.gov (United States)

    Bogen, Kenneth T

    2016-03-01

    To improve U.S. Environmental Protection Agency (EPA) dose-response (DR) assessments for noncarcinogens and for nonlinear mode of action (MOA) carcinogens, the 2009 NRC Science and Decisions Panel recommended that the adjustment-factor approach traditionally applied to these endpoints should be replaced by a new default assumption that both endpoints have linear-no-threshold (LNT) population-wide DR relationships. The panel claimed this new approach is warranted because population DR is LNT when any new dose adds to a background dose that explains background levels of risk, and/or when there is substantial interindividual heterogeneity in susceptibility in the exposed human population. Mathematically, however, the first claim is either false or effectively meaningless and the second claim is false. Any dose-and population-response relationship that is statistically consistent with an LNT relationship may instead be an additive mixture of just two quasi-threshold DR relationships, which jointly exhibit low-dose S-shaped, quasi-threshold nonlinearity just below the lower end of the observed "linear" dose range. In this case, LNT extrapolation would necessarily overestimate increased risk by increasingly large relative magnitudes at diminishing values of above-background dose. The fact that chemically-induced apoptotic cell death occurs by unambiguously nonlinear, quasi-threshold DR mechanisms is apparent from recent data concerning this quintessential toxicity endpoint. The 2009 NRC Science and Decisions Panel claims and recommendations that default LNT assumptions be applied to DR assessment for noncarcinogens and nonlinear MOA carcinogens are therefore not justified either mathematically or biologically. © 2015 The Author. Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  16. Asbestos-related occupational cancers compensated under the Spanish National Insurance System, 1978-2011.

    Science.gov (United States)

    García-Gómez, Montserrat; Menéndez-Navarro, Alfredo; López, Rosario Castañeda

    2015-01-01

    In 1978, asbestos-related occupational cancers were added to the Spanish list of occupational diseases. However, there are no full accounts of compensated cases since their inclusion. To analyze the cases of asbestos-related cancer recognized as occupational in Spain between 1978 and 2011. Cases were obtained from the Spanish Employment Ministry. Specific incidence rates by year, economic activity, and occupation were obtained. We compared mortality rates of mesothelioma and bronchus and lung cancer mortality in Spain and the European Union. Between 1978 and 2011, 164 asbestos-related occupational cancers were recognized in Spain, with a mean annual rate of 0·08 per 10(5) employees (0·13 in males, 0·002 in females). Under-recognition rates were an estimated 93·6% (males) and 99·7% (females) for pleural mesothelioma and 98·8% (males) and 100% (females) for bronchus and lung cancer. In Europe for the year 2000, asbestos-related occupational cancer rates ranged from 0·04 per 10(5) employees in Spain to 7·32 per 10(5) employees in Norway. These findings provide evidence of gross under-recognition of asbestos-related occupational cancers in Spain. Future work should investigate cases treated in the National Healthcare System to better establish the impact of asbestos on health in Spain.

  17. Comparison of risk-dominant scenario assumptions for several TRU waste facilities in the DOE complex

    International Nuclear Information System (INIS)

    Foppe, T.L.; Marx, D.R.

    1999-01-01

    In order to gain a risk management perspective, the DOE Rocky Flats Field Office (RFFO) initiated a survey of other DOE sites regarding risks from potential accidents associated with transuranic (TRU) storage and/or processing facilities. Recently-approved authorization basis documents at the Rocky Flats Environmental Technology Site (RFETS) have been based on the DOE Standard 3011 risk assessment methodology with three qualitative estimates of frequency of occurrence and quantitative estimates of radiological consequences to the collocated worker and the public binned into three severity levels. Risk Class 1 and 2 events after application of controls to prevent or mitigate the accident are designated as risk-dominant scenarios. Accident Evaluation Guidelines for selection of Technical Safety Requirements (TSRs) are based on the frequency and consequence bin assignments to identify controls that can be credited to reduce risk to Risk Class 3 or 4, or that are credited for Risk Class 1 and 2 scenarios that cannot be further reduced. This methodology resulted in several risk-dominant scenarios for either the collocated worker or the public that warranted consideration on whether additional controls should be implemented. RFFO requested the survey because of these high estimates of risks that are primarily due to design characteristics of RFETS TRU waste facilities (i.e., Butler-type buildings without a ventilation and filtration system, and a relatively short distance to the Site boundary). Accident analysis methodologies and key assumptions are being compared for the DOE sites responding to the survey. This includes type of accidents that are risk dominant (e.g., drum explosion, material handling breach, fires, natural phenomena, external events, etc.), source term evaluation (e.g., radionuclide material-at-risk, chemical and physical form, damage ratio, airborne release fraction, respirable fraction, leakpath factors), dispersion analysis (e.g., meteorological

  18. Understanding the scale of the single ion free energy: A critical test of the tetra-phenyl arsonium and tetra-phenyl borate assumption

    Science.gov (United States)

    Duignan, Timothy T.; Baer, Marcel D.; Mundy, Christopher J.

    2018-06-01

    The tetra-phenyl arsonium and tetra-phenyl borate (TATB) assumption is a commonly used extra-thermodynamic assumption that allows single ion free energies to be split into cationic and anionic contributions. The assumption is that the values for the TATB salt can be divided equally. This is justified by arguing that these large hydrophobic ions will cause a symmetric response in water. Experimental and classical simulation work has raised potential flaws with this assumption, indicating that hydrogen bonding with the phenyl ring may favor the solvation of the TB- anion. Here, we perform ab initio molecular dynamics simulations of these ions in bulk water demonstrating that there are significant structural differences. We quantify our findings by reproducing the experimentally observed vibrational shift for the TB- anion and confirm that this is associated with hydrogen bonding with the phenyl rings. Finally, we demonstrate that this results in a substantial energetic preference of the water to solvate the anion. Our results suggest that the validity of the TATB assumption, which is still widely used today, should be reconsidered experimentally in order to properly reference single ion solvation free energy, enthalpy, and entropy.

  19. General Relativity and Gravitation

    Science.gov (United States)

    Ehlers, J.; Murdin, P.

    2000-11-01

    The General Theory of Relativity (GR), created by Albert Einstein between 1907 and 1915, is a theory both of gravitation and of spacetime structure. It is based on the assumption that matter, via its energy-momentum, interacts with the metric of spacetime, which is considered (in contrast to Newtonian physics and SPECIAL RELATIVITY) as a dynamical field having degrees of freedom of its own (GRAVI...

  20. Making Explicit the Formalism Underlying Evaluation in Music Information Retrieval Research

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2014-01-01

    We make explicit the formalism underlying evaluation in music information retrieval research. We define a ``system,'' what it means to ``analyze'' one, and make clear the aims, parts, design, execution, interpretation, assumptions and limitations of its ``evaluation.'' We apply this formalism...... to discuss the MIREX automatic mood classification task....

  1. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  2. Projecting future air pollution-related mortality under a changing climate: progress, uncertainties and research needs.

    Science.gov (United States)

    Madaniyazi, Lina; Guo, Yuming; Yu, Weiwei; Tong, Shilu

    2015-02-01

    Climate change may affect mortality associated with air pollutants, especially for fine particulate matter (PM2.5) and ozone (O3). Projection studies of such kind involve complicated modelling approaches with uncertainties. We conducted a systematic review of researches and methods for projecting future PM2.5-/O3-related mortality to identify the uncertainties and optimal approaches for handling uncertainty. A literature search was conducted in October 2013, using the electronic databases: PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 to September 2013. Fifteen studies fulfilled the inclusion criteria. Most studies reported that an increase of climate change-induced PM2.5 and O3 may result in an increase in mortality. However, little research has been conducted in developing countries with high emissions and dense populations. Additionally, health effects induced by PM2.5 may dominate compared to those caused by O3, but projection studies of PM2.5-related mortality are fewer than those of O3-related mortality. There is a considerable variation in approaches of scenario-based projection researches, which makes it difficult to compare results. Multiple scenarios, models and downscaling methods have been used to reduce uncertainties. However, few studies have discussed what the main source of uncertainties is and which uncertainty could be most effectively reduced. Projecting air pollution-related mortality requires a systematic consideration of assumptions and uncertainties, which will significantly aid policymakers in efforts to manage potential impacts of PM2.5 and O3 on mortality in the context of climate change. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  3. Breakdown of Hydrostatic Assumption in Tidal Channel with Scour Holes

    Directory of Open Access Journals (Sweden)

    Chunyan Li

    2016-10-01

    Full Text Available Hydrostatic condition is a common assumption in tidal and subtidal motions in oceans and estuaries.. Theories with this assumption have been largely successful. However, there is no definite criteria separating the hydrostatic from the non-hydrostatic regimes in real applications because real problems often times have multiple scales. With increased refinement of high resolution numerical models encompassing smaller and smaller spatial scales, the need for non-hydrostatic models is increasing. To evaluate the vertical motion over bathymetric changes in tidal channels and assess the validity of the hydrostatic approximation, we conducted observations using a vessel-based acoustic Doppler current profiler (ADCP. Observations were made along a straight channel 18 times over two scour holes of 25 m deep, separated by 330 m, in and out of an otherwise flat 8 m deep tidal pass leading to the Lake Pontchartrain over a time period of 8 hours covering part of the diurnal tidal cycle. Out of the 18 passages over the scour holes, 11 of them showed strong upwelling and downwelling which resulted in the breakdown of hydrostatic condition. The maximum observed vertical velocity was ~ 0.35 m/s, a high value in a tidal channel, and the estimated vertical acceleration reached a high value of 1.76×10-2 m/s2. Analysis demonstrated that the barotropic non-hydrostatic acceleration was dominant. The cause of the non-hydrostatic flow was the that over steep slopes. This demonstrates that in such a system, the bathymetric variation can lead to the breakdown of hydrostatic conditions. Models with hydrostatic restrictions will not be able to correctly capture the dynamics in such a system with significant bathymetric variations particularly during strong tidal currents.

  4. Entanglement dynamics of two interacting qubits under the influence ...

    Indian Academy of Sciences (India)

    2016-06-21

    Jun 21, 2016 ... possibilities for storage and processing of information. [1]. In contrast to the ... be classified into the process with memory (non-. Markovian) and ... a general class of initial quantum states under the assumption that the ..... R in the descending order. Matrix R is defined as. R = ρ(σ y. 1 ⊗σ y. 2. )ρ⋆(σ y. 1 ⊗σ.

  5. Testing a key assumption in animal communication: between-individual variation in female visual systems alters perception of male signals

    Directory of Open Access Journals (Sweden)

    Kelly L. Ronald

    2017-12-01

    Full Text Available Variation in male signal production has been extensively studied because of its relevance to animal communication and sexual selection. Although we now know much about the mechanisms that can lead to variation between males in the properties of their signals, there is still a general assumption that there is little variation in terms of how females process these male signals. Variation between females in signal processing may lead to variation between females in how they rank individual males, meaning that one single signal may not be universally attractive to all females. We tested this assumption in a group of female wild-caught brown-headed cowbirds (Molothrus ater, a species that uses a male visual signal (e.g. a wingspread display to make its mate-choice decisions. We found that females varied in two key parameters of their visual sensory systems related to chromatic and achromatic vision: cone densities (both total and proportions and cone oil droplet absorbance. Using visual chromatic and achromatic contrast modeling, we then found that this between-individual variation in visual physiology leads to significant between-individual differences in how females perceive chromatic and achromatic male signals. These differences may lead to variation in female preferences for male visual signals, which would provide a potential mechanism for explaining individual differences in mate-choice behavior.

  6. Testing a key assumption in animal communication: between-individual variation in female visual systems alters perception of male signals.

    Science.gov (United States)

    Ronald, Kelly L; Ensminger, Amanda L; Shawkey, Matthew D; Lucas, Jeffrey R; Fernández-Juricic, Esteban

    2017-12-15

    Variation in male signal production has been extensively studied because of its relevance to animal communication and sexual selection. Although we now know much about the mechanisms that can lead to variation between males in the properties of their signals, there is still a general assumption that there is little variation in terms of how females process these male signals. Variation between females in signal processing may lead to variation between females in how they rank individual males, meaning that one single signal may not be universally attractive to all females. We tested this assumption in a group of female wild-caught brown-headed cowbirds ( Molothrus ater ), a species that uses a male visual signal (e.g. a wingspread display) to make its mate-choice decisions. We found that females varied in two key parameters of their visual sensory systems related to chromatic and achromatic vision: cone densities (both total and proportions) and cone oil droplet absorbance. Using visual chromatic and achromatic contrast modeling, we then found that this between-individual variation in visual physiology leads to significant between-individual differences in how females perceive chromatic and achromatic male signals. These differences may lead to variation in female preferences for male visual signals, which would provide a potential mechanism for explaining individual differences in mate-choice behavior. © 2017. Published by The Company of Biologists Ltd.

  7. Relative fault and efficient negligence: comparative negligence explained

    NARCIS (Netherlands)

    Dari-Mattiacci, G.; Hendriks, E.S.

    2010-01-01

    Comparative negligence poses a persisting puzzle in law & economics. Under standard assumptions, its performance is identical to other negligence rules, while its implementation is slightly more complex. If so, why is it the most common rule? In this paper, we advance a novel argument: comparative

  8. Migration in Asia-Europe Relations

    DEFF Research Database (Denmark)

    Juego, Bonn

    2010-01-01

    There is a remarkable difference between viewing migration as a 'social integration' issue, on the one hand, and migration as a 'social relation'. The idea of ‘social integration’ has unrealistic assumptions that see migration as a one-way process, that societies and human relations are static......, and that migrants are mechanical. Policies that are founded on unrealistic assumptions are most likely to generate tensions, conflicts, and contradictions. For a migration process to succeed in forging social harmony and development, it is therefore of decisive and crucial importance to regard migration...... as a ‘social relation’. This is simply because successful migration has to be a harmonious synergy between the migrants (and also the sending countries where they come from) and the receiving society (and its people). As indicated, migrants enter into the receiving society not merely as a passive commodity...

  9. Why we still don't understand the social aspects of wind power: A critique of key assumptions within the literature

    International Nuclear Information System (INIS)

    Aitken, Mhairi

    2010-01-01

    The literature on public attitudes to wind power is underpinned by key assumptions which limit its scope and restrict the findings it can present. Five key assumptions are that: (1) The majority of the public supports wind power. (2) Opposition to wind power is therefore deviant. (3) Opponents are ignorant or misinformed. (4) The reason for understanding opposition is to overcome it. (5) Trust is key. The paper calls for critical reflection on each of these assumptions. It should not be assumed that opposition to wind power is deviant/illegitimate. Opposition cannot be dismissed as ignorant or misinformed instead it must be acknowledged that objectors are often very knowledgeable. Public attitudes and responses to wind power should not be examined in order to mitigate potential future opposition, but rather in order to understand the social context of renewable energy. Trust is identified as a key issue, however greater trust must be placed in members of the public and in their knowledge. In sum, the literature must abandon the assumption that it knows who is 'right' and instead must engage with the possibility that objectors to wind power are not always 'wrong'.

  10. Stress-reducing preventive maintenance model for a unit under stressful environment

    International Nuclear Information System (INIS)

    Park, J.H.; Chang, Woojin; Lie, C.H.

    2012-01-01

    We develop a preventive maintenance (PM) model for a unit operated under stressful environment. The PM model in this paper consists of a failure rate model and two cost models to determine the optimal PM scheduling which minimizes a cost rate. The assumption for the proposed model is that stressful environment accelerates the failure of the unit and periodic maintenances reduce stress from outside. The failure rate model handles the maintenance effect of PM using improvement and stress factors. The cost models are categorized into two failure recognition cases: immediate failure recognition and periodic failure detection. The optimal PM scheduling is obtained by considering the trade-off between the related cost and the lifetime of a unit in our model setting. The practical usage of our proposed model is tested through a numerical example.

  11. Monitoring long-lasting insecticidal net (LLIN) durability to validate net serviceable life assumptions, in Rwanda

    NARCIS (Netherlands)

    Hakizimana, E.; Cyubahiro, B.; Rukundo, A.; Kabayiza, A.; Mutabazi, A.; Beach, R.; Patel, R.; Tongren, J.E.; Karema, C.

    2014-01-01

    Background To validate assumptions about the length of the distribution–replacement cycle for long-lasting insecticidal nets (LLINs) in Rwanda, the Malaria and other Parasitic Diseases Division, Rwanda Ministry of Health, used World Health Organization methods to independently confirm the three-year

  12. Estimation of kinematic parameters in CALIFA galaxies: no-assumption on internal dynamics

    Science.gov (United States)

    García-Lorenzo, B.; Barrera-Ballesteros, J.; CALIFA Team

    2016-06-01

    We propose a simple approach to homogeneously estimate kinematic parameters of a broad variety of galaxies (elliptical, spirals, irregulars or interacting systems). This methodology avoids the use of any kinematical model or any assumption on internal dynamics. This simple but novel approach allows us to determine: the frequency of kinematic distortions, systemic velocity, kinematic center, and kinematic position angles which are directly measured from the two dimensional-distributions of radial velocities. We test our analysis tools using the CALIFA Survey

  13. Harnack inequality for harmonic functions relative to a nonlinear p-homogeneous Riemannian Dirichlet form

    Directory of Open Access Journals (Sweden)

    Marco Biroli

    2007-12-01

    Full Text Available We consider a measure valued map α(u defined on D where D is a subspace of L^p(X,m with X a locally compact Hausdorff topological space with a distance under which it is a space of homogeneous type. Under assumptions of convexity, Gateaux differentiability and other assumptions on α which generalize the properties of the energy measure of a Dirichlet form, we prove the Holder continuity of the local solution u of the problem  ∫Xµ(u,v(dx = 0  for each v belonging to a suitable space of test functions, where µ(u,v =< α'(u,v >.

  14. FIGHTING THE CLASSICAL CRIME-SCENE ASSUMPTIONS. CRITICAL ASPECTS IN ESTABLISHING THE CRIME-SCENE PERIMETER IN COMPUTER-BASED EVIDENCE CASES

    Directory of Open Access Journals (Sweden)

    Cristina DRIGĂ

    2016-05-01

    Full Text Available Physical-world forensic investigation has the luxury of being tied to the sciences governing the investigated space, hence some assumptions can be made with some degree of certainty when investigating a crime. Cyberspace on the other hand, has a dual nature comprising both a physical layer susceptible of scientific analysis, and a virtual layer governed entirely by the conventions established between the various actors involved at a certain moment in time, defining the actual digital landscape and being the layer where the actual facts relevant from the legal point of view occur. This distinct nature renders unusable many of the assumptions which the legal professionals and the courts of law are used to operate with. The article intends to identify the most important features of cyberspace having immediate legal consequences, with the purpose to establish new and safe assumptions from the legal professional's perspective when cross-examining facts that occurred in cyberspace.

  15. Relation of Cloud Occurrence Frequency, Overlap, and Effective Thickness Derived from CALIPSO and CloudSat Merged Cloud Vertical Profiles

    Science.gov (United States)

    Kato, Seiji; Sun-Mack, Sunny; Miller, Walter F.; Rose, Fred G.; Chen, Yan; Minnis, Patrick; Wielicki, Bruce A.

    2009-01-01

    A cloud frequency of occurrence matrix is generated using merged cloud vertical profile derived from Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) and Cloud Profiling Radar (CPR). The matrix contains vertical profiles of cloud occurrence frequency as a function of the uppermost cloud top. It is shown that the cloud fraction and uppermost cloud top vertical pro les can be related by a set of equations when the correlation distance of cloud occurrence, which is interpreted as an effective cloud thickness, is introduced. The underlying assumption in establishing the above relation is that cloud overlap approaches the random overlap with increasing distance separating cloud layers and that the probability of deviating from the random overlap decreases exponentially with distance. One month of CALIPSO and CloudSat data support these assumptions. However, the correlation distance sometimes becomes large, which might be an indication of precipitation. The cloud correlation distance is equivalent to the de-correlation distance introduced by Hogan and Illingworth [2000] when cloud fractions of both layers in a two-cloud layer system are the same.

  16. Tests of data quality, scaling assumptions, and reliability of the Danish SF-36

    DEFF Research Database (Denmark)

    Bjorner, J B; Damsgaard, M T; Watt, T

    1998-01-01

    We used general population data (n = 4084) to examine data completeness, response consistency, tests of scaling assumptions, and reliability of the Danish SF-36 Health Survey. We compared traditional multitrait scaling analyses to analyses using polychoric correlations and Spearman correlations...... with chronic diseases excepted). Concerning correlation methods, we found interesting differences indicating advantages of using methods that do not assume a normal distribution of answers as an addition to traditional methods....

  17. Factor structure and concurrent validity of the world assumptions scale.

    Science.gov (United States)

    Elklit, Ask; Shevlin, Mark; Solomon, Zahava; Dekel, Rachel

    2007-06-01

    The factor structure of the World Assumptions Scale (WAS) was assessed by means of confirmatory factor analysis. The sample was comprised of 1,710 participants who had been exposed to trauma that resulted in whiplash. Four alternative models were specified and estimated using LISREL 8.72. A correlated 8-factor solution was the best explanation of the sample data. The estimates of reliability of eight subscales of the WAS ranged from .48 to .82. Scores from five subscales correlated significantly with trauma severity as measured by the Harvard Trauma Questionnaire, although the magnitude of the correlations was low to modest, ranging from .08 to -.43. It is suggested that the WAS has adequate psychometric properties for use in both clinical and research settings.

  18. Population genetics inference for longitudinally-sampled mutants under strong selection.

    Science.gov (United States)

    Lacerda, Miguel; Seoighe, Cathal

    2014-11-01

    Longitudinal allele frequency data are becoming increasingly prevalent. Such samples permit statistical inference of the population genetics parameters that influence the fate of mutant variants. To infer these parameters by maximum likelihood, the mutant frequency is often assumed to evolve according to the Wright-Fisher model. For computational reasons, this discrete model is commonly approximated by a diffusion process that requires the assumption that the forces of natural selection and mutation are weak. This assumption is not always appropriate. For example, mutations that impart drug resistance in pathogens may evolve under strong selective pressure. Here, we present an alternative approximation to the mutant-frequency distribution that does not make any assumptions about the magnitude of selection or mutation and is much more computationally efficient than the standard diffusion approximation. Simulation studies are used to compare the performance of our method to that of the Wright-Fisher and Gaussian diffusion approximations. For large populations, our method is found to provide a much better approximation to the mutant-frequency distribution when selection is strong, while all three methods perform comparably when selection is weak. Importantly, maximum-likelihood estimates of the selection coefficient are severely attenuated when selection is strong under the two diffusion models, but not when our method is used. This is further demonstrated with an application to mutant-frequency data from an experimental study of bacteriophage evolution. We therefore recommend our method for estimating the selection coefficient when the effective population size is too large to utilize the discrete Wright-Fisher model. Copyright © 2014 by the Genetics Society of America.

  19. Bayou Corne Sinkhole: Control Measurements of State Highway 70 in Assumption Parish, Louisiana : Research Project Capsule

    Science.gov (United States)

    2012-09-01

    The sinkhole located in northern Assumption Parish, Louisiana, threatens : the stability of Highway 70, a state-maintained route. In order to monitor : and mitigate potential damage eff ects on this infrastructure, the Louisiana : Department of Trans...

  20. 77 FR 74353 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-12-14

    ... regulation will be 0.75 percent for the period during which a benefit is in pay status and 4.00 percent... PENSION BENEFIT GUARANTY CORPORATION 29 CFR Part 4022 Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits AGENCY: Pension Benefit Guaranty Corporation...

  1. On the impact of the ideal gas assumption to high-pressure combustion phenomena in engines

    NARCIS (Netherlands)

    Evlampiev, A.V.; Somers, L.M.T.; Baert, R.S.G.; Goey, de L.P.H.

    2008-01-01

    The effect of the ideal gas law assumption on auto-ignition and NOx-formation in a rapid compression machine is studied. For both processes the simulations are compared to a reference simulation using a Redlich-Kwong equation-of-state based on the critical properties of all constituents.

  2. Accumulated dose calculations in Indian PHWRs under DBA

    International Nuclear Information System (INIS)

    Nesaraj, David; Pradhan, A.S.; Bhardwaj, S.A.

    1996-01-01

    Accumulated gamma dose inside reactor building due to release of fission products from equilibrium core of Indian PHWR under accident condition has been assessed. The assessment has been done for the radiation tolerance limit of the critical equipment inside reactor building. The basic source data has been generated using computer code ORIGEN2 written and developed by Oak Ridge National Laboratory, USA (ORNL). This paper discusses the details of the calculations done on the basis of certain assumption which are mentioned at relevant places. The results indicate accumulated gamma dose at a few typical locations inside reactor building under accident condition. (author). 1 ref., 1 tab., 1 fig

  3. The European Water Framework Directive: How Ecological Assumptions Frame Technical and Social Change

    Directory of Open Access Journals (Sweden)

    Patrick Steyaert

    2007-06-01

    Full Text Available The European Water Framework Directive (WFD is built upon significant cognitive developments in the field of ecological science but also encourages active involvement of all interested parties in its implementation. The coexistence in the same policy text of both substantive and procedural approaches to policy development stimulated this research as did our concerns about the implications of substantive ecological visions within the WFD policy for promoting, or not, social learning processes through participatory designs. We have used a qualitative analysis of the WFD text which shows the ecological dimension of the WFD dedicates its quasi-exclusive attention to a particular current of thought in ecosystems science focusing on ecosystems status and stability and considering human activities as disturbance factors. This particular worldview is juxtaposed within the WFD with a more utilitarian one that gives rise to many policy exemptions without changing the general underlying ecological model. We discuss these policy statements in the light of the tension between substantive and procedural policy developments. We argue that the dominant substantive approach of the WFD, comprising particular ecological assumptions built upon "compositionalism," seems to be contradictory with its espoused intention of involving the public. We discuss that current of thought in regard to more functionalist thinking and adaptive management, which offers greater opportunities for social learning, i.e., place a set of interdependent stakeholders in an intersubjective position in which they operate a "social construction" of water problems through the co-production of knowledge.

  4. 3 CFR - Delegation of Certain Functions Under Sections 603-604 and 699 of the Foreign Relations...

    Science.gov (United States)

    2010-01-01

    ...-604 and 699 of the Foreign Relations Authorization Act, Fiscal Year 2003 (Public Law 107-228... Functions Under Sections 603-604 and 699 of the Foreign Relations Authorization Act, Fiscal Year 2003...-604 and 699 of the Foreign Relations Authorization Act, Fiscal Year 2003 (Public Law 107-228). You are...

  5. Material law for concrete under multiaxial stress

    International Nuclear Information System (INIS)

    Geistefeldt, H.

    1977-01-01

    In this paper a general triaxial set of finite strain-stress relations is derived, which can include in a step-by-step way nearly all known factors and curves of material response. The finite constitutive equations representing the behavior of concrete are related to the main strain-directions. The elastic part, the functions for uniaxial behavior, those for biaxial response and finally the relation-parts, nonzero only in triaxial stress-state, can be reset separately by suitable functions which have been adjusted to the material response of actual concrete known from special tests. In nonlinear incremental analysis a potential is usually assumed in incremental material behavior to keep incremental stiffness matrices symmetric. If the proposed generalized set of constitutive equations is restricted to special types of functions, the resulting tangent stiffness is symmetric. Special functions for the various parts are presented, the tangent stiffness of which can easily be derived explicitly by partial differentiation of the related strain-stress relations. Thus the application of the proposed constitutive equations in incremental nonlinear analysis is very effective. The free coefficients of one general set of equations are adjusted step by step to the results of Kupfer's biaxial tests under shorttime loading. With a new and very short bixial failure criterion for concrete, which has been stated and compared with test results, the analytic description of the biaxial behavior of Kupfer's concrete is completed. With some additional assumptions the proposed failure criteria and the strain-stress equations for concrete are extended to the biaxial response of uncracked othogonally reinforced concrete response

  6. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klaauw, B.; Koning, R.H.

    2003-01-01

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  7. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klauw, B.; Koning, R.H.

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  8. Homotopy Method for a General Multiobjective Programming Problem under Generalized Quasinormal Cone Condition

    Directory of Open Access Journals (Sweden)

    X. Zhao

    2012-01-01

    Full Text Available A combined interior point homotopy continuation method is proposed for solving general multiobjective programming problem. We prove the existence and convergence of a smooth homotopy path from almost any interior initial interior point to a solution of the KKT system under some basic assumptions.

  9. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    Science.gov (United States)

    Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.

    2017-02-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA

  10. Matrix Diffusion for Performance Assessment - Experimental Evidence, Modelling Assumptions and Open Issues

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, A

    2004-07-01

    In this report a comprehensive overview on the matrix diffusion of solutes in fractured crystalline rocks is presented. Some examples from observations in crystalline bedrock are used to illustrate that matrix diffusion indeed acts on various length scales. Fickian diffusion is discussed in detail followed by some considerations on rock porosity. Due to the fact that the dual-porosity medium model is a very common and versatile method for describing solute transport in fractured porous media, the transport equations and the fundamental assumptions, approximations and simplifications are discussed in detail. There is a variety of geometrical aspects, processes and events which could influence matrix diffusion. The most important of these, such as, e.g., the effect of the flow-wetted fracture surface, channelling and the limited extent of the porous rock for matrix diffusion etc., are addressed. In a further section open issues and unresolved problems related to matrix diffusion are mentioned. Since matrix diffusion is one of the key retarding processes in geosphere transport of dissolved radionuclide species, matrix diffusion was consequently taken into account in past performance assessments of radioactive waste repositories in crystalline host rocks. Some issues regarding matrix diffusion are site-specific while others are independent of the specific situation of a planned repository for radioactive wastes. Eight different performance assessments from Finland, Sweden and Switzerland were considered with the aim of finding out how matrix diffusion was addressed, and whether a consistent picture emerges regarding the varying methodology of the different radioactive waste organisations. In the final section of the report some conclusions are drawn and an outlook is given. An extensive bibliography provides the reader with the key papers and reports related to matrix diffusion. (author)

  11. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  12. How do people learn from negative evidence? Non-monotonic generalizations and sampling assumptions in inductive reasoning.

    Science.gov (United States)

    Voorspoels, Wouter; Navarro, Daniel J; Perfors, Amy; Ransom, Keith; Storms, Gert

    2015-09-01

    A robust finding in category-based induction tasks is for positive observations to raise the willingness to generalize to other categories while negative observations lower the willingness to generalize. This pattern is referred to as monotonic generalization. Across three experiments we find systematic non-monotonicity effects, in which negative observations raise the willingness to generalize. Experiments 1 and 2 show that this effect emerges in hierarchically structured domains when a negative observation from a different category is added to a positive observation. They also demonstrate that this is related to a specific kind of shift in the reasoner's hypothesis space. Experiment 3 shows that the effect depends on the assumptions that the reasoner makes about how inductive arguments are constructed. Non-monotonic reasoning occurs when people believe the facts were put together by a helpful communicator, but monotonicity is restored when they believe the observations were sampled randomly from the environment. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Acting under interference by other agents with unknown goals

    DEFF Research Database (Denmark)

    Sønderberg-Madsen, Nicolaj; Jensen, Finn V.

    2008-01-01

    We consider the situation where two agents try to solve each their own task in a common environment. A general framework for representing that kind of scenario is presented. The framework is used to model the analysis depth of the opponent agent and to determine an optimal policy under various as...... assumptions on analysis depth of the opponent. The framework is applied on a strategic game, Ausgetrickst, and experiments are reported....

  14. The assumption of linearity in soil and plant concentration ratios: an experimental evaluation

    International Nuclear Information System (INIS)

    Sheppard, S.C.; Evenden, W.G.

    1988-01-01

    We have evaluated one of the main assumptions in the use of concentration ratios to describe the transfer of elements in the environment. The ratios examined in detail were the 'concentration ratio' (CR) of leaf to soil and the 'partition coefficient' (Ksub(d)) of solid- to liquid-phase concentrations in soil. Use of these ratios implies a linear relationship between the concentrations. Soil was experimentally contaminated to evaluate this linearity over more than a 1000-fold range in concentration. A secondary objective was to determine CR and Ksub(d) values in a long-term (2 y) outdoor study using a peat soil and blueberries. The elements I, Se, Cs, Pb and U were chosen as environmentally important elements. The results indicated that relationships of leaf and leachate concentrations were not consistently linearly related to the total soil concentrations for each of the elements. The modelling difficulties implied by these concentration dependencies can be partially offset by including the strong negative correlation between CR and Ksub(d). The error introduced by using a mean value of the ratios for Se or U resulted in up to a ten-fold increase in variability for CR and a three-fold increase for Ksub(d). (author)

  15. Letters: Milk and Mortality : Study used wrong assumption about galactose content of fermented dairy products

    NARCIS (Netherlands)

    Hettinga, K.A.

    2014-01-01

    Michaëlsson and colleagues’ proposed mechanism for the effect of milk intake on the risk of mortality and fractures is based on the assumption that fermented dairy products (which had the opposite effects to those of non-fermented milk) are free of galactose.1 For most fermented dairy products,

  16. Enhancing the efficiency of constrained dual-hop variable-gain AF relaying under nakagami-m fading

    KAUST Repository

    Zafar, Ammar; Radaydeh, Redha Mahmoud; Chen, Yunfei; Alouini, Mohamed-Slim

    2014-01-01

    -to-end signal-to-noise-ratio (SNR) and the overall power consumed is minimized while maintaining this constraint. This problem is considered under two different assumptions of the available channel state information (CSI) at the relays, namely full CSI

  17. Using Instrument Simulators and a Satellite Database to Evaluate Microphysical Assumptions in High-Resolution Simulations of Hurricane Rita

    Science.gov (United States)

    Hristova-Veleva, S. M.; Chao, Y.; Chau, A. H.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Martin, J. M.; Poulsen, W. L.; Rodriguez, E.; Stiles, B. W.; Turk, J.; Vu, Q.

    2009-12-01

    Improving forecasting of hurricane intensity remains a significant challenge for the research and operational communities. Many factors determine a tropical cyclone’s intensity. Ultimately, though, intensity is dependent on the magnitude and distribution of the latent heating that accompanies the hydrometeor production during the convective process. Hence, the microphysical processes and their representation in hurricane models are of crucial importance for accurately simulating hurricane intensity and evolution. The accurate modeling of the microphysical processes becomes increasingly important when running high-resolution models that should properly reflect the convective processes in the hurricane eyewall. There are many microphysical parameterizations available today. However, evaluating their performance and selecting the most representative ones remains a challenge. Several field campaigns were focused on collecting in situ microphysical observations to help distinguish between different modeling approaches and improve on the most promising ones. However, these point measurements cannot adequately reflect the space and time correlations characteristic of the convective processes. An alternative approach to evaluating microphysical assumptions is to use multi-parameter remote sensing observations of the 3D storm structure and evolution. In doing so, we could compare modeled to retrieved geophysical parameters. The satellite retrievals, however, carry their own uncertainty. To increase the fidelity of the microphysical evaluation results, we can use instrument simulators to produce satellite observables from the model fields and compare to the observed. This presentation will illustrate how instrument simulators can be used to discriminate between different microphysical assumptions. We will compare and contrast the members of high-resolution ensemble WRF model simulations of Hurricane Rita (2005), each member reflecting different microphysical assumptions

  18. Optimal Investment Under Transaction Costs: A Threshold Rebalanced Portfolio Approach

    Science.gov (United States)

    Tunc, Sait; Donmez, Mehmet Ali; Kozat, Suleyman Serdar

    2013-06-01

    We study optimal investment in a financial market having a finite number of assets from a signal processing perspective. We investigate how an investor should distribute capital over these assets and when he should reallocate the distribution of the funds over these assets to maximize the cumulative wealth over any investment period. In particular, we introduce a portfolio selection algorithm that maximizes the expected cumulative wealth in i.i.d. two-asset discrete-time markets where the market levies proportional transaction costs in buying and selling stocks. We achieve this using "threshold rebalanced portfolios", where trading occurs only if the portfolio breaches certain thresholds. Under the assumption that the relative price sequences have log-normal distribution from the Black-Scholes model, we evaluate the expected wealth under proportional transaction costs and find the threshold rebalanced portfolio that achieves the maximal expected cumulative wealth over any investment period. Our derivations can be readily extended to markets having more than two stocks, where these extensions are pointed out in the paper. As predicted from our derivations, we significantly improve the achieved wealth over portfolio selection algorithms from the literature on historical data sets.

  19. Individual Change and the Timing and Onset of Important Life Events: Methods, Models, and Assumptions

    Science.gov (United States)

    Grimm, Kevin; Marcoulides, Katerina

    2016-01-01

    Researchers are often interested in studying how the timing of a specific event affects concurrent and future development. When faced with such research questions there are multiple statistical models to consider and those models are the focus of this paper as well as their theoretical underpinnings and assumptions regarding the nature of the…

  20. A study of composite beam with shape memory alloy arbitrarily embedded under thermal and mechanical loadings

    International Nuclear Information System (INIS)

    Zhang Yin; Zhao Yapu

    2007-01-01

    The constitutive relations and kinematic assumptions on the composite beam with shape memory alloy (SMA) arbitrarily embedded are discussed and the results related to the different kinematic assumptions are compared. As the approach of mechanics of materials is to study the composite beam with the SMA layer embedded, the kinematic assumption is vital. In this paper, we systematically study the kinematic assumptions influence on the composite beam deflection and vibration characteristics. Based on the different kinematic assumptions, the equations of equilibrium/motion are different. Here three widely used kinematic assumptions are presented and the equations of equilibrium/motion are derived accordingly. As the three kinematic assumptions change from the simple to the complex one, the governing equations evolve from the linear to the nonlinear ones. For the nonlinear equations of equilibrium, the numerical solution is obtained by using Galerkin discretization method and Newton-Rhapson iteration method. The analysis on the numerical difficulty of using Galerkin method on the post-buckling analysis is presented. For the post-buckling analysis, finite element method is applied to avoid the difficulty due to the singularity occurred in Galerkin method. The natural frequencies of the composite beam with the nonlinear governing equation, which are obtained by directly linearizing the equations and locally linearizing the equations around each equilibrium, are compared. The influences of the SMA layer thickness and the shift from neutral axis on the deflection, buckling and post-buckling are also investigated. This paper presents a very general way to treat thermo-mechanical properties of the composite beam with SMA arbitrarily embedded. The governing equations for each kinematic assumption consist of a third order and a fourth order differential equation with a total of seven boundary conditions. Some previous studies on the SMA layer either ignore the thermal constraint

  1. Mutual assumptions and facts about nondisclosure among clinical supervisors and students in group supervision

    DEFF Research Database (Denmark)

    Nielsen, Geir Høstmark; Skjerve, Jan; Jacobsen, Claus Haugaard

    2009-01-01

    In the two preceding papers of this issue of Nordic Psychology the authors report findings from a study of nondisclosure among student therapists and clinical supervisors. The findings were reported separately for each group. In this article, the two sets of findings are held together and compared......, so as to draw a picture of mutual assumptions and facts about nondisclosure among students and supervisors....

  2. Electron emission from molybdenum under ion bombardment

    International Nuclear Information System (INIS)

    Ferron, J.; Alonso, E.V.; Baragiola, R.A.; Oliva-Florio, A.

    1981-01-01

    Measurements are reported of electron emission yields of clean molybdenum surfaces under bombardment with H + , H 2 + , D + , D 2 + , He + , N + , N 2 + , O + , O 2 + , Ne + , Ar + , Kr + and Xe + in the wide energy range 0.7-60.2 keV. The clean surfaces were produced by inert gas sputtering under ultrahigh vacuum. The results are compared with those predicted by a core-level excitation model. The disagreement found when using correct values for the energy levels of Mo is traced to wrong assumptions in the model. A substantially improved agreement with experiment is obtained using a model in which electron emission results from the excitation of valence electrons from the target by the projectiles and fast recoiling target atoms. (author)

  3. Heat-related mortality projections for cardiovascular and respiratory disease under the changing climate in Beijing, China

    Science.gov (United States)

    Li, Tiantian; Ban, Jie; Horton, Radley M.; Bader, Daniel A.; Huang, Ganlin; Sun, Qinghua; Kinney, Patrick L.

    2015-08-01

    Because heat-related health effects tend to become more serious at higher temperatures, there is an urgent need to determine the mortality projection of specific heat-sensitive diseases to provide more detailed information regarding the variation of the sensitivity of such diseases. In this study, the specific mortality of cardiovascular and respiratory disease in Beijing was initially projected under five different global-scale General Circulation Models (GCMs) and two Representative Concentration Pathways scenarios (RCPs) in the 2020s, 2050s, and 2080s compared to the 1980s. Multi-model ensembles indicated cardiovascular mortality could increase by an average percentage of 18.4%, 47.8%, and 69.0% in the 2020s, 2050s, and 2080s under RCP 4.5, respectively, and by 16.6%,73.8% and 134% in different decades respectively, under RCP 8.5 compared to the baseline range. The same increasing pattern was also observed in respiratory mortality. The heat-related deaths under the RCP8.5 scenario were found to reach a higher number and to increase more rapidly during the 21st century compared to the RCP4.5 scenario, especially in the 2050s and the 2080s. The projection results show potential trends in cause-specific mortality in the context of climate change, and provide support for public health interventions tailored to specific climate-related future health risks.

  4. 7 CFR 1.29 - Subpoenas relating to investigations under statutes administered by the Secretary of Agriculture.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false Subpoenas relating to investigations under statutes administered by the Secretary of Agriculture. 1.29 Section 1.29 Agriculture Office of the Secretary of Agriculture ADMINISTRATIVE REGULATIONS Departmental Proceedings § 1.29 Subpoenas relating to investigations...

  5. How do rigid-lid assumption affect LES simulation results at high Reynolds flows?

    Science.gov (United States)

    Khosronejad, Ali; Farhadzadeh, Ali; SBU Collaboration

    2017-11-01

    This research is motivated by the work of Kara et al., JHE, 2015. They employed LES to model flow around a model of abutment at a Re number of 27,000. They showed that first-order turbulence characteristics obtained by rigid-lid (RL) assumption compares fairly well with those of level-set (LS) method. Concerning the second-order statistics, however, their simulation results showed a significant dependence on the method used to describe the free surface. This finding can have important implications for open channel flow modeling. The Reynolds number for typical open channel flows, however, could be much larger than that of Kara et al.'s test case. Herein, we replicate the reported study by augmenting the geometric and hydraulic scales to reach a Re number of one order of magnitude larger ( 200,000). The Virtual Flow Simulator (VFS-Geophysics) model in its LES mode is used to simulate the test case using both RL and LS methods. The computational results are validated using measured flow and free-surface data from our laboratory experiments. Our goal is to investigate the effects of RL assumption on both first-order and second order statistics at high Reynolds numbers that occur in natural waterways. Acknowledgment: Computational resources are provided by the Center of Excellence in Wireless & Information Technology (CEWIT) of Stony Brook University.

  6. Assessing women's sexuality after cancer therapy: checking assumptions with the focus group technique.

    Science.gov (United States)

    Bruner, D W; Boyd, C P

    1999-12-01

    Cancer and cancer therapies impair sexual health in a multitude of ways. The promotion of sexual health is therefore vital for preserving quality of life and is an integral part of total or holistic cancer management. Nursing, to provide holistic care, requires research that is meaningful to patients as well as the profession to develop educational and interventional studies to promote sexual health and coping. To obtain meaningful research data instruments that are reliable, valid, and pertinent to patients' needs are required. Several sexual functioning instruments were reviewed for this study and found to be lacking in either a conceptual foundation or psychometric validation. Without a defined conceptual framework, authors of the instruments must have made certain assumptions regarding what women undergoing cancer therapy experience and what they perceive as important. To check these assumptions before assessing women's sexuality after cancer therapies in a larger study, a pilot study was designed to compare what women experience and perceive as important regarding their sexuality with what is assessed in several currently available research instruments, using the focus group technique. Based on the focus group findings, current sexual functioning questionnaires may be lacking in pertinent areas of concern for women treated for breast or gynecologic malignancies. Better conceptual foundations may help future questionnaire design. Self-regulation theory may provide an acceptable conceptual framework from which to develop a sexual functioning questionnaire.

  7. Projecting Future Heat-Related Mortality under Climate Change Scenarios: A Systematic Review

    Science.gov (United States)

    Barnett, Adrian Gerard; Wang, Xiaoming; Vaneckova, Pavla; FitzGerald, Gerard; Tong, Shilu

    2011-01-01

    Background: Heat-related mortality is a matter of great public health concern, especially in the light of climate change. Although many studies have found associations between high temperatures and mortality, more research is needed to project the future impacts of climate change on heat-related mortality. Objectives: We conducted a systematic review of research and methods for projecting future heat-related mortality under climate change scenarios. Data sources and extraction: A literature search was conducted in August 2010, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 through July 2010. Data synthesis: Fourteen studies fulfilled the inclusion criteria. Most projections showed that climate change would result in a substantial increase in heat-related mortality. Projecting heat-related mortality requires understanding historical temperature–mortality relationships and considering the future changes in climate, population, and acclimatization. Further research is needed to provide a stronger theoretical framework for projections, including a better understanding of socioeconomic development, adaptation strategies, land-use patterns, air pollution, and mortality displacement. Conclusions: Scenario-based projection research will meaningfully contribute to assessing and managing the potential impacts of climate change on heat-related mortality. PMID:21816703

  8. Local Relation Map: A Novel Illumination Invariant Face Recognition Approach

    Directory of Open Access Journals (Sweden)

    Lian Zhichao

    2012-10-01

    Full Text Available In this paper, a novel illumination invariant face recognition approach is proposed. Different from most existing methods, an additive term as noise is considered in the face model under varying illuminations in addition to a multiplicative illumination term. High frequency coefficients of Discrete Cosine Transform (DCT are discarded to eliminate the effect caused by noise. Based on the local characteristics of the human face, a simple but effective illumination invariant feature local relation map is proposed. Experimental results on the Yale B, Extended Yale B and CMU PIE demonstrate the outperformance and lower computational burden of the proposed method compared to other existing methods. The results also demonstrate the validity of the proposed face model and the assumption on noise.

  9. 76 FR 76037 - Office of the Attorney General; Assumption of Concurrent Federal Criminal Jurisdiction in Certain...

    Science.gov (United States)

    2011-12-06

    ... Office of the Attorney General; Assumption of Concurrent Federal Criminal Jurisdiction in Certain Areas of Indian Country AGENCY: Office of the Attorney General, Department of Justice. ACTION: Final rule... concurrent criminal jurisdiction within the tribe's Indian country, and for the Attorney General to decide...

  10. The Impact of Feedback Frequency on Learning and Task Performance: Challenging the "More Is Better" Assumption

    Science.gov (United States)

    Lam, Chak Fu; DeRue, D. Scott; Karam, Elizabeth P.; Hollenbeck, John R.

    2011-01-01

    Previous research on feedback frequency suggests that more frequent feedback improves learning and task performance (Salmoni, Schmidt, & Walter, 1984). Drawing from resource allocation theory (Kanfer & Ackerman, 1989), we challenge the "more is better" assumption and propose that frequent feedback can overwhelm an individual's cognitive resource…

  11. Relation between stability and resilience determines the performance of early warning signals under different environmental drivers.

    Science.gov (United States)

    Dai, Lei; Korolev, Kirill S; Gore, Jeff

    2015-08-11

    Shifting patterns of temporal fluctuations have been found to signal critical transitions in a variety of systems, from ecological communities to human physiology. However, failure of these early warning signals in some systems calls for a better understanding of their limitations. In particular, little is known about the generality of early warning signals in different deteriorating environments. In this study, we characterized how multiple environmental drivers influence the dynamics of laboratory yeast populations, which was previously shown to display alternative stable states [Dai et al., Science, 2012]. We observed that both the coefficient of variation and autocorrelation increased before population collapse in two slowly deteriorating environments, one with a rising death rate and the other one with decreasing nutrient availability. We compared the performance of early warning signals across multiple environments as "indicators for loss of resilience." We find that the varying performance is determined by how a system responds to changes in a specific driver, which can be captured by a relation between stability (recovery rate) and resilience (size of the basin of attraction). Furthermore, we demonstrate that the positive correlation between stability and resilience, as the essential assumption of indicators based on critical slowing down, can break down in this system when multiple environmental drivers are changed simultaneously. Our results suggest that the stability-resilience relation needs to be better understood for the application of early warning signals in different scenarios.

  12. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  13. Evaluating the Sensitivity of the Mass-Based Particle Removal Calculations for HVAC Filters in ISO 16890 to Assumptions for Aerosol Distributions

    Directory of Open Access Journals (Sweden)

    Brent Stephens

    2018-02-01

    Full Text Available High efficiency particle air filters are increasingly being recommended for use in heating, ventilating, and air-conditioning (HVAC systems to improve indoor air quality (IAQ. ISO Standard 16890-2016 provides a methodology for approximating mass-based particle removal efficiencies for PM1, PM2.5, and PM10 using size-resolved removal efficiency measurements for 0.3 µm to 10 µm particles. Two historical volume distribution functions for ambient aerosol distributions are assumed to represent ambient air in urban and rural areas globally. The goals of this work are to: (i review the ambient aerosol distributions used in ISO 16890, (ii evaluate the sensitivity of the mass-based removal efficiency calculation procedures described in ISO 16890 to various assumptions that are related to indoor and outdoor aerosol distributions, and (iii recommend several modifications to the standard that can yield more realistic estimates of mass-based removal efficiencies for HVAC filters, and thus provide a more realistic representation of a greater number of building scenarios. The results demonstrate that knowing the PM mass removal efficiency estimated using ISO 16890 is not sufficient to predict the PM mass removal efficiency in all of the environments in which the filter might be used. The main reason for this insufficiency is that the assumptions for aerosol number and volume distributions can substantially impact the results, albeit with some exceptions.

  14. A note on poroacoustic traveling waves under Forchheimer's law

    International Nuclear Information System (INIS)

    Jordan, P.M.

    2013-01-01

    Acoustic traveling waves in a gas that saturates a rigid porous medium is investigated under the assumption that the drag experienced by the gas is modeled by Forchheimer's law. Exact traveling wave solutions (TWS)s, as well as approximate and asymptotic expressions, are obtained; decay rates are determined; and acceleration wave results are presented. In addition, special cases are considered, critical values of the wave variable and parameters are derived, and comparisons with predictions based on Darcy's law are performed. It is shown that, with respect to the Darcy case, most of the metrics that characterize such waveforms exhibit an increase in magnitude under Forchheimer's law

  15. Reconstructing genealogies of serial samples under the assumption of a molecular clock using serial-sample UPGMA.

    Science.gov (United States)

    Drummond, A; Rodrigo, A G

    2000-12-01

    Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.

  16. Correlation between Strawberry (Fragaria ananassa Duch. Productivity and Photosynthesis-related Parameters under Various Growth Conditions

    Directory of Open Access Journals (Sweden)

    Hyo Gil Choi

    2016-10-01

    Full Text Available In the present study, we investigated changes in chlorophyll fluorescence, photosynthetic parameters and fruit yields, as well as fruit phytochemical accumulation of strawberry (Fragaria ananassa Duch. that had been cultivated in a greenhouse under different combinations of light intensity and temperature. In plants grown with low light (LL photosystem II chlorophyll fluorescence was found to increase as compared with those grown under high light (HL. When strawberry plants were grown with temperature higher than 5◦C in addition to LL, they showed decrease in non-photochemical quenching (NPQ, photochemical quenching (qP, as well as chlorophyll fluorescence decrease ratio (RFd when compared with other combinations of light and temperature. Moreover, fruit yield of strawberry was closely correlated with chlorophyll fluorescence-related parameters such as NPQ, qP, and RFd, but not with the maximum efficiency of PS II (Fv/Fm. Although plant groups grown under different combinations of light and temperature showed almost comparable levels of photosynthesis rates (Pr when irradiated with low-intensity light, they displayed clear differences when measured with higher irradiances. Plants grown under HL with temperature above 10◦C showed the highest Pr, in contrast to the plants grown under LL with temperature above 5◦C. When the stomatal conductance and the transpiration rate were measured, plants of each treatment showed clear differences even when analyzed with lower irradiances. We also found that fruit production during winter season was more strongly influenced by growth temperature than light intensity. We suggest that fruit productivity of strawberry is closely associated with chlorophyll fluorescence and photosynthesis-related parameters during cultivation under different regimes of temperature and light.

  17. Foreign Under-Investment in US Securities and the Role of Relational Capital

    OpenAIRE

    Michael, Bryane

    2015-01-01

    Over 70 academic papers attempt to explain why foreigners invest in US securities. All ignore the vital role of the US broker-dealer. Macroeconomic factors like a trade balance or corporate governance may guide foreign investors toward certain markets. But US broker-dealers provide information to foreign investors and execute the actual trades. We hypothesize that particular foreign investors under-invest in US securities because of a lack of relational capital with US broker-dealers. We find...

  18. Divide and Conquer: A Valid Approach for Risk Assessment and Decision Making under Uncertainty for Groundwater-Related Diseases

    Science.gov (United States)

    Sanchez-Vila, X.; de Barros, F.; Bolster, D.; Nowak, W.

    2010-12-01

    Assessing the potential risk of hydro(geo)logical supply systems to human population is an interdisciplinary field. It relies on the expertise in fields as distant as hydrogeology, medicine, or anthropology, and needs powerful translation concepts to provide decision support and policy making. Reliable health risk estimates need to account for the uncertainties in hydrological, physiological and human behavioral parameters. We propose the use of fault trees to address the task of probabilistic risk analysis (PRA) and to support related management decisions. Fault trees allow decomposing the assessment of health risk into individual manageable modules, thus tackling a complex system by a structural “Divide and Conquer” approach. The complexity within each module can be chosen individually according to data availability, parsimony, relative importance and stage of analysis. The separation in modules allows for a true inter- and multi-disciplinary approach. This presentation highlights the three novel features of our work: (1) we define failure in terms of risk being above a threshold value, whereas previous studies used auxiliary events such as exceedance of critical concentration levels, (2) we plot an integrated fault tree that handles uncertainty in both hydrological and health components in a unified way, and (3) we introduce a new form of stochastic fault tree that allows to weaken the assumption of independent subsystems that is required by a classical fault tree approach. We illustrate our concept in a simple groundwater-related setting.

  19. Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism

    OpenAIRE

    Arias-Castro, Ery; Candès, Emmanuel J.; Plan, Yaniv

    2011-01-01

    Testing for the significance of a subset of regression coefficients in a linear model, a staple of statistical analysis, goes back at least to the work of Fisher who introduced the analysis of variance (ANOVA). We study this problem under the assumption that the coefficient vector is sparse, a common situation in modern high-dimensional settings. Suppose we have $p$ covariates and that under the alternative, the response only depends upon the order of $p^{1-\\alpha}$ of those, $0\\le\\alpha\\le1$...

  20. Dominance of gauge artifact in the consistency relation for the primordial bispectrum

    International Nuclear Information System (INIS)

    Tanaka, Takahiro; Urakawa, Yuko

    2011-01-01

    The conventional cosmological perturbation theory has been performed under the assumption that we know the whole spatial region of the universe with infinite volume. This is, however, not the case in the actual observations because observable portion of the universe is limited. To give a theoretical prediction to the observable fluctuations, gauge-invariant observables should be composed of the information in our local observable universe with finite volume. From this point of view, we reexamine the primordial non-Gaussianity in single field models, focusing on the bispectrum in the squeezed limit. A conventional prediction states that the bispectrum in this limit is related to the power spectrum through the so-called consistency relation. However, it turns out that, if we adopt a genuine gauge invariant variable which is naturally composed purely of the information in our local universe, the leading term for the bispectrum in the squeezed limit predicted by the consistency relation vanishes

  1. Extraction and Analysis of Information Related to Research & Development Declared Under an Additional Protocol

    International Nuclear Information System (INIS)

    Idinger, J.; Labella, R.; Rialhe, A.; Teller, N.

    2015-01-01

    The additional protocol (AP) provides important tools to strengthen and improve the effectiveness and efficiency of the safeguards system. Safeguards are designed to verify that States comply with their international commitments not to use nuclear material or to engage in nuclear-related activities for the purpose of developing nuclear weapons or other nuclear explosive devices. Under an AP based on INFCIRC/540, a State must provide to the IAEA additional information about, and inspector access to, all parts of its nuclear fuel cycle. In addition, the State has to supply information about its nuclear fuel cycle-related research and development (R&D) activities. The majority of States declare their R&D activities under the AP Articles 2.a.(i), 2.a.(x), and 2.b.(i) as part of initial declarations and their annual updates under the AP. In order to verify consistency and completeness of information provided under the AP by States, the Agency has started to analyze declared R&D information by identifying interrelationships between States in different R&D areas relevant to safeguards. The paper outlines the quality of R&D information provided by States to the Agency, describes how the extraction and analysis of relevant declarations are currently carried out at the Agency and specifies what kinds of difficulties arise during evaluation in respect to cross-linking international projects and finding gaps in reporting. In addition, the paper tries to elaborate how the reporting quality of AP information with reference to R&D activities and the assessment process of R&D information could be improved. (author)

  2. Heat-Related Mortality Projections for Cardiovascular and Respiratory Disease Under the Changing Climate in Beijing, China

    Science.gov (United States)

    Li, Tiantian; Ban, Jie; Horton, Radley M.; Bader, Daniel A.; Huang, Ganlin; Sun, Qinghua; Kinney, Patrick L.

    2015-01-01

    Because heat-related health effects tend to become more serious at higher temperatures, there is an urgent need to determine the mortality projection of specific heat-sensitive diseases to provide more detailed information regarding the variation of the sensitivity of such diseases. In this study, the specific mortality of cardiovascular and respiratory disease in Beijing was initially projected under five different global-scale General Circulation Models (GCMs) and two Representative Concentration Pathways scenarios (RCPs) in the 2020s, 2050s, and 2080s compared to the 1980s. Multi-model ensembles indicated cardiovascular mortality could increase by an average percentage of 18.4 percent, 47.8 percent, and 69.0 percent in the 2020s, 2050s, and 2080s under RCP 4.5, respectively, and by 16.6 percent, 73.8 percent and 134 percent in different decades respectively, under RCP 8.5 compared to the baseline range. The same increasing pattern was also observed in respiratory mortality. The heat-related deaths under the RCP 8.5 scenario were found to reach a higher number and to increase more rapidly during the 21st century compared to the RCP4.5 scenario, especially in the 2050s and the 2080s. The projection results show potential trends in cause-specific mortality in the context of climate change, and provide support for public health interventions tailored to specific climate-related future health risks.

  3. Gauge theories under incorporation of a generalized uncertainty principle

    International Nuclear Information System (INIS)

    Kober, Martin

    2010-01-01

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  4. Cloud-turbulence interactions: Sensitivity of a general circulation model to closure assumptions

    International Nuclear Information System (INIS)

    Brinkop, S.; Roeckner, E.

    1993-01-01

    Several approaches to parameterize the turbulent transport of momentum, heat, water vapour and cloud water for use in a general circulation model (GCM) have been tested in one-dimensional and three-dimensional model simulations. The schemes differ with respect to their closure assumptions (conventional eddy diffusivity model versus turbulent kinetic energy closure) and also regarding their treatment of cloud-turbulence interactions. The basis properties of these parameterizations are discussed first in column simulations of a stratocumulus-topped atmospheric boundary layer (ABL) under a strong subsidence inversion during the KONTROL experiment in the North Sea. It is found that the K-models tend to decouple the cloud layer from the adjacent layers because the turbulent activity is calculated from local variables. The higher-order scheme performs better in this respect because internally generated turbulence can be transported up and down through the action of turbulent diffusion. Thus, the TKE-scheme provides not only a better link between the cloud and the sub-cloud layer but also between the cloud and the inversion as a result of cloud-top entrainment. In the stratocumulus case study, where the cloud is confined by a pronounced subsidence inversion, increased entrainment favours cloud dilution through enhanced evaporation of cloud droplets. In the GCM study, however, additional cloud-top entrainment supports cloud formation because indirect cloud generating processes are promoted through efficient ventilation of the ABL, such as the enhanced moisture supply by surface evaporation and the increased depth of the ABL. As a result, tropical convection is more vigorous, the hydrological cycle is intensified, the whole troposphere becomes warmer and moister in general and the cloudiness in the upper part of the ABL is increased. (orig.)

  5. Black Bodies in Dance Education: Charting a New Pedagogical Paradigm to Eliminate Gendered and Hypersexualized Assumptions

    Science.gov (United States)

    West, C. S'thembile

    2005-01-01

    To resist and transform gendered and hypersexualized assumptions and attitudes that cloud interpretations and devalue readings of black and brown bodies, dance educators cannot only facilitate agency for their students, but also help demonstrate an overarching concern for social justice and equality. Dance has the power to transform and redirect…

  6. Confinement and related transport in Extrap geometry

    International Nuclear Information System (INIS)

    Tendler, M.

    1983-01-01

    The properties of the plasma dynamic equilibrium are investigated for the Extrap magnetic confinement geometry. The temperatures achieved so far in the high-#betta# pinches are much lower than the predicted values. Here, it is shown that the particle containment in Extrap may be improved as compared to the other pinches due to the electrostatic confinement. An analytic solution for the profiles of the plasma parameters are found under the assumption that the energy is lost primarily in the radial direction by heat conduction and convection. An estimate of the radial particle confinement time is given, showing favourable scaling with plasma density and temperature. The conventional assumption of a uniform current density is shown to be unjustified in the case of an inhomogeneous electron temperature. An analytical expression is found for the pinch radius at different mechanisms of the heat transport. (orig.)

  7. Assumption and program of the earlier stage construction of L/ILW disposal site

    International Nuclear Information System (INIS)

    Li Xuequn; Chen Shi; Li Xinbang

    1993-01-01

    The authors analysed the production and treatment of low- and intermediate-level radwastes (L/ILW) in China. Some problems and situation in this field are introduced. Over the past ten years, preliminary efforts have been made by CNNC (China National Nuclear Corporation) in policy, law and rules, developing program, management system, siting, engineering techniques, and safety assessment for radwaste disposal. The investment of the earlier stage work of L/ILW disposal site construction is estimated, the program and assumption to disposal site construction of the L/ILW are reviewed

  8. Rational Adaptation under Task and Processing Constraints: Implications for Testing Theories of Cognition and Action

    Science.gov (United States)

    Howes, Andrew; Lewis, Richard L.; Vera, Alonso

    2009-01-01

    The authors assume that individuals adapt rationally to a utility function given constraints imposed by their cognitive architecture and the local task environment. This assumption underlies a new approach to modeling and understanding cognition--cognitively bounded rational analysis--that sharpens the predictive acuity of general, integrated…

  9. Decision making under ambiguity but not under risk is related to problem gambling severity

    NARCIS (Netherlands)

    Brevers, Damien; Cleeremans, Axel; Goudriaan, Anna E.; Bechara, Antoine; Kornreich, Charles; Verbanck, Paul; Noël, Xavier

    2012-01-01

    The aim of the present study was to examine the relationship between problem gambling severity and decision-making situations that vary in two degrees of uncertainty (probability of outcome is known: decision-making under risk; probability of outcome is unknown: decision-making under ambiguity). For

  10. Investigation of China’s national public relations strategy under globalization : the hotspots around the national media

    OpenAIRE

    雷, 紫雯

    2014-01-01

    This study investigates on China’s national public relations strategy under the globalization by analyzing the national media. In recent years, in order to improve the global public opinion environment, and to improve its national public relations capabilities that match its economic power status, China has actively strengthened its national public relations strategies, including making the national “media go out”, and building world-class media. By researching on the localization of Chinese ...

  11. Knaves, Knights or Networks: Which Assumption of Lecturer and Manager Motivation Should Underlie Further Education Policy?

    Science.gov (United States)

    Boocock, Andrew

    2015-01-01

    Julian Le Grand, a well-known economist, identifies two types of public sector employee: knights (with altruistic motives) and knaves (with self-interested motives). He argues that the quasi-market, predicated on the assumption of knavish behaviour (or agent self-interest), is the most effective way of directing school managers and teachers…

  12. Psychological Factors related with Driving under the Influence of Alcohol and Substance Use

    Directory of Open Access Journals (Sweden)

    Ersin Budak

    2015-09-01

    Full Text Available Driving under the influence of alcohol and substance use is an important traffic problem that caused many people in the world to lose their lieves. Many features that are important in terms of driving adversely affected under the influence of alcohol and substance and therefore impaired driving behavior arises in drivers. The most effective way to fight for prevent this impaired driver behavier is the restrictions and regulations imposed on drivers in traffic related to alcohol and drug use. Nevertheless, in the literature, some drivers continue to impaired driving function with a risky traffic behavior, in which the driver personality (risk-taking, thrill-seeking, self-control, psychopathological (substance abuse, personality disorders, mood disorders, attention deficit hyperactivity disorder, post-traumatic stress disorder, anxiety, anger and aggression, and many other neuropsychological features are considered to have a relationship with this situation. In this article psychological, psychopathological and neuropsychological studies have examined regarding drive under the influence of alcohol and drug. [Psikiyatride Guncel Yaklasimlar - Current Approaches in Psychiatry 2015; 7(3.000: 333-347

  13. An assessment of issues related to determination of time periods required for isolation of high level waste

    International Nuclear Information System (INIS)

    Cohen, J.J.; Daer, G.R.; Smith, C.F.; Vogt, D.K.; Woolfolk, S.W.

    1989-01-01

    A commonly held perception is that disposal of spent nuclear fuel or high-level waste presents a risk of unprecedented duration. The EPA requires that projected releases of radioactivity be limited for 10,000 years after disposal with the intent that risks from the disposal repository be no greater than those from the uranium ore deposit from which the nuclear fuel was originally extracted. This study reviews issues involved in assessing compliance with the requirement. The determination of compliance is assumption dependent primarily due to uncertainties in dosimetric data, and relative availability of the radioactivity for environmental transport and eventual assimilation by humans. A conclusion of this study is that, in time, a spent fuel disposal repository such as the projected Yucca Mountain Project Facility will become less hazardous than the original ore deposit. Only the time it takes to do so is in question. Depending upon the assumptions selected, this time period could range from a few centuries to hundreds of thousands of years considering only the inherent radiotoxicities. However, if it can be assumed that the spent fuel radioactivity emplaced in a waste repository is less than 1/10 as available for human assimilation than that in a uranium ore deposit, then even under the most pessimistic set of assumptions, the EPA criteria can be considered to be complied with. 24 refs., 5 figs., 2 tabs

  14. Child mortality estimation: consistency of under-five mortality rate estimates using full birth histories and summary birth histories.

    Directory of Open Access Journals (Sweden)

    Romesh Silva

    Full Text Available Given the lack of complete vital registration data in most developing countries, for many countries it is not possible to accurately estimate under-five mortality rates from vital registration systems. Heavy reliance is often placed on direct and indirect methods for analyzing data collected from birth histories to estimate under-five mortality rates. Yet few systematic comparisons of these methods have been undertaken. This paper investigates whether analysts should use both direct and indirect estimates from full birth histories, and under what circumstances indirect estimates derived from summary birth histories should be used.Usings Demographic and Health Surveys data from West Africa, East Africa, Latin America, and South/Southeast Asia, I quantify the differences between direct and indirect estimates of under-five mortality rates, analyze data quality issues, note the relative effects of these issues, and test whether these issues explain the observed differences. I find that indirect estimates are generally consistent with direct estimates, after adjustment for fertility change and birth transference, but don't add substantial additional insight beyond direct estimates. However, choice of direct or indirect method was found to be important in terms of both the adjustment for data errors and the assumptions made about fertility.Although adjusted indirect estimates are generally consistent with adjusted direct estimates, some notable inconsistencies were observed for countries that had experienced either a political or economic crisis or stalled health transition in their recent past. This result suggests that when a population has experienced a smooth mortality decline or only short periods of excess mortality, both adjusted methods perform equally well. However, the observed inconsistencies identified suggest that the indirect method is particularly prone to bias resulting from violations of its strong assumptions about recent mortality

  15. Changes in children′s oral health related quality of life following dental treatment under general anesthesia

    Directory of Open Access Journals (Sweden)

    Seyed Ebrahim Jabarifar

    2009-01-01

    Full Text Available Background: Children′s oral health related quality of life (OHRQoL evaluates the impacts of oral daily activities of children and family on quality of life. Oral health related quality of life as outcome can be used to evaluate the dental health services. This study aimed to assess the extent to which den-tal treatment under general anesthesia affects quality of life of children and their families. Methods: One hundred parents of 3-10 year-old children who needed dental treatment under general anesthesia completed a parent-children perception questionnaire (P-CPQ and family impact scale (FIS before, and 4 weeks after dental treatment under general anesthesia. The questionnaire had statements related to oral health, functional limitation, emotional state and well being social well-being and family issues. Data were analyzed using SPSS version 11.5. Results: The mean scores and standard deviations of oral health quality of life of the children before and after dental treatment were 43.3 ± 7.14 and 39.24 ± 5.47 respectively. The mean scores of FIS before and after dental treatment were 8.00 ± 3.21 and 3.66 ± 2.62, respectively. The effect size of mean differences in P-CPQ and FIS scores were 1.84 ± 1.64 and 1.35 ± 4.34, respectively. Conclusion: Provision of dental treatment under general anesthesia for uncooperative, young chil-dren with extensive dental problems had significant effects on quality of life of both children and their families.

  16. Social power and approach-related neural activity

    NARCIS (Netherlands)

    M.A.S. Boksem (Maarten); R. Smolders (Ruud); D. de Cremer (David)

    2009-01-01

    textabstractIt has been argued that power activates a general tendency to approach whereas powerlessness activates a tendency to inhibit. The assumption is that elevated power involves reward-rich environments, freedom and, as a consequence, triggers an approach-related motivational orientation and

  17. Reaction progress pathways for glass and spent fuel under unsaturated conditions

    International Nuclear Information System (INIS)

    Bates, J.; Finn, P.; Bourcier, W.; Stout, R.

    1994-10-01

    The source term for the release of radionuclides from a nuclear waste repository is the waste form. In order to assess the performance of the repository and the engineered barrier system (EBS) compared to regulations established by the Nuclear Regulatory Commission and the Environmental Protection Agency it is necessary (1) to use available data to place bounding limits on release rates from the EBS, and (2) to develop a mechanistic predictive model of the radionuclide release and validate the model against tests done under a variety of different potential reaction conditions. The problem with (1) is that there is little experience to use when evaluating waste form reaction under unsaturated conditions such that errors in applying expert judgment to the problem may be significant. The second approach, to test and model the waste form reaction, is a more defensible means of providing input to the prediction of radionuclide release. In this approach, information related to the source term has a technical basis and provides a starting point to make reasonable assumptions for long-term behavior. Key aspects of this approach are an understanding of the reaction progress mechanism and the ability to model the tests using a geochemical code such as EQ3/6. Current knowledge of glass, UO 2 , and spent fuel reactions under different conditions are described below

  18. Speakers' assumptions about the lexical flexibility of idioms.

    Science.gov (United States)

    Gibbs, R W; Nayak, N P; Bolton, J L; Keppel, M E

    1989-01-01

    In three experiments, we examined why some idioms can be lexically altered and still retain their figurative meanings (e.g., John buttoned his lips about Mary can be changed into John fastened his lips about Mary and still mean "John didn't say anything about Mary"), whereas other idioms cannot be lexically altered without losing their figurative meanings (e.g., John kicked the bucket, meaning "John died," loses its idiomatic meaning when changed into John kicked the pail). Our hypothesis was that the lexical flexibility of idioms is determined by speakers' assumptions about the ways in which parts of idioms contribute to their figurative interpretations as a whole. The results of the three experiments indicated that idioms whose individual semantic components contribute to their overall figurative meanings (e.g., go out on a limb) were judged as less disrupted by changes in their lexical items (e.g., go out on a branch) than were nondecomposable idioms (e.g., kick the bucket) when their individual words were altered (e.g., punt the pail). These findings lend support to the idea that both the syntactic productivity and the lexical makeup of idioms are matters of degree, depending on the idioms' compositional properties. This conclusion suggests that idioms do not form a unique class of linguistic items, but share many of the properties of more literal language.

  19. A robust procedure for comparing multiple means under heteroscedasticity in unbalanced designs.

    Directory of Open Access Journals (Sweden)

    Esther Herberich

    2010-03-01

    Full Text Available Investigating differences between means of more than two groups or experimental conditions is a routine research question addressed in biology. In order to assess differences statistically, multiple comparison procedures are applied. The most prominent procedures of this type, the Dunnett and Tukey-Kramer test, control the probability of reporting at least one false positive result when the data are normally distributed and when the sample sizes and variances do not differ between groups. All three assumptions are non-realistic in biological research and any violation leads to an increased number of reported false positive results. Based on a general statistical framework for simultaneous inference and robust covariance estimators we propose a new statistical multiple comparison procedure for assessing multiple means. In contrast to the Dunnett or Tukey-Kramer tests, no assumptions regarding the distribution, sample sizes or variance homogeneity are necessary. The performance of the new procedure is assessed by means of its familywise error rate and power under different distributions. The practical merits are demonstrated by a reanalysis of fatty acid phenotypes of the bacterium Bacillus simplex from the "Evolution Canyons" I and II in Israel. The simulation results show that even under severely varying variances, the procedure controls the number of false positive findings very well. Thus, the here presented procedure works well under biologically realistic scenarios of unbalanced group sizes, non-normality and heteroscedasticity.

  20. Learning process mapping heuristics under stochastic sampling overheads

    Science.gov (United States)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  1. Fourth-order structural steganalysis and analysis of cover assumptions

    Science.gov (United States)

    Ker, Andrew D.

    2006-02-01

    We extend our previous work on structural steganalysis of LSB replacement in digital images, building detectors which analyse the effect of LSB operations on pixel groups as large as four. Some of the method previously applied to triplets of pixels carries over straightforwardly. However we discover new complexities in the specification of a cover image model, a key component of the detector. There are many reasonable symmetry assumptions which we can make about parity and structure in natural images, only some of which provide detection of steganography, and the challenge is to identify the symmetries a) completely, and b) concisely. We give a list of possible symmetries and then reduce them to a complete, non-redundant, and approximately independent set. Some experimental results suggest that all useful symmetries are thus described. A weighting is proposed and its approximate variance stabilisation verified empirically. Finally, we apply symmetries to create a novel quadruples detector for LSB replacement steganography. Experimental results show some improvement, in most cases, over other detectors. However the gain in performance is moderate compared with the increased complexity in the detection algorithm, and we suggest that, without new insight, further extension of structural steganalysis may provide diminishing returns.

  2. Increasing weather-related impacts on European population under climate and demographic change

    Science.gov (United States)

    Forzieri, Giovanni; Cescatti, Alessandro; Batista e Silva, Filipe; Kovats, Sari R.; Feyen, Luc

    2017-04-01

    Over the last three decades the overwhelming majority of disasters have been caused by weather-related events. The observed rise in weather-related disaster losses has been largely attributed to increased exposure and to a lesser degree to global warming. Recent studies suggest an intensification in the climatology of multiple weather extremes in Europe over the coming decades in view of climate change, while urbanization continues. In view of these pressures, understanding and quantifying the potential impacts of extreme weather events on future societies is imperative in order to identify where and to what extent their livelihoods will be at risk in the future, and develop timely and effective adaptation and disaster risk reduction strategies. Here we show a comprehensive assessment of single- and multi-hazard impacts on the European population until the year 2100. For this purpose, we developed a novel methodology that quantifies the human impacts as a multiplicative function of hazard, exposure and population vulnerability. We focus on seven of the most impacting weather-related hazards - including heat and cold waves, wildfires, droughts, river and coastal floods and windstorms - and evaluated their spatial and temporal variations in intensity and frequency under a business-as-usual climate scenario. Long-term demographic dynamics were modelled to assess exposure developments under a corresponding middle-of-the-road scenario. Vulnerability of humans to weather extremes was appraised based on more than 2300 records of weather-related disasters. The integration of these elements provides a range of plausible estimates of extreme weather-related risks for future European generations. Expected impacts on population are quantified in terms of fatalities and number of people exposed. We find a staggering rise in fatalities from extreme weather events, with the projected death toll by the end of the century amounting to more than 50 times the present number of people

  3. Some comments on mapping from disease-specific to generic health-related quality-of-life scales.

    Science.gov (United States)

    Palta, Mari

    2013-01-01

    An article by Lu et al. in this issue of Value in Health addresses the mapping of treatment or group differences in disease-specific measures (DSMs) of health-related quality of life onto differences in generic health-related quality-of-life scores, with special emphasis on how the mapping is affected by the reliability of the DSM. In the proposed mapping, a factor analytic model defines a conversion factor between the scores as the ratio of factor loadings. Hence, the mapping applies to convert true underlying scales and has desirable properties facilitating the alignment of instruments and understanding their relationship in a coherent manner. It is important to note, however, that when DSM means or differences in mean DSMs are estimated, their mapping is still of a measurement error-prone predictor, and the correct conversion coefficient is the true mapping multiplied by the reliability of the DSM in the relevant sample. In addition, the proposed strategy for estimating the factor analytic mapping in practice requires assumptions that may not hold. We discuss these assumptions and how they may be the reason we obtain disparate estimates of the mapping factor in an application of the proposed methods to groups of patients. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. Linearity assumption in soil-to-plant transfer factors of natural uranium and radium in Helianthus annuus L

    International Nuclear Information System (INIS)

    Rodriguez, P. Blanco; Tome, F. Vera; Fernandez, M. Perez; Lozano, J.C.

    2006-01-01

    The linearity assumption of the validation of soil-to-plant transfer factors of natural uranium and 226 Ra was tested using Helianthus annuus L. (sunflower) grown in a hydroponic medium. Transfer of natural uranium and 226 Ra was tested in both the aerial fraction of plants and in the overall seedlings (roots and shoots). The results show that the linearity assumption can be considered valid in the hydroponic growth of sunflowers for the radionuclides studied. The ability of sunflowers to translocate uranium and 226 Ra was also investigated, as well as the feasibility of using sunflower plants to remove uranium and radium from contaminated water, and by extension, their potential for phytoextraction. In this sense, the removal percentages obtained for natural uranium and 226 Ra were 24% and 42%, respectively. Practically all the uranium is accumulated in the roots. However, 86% of the 226 Ra activity concentration in roots was translocated to the aerial part

  5. Linearity assumption in soil-to-plant transfer factors of natural uranium and radium in Helianthus annuus L

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, P. Blanco [Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain); Tome, F. Vera [Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain)]. E-mail: fvt@unex.es; Fernandez, M. Perez [Area de Ecologia, Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain); Lozano, J.C. [Laboratorio de Radiactividad Ambiental, Facultad de Ciencias, Universidad de Salamanca, 37008 Salamanca (Spain)

    2006-05-15

    The linearity assumption of the validation of soil-to-plant transfer factors of natural uranium and {sup 226}Ra was tested using Helianthus annuus L. (sunflower) grown in a hydroponic medium. Transfer of natural uranium and {sup 226}Ra was tested in both the aerial fraction of plants and in the overall seedlings (roots and shoots). The results show that the linearity assumption can be considered valid in the hydroponic growth of sunflowers for the radionuclides studied. The ability of sunflowers to translocate uranium and {sup 226}Ra was also investigated, as well as the feasibility of using sunflower plants to remove uranium and radium from contaminated water, and by extension, their potential for phytoextraction. In this sense, the removal percentages obtained for natural uranium and {sup 226}Ra were 24% and 42%, respectively. Practically all the uranium is accumulated in the roots. However, 86% of the {sup 226}Ra activity concentration in roots was translocated to the aerial part.

  6. Social power and approach-related neural activity

    OpenAIRE

    Boksem, Maarten; Smolders, Ruud; Cremer, David

    2009-01-01

    textabstractIt has been argued that power activates a general tendency to approach whereas powerlessness activates a tendency to inhibit. The assumption is that elevated power involves reward-rich environments, freedom and, as a consequence, triggers an approach-related motivational orientation and attention to rewards. In contrast, reduced power is associated with increased threat, punishment and social constraint and thereby activates inhibition-related motivation. Moreover, approach motiva...

  7. Simulating residential demand response: Improving socio-technical assumptions in activity-based models of energy demand

    OpenAIRE

    McKenna, E.; Higginson, S.; Grunewald, P.; Darby, S. J.

    2017-01-01

    Demand response is receiving increasing interest as a new form of flexibility within low-carbon power systems. Energy models are an important tool to assess the potential capability of demand side contributions. This paper critically reviews the assumptions in current models and introduces a new conceptual framework to better facilitate such an assessment. We propose three dimensions along which change could occur, namely technology, activities and service expectations. Using this framework, ...

  8. A framework for ALWR economics

    International Nuclear Information System (INIS)

    Braun, C.

    1993-01-01

    This paper discusses three major factors that determine the relative competitiveness of future advanced light water reactors (ALWRS) in comparison with then-concurrent fossil generation alternatives. These factors include: (a) ALWR capital cost assumptions, (b) fossil fuel prices (particularly natural gas price), and (c) ownership and financing arrangements, i.e., utility or independent power producer (IPP) plant ownership, construction, and operation. By properly selecting the right menu of assumptions, e.g., a low capital cost ALWR IPP versus a high gas price utility owner combined-cycle plant, one can investigate the competitiveness of future ALWRs under various ownership environments and the sensitivity of ALWR economics to different cost assumptions. Using standardized assumptions for fossil plant financing arrangements under different ownerships and plant capital cost and fuel price data, it is possible to create various frameworks into which ALWR cost assumptions could be inserted to investigate relative nuclear/fossil economics

  9. Transmission dynamics of Bacillus thuringiensis infecting Plodia interpunctella: a test of the mass action assumption with an insect pathogen.

    Science.gov (United States)

    Knell, R J; Begon, M; Thompson, D J

    1996-01-22

    Central to theoretical studies of host-pathogen population dynamics is a term describing transmission of the pathogen. This usually assumes that transmission is proportional to the density of infectious hosts or particles and of susceptible individuals. We tested this assumption with the bacterial pathogen Bacillus thuringiensis infecting larvae of Plodia interpunctella, the Indian meal moth. Transmission was found to increase in a more than linear way with host density in fourth and fifth instar P. interpunctella, and to decrease with the density of infectious cadavers in the case of fifth instar larvae. Food availability was shown to play an important part in this process. Therefore, on a number of counts, the usual assumption was found not to apply in our experimental system.

  10. Relativity effects for space-based coherent lidar experiments

    Science.gov (United States)

    Gudimetla, V. S. Rao

    1996-01-01

    An effort was initiated last year in the Astrionics Laboratory at Marshall Space Flight Center to examine and incorporate, if necessary, the effects of relativity in the design of space-based lidar systems. A space-based lidar system, named AEOLUS, is under development at Marshall Space Flight Center and it will be used to accurately measure atmospheric wind profiles. Effects of relativity were also observed in the performance of space-based systems, for example in case of global positioning systems, and corrections were incorporated into the design of instruments. During the last summer, the effects of special relativity on the design of space-based lidar systems were studied in detail, by analyzing the problem of laser scattering off a fixed target when the source and a co-located receiver are moving on a spacecraft. Since the proposed lidar system uses a coherent detection system, errors even in the order of a few microradians must be corrected to achieve a good signal-to-noise ratio. Previous analysis assumed that the ground is flat and the spacecraft is moving parallel to the ground, and developed analytical expressions for the location, direction and Doppler shift of the returning radiation. Because of the assumptions used in that analysis, only special relativity effects were involved. In this report, that analysis is extended to include general relativity and calculate its effects on the design.

  11. Issues related to a programme of activities under the CDM

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, J.

    2006-05-15

    Emissions of CO2 from the energy and land-use change and forestry sectors are responsible for the majority of emissions in non-Annex I Parties to the UNFCCC. Tackling greenhouse gas (GHG) emissions from these sectors is a key to slowing the growth in GHG emissions in non-Annex I countries. Implementing Clean Development Mechanism (CDM) projects can help achieve this aim, while also assisting non-Annex I countries to move towards sustainable development and Annex I countries achieve their emission commitments under the Kyoto Protocol. There has been rapid progress in the CDM over the last year - in terms of the number of projects in the pipeline and registered, and in terms of credits issued. However, some important sectors are notable by their small share in the CDM portfolio. Several countries have also called attention to the need to accelerate the process of approving CDM methodologies and projects. In order to improve the effectiveness of the CDM to achieve its dual objectives, the COP/MOP agreed a decision on 'further guidance relating to the clean development mechanism. This decision lays out guidance on how to improve the operation of the CDM, and includes provisions that allow: (1) Bundling of project activities; and (2) Project activities under a programme of activities, to be registered as a CDM project activity. At present, of the 172 currently registered CDM project activities, 27 involve programmes or bundles. These project activities can include more than one project type, be implemented in several locations, and/or occur in more than one sector. This paper assesses how project activities under a programme of activities under the CDM (referred to here as PCDM) could help to increase the effectiveness of the CDM by encouraging a wide spread of emission mitigation activities. This paper also explores the key issues that may need to be considered for the PCDM concept to be further implemented. The paper concludes that: (1) Key concepts and issues

  12. TESTING THE ASSUMPTIONS AND INTERPRETING THE RESULTS OF THE RASCH MODEL USING LOG-LINEAR PROCEDURES IN SPSS

    NARCIS (Netherlands)

    TENVERGERT, E; GILLESPIE, M; KINGMA, J

    This paper shows how to use the log-linear subroutine of SPSS to fit the Rasch model. It also shows how to fit less restrictive models obtained by relaxing specific assumptions of the Rasch model. Conditional maximum likelihood estimation was achieved by including dummy variables for the total

  13. Factors underlying male and female use of violent video games

    OpenAIRE

    Hartmann, T.; Möller, I.; Krause, C.

    2015-01-01

    Research has consistently shown that males play violent video games more frequently than females, but factors underlying this gender gap have not been examined to date. This approach examines the assumption that males play violent video games more because they anticipate more enjoyment and less guilt from engaging in virtual violence than females. This may be because males are less empathetic, tend to morally justify physical violence more and have a greater need for sensation and aggression ...

  14. Washington International Renewable Energy Conference 2008 Pledges: Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, B.; Bilello, D. E.; Cowlin, S. C.; Mann, M.; Wise, A.

    2008-08-01

    The 2008 Washington International Renewable Energy Conference (WIREC) was held in Washington, D.C., from March 4-6, 2008, and involved nearly 9,000 people from 125 countries. The event brought together worldwide leaders in renewable energy (RE) from governments, international organizations, nongovernmental organizations, and the private sector to discuss the role that renewables can play in alleviating poverty, growing economies, and passing on a healthy planet to future generations. The conference concluded with more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy. The U.S. government authorized the National Renewable Energy Laboratory (NREL) to estimate the carbon dioxide (CO2) savings that would result from the pledges made at the 2008 conference. This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions derived from those pledges.

  15. Understanding how biodiversity unfolds through time under neutral theory.

    Science.gov (United States)

    Missa, Olivier; Dytham, Calvin; Morlon, Hélène

    2016-04-05

    Theoretical predictions for biodiversity patterns are typically derived under the assumption that ecological systems have reached a dynamic equilibrium. Yet, there is increasing evidence that various aspects of ecological systems, including (but not limited to) species richness, are not at equilibrium. Here, we use simulations to analyse how biodiversity patterns unfold through time. In particular, we focus on the relative time required for various biodiversity patterns (macroecological or phylogenetic) to reach equilibrium. We simulate spatially explicit metacommunities according to the Neutral Theory of Biodiversity (NTB) under three modes of speciation, which differ in how evenly a parent species is split between its two daughter species. We find that species richness stabilizes first, followed by species area relationships (SAR) and finally species abundance distributions (SAD). The difference in timing of equilibrium between these different macroecological patterns is the largest when the split of individuals between sibling species at speciation is the most uneven. Phylogenetic patterns of biodiversity take even longer to stabilize (tens to hundreds of times longer than species richness) so that equilibrium predictions from neutral theory for these patterns are unlikely to be relevant. Our results suggest that it may be unwise to assume that biodiversity patterns are at equilibrium and provide a first step in studying how these patterns unfold through time. © 2016 The Author(s).

  16. bcc transition metals under pressure: results from ultrasonic interferometry and diamond-cell experiments

    International Nuclear Information System (INIS)

    Katahara, K.W.; Manghnani, M.H.; Ming, L.C.; Fisher, E.S.

    1976-01-01

    Hydrostatic pressure derivatives of the single-crystal elastic moduli, dC/sub ij//dP, have been measured ultrasonically for b.c.c. Nb--Mo and Ta--W solid solutions. The composition dependence of various electronic properties of these alloys is known to be reasonably well approximated by a rigid-electron-band filling model where e/a, the electron per atom ratio, is the primary parameter. The results indicate that the elastic moduli and their pressure derivatives may also be calculated in such a model. In particular, the dC/sub ij//dP show relatively sharp increases at e/a compositions of 5.4 for Nb--Mo and 5.7 for Ta--W. Both compositions correspond to changes in Fermi surface topology, as deduced from existing band calculations and the rigid band assumption. The results are discussed in the light of related electronic properties and possible geophysical applications. A comparison is also made between ultrasonic results and X-ray diffraction data for Nb. Using diamond-anvil pressure cell, compression of Nb was determined by X-ray diffraction up to 55 kbar in a liquid medium under purely hydrostatic conditions, and up to 175 kbar in a solid medium under nonhydrostatic conditions. The data obtained under hydrostatic conditions agree well with the ultrasonic equation of state and shock wave data, whereas the nonhydrostatic results tend to imply either a higher bulk modulus K/sub s/ or a higher (par. deltaK/sub s//par. deltaP)/sub T/

  17. The Concept of a Neutral Starting Point in Thomistic Metaphysics and its Relationship to the Assumptive Character of Knowledge

    Directory of Open Access Journals (Sweden)

    Piotr Duchliński

    2015-12-01

    Full Text Available This article describes how certain Thomists understand the concept of a neutral (i.e. assumption free starting point, and outlines their arguments in favour of it. To be sure, within current epistemology, their position is considered outdated and unpopular: apart from Thomists, nobody would now argue that there is a privileged, assumption free starting point for philosophy. (After all, it is generally thought that any such thing would simply fail to yield determinate results, at least where human cognition or knowledge is concerned. This article, though, poses, and seeks to answer, the question of whether the Thomistic position is intended merely as a methodological commitment, or also as a stage in the pragmatic construction of a metaphysical theory. In the light of a discussion that confronts this with some points from contemporary philosophy of science concerning the theorizing of experience and the assumptive character of scientific knowledge, the author puts forward the following hypothesis as a point of departure for further inquiry: that philosophy itself be understood as a paradigm, in the sense that it be taken to function as a benchmark - a model responsible for organizing human experience as a whole. The considerations presented are of a theoretical-cognitive character, with no claims made regarding the issue of the correct starting point for philosophy.

  18. Live tissue imaging shows reef corals elevate pH under their calcifying tissue relative to seawater.

    Directory of Open Access Journals (Sweden)

    Alexander Venn

    Full Text Available The threat posed to coral reefs by changes in seawater pH and carbonate chemistry (ocean acidification raises the need for a better mechanistic understanding of physiological processes linked to coral calcification. Current models of coral calcification argue that corals elevate extracellular pH under their calcifying tissue relative to seawater to promote skeleton formation, but pH measurements taken from the calcifying tissue of living, intact corals have not been achieved to date. We performed live tissue imaging of the reef coral Stylophora pistillata to determine extracellular pH under the calcifying tissue and intracellular pH in calicoblastic cells. We worked with actively calcifying corals under flowing seawater and show that extracellular pH (pHe under the calicoblastic epithelium is elevated by ∼0.5 and ∼0.2 pH units relative to the surrounding seawater in light and dark conditions respectively. By contrast, the intracellular pH (pHi of the calicoblastic epithelium remains stable in the light and dark. Estimates of aragonite saturation states derived from our data indicate the elevation in subcalicoblastic pHe favour calcification and may thus be a critical step in the calcification process. However, the observed close association of the calicoblastic epithelium with the underlying crystals suggests that the calicoblastic cells influence the growth of the coral skeleton by other processes in addition to pHe modification. The procedure used in the current study provides a novel, tangible approach for future investigations into these processes and the impact of environmental change on the cellular mechanisms underpinning coral calcification.

  19. Analysis of the operator action and the single failure criteria in a SGTR sequence using best estimate assumptions with TRACE 5.0

    International Nuclear Information System (INIS)

    Jimenez, G.; Queral, C.; Rebollo-Mena, M.J.; Martínez-Murillo, J.C.; Lopez-Alonso, E.

    2013-01-01

    Highlights: ► Several worldwide SGTR deterministic safety analysis methodologies were analyze to study the different assumption made. ► The most relevant assumptions for operation action and single failure criteria are compared through TRACE simulations.► The offsite dose results showed the adequacy of the assumptions included in the analyzed methodologies. ► Two proposal of change for SGTR Emergency Operating Procedure are outlined to minimize the dose in case of a SGTR accident. - Abstract: Steam Generator Tube Rupture (SGTR) sequences in Pressurized Water Reactors are known to be one of the most demanding transients for the operating crew. SGTR are a special kind of transient as they could lead to radiological releases without core damage or containment failure, as they can constitute a direct path from the reactor coolant system to the environment. The first methodology used to perform the Deterministic Safety Analysis (DSA) of a SGTR did not credit the operator action for the first 30 min of the transient, assuming that the operating crew was able to stop the primary to secondary leakage within that period of time. However, the different real SGTR accident cases happened in the USA and over the world demonstrated that the operators usually take more than 30 min to stop the leakage in actual sequences. Some methodologies were raised to overcome that fact, considering operator actions from the beginning of the transient, as it is done in Probabilistic Safety Analysis. This paper presents the results of comparing different assumptions regarding the single failure criteria and the operator action taken from the most common methodologies included in the different Deterministic Safety Analysis. One single failure criteria that has not been analysed previously in the literature is proposed and analysed in this paper too. The comparison is done with a PWR Westinghouse three loop model in TRACE code (Almaraz NPP) with best estimate assumptions but including

  20. A hybrid Dantzig-Wolfe, Benders decomposition and column generation procedure for multiple diet production planning under uncertainties

    Science.gov (United States)

    Udomsungworagul, A.; Charnsethikul, P.

    2018-03-01

    This article introduces methodology to solve large scale two-phase linear programming with a case of multiple time period animal diet problems under both nutrients in raw materials and finished product demand uncertainties. Assumption of allowing to manufacture multiple product formulas in the same time period and assumption of allowing to hold raw materials and finished products inventory have been added. Dantzig-Wolfe decompositions, Benders decomposition and Column generations technique has been combined and applied to solve the problem. The proposed procedure was programmed using VBA and Solver tool in Microsoft Excel. A case study was used and tested in term of efficiency and effectiveness trade-offs.