WorldWideScience

Sample records for non-predictive control assumption

  1. Catalyst Deactivation: Control Relevance of Model Assumptions

    Directory of Open Access Journals (Sweden)

    Bernt Lie

    2000-10-01

    Full Text Available Two principles for describing catalyst deactivation are discussed, one based on the deactivation mechanism, the other based on the activity and catalyst age distribution. When the model is based upon activity decay, it is common to use a mean activity developed from the steady-state residence time distribution. We compare control-relevant properties of such an approach with those of a model based upon the deactivation mechanism. Using a continuous stirred tank reactor as an example, we show that the mechanistic approach and the population balance approach lead to identical models. However, common additional assumptions used for activity-based models lead to model properties that may deviate considerably from the correct one.

  2. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  3. SPC without Control Limits and Normality Assumption: A New Method

    Science.gov (United States)

    Vazquez-Lopez, J. A.; Lopez-Juarez, I.

    Control Charts (CC) are important Statistic Process Control (SPC) tools developed in the 20's to control and improve the quality of industrial production. The use of CC requires visual inspection and human judgement to diagnoses the process quality properly. CC assume normal distribution in the observed variables to establish the control limits. However, this is a requirement difficult to meet in practice since skewness distributions are commonly observed. In this research, a novel method that neither requires control limits nor data normality is presented. The core of the method is based on the FuzzyARTMAP (FAM) Artificial Neural Network (ANN) that learns special and non-special patterns of variation and whose internal parameters are determined through experimental design to increase its efficiency. The proposed method was implemented successfully in a manufacturing process determining the statistical control state that validate our method.

  4. Controlling in networking organisations – the concept and assumptions

    Directory of Open Access Journals (Sweden)

    Bieńkowska Agnieszka

    2014-05-01

    Full Text Available W artykule scharakteryzowano istotę i cechy organizacji sieciowej. W kontekście specyficznego sposobu współpracy miedzy organizacjami - partnerami w sieci wskazano dużą potrzebę koordynacji działań poszczególnych podmiotów w celu realizacji współuzgodnionych zamierzeń i zaproponowano controlling jako metodę wspierającą sprawne zarządzanie organizacją sieciową. Przedstawiono ewolucję koncepcji controllingu od controllingu strategicznego, przez controlling partnerski w stronę controllingu w organizacjach sieciowych. Zdefi niowano pojęcie i zadania controllingu w organizacjach sieciowych (controllingu sieciowego. Nakreślono zarys jego rozwiązań funkcjonalnych, organizacyjnych i instrumentalnych.

  5. Predicting behaviour from perceived behavioural control: tests of the accuracy assumption of the theory of planned behaviour.

    Science.gov (United States)

    Sheeran, Paschal; Trafimow, David; Armitage, Christopher J

    2003-09-01

    The theory of planned behaviour assumes that the accuracy of perceived behavioural control (PBC) determines the strength of the PBC-behaviour relationship. However, this assumption has never been formally tested. The present research developed and validated a proxy measure of actual control (PMAC) in order to test the assumption. In two studies, participants completed measures of intention and PBC, and subsequently completed measures of behaviour and the PMAC. Validity of the PMAC was established by findings showing; (a). that the PMAC moderated the intention-behaviour relation, and (b). that PMAC scores did not reflect attributions for participants' failure to enact their stated intentions. Accuracy was operationalized as the difference between PBC and PMAC scores. Consistent with theoretical expectations, several analyses indicated that greater accuracy of PBC was associated with improved prediction of behaviour by PBC.

  6. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  7. Sample Size Calculation: Inaccurate A Priori Assumptions for Nuisance Parameters Can Greatly Affect the Power of a Randomized Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Elsa Tavernier

    Full Text Available We aimed to examine the extent to which inaccurate assumptions for nuisance parameters used to calculate sample size can affect the power of a randomized controlled trial (RCT. In a simulation study, we separately considered an RCT with continuous, dichotomous or time-to-event outcomes, with associated nuisance parameters of standard deviation, success rate in the control group and survival rate in the control group at some time point, respectively. For each type of outcome, we calculated a required sample size N for a hypothesized treatment effect, an assumed nuisance parameter and a nominal power of 80%. We then assumed a nuisance parameter associated with a relative error at the design stage. For each type of outcome, we randomly drew 10,000 relative errors of the associated nuisance parameter (from empirical distributions derived from a previously published review. Then, retro-fitting the sample size formula, we derived, for the pre-calculated sample size N, the real power of the RCT, taking into account the relative error for the nuisance parameter. In total, 23%, 0% and 18% of RCTs with continuous, binary and time-to-event outcomes, respectively, were underpowered (i.e., the real power was 90%. Even with proper calculation of sample size, a substantial number of trials are underpowered or overpowered because of imprecise knowledge of nuisance parameters. Such findings raise questions about how sample size for RCTs should be determined.

  8. Environment Assumptions for Synthesis

    CERN Document Server

    Chatterjee, Krishnendu; Jobstmann, Barbara

    2008-01-01

    The synthesis problem asks to construct a reactive finite-state system from an $\\omega$-regular specification. Initial specifications are often unrealizable, which means that there is no system that implements the specification. A common reason for unrealizability is that assumptions on the environment of the system are incomplete. We study the problem of correcting an unrealizable specification $\\phi$ by computing an environment assumption $\\psi$ such that the new specification $\\psi\\to\\phi$ is realizable. Our aim is to construct an assumption $\\psi$ that constrains only the environment and is as weak as possible. We present a two-step algorithm for computing assumptions. The algorithm operates on the game graph that is used to answer the realizability question. First, we compute a safety assumption that removes a minimal set of environment edges from the graph. Second, we compute a liveness assumption that puts fairness conditions on some of the remaining environment edges. We show that the problem of findi...

  9. Disastrous assumptions about community disasters

    Energy Technology Data Exchange (ETDEWEB)

    Dynes, R.R. [Univ. of Delaware, Newark, DE (United States). Disaster Research Center

    1995-12-31

    Planning for local community disasters is compounded with erroneous assumptions. Six problematic models are identified: agent facts, big accident, end of the world, media, command and control, administrative. Problematic assumptions in each of them are identified. A more adequate model centered on problem solving is identified. That there is a discrepancy between disaster planning efforts and the actual response experience seems rather universal. That discrepancy is symbolized by the graffiti which predictably surfaces on many walls in post disaster locations -- ``First the earthquake, then the disaster.`` That contradiction is seldom reduced as a result of post disaster critiques, since the most usual conclusion is that the plan was adequate but the ``people`` did not follow it. Another explanation will be provided here. A more plausible explanation for failure is that most planning efforts adopt a number of erroneous assumptions which affect the outcome. Those assumptions are infrequently changed or modified by experience.

  10. The Control-Value Theory of Achievement Emotions: Assumptions, Corollaries, and Implications for Educational Research and Practice

    Science.gov (United States)

    Pekrun, Reinhard

    2006-01-01

    This article describes the control-value theory of achievement emotions and its implications for educational research and practice. The theory provides an integrative framework for analyzing the antecedents and effects of emotions experienced in achievement and academic settings. It is based on the premise that appraisals of control and values are…

  11. Methodology and Assumptions of Contingency Shuttle Crew Support (CSCS) Calculations Using ISS Environmental Control and Life Support Systems

    Science.gov (United States)

    Prokhorov, Kimberlee; Shkedi, Brienne

    2006-01-01

    The current International Space Station (ISS) Environmental Control and Life Support (ECLS) system is designed to support an ISS crew size of three people. The capability to expand that system to support nine crew members during a Contingency Shuttle Crew Support (CSCS) scenario has been evaluated. This paper describes how the ISS ECLS systems may be operated for supporting CSCS, and the durations expected for the oxygen supply and carbon dioxide control subsystems.

  12. Implications of alternative assumptions regarding future air pollution control in scenarios similar to the Representative Concentration Pathways

    NARCIS (Netherlands)

    Chuwah, C.; van Noije, T.; van Vuuren, D.P.; Hazeleger, W.; Strunk, A.; Deetman, S.; Beltran, A.M.; van Vliet, J.

    2013-01-01

    The uncertain, future development of emissions of short-lived trace gases and aerosols forms a key factor for future air quality and climate forcing. The Representative Concentration Pathways (RCPs) only explore part of this range as they all assume that worldwide ambitious air pollution control pol

  13. Implications of alternative assumptions regarding future air pollution control in scenarios similar to the Representative Concentration Pathways

    NARCIS (Netherlands)

    Chuwah, C.D.; Noije, van T.; Vuuren, van D.P.; Hazeleger, W.; Strunk, A.; Deetman, S.; Beltran, A.M.; Vliet, van de J.

    2013-01-01

    The uncertain, future development of emissions of short-lived trace gases and aerosols forms a key factor for future air quality and climate forcing. The Representative Concentration Pathways (RCPs) only explore part of this range as they all assume that worldwide ambitious air pollution control

  14. Implications of alternative assumptions regarding future air pollution control in scenarios similar to the Representative Concentration Pathways

    NARCIS (Netherlands)

    Chuwah, C.; van Noije, T.; van Vuuren, D.P.; Hazeleger, W.; Strunk, A.; Deetman, S.; Beltran, A.M.; van Vliet, J.

    2013-01-01

    The uncertain, future development of emissions of short-lived trace gases and aerosols forms a key factor for future air quality and climate forcing. The Representative Concentration Pathways (RCPs) only explore part of this range as they all assume that worldwide ambitious air pollution control pol

  15. Advanced lung ventilation system (ALVS) with linear respiratory mechanics assumption for waveform optimization of dual-controlled ventilation.

    Science.gov (United States)

    Montecchia, F; Guerrisi, M; Canichella, A

    2007-03-01

    The present paper describes the functional features of an advanced lung ventilation system (ALVS) properly designed for the optimization of conventional dual-controlled ventilation (DCV), i.e. with pressure-controlled ventilation with ensured tidal or minute volume. Considering the particular clinical conditions of patients treated with controlled ventilation the analysis and synthesis of ALVS control have been performed assuming a linear respiratory mechanics. Moreover, new airways pressure waveforms with more physiological shape can be tested on simulators of respiratory system in order to evaluate their clinical application. This is obtained through the implementation of a compensation procedure making the desired airways pressure waveform independent on patient airways resistance and lung compliance variations along with a complete real-time monitoring of respiratory system parameters leading the ventilator setting. The experimental results obtained with a lung simulator agree with the theoretical ones and show that ALVS performance is useful for the research activity aiming at the improvement of both diagnostic evaluation and therapeutic outcome relative to mechanical ventilation treatments.

  16. Calibration plot for proteomics: A graphical tool to visually check the assumptions underlying FDR control in quantitative experiments.

    Science.gov (United States)

    Giai Gianetto, Quentin; Combes, Florence; Ramus, Claire; Bruley, Christophe; Couté, Yohann; Burger, Thomas

    2016-01-01

    In MS-based quantitative proteomics, the FDR control (i.e. the limitation of the number of proteins that are wrongly claimed as differentially abundant between several conditions) is a major postanalysis step. It is classically achieved thanks to a specific statistical procedure that computes the adjusted p-values of the putative differentially abundant proteins. Unfortunately, such adjustment is conservative only if the p-values are well-calibrated; the false discovery control being spuriously underestimated otherwise. However, well-calibration is a property that can be violated in some practical cases. To overcome this limitation, we propose a graphical method to straightforwardly and visually assess the p-value well-calibration, as well as the R codes to embed it in any pipeline. All MS data have been deposited in the ProteomeXchange with identifier PXD002370 (http://proteomecentral.proteomexchange.org/dataset/PXD002370).

  17. Linking assumptions in amblyopia

    Science.gov (United States)

    LEVI, DENNIS M.

    2017-01-01

    Over the last 35 years or so, there has been substantial progress in revealing and characterizing the many interesting and sometimes mysterious sensory abnormalities that accompany amblyopia. A goal of many of the studies has been to try to make the link between the sensory losses and the underlying neural losses, resulting in several hypotheses about the site, nature, and cause of amblyopia. This article reviews some of these hypotheses, and the assumptions that link the sensory losses to specific physiological alterations in the brain. Despite intensive study, it turns out to be quite difficult to make a simple linking hypothesis, at least at the level of single neurons, and the locus of the sensory loss remains elusive. It is now clear that the simplest notion—that reduced contrast sensitivity of neurons in cortical area V1 explains the reduction in contrast sensitivity—is too simplistic. Considerations of noise, noise correlations, pooling, and the weighting of information also play a critically important role in making perceptual decisions, and our current models of amblyopia do not adequately take these into account. Indeed, although the reduction of contrast sensitivity is generally considered to reflect “early” neural changes, it seems plausible that it reflects changes at many stages of visual processing. PMID:23879956

  18. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  19. Influence of methylphenidate treatment assumptions on cognitive function in healthy young adults in a double-blind, placebo-controlled trial

    Directory of Open Access Journals (Sweden)

    Mommaerts JL

    2013-08-01

    Full Text Available Jean-Luc Mommaerts,1 Gerlinde Beerens,1 Lieve Van den Block,1 Eric Soetens,2 Sandrina Schol,1 Erwin Van De Vijver,1 Dirk Devroey1 1Department of Family Medicine, 2Department of Cognitive and Biological Psychology, Vrije Universiteit Brussel, Belgium Background: Increasing numbers of students use stimulants such as methylphenidate (MPH to improve their study capacity, making them prone to subsequent prolonged drug abuse. This study explored the cognitive effects of MPH in students who either assumed they received MPH or assumed they received a placebo. Methods: In a double-blind, randomized, placebo-controlled trial with a between-subjects design, 21 students were subjected to partial sleep deprivation, receiving no more than 4 hours sleep the night before they were tested. In the morning, they were given either a placebo or 20 mg of MPH. They then performed free recall verbal tests and Go/No-Go tasks repeatedly, their moods were evaluated using Profile of Mood States and their tiredness was assessed using a visual analog scale, with evaluation of vigilance. Results: No significant differences were found between those subjects who received MPH and those who received a placebo. However, significant differences were found between subjects who assumed they had received MPH or had no opinion, and those who assumed they had received a placebo. At three minutes, one hour, and one day after memorizing ten lists of 20 words, those who assumed they had received MPH recalled 54%, 58%, and 54% of the words, respectively, whereas those who assumed they had received placebo only recalled 35%, 37%, and 34%. Conclusion: Healthy, partially sleep-deprived young students who assume they have received 20 mg of MPH experience a substantial placebo effect that improves consolidation of information into long-term memory. This is independent of any pharmacologic effects of MPH, which had no significant effects on verbal memory in this study. This information may be

  20. Examining Computational Assumptions For Godiva IV

    Energy Technology Data Exchange (ETDEWEB)

    Kirkland, Alexander Matthew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Jaegers, Peter James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-11

    Over the course of summer 2016, the effects of several computational modeling assumptions with respect to the Godiva IV reactor were examined. The majority of these assumptions pertained to modeling errors existing in the control rods and burst rod. The Monte Carlo neutron transport code, MCNP, was used to investigate these modeling changes, primarily by comparing them to that of the original input deck specifications.

  1. Testing an assumption of the E-Z Reader model of eye-movement control during reading: Using event-related potentials to examine the familiarity check

    NARCIS (Netherlands)

    Reichle, E.D.; Tokowicz, N.; Liu, Y.; Perfetti, C.A.

    2011-01-01

    According to the E-Z Reader model of eye-movement control, the completion of an early stage of lexical processing, the familiarity check, causes the eyes to move forward during reading (Reichle, Pollatsek, Fisher, & Rayner, 1998). Here, we report an event-related potential (ERP

  2. Test of Poisson Failure Assumption.

    Science.gov (United States)

    1982-09-01

    o. ....... 37 00/ D itlr.: DVI r TEST OF POISSON FAILURE ASSUMPTION Chapter 1. INTRODUCTION 1.1 Background. In stockage models... precipitates a regular failure pattern; it is also possible that the coding of scheduled vs unscheduled does not reflect what we would expect. Data

  3. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  4. Testing an assumption of the E-Z Reader model of eye-movement control during reading: using event-related potentials to examine the familiarity check.

    Science.gov (United States)

    Reichle, Erik D; Tokowicz, Natasha; Liu, Ying; Perfetti, Charles A

    2011-07-01

    According to the E-Z Reader model of eye-movement control, the completion of an early stage of lexical processing, the familiarity check, causes the eyes to move forward during reading (Reichle, Pollatsek, Fisher, & Rayner, 1998). Here, we report an event-related potential (ERP) experiment designed to examine the hypothesized familiarity check at the electrophysiological level. The results indicate ERP components modulated by word frequency at the time of the predicted familiarity check. These findings are consistent with the hypothesis that an early stage of lexical processing is linked to the "decisions" about when to move the eyes during reading. Copyright © 2011 Society for Psychophysiological Research.

  5. Modern Cosmology: Assumptions and Limits

    Science.gov (United States)

    Hwang, Jai-Chan

    2012-06-01

    Physical cosmology tries to understand the Universe at large with its origin and evolution. Observational and experimental situations in cosmology do not allow us to proceed purely based on the empirical means. We examine in which sense our cosmological assumptions in fact have shaped our current cosmological worldview with consequent inevitable limits. Cosmology, as other branches of science and knowledge, is a construct of human imagination reflecting the popular belief system of the era. The question at issue deserves further philosophic discussions. In Whitehead's words, ``philosophy, in one of its functions, is the critic of cosmologies.'' (Whitehead 1925).

  6. Modern Cosmology: Assumptions and Limits

    CERN Document Server

    Hwang, Jai-chan

    2012-01-01

    Physical cosmology tries to understand the Universe at large with its origin and evolution. Observational and experimental situations in cosmology do not allow us to proceed purely based on the empirical means. We examine in which sense our cosmological assumptions in fact have shaped our current cosmological worldview with consequent inevitable limits. Cosmology, as other branches of science and knowledge, is a construct of human imagination reflecting the popular belief system of the era. The question at issue deserves further philosophic discussions. In Whitehead's words, "philosophy, in one of its functions, is the critic of cosmologies". (Whitehead 1925)

  7. Challenged assumptions and invisible effects

    DEFF Research Database (Denmark)

    Wimmelmann, Camilla Lawaetz; Vitus, Kathrine; Jervelund, Signe Smith

    2017-01-01

    for the implementation—different from the assumed conditions—not only challenge the implementation of the intervention but also potentially produce unanticipated yet valuable effects. Research implications – Newly arrived immigrants represent a hugely diverse and heterogeneous group of people with differing values...... of two complete intervention courses and an analysis of the official intervention documents. Findings – This case study exemplifies how the basic normative assumptions behind an immigrant-oriented intervention and the intrinsic power relations therein may be challenged and negotiated by the participants....... In particular, the assumed (power) relations inherent in immigrant-oriented educational health interventions, in which immigrants are in a novice position, are challenged, as the immigrants are experienced adults (and parents) in regard to healthcare. The paper proposes that such unexpected conditions...

  8. Faulty assumptions for repository requirements

    Energy Technology Data Exchange (ETDEWEB)

    Sutcliffe, W G

    1999-06-03

    Long term performance requirements for a geologic repository for spent nuclear fuel and high-level waste are based on assumptions concerning water use and subsequent deaths from cancer due to ingesting water contaminated with radio isotopes ten thousand years in the future. This paper argues that the assumptions underlying these requirements are faulty for a number of reasons. First, in light of the inevitable technological progress, including efficient desalination of water, over the next ten thousand years, it is inconceivable that a future society would drill for water near a repository. Second, even today we would not use water without testing its purity. Third, today many types of cancer are curable, and with the rapid progress in medical technology in general, and the prevention and treatment of cancer in particular, it is improbable that cancer caused by ingesting contaminated water will be a sign&ant killer in the far future. This paper reviews the performance requirements for geological repositories and comments on the difficulties in proving compliance in the face of inherent uncertainties. The already tiny long-term risk posed by a geologic repository is presented and contrasted with contemporary every day risks. A number of examples of technological progress, including cancer treatments, are advanced. The real and significant costs resulting from the overly conservative requirements are then assessed. Examples are given of how money (and political capital) could be put to much better use to save lives today and in the future. It is concluded that although a repository represents essentially no long-term risk, monitored retrievable dry storage (above or below ground) is the current best alternative for spent fuel and high-level nuclear waste.

  9. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption. In this ......Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption....... In this paper we investigate a method for testing the mar assumption in the presence of other distributional constraints. We present methods to (approximately) compute a test statistic consisting of the ratio of two profile likelihood functions. This requires the optimization of the likelihood under...

  10. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other appl...

  11. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption....... In this paper we investigate a method for testing the mar assumption in the presence of other distributional constraints. We present methods to (approximately) compute a test statistic consisting of the ratio of two profile likelihood functions. This requires the optimization of the likelihood under...... no assumptionson the missingness mechanism, for which we use our recently proposed AI \\& M algorithm. We present experimental results on synthetic data that show that our approximate test statistic is a good indicator for whether data is mar relative to the given distributional assumptions....

  12. Beyond the crystal ball assumption

    DEFF Research Database (Denmark)

    Vaucouleur, Sebastien

    2008-01-01

    trades control for flexibility. Unfortunately, it also makes the customized software product very sensitive to upgrades. We propose a more mitigated solution, that does not require accurate anticipation and yet offers some resilience to evolution of the base software product through the use of code...... quantification. We introduce the Eggther framework for customization of evolvable software products in general and ERP systems in particular. Our approach is based on the concept of code query by example. The technology being developed is based on an initial empirical study on practices around ERP systems. We...... motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the upgrade problem....

  13. The Self in Guidance: Assumptions and Challenges.

    Science.gov (United States)

    Edwards, Richard; Payne, John

    1997-01-01

    Examines the assumptions of "self" made in the professional and managerial discourses of guidance. Suggests that these assumptions obstruct the capacity of guidance workers to explain their own practices. Drawing on contemporary debates over identity, modernity, and postmodernity, argues for a more explicit debate about the self in guidance. (RJM)

  14. Assumptions of Multiple Regression: Correcting Two Misconceptions

    Directory of Open Access Journals (Sweden)

    Matt N. Williams

    2013-09-01

    Full Text Available In 2002, an article entitled - Four assumptions of multiple regression that researchers should always test- by.Osborne and Waters was published in PARE. This article has gone on to be viewed more than 275,000 times.(as of August 2013, and it is one of the first results displayed in a Google search for - regression.assumptions- . While Osborne and Waters' efforts in raising awareness of the need to check assumptions.when using regression are laudable, we note that the original article contained at least two fairly important.misconceptions about the assumptions of multiple regression: Firstly, that multiple regression requires the.assumption of normally distributed variables; and secondly, that measurement errors necessarily cause.underestimation of simple regression coefficients. In this article, we clarify that multiple regression models.estimated using ordinary least squares require the assumption of normally distributed errors in order for.trustworthy inferences, at least in small samples, but not the assumption of normally distributed response or.predictor variables. Secondly, we point out that regression coefficients in simple regression models will be.biased (toward zero estimates of the relationships between variables of interest when measurement error is.uncorrelated across those variables, but that when correlated measurement error is present, regression.coefficients may be either upwardly or downwardly biased. We conclude with a brief corrected summary of.the assumptions of multiple regression when using ordinary least squares.

  15. Wrong assumptions in the financial crisis

    NARCIS (Netherlands)

    Aalbers, M.B.

    2009-01-01

    Purpose - The purpose of this paper is to show how some of the assumptions about the current financial crisis are wrong because they misunderstand what takes place in the mortgage market. Design/methodology/approach - The paper discusses four wrong assumptions: one related to regulation, one to leve

  16. New Cryptosystem Using Multiple Cryptographic Assumptions

    Directory of Open Access Journals (Sweden)

    E. S. Ismail

    2011-01-01

    Full Text Available Problem statement: A cryptosystem is a way for a sender and a receiver to communicate digitally by which the sender can send receiver any confidential or private message by first encrypting it using the receiver’s public key. Upon receiving the encrypted message, the receiver can confirm the originality of the message’s contents using his own secret key. Up to now, most of the existing cryptosystems were developed based on a single cryptographic assumption like factoring, discrete logarithms, quadratic residue or elliptic curve discrete logarithm. Although these schemes remain secure today, one day in a near future they may be broken if one finds a polynomial algorithm that can efficiently solve the underlying cryptographic assumption. Approach: By this motivation, we designed a new cryptosystem based on two cryptographic assumptions; quadratic residue and discrete logarithms. We integrated these two assumptions in our encrypting and decrypting equations so that the former depends on one public key whereas the latter depends on one corresponding secret key and two secret numbers. Each of public and secret keys in our scheme determines the assumptions we use. Results: The newly developed cryptosystem is shown secure against the three common considering algebraic attacks using a heuristic security technique. The efficiency performance of our scheme requires 2Texp+2Tmul +Thash time complexity for encryption and Texp+2Tmul+Tsrt time complexity for decryption and this magnitude of complexity is considered minimal for multiple cryptographic assumptions-like cryptosystems. Conclusion: The new cryptosystem based on multiple cryptographic assumptions offers a greater security level than that schemes based on a single cryptographic assumption. The adversary has to solve the two assumptions simultaneously to recover the original message from the received corresponding encrypted message but this is very unlikely to happen.

  17. Planning the unknown: the simultaneity of predictive and non-predictive entrepreneurial strategies

    NARCIS (Netherlands)

    Kraaijenbrink, Jeroen; Ratinho, Tiago; Groen, Arend J.

    2012-01-01

    Two distinct approaches have emerged to categorize entrepreneurial strategies. While some argue that planning is beneficial for entrepreneurs, a growing body of literature argues that non-predictive strategies can also lead to successful outcomes. The effectuation framework gained attention and it i

  18. A Cmparison of Closed World Assumptions

    Institute of Scientific and Technical Information of China (English)

    沈一栋

    1992-01-01

    In this Paper.we introduce a notion of the family of closed world assumptions and compare several well-known closed world approaches in the family to the extent to whic an incomplete database is com pleted.

  19. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record

  20. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  1. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    2008-01-01

    thinking about large production and manufacturing organisations (particularly in America) in the late industrial age. Many of the difficulties reported with CMMI can be attributed basing practice on these assumptions in organisations which have different cultures and management traditions, perhaps...... in different countries operating different economic and social models. Characterizing CMMI in this way opens the door to another question: are there other sets of organisational and management assumptions which would be better suited to other types of organisations operating in other cultural contexts?...

  2. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  3. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    2008-01-01

    in different countries operating different economic and social models. Characterizing CMMI in this way opens the door to another question: are there other sets of organisational and management assumptions which would be better suited to other types of organisations operating in other cultural contexts?...

  4. Mexican-American Cultural Assumptions and Implications.

    Science.gov (United States)

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  5. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and facilit

  6. Mexican-American Cultural Assumptions and Implications.

    Science.gov (United States)

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  7. Culturally Biased Assumptions in Counseling Psychology

    Science.gov (United States)

    Pedersen, Paul B.

    2003-01-01

    Eight clusters of culturally biased assumptions are identified for further discussion from Leong and Ponterotto's (2003) article. The presence of cultural bias demonstrates that cultural bias is so robust and pervasive that is permeates the profession of counseling psychology, even including those articles that effectively attack cultural bias…

  8. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  9. The OPERA hypothesis: assumptions and clarifications.

    Science.gov (United States)

    Patel, Aniruddh D

    2012-04-01

    Recent research suggests that musical training enhances the neural encoding of speech. Why would musical training have this effect? The OPERA hypothesis proposes an answer on the basis of the idea that musical training demands greater precision in certain aspects of auditory processing than does ordinary speech perception. This paper presents two assumptions underlying this idea, as well as two clarifications, and suggests directions for future research.

  10. On distributional assumptions and whitened cosine similarities

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Recently, an interpretation of the whitened cosine similarity measure as a Bayes decision rule was proposed (C. Liu, "The Bayes Decision Rule Induced Similarity Measures,'' IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 6, pp. 1086-1090, June 2007. This communication makes th...... the observation that some of the distributional assumptions made to derive this measure are very restrictive and, considered simultaneously, even inconsistent....

  11. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  12. Closed World Assumption for Disjunctive Reasoning

    Institute of Scientific and Technical Information of China (English)

    WANG Kewen; ZHOU Lizhu

    2001-01-01

    In this paper, the relationship between argumentation and closed world reasoning for disjunctive information is studied. In particular, the authors propose a simple and intuitive generalization of the closed world assumption (CWA) for general disjunctive deductive databases (with default negation). This semantics,called DCWA, allows a natural argumentation-based interpretation and can be used to represent reasoning for disjunctive information. We compare DCWA with GCWA and prove that DCWA extends Minker's GCWA to the class of disjunctive databases with default negation. Also we compare our semantics with some related approaches.In addition, the computational complexity of DCWA is investigated.

  13. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  14. 39 Questionable Assumptions in Modern Physics

    Science.gov (United States)

    Volk, Greg

    2009-03-01

    The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.

  15. Inference and Assumption in Historical Seismology

    Science.gov (United States)

    Musson, R. M. W.

    The principal aim in studies of historical earthquakes is usually to be able to derive parameters for past earthquakes from macroseismic or other data and thus extend back in time parametric earthquake catalogues, often with improved seismic hazard studies as the ultimate goal. In cases of relatively recent historical earthquakes, for example, those of the 18th and 19th centuries, it is often the case that there is such an abundance of available macroseismic data that estimating earthquake parameters is relatively straightforward. For earlier historical periods, especially medieval and earlier, and also for areas where settlement or documentation are sparse, the situation is much harder. The seismologist often finds that he has only a few data points (or even one) for an earthquake that nevertheless appears to be regionally significant.In such cases, it is natural that the investigator will attempt to make the most of the available data, expanding it by making working assumptions, and from these deriving conclusions by inference (i.e. the process of proceeding logically from some premise). This can be seen in a number of existing studies; in some cases extremely slight data are so magnified by the use of inference that one must regard the results as tentative in the extreme. Two main types of inference can be distinguished. The first type is inference from documentation. This is where assumptions are made such as: the absence of a report of the earthquake from this monastic chronicle indicates that at this locality the earthquake was not felt. The second type is inference from seismicity. Here one deals with arguments such as all recent earthquakes felt at town X are events occurring in seismic zone Y, therefore this ancient earthquake which is only reported at town X probably also occurred in this zone.

  16. Posttraumatic Growth and Shattered World Assumptions Among Ex-POWs

    DEFF Research Database (Denmark)

    Lahav, Y.; Bellin, Elisheva S.; Solomon, Z.

    2016-01-01

    PTG and WAs. Method: Former prisoners of war (ex-POWs; n = 158) and comparable controls (n = 106) were assessed 38 years after the Yom Kippur War. Results: Ex-POWs endorsed more negative WAs and higher PTG and dissociation compared to controls. Ex-POWs with posttraumatic stress disorder (PTSD...... world assumptions (WAs) and that the co-occurrence of high PTG and negative WAs among trauma survivors reflects reconstruction of an integrative belief system. The present study aimed to test these claims by investigating, for the first time, the mediating role of dissociation in the relation between......) endorsed negative WAs and a higher magnitude of PTG and dissociation, compared to both ex-POWs without PTSD and controls. WAs were negatively correlated with dissociation and positively correlated with PTG. PTG was positively correlated with dissociation. Moreover, dissociation fully mediated...

  17. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  18. Roy's specific life values and the philosophical assumption of humanism.

    Science.gov (United States)

    Hanna, Debra R

    2013-01-01

    Roy's philosophical assumption of humanism, which is shaped by the veritivity assumption, is considered in terms of her specific life values and in contrast to the contemporary view of humanism. Like veritivity, Roy's philosophical assumption of humanism unites a theocentric focus with anthropological values. Roy's perspective enriches the mainly secular, anthropocentric assumption. In this manuscript, the basis for Roy's perspective of humanism will be discussed so that readers will be able to use the Roy adaptation model in an authentic manner.

  19. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, Rampal S.; Alonso, David; McKane, Alan J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the a

  20. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, R.S.; Alonso, D.; McKane, A.J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  1. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  2. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  3. Economic Growth Assumptions in Climate and Energy Policy

    Directory of Open Access Journals (Sweden)

    Nir Y. Krakauer

    2014-03-01

    Full Text Available The assumption that the economic growth seen in recent decades will continue has dominated the discussion of future greenhouse gas emissions and the mitigation of and adaptation to climate change. Given that long-term economic growth is uncertain, the impacts of a wide range of growth trajectories should be considered. In particular, slower economic growth would imply that future generations will be relatively less able to invest in emissions controls or adapt to the detrimental impacts of climate change. Taking into consideration the possibility of economic slowdown therefore heightens the urgency of reducing greenhouse gas emissions now by moving to renewable energy sources, even if this incurs short-term economic cost. I quantify this counterintuitive impact of economic growth assumptions on present-day policy decisions in a simple global economy-climate model (Dynamic Integrated model of Climate and the Economy (DICE. In DICE, slow future growth increases the economically optimal present-day carbon tax rate and the utility of taxing carbon emissions, although the magnitude of the increase is sensitive to model parameters, including the rate of social time preference and the elasticity of the marginal utility of consumption. Future scenario development should specifically include low-growth scenarios, and the possibility of low-growth economic trajectories should be taken into account in climate policy analyses.

  4. Decision-Theoretic Planning: Structural Assumptions and Computational Leverage

    CERN Document Server

    Boutilier, C; Hanks, S; 10.1613/jair.575

    2011-01-01

    Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives adopted in these areas often differ in substantial ways, many planning problems of interest to researchers in these fields can be modeled as Markov decision processes (MDPs) and analyzed using the techniques of decision theory. This paper presents an overview and synthesis of MDP-related methods, showing how they provide a unifying framework for modeling many classes of planning problems studied in AI. It also describes structural properties of MDPs that, when exhibited by particular classes of problems, can be exploited in the construction of optimal or approximately optimal policies or plans. Planning problems commonly possess structure in the reward and value functions used to describe performance criteria, in the...

  5. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    CERN Document Server

    Côté, Benoit; Ritter, Christian; Herwig, Falk; Venn, Kim A

    2016-01-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of Type Ia supernovae and the strength of gal...

  6. Correlations and Non-predictability in the Time Evolution of Earthquake Ruptures

    Science.gov (United States)

    Elkhoury, J. E.; Knopoff, L.

    2007-12-01

    The characterization of the time evolution of ruptures is one of the important aspects of the earthquake process. What makes a rupture, that starts small, to become a big one or end very quickly resulting in a small earthquake is central to understanding the physics of the time evolution of ruptures. Establishing whether there are any correlations in time, between the initiation of the rupture and its ultimate size, is a step in the right direction. Here, we analyze three source-time function data sets. The first is produced by the generation of repeated rupture events on a 2D heterogeneous, in-plane, dynamical model, while the second is produced by an-age dependent critical branching model. The third is the source-time function data base of Ruff [1]. We formulate the problem in terms of two questions. 1) Are there any correlations between the moment release at the beginning of the rupture and the total moment release during the entire rupture? 2) Can we predict the final size of an earthquake, once it has started and without any a posteriori information, by just knowing the moment release up to a certain time τ? Using the three data bases, the answer to the first question is yes and no to the second. The longer τ is, the stronger the correlations are between what goes on at the initiation and the final size. But, for τ fixed, and not a major fraction of the rupture time, there is no predictability of the rupture size. In particular, if a rupture starts with a very large moment release during time τ, it becomes a large earthquake. On the other hand, large earthquakes might start with very small moment release during τ; the non-predictability is due to the heterogeneities. The randomness in the critical branching model mimics the effect of the heterogeneities in the crust and in the 2D model. \\begin{thebibliography}{99} \\bibitem{ruff} Ruff, L. J., http://www.geo.lsa.umich.edu/SeismoObs/STF.html

  7. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  8. Exposing Trust Assumptions in Distributed Policy Enforcement (Briefing Charts)

    Science.gov (United States)

    2016-06-21

    Coordinated defenses appear to be feasible • Writing policies from scratch is hard – Exposing assumptions requires people to think about what assumptions... critical capabilities as: – Adaptation to dynamic service availability – Complex situational dynamics (e.g., differentiating between bot-net and

  9. Co-Dependency: An Examination of Underlying Assumptions.

    Science.gov (United States)

    Myer, Rick A.; And Others

    1991-01-01

    Discusses need for careful examination of codependency as diagnostic category. Critically examines assumptions that codependency is disease, addiction, or predetermined by the environment. Discusses implications of assumptions. Offers recommendations for mental health counselors focusing on need for systematic research, redirection of efforts to…

  10. Co-Dependency: An Examination of Underlying Assumptions.

    Science.gov (United States)

    Myer, Rick A.; And Others

    1991-01-01

    Discusses need for careful examination of codependency as diagnostic category. Critically examines assumptions that codependency is disease, addiction, or predetermined by the environment. Discusses implications of assumptions. Offers recommendations for mental health counselors focusing on need for systematic research, redirection of efforts to…

  11. 29 CFR 4231.10 - Actuarial calculations and assumptions.

    Science.gov (United States)

    2010-07-01

    ... MULTIEMPLOYER PLANS § 4231.10 Actuarial calculations and assumptions. (a) Most recent valuation. All calculations required by this part must be based on the most recent actuarial valuation as of the date of... this part must be based on methods and assumptions that are reasonable in the aggregate, based on...

  12. Special Theory of Relativity without special assumptions and tachyonic motion

    Directory of Open Access Journals (Sweden)

    E. Kapuścik

    2010-01-01

    Full Text Available The most general form of transformations of space-time coordinates in Special Theory of Relativity based solely on physical assumptions is described. Only the linearity of space-time transformations and the constancy of the speed of light are used as assumptions. The application to tachyonic motion is indicated.

  13. 40 CFR 761.2 - PCB concentration assumptions for use.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false PCB concentration assumptions for use..., AND USE PROHIBITIONS General § 761.2 PCB concentration assumptions for use. (a)(1) Any person may..., oil-filled cable, and rectifiers whose PCB concentration is not established contain PCBs at < 50 ppm...

  14. MONITORED GEOLOGIC REPOSITORY LIFE CYCLE COST ESTIMATE ASSUMPTIONS DOCUMENT

    Energy Technology Data Exchange (ETDEWEB)

    R.E. Sweeney

    2001-02-08

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost (LCC) estimate and schedule update incorporating information from the Viability Assessment (VA) , License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance.

  15. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    Science.gov (United States)

    Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.

    2017-02-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA

  16. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  17. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases...... literature, and few contributions represent the three remaining discourses, which unjustifiably leaves out issues that research could and most probably should investigate. In order to highlight research potentials, limitations, and underlying assumptions of each discourse, we develop four IT PPM metaphors...

  18. Different Random Distributions Research on Logistic-Based Sample Assumption

    Directory of Open Access Journals (Sweden)

    Jing Pan

    2014-01-01

    Full Text Available Logistic-based sample assumption is proposed in this paper, with a research on different random distributions through this system. It provides an assumption system of logistic-based sample, including its sample space structure. Moreover, the influence of different random distributions for inputs has been studied through this logistic-based sample assumption system. In this paper, three different random distributions (normal distribution, uniform distribution, and beta distribution are used for test. The experimental simulations illustrate the relationship between inputs and outputs under different random distributions. Thereafter, numerical analysis infers that the distribution of outputs depends on that of inputs to some extent, and this assumption system is not independent increment process, but it is quasistationary.

  19. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    Swarup Mohalik; R Ramanujam

    2002-04-01

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the commitments offered by the other at that state. We model examples like reliable bit transmission and sequence transmission protocols in this framework and discuss how assumption-commitment structure facilitates compositional design of such protocols. We prove a decomposition theorem which states that every protocol specified globally as a finite state system can be decomposed into such an assumption compatible system. We also present a syntactic characterization of this class using top level parallel composition.

  20. Tails assumptions and posterior concentration rates for mixtures of Gaussians

    OpenAIRE

    Naulet, Zacharie; Rousseau, Judith

    2016-01-01

    Nowadays in density estimation, posterior rates of convergence for location and location-scale mixtures of Gaussians are only known under light-tail assumptions; with better rates achieved by location mixtures. It is conjectured, but not proved, that the situation should be reversed under heavy tails assumptions. The conjecture is based on the feeling that there is no need to achieve a good order of approximation in regions with few data (say, in the tails), favoring location-scale mixtures w...

  1. US Intervention in Failed States: Bad Assumptions=Poor Outcomes

    Science.gov (United States)

    2002-01-01

    NATIONAL DEFENSE UNIVERSITY NATIONAL WAR COLLEGE STRATEGIC LOGIC ESSAY US INTERVENTION IN FAILED STATES: BAD ASSUMPTIONS = POOR ...2002 2. REPORT TYPE 3. DATES COVERED 00-00-2002 to 00-00-2002 4. TITLE AND SUBTITLE US Intervention in Failed States: Bad Assumptions= Poor ...country remains in the grip of poverty , natural disasters, and stagnation. Rwanda Rwanda, another small African country, is populated principally

  2. H-INFINITY-OPTIMIZATION WITHOUT ASSUMPTIONS ON FINITE OR INFINITE ZEROS

    NARCIS (Netherlands)

    SCHERER, C

    1992-01-01

    Explicit algebraic conditions are presented for the suboptimality of some parameter in the H(infinity)-optimization problem by output measurement control. Apart from two strict properness conditions, no artificial assumptions restrict the underlying system. In particular, the plant may have zeros on

  3. Contemporary assumptions on human nature and work and approach to human potential managing

    Directory of Open Access Journals (Sweden)

    Vujić Dobrila

    2006-01-01

    Full Text Available A general problem of this research is to identify if there is a relationship between the assumption on human nature and work (Mcgregor, Argyris, Schein, Steers and Porter and a general organizational model preference, as well as a mechanism of human resource management? This research was carried out in 2005/2006. The sample consisted of 317 subjects (197 managers, 105 highly educated subordinates and 15 entrepreneurs in 7 big enterprises in a group of small business enterprises differentiating in terms of the entrepreneur’s structure and a type of activity. A general hypothesis "that assumptions on human nature and work are statistically significant in connection to the preference approach (models, of work motivation commitment", has been confirmed. A specific hypothesis have been also confirmed: ·The assumptions on a human as a rational economic being are statistically significant in correlation with only two mechanisms of traditional models, the mechanism of method work control and the working discipline mechanism. ·Statistically significant assumptions on a human as a social being are correlated with all mechanisms of engaging employees, which belong to the model of the human relations, except the mechanism introducing the adequate type of prizes for all employees independently of working results. ·The assumptions on a human as a creative being are statistically significant, positively correlating with preference of two mechanisms belonging to the human resource model by investing into education and training and making conditions for the application of knowledge and skills. The young with assumptions on a human as a creative being prefer much broader repertoire of mechanisms belonging to the human resources model from the remaining category of subjects in the pattern. The connection between the assumption on human nature and preference models of engaging appears especially in the sub-pattern of managers, in the category of young subjects

  4. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2016-01-01

    are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute...... over what constitutes proper assumptions—even in the absence of corroborating or falsifying empirical evidence. We also discuss how changing assumptions may drive future progress in the resource-based view.......A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...

  5. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  6. On the Necessary and Sufficient Assumptions for UC Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Nielsen, Jesper Buus; Orlandi, Claudio

    2010-01-01

    We study the necessary and sufficient assumptions for universally composable (UC) computation, both in terms of setup and computational assumptions. We look at the common reference string model, the uniform random string model and the key-registration authority model (KRA), and provide new results...... for all of them. Perhaps most interestingly we show that: •  For even the minimal meaningful KRA, where we only assume that the secret key is a value which is hard to compute from the public key, one can UC securely compute any poly-time functionality if there exists a passive secure oblivious......-transfer protocol for the stand-alone model. Since a KRA where the secret keys can be computed from the public keys is useless, and some setup assumption is needed for UC secure computation, this establishes the best we could hope for the KRA model: any non-trivial KRA is sufficient for UC computation. •  We show...

  7. More Efficient VLR Group Signature Based on DTDH Assumption

    Directory of Open Access Journals (Sweden)

    Lizhen Ma

    2012-10-01

    Full Text Available In VLR (verifier-local revocation group signature, only verifiers are involved in the revocation of a member, while signers are not. Thus the VLR group signature schemes are suitable for mobile environments. To meet the requirement of speediness, reducing computation costs and shortening signature length are two requirements at the current research of VLR group signatures. A new VLR group signature is proposed based on q-SDH assumption and DTDH assumption. Compared with the existing VLR group signatures based on DTDH assumption, the  proposed scheme not only has the shortest signature size, but also has the lowest computation costs , and can be applicable to mobile environments such as IEEE 802.1x.  

  8. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  9. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases......: (1) IT PPM as the top management marketplace, (2) IT PPM as the cause of social dilemmas at the lower organizational levels (3) IT PPM as polity between different organizational interests, (4) IT PPM as power relations that suppress creativity and diversity. Our metaphors can be used by practitioners...

  10. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.;

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed...... by Froese et al. are realistic and consistent. We further show that the assumption about density-dependence being described by a stock recruitment relationship is responsible for determining whether a peak in the cohort biomass of a population occurs late or early in life. Finally, we argue...

  11. 41 CFR 60-3.9 - No assumption of validity.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true No assumption of validity. 60-3.9 Section 60-3.9 Public Contracts and Property Management Other Provisions Relating to Public... 3-UNIFORM GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 60-3.9...

  12. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  13. "Touch Me, Like Me": Testing an Encounter Group Assumption

    Science.gov (United States)

    Boderman, Alvin; And Others

    1972-01-01

    An experiment to test an encounter group assumption that touching increases interpersonal attraction was conducted. College women were randomly assigned to a touch or no-touch condition. A comparison of total evaluation scores verified the hypothesis: subjects who touched the accomplice perceived her as a more attractive person than those who did…

  14. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  15. Woman's Moral Development in Search of Philosophical Assumptions.

    Science.gov (United States)

    Sichel, Betty A.

    1985-01-01

    Examined is Carol Gilligan's thesis that men and women use different moral languages to resolve moral dilemmas, i.e., women speak a language of caring and responsibility, and men speak a language of rights and justice. Her thesis is not grounded with adequate philosophical assumptions. (Author/RM)

  16. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  17. Using Contemporary Art to Challenge Cultural Values, Beliefs, and Assumptions

    Science.gov (United States)

    Knight, Wanda B.

    2006-01-01

    Art educators, like many other educators born or socialized within the main-stream culture of a society, seldom have an opportunity to identify, question, and challenge their cultural values, beliefs, assumptions, and perspectives because school culture typically reinforces those they learn at home and in their communities (Bush & Simmons, 1990).…

  18. Lightweight Graphical Models for Selectivity Estimation Without Independence Assumptions

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2011-01-01

    ’s optimizers are frequently caused by missed correlations between attributes. We present a selectivity estimation approach that does not make the independence assumptions. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution of all...

  19. Making Predictions about Chemical Reactivity: Assumptions and Heuristics

    Science.gov (United States)

    Maeyer, Jenine; Talanquer, Vicente

    2013-01-01

    Diverse implicit cognitive elements seem to support but also constrain reasoning in different domains. Many of these cognitive constraints can be thought of as either implicit assumptions about the nature of things or reasoning heuristics for decision-making. In this study we applied this framework to investigate college students' understanding of…

  20. Unpacking Assumptions in Research Synthesis: A Critical Construct Synthesis Approach

    Science.gov (United States)

    Wolgemuth, Jennifer R.; Hicks, Tyler; Agosto, Vonzell

    2017-01-01

    Research syntheses in education, particularly meta-analyses and best-evidence syntheses, identify evidence-based practices by combining findings across studies whose constructs are similar enough to warrant comparison. Yet constructs come preloaded with social, historical, political, and cultural assumptions that anticipate how research problems…

  1. Challenging Teachers' Pedagogic Practice and Assumptions about Social Media

    Science.gov (United States)

    Cartner, Helen C.; Hallas, Julia L.

    2017-01-01

    This article describes an innovative approach to professional development designed to challenge teachers' pedagogic practice and assumptions about educational technologies such as social media. Developing effective technology-related professional development for teachers can be a challenge for institutions and facilitators who provide this…

  2. Assumptions regarding right censoring in the presence of left truncation.

    Science.gov (United States)

    Qian, Jing; Betensky, Rebecca A

    2014-04-01

    Clinical studies using complex sampling often involve both truncation and censoring, where there are options for the assumptions of independence of censoring and event and for the relationship between censoring and truncation. In this paper, we clarify these choices, show certain equivalences, and provide examples.

  3. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2011-01-01

    DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...

  4. Quantum cryptography in real-life applications: Assumptions and security

    Science.gov (United States)

    Zhao, Yi

    Quantum cryptography, or quantum key distribution (QKD), provides a means of unconditionally secure communication. The security is in principle based on the fundamental laws of physics. Security proofs show that if quantum cryptography is appropriately implemented, even the most powerful eavesdropper cannot decrypt the message from a cipher. The implementations of quantum crypto-systems in real life may not fully comply with the assumptions made in the security proofs. Such discrepancy between the experiment and the theory can be fatal to the security of a QKD system. In this thesis we address a number of these discrepancies. A perfect single-photon source is often assumed in many security proofs. However, a weak coherent source is widely used in a real-life QKD implementation. Decoy state protocols have been proposed as a novel approach to dramatically improve the performance of a weak coherent source based QKD implementation without jeopardizing its security. Here, we present the first experimental demonstrations of decoy state protocols. Our experimental scheme was later adopted by most decoy state QKD implementations. In the security proof of decoy state protocols as well as many other QKD protocols, it is widely assumed that a sender generates a phase-randomized coherent state. This assumption has been enforced in few implementations. We close this gap in two steps: First, we implement and verify the phase randomization experimentally; second, we prove the security of a QKD implementation without the coherent state assumption. In many security proofs of QKD, it is assumed that all the detectors on the receiver's side have identical detection efficiencies. We show experimentally that this assumption may be violated in a commercial QKD implementation due to an eavesdropper's malicious manipulation. Moreover, we show that the eavesdropper can learn part of the final key shared by the legitimate users as a consequence of this violation of the assumptions.

  5. Analysis of one assumption of the Navier-Stokes equations

    CERN Document Server

    Budarin, V A

    2013-01-01

    This article analyses the assumptions regarding the influence of pressure forces during the calculation of the motion of a Newtonian fluid. The purpose of the analysis is to determine the reasonableness of the assumptions and their impact on the results of the analytical calculation. The connections between equations, causes of discrepancies in exact solutions of the Navier-Stokes equations at low Reynolds numbers and the emergence of unstable solutions using computer programs are also addressed. The necessity to complement the well-known equations of motion in mechanical stress requires other equations are substantive. It is shown that there are three methods of solving such a problem and the requirements for the unknown equations are described. Keywords: Navier-Stokes, approximate equation, closing equations, holonomic system.

  6. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    Science.gov (United States)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-01-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance. PMID:27721505

  7. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    Web usage mining concerns the discovery of common browsing patterns, i.e., pages requested in sequence, from web logs. To cope with the enormous amounts of data, several aggregated structures based on statistical models of web surfing have appeared, e.g., the Hypertext Probabilistic Grammar (HPG...... knowledge there has been no systematic study of the validity of the Markov assumption wrt.\\ web usage mining and the resulting quality of the mined browsing patterns. In this paper we systematically investigate the quality of browsing patterns mined from structures based on the Markov assumption. Formal...... measures of quality, based on the closeness of the mined patterns to the true traversal patterns, are defined and an extensive experimental evaluation is performed, based on two substantial real-world data sets. The results indicate that a large number of rules must be considered to achieve high quality...

  8. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  9. A "unity assumption" does not promote intersensory integration.

    Science.gov (United States)

    Misceo, Giovanni F; Taylor, Nathanael J

    2011-01-01

    An account of intersensory integration is premised on knowing that different sensory inputs arise from the same object. Could, however, the combination of the inputs be impaired although the "unity assumption" holds? Forty observers viewed a square through a minifying (50%) lens while they simultaneously touched the square. Half could see and half could not see their haptic explorations of the square. Both groups, however, had reason to believe that they were touching and viewing the same square. Subsequent matches of the inspected square were mutually biased by touch and vision when the exploratory movements were visible. However, the matches were biased in the direction of the square's haptic size when observers could not see their exploratory movements. This impaired integration without the visible haptic explorations suggests that the unity assumption alone is not enough to promote intersensory integration.

  10. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...... waste LCA models. This review infers that some of the differences in waste LCA models are inherent to the time they were developed. It is expected that models developed later, benefit from past modelling assumptions and knowledge and issues. Models developed in different countries furthermore rely...

  11. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    Science.gov (United States)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-10-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance.

  12. Assumptions and realities of the NCLEX-RN.

    Science.gov (United States)

    Aucoin, Julia W; Treas, Leslie

    2005-01-01

    Every three years the National Council of State Boards of Nursing conducts a practice analysis to verify the activities that are tested on the licensure exam (NCLEX-RN). Faculty can benefit from information in the practice analysis to ensure that courses and experiences adequately prepare graduates for the NCLEX-RN. This summary of the practice analysis challenges common assumptions and provides recommendations for faculty.

  13. The sufficiency assumption of the reasoned approach to action

    OpenAIRE

    David Trafimow

    2015-01-01

    The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables accou...

  14. Some Considerations on the Basic Assumptions in Rotordynamics

    Science.gov (United States)

    GENTA, G.; DELPRETE, C.; BRUSA, E.

    1999-10-01

    The dynamic study of rotors is usually performed under a number of assumptions, namely small displacements and rotations, small unbalance and constant angular velocity. The latter assumption can be substituted by a known time history of the spin speed. The present paper develops a general non-linear model which can be used to study the rotordynamic behaviour of both fixed and free rotors without resorting to the mentioned assumptions and compares the results obtained from a number of non-linear numerical simulations with those computed through the usual linearized approach. It is so possible to verify that the validity of the rotordynamic models extends to situations in which fairly large unbalances and whirling motions are present and, above all, it is shown that the doubts forwarded about the application of a model which is based on constant spin speed to the case of free rotors in which the angular momentum is constant have no ground. Rotordynamic models can thus be used to study the stability in the small of spinning spacecrafts and the insight obtained from the study of rotors is useful to understand their attitude dynamics and its interactions with the vibration dynamics.

  15. 关于秦山一期棒控系统电流调节回路软件化的设想%The Assumption of the Current Regulating Circuit of Control rod Control System in Qin Shan Nuclear Power Plant Phase I Achieved by Software

    Institute of Scientific and Technical Information of China (English)

    谭平; 黄程

    2016-01-01

    The paper takes the control rod control system of Qin Shan nuclear power phase I as anexample, describe thecontrol rod driven mechanism、the principle of current regulating circuit and control scheme briefly, discuss the function of theset point function module, currentregulating module,phase shift trigger module, anddo the software requirement analysis for them. The paper is based on the C8051F(micro control unit)toachieve the function of these three module,improving the integrity of the current regulating circuit, saving the space of cabinet, conforming to the Miniaturized design.%本文以秦山一期棒控系统为例,简要介绍了秦山一期控制棒控制系统(下述简称棒控系统)的控制对象控制棒驱动机构、电流调节回路原理及控制方案,论述了棒控系统定值、电流调节、移相触发器组件的功能,并对这三类模块功能进行软件分析。本文采用以C8051F单片机为控制核心,将原来采用硬件实现的功能改由软件实现,提高了电流调节回路的集成度,节省机柜空间,符合微型化设计理念。

  16. The sexual victimization of men in America: new data challenge old assumptions.

    Science.gov (United States)

    Stemple, Lara; Meyer, Ilan H

    2014-06-01

    We assessed 12-month prevalence and incidence data on sexual victimization in 5 federal surveys that the Bureau of Justice Statistics, the Centers for Disease Control and Prevention, and the Federal Bureau of Investigation conducted independently in 2010 through 2012. We used these data to examine the prevailing assumption that men rarely experience sexual victimization. We concluded that federal surveys detect a high prevalence of sexual victimization among men-in many circumstances similar to the prevalence found among women. We identified factors that perpetuate misperceptions about men's sexual victimization: reliance on traditional gender stereotypes, outdated and inconsistent definitions, and methodological sampling biases that exclude inmates. We recommend changes that move beyond regressive gender assumptions, which can harm both women and men.

  17. Evaluating risk factor assumptions: a simulation-based approach

    Directory of Open Access Journals (Sweden)

    Miglioretti Diana L

    2011-09-01

    Full Text Available Abstract Background Microsimulation models are an important tool for estimating the comparative effectiveness of interventions through prediction of individual-level disease outcomes for a hypothetical population. To estimate the effectiveness of interventions targeted toward high risk groups, the mechanism by which risk factors influence the natural history of disease must be specified. We propose a method for evaluating these risk factor assumptions as part of model-building. Methods We used simulation studies to examine the impact of risk factor assumptions on the relative rate (RR of colorectal cancer (CRC incidence and mortality for a cohort with a risk factor compared to a cohort without the risk factor using an extension of the CRC-SPIN model for colorectal cancer. We also compared the impact of changing age at initiation of screening colonoscopy for different risk mechanisms. Results Across CRC-specific risk factor mechanisms, the RR of CRC incidence and mortality decreased (towards one with increasing age. The rate of change in RRs across age groups depended on both the risk factor mechanism and the strength of the risk factor effect. Increased non-CRC mortality attenuated the effect of CRC-specific risk factors on the RR of CRC when both were present. For each risk factor mechanism, earlier initiation of screening resulted in more life years gained, though the magnitude of life years gained varied across risk mechanisms. Conclusions Simulation studies can provide insight into both the effect of risk factor assumptions on model predictions and the type of data needed to calibrate risk factor models.

  18. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  19. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    Web usage mining concerns the discovery of common browsing patterns, i.e., pages requested in sequence, from web logs. To cope with the enormous amounts of data, several aggregated structures based on statistical models of web surfing have appeared, e.g., the Hypertext Probabilistic Grammar (HPG......) model~\\cite{borges99data}. These techniques typically rely on the \\textit{Markov assumption with history depth} $n$, i.e., it is assumed that the next requested page is only dependent on the last $n$ pages visited. This is not always valid, i.e. false browsing patterns may be discovered. However, to our...

  20. AN EFFICIENT BIT COMMITMENT SCHEME BASED ON FACTORING ASSUMPTION

    Institute of Scientific and Technical Information of China (English)

    Zhong Ming; Yang Yixian

    2001-01-01

    Recently, many bit commitment schemes have been presented. This paper presents a new practical bit commitment scheme based on Schnorr's one-time knowledge proof scheme,where the use of cut-and-choose method and many random exam candidates in the protocols are replaced by a single challenge number. Therefore the proposed bit commitment scheme is more efficient and practical than the previous schemes In addition, the security of the proposed scheme under factoring assumption is proved, thus the cryptographic basis of the proposed scheme is clarified.

  1. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  2. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    articles across various research disciplines. We find and classify a stock of 107 relevant articles into four scientific discourses: the normative, the interpretive, the critical, and the dialogical discourses, as formulated by Deetz (1996). We find that the normative discourse dominates the IT PPM......, metaphors, information systems....... literature, and few contributions represent the three remaining discourses, which unjustifiably leaves out issues that research could and most probably should investigate. In order to highlight research potentials, limitations, and underlying assumptions of each discourse, we develop four IT PPM metaphors...

  3. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  4. The contour method cutting assumption: error minimization and correction

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Kastengren, Alan L [ANL

    2010-01-01

    The recently developed contour method can measure 2-D, cross-sectional residual-stress map. A part is cut in two using a precise and low-stress cutting technique such as electric discharge machining. The contours of the new surfaces created by the cut, which will not be flat if residual stresses are relaxed by the cutting, are then measured and used to calculate the original residual stresses. The precise nature of the assumption about the cut is presented theoretically and is evaluated experimentally. Simply assuming a flat cut is overly restrictive and misleading. The critical assumption is that the width of the cut, when measured in the original, undeformed configuration of the body is constant. Stresses at the cut tip during cutting cause the material to deform, which causes errors. The effect of such cutting errors on the measured stresses is presented. The important parameters are quantified. Experimental procedures for minimizing these errors are presented. An iterative finite element procedure to correct for the errors is also presented. The correction procedure is demonstrated on experimental data from a steel beam that was plastically bent to put in a known profile of residual stresses.

  5. On the role of assumptions in cladistic biogeographical analyses

    Directory of Open Access Journals (Sweden)

    Charles Morphy Dias dos Santos

    2011-01-01

    Full Text Available The biogeographical Assumptions 0, 1, and 2 (respectively A0, A1 and A2 are theoretical terms used to interpret and resolve incongruence in order to find general areagrams. The aim of this paper is to suggest the use of A2 instead of A0 and A1 in solving uncertainties during cladistic biogeographical analyses. In a theoretical example, using Component Analysis and Primary Brooks Parsimony Analysis (primary BPA, A2 allows for the reconstruction of the true sequence of disjunction events within a hypothetical scenario, while A0 adds spurious area relationships. A0, A1 and A2 are interpretations of the relationships between areas, not between taxa. Since area relationships are not equivalent to cladistic relationships, it is inappropriate to use the distributional information of taxa to resolve ambiguous patterns in areagrams, as A0 does. Although ambiguity in areagrams is virtually impossible to explain, A2 is better and more neutral than any other biogeographical assumption.

  6. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  7. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2011-01-01

    generalized to use instead d-DDH, and we show in the generic group model that d-DDH is harder than DDH. This means that virtually any application of DDH can now be realized with the same (amortized) efficiency, but under a potentially weaker assumption. On the negative side, we also show that d-DDH, just like...... DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...... and that in fact the problems become harder with increasing d and hence form an infinite hierarchy. We show that hardness of VDDH implies CCA-secure encryption, efficient Naor-Reingold style pseudorandom functions, and auxiliary input secure encryption, a strong form of leakage resilience. This can be seen...

  8. Time derivatives of the spectrum: Relaxing the stationarity assumption

    Science.gov (United States)

    Prieto, G. A.; Thomson, D. J.; Vernon, F. L.

    2005-12-01

    Spectrum analysis of seismic waveforms has played a significant role towards the understanding of multiple aspects of Earth structure and earthquake source physics. In recent years the multitaper spectrum estimation approach (Thomson, 1982) has been applied to geophysical problems providing not only reliable estimates of the spectrum, but also estimates of spectral uncertainties (Thomson and Chave, 1991). However, these improved spectral estimates were developed under the assumption of local stationarity and provide an incomplete description of the observed process. It is obvious that due to the intrinsic attenuation of the Earth, the amplitudes, and thus the frequency contents are changing with time as waves pass through a seismic station. There have been incredible improvements in different techniques to analyze non-stationary signals, including wavelet decomposition, Wigner-Ville spectrum and the dual-frequency spectrum. We apply one of the recently developed techniques, the Quadratic Inverse Theory (Thomson, 1990, 1994), combined with the multitaper technique to look at the time derivatives of the spectrum. If the spectrum is reasonably white in a certain bandwidth, using QI theory, we can estimate the derivatives of the spectrum at each frequency. We test synthetic signals to corroborate the approach and apply it the records of small earthquakes at local distances. This is a first approach to try and combine the classical spectrum analysis without the assumption of stationarity that is generally taken.

  9. Relaxing the zero-sum assumption in neutral biodiversity theory.

    Science.gov (United States)

    Haegeman, Bart; Etienne, Rampal S

    2008-05-21

    The zero-sum assumption is one of the ingredients of the standard neutral model of biodiversity by Hubbell. It states that the community is saturated all the time, which in this model means that the total number of individuals in the community is constant over time, and therefore introduces a coupling between species abundances. It was shown recently that a neutral model with independent species, and thus without any coupling between species abundances, has the same sampling formula (given a fixed number of individuals in the sample) as the standard model [Etienne, R.S., Alonso, D., McKane, A.J., 2007. The zero-sum assumption in neutral biodiversity theory. J. Theor. Biol. 248, 522-536]. The equilibria of both models are therefore equivalent from a practical point of view. Here we show that this equivalence can be extended to a class of neutral models with density-dependence on the community-level. This result can be interpreted as robustness of the model, i.e. insensitivity of the model to the precise interaction of the species in a neutral community. It can also be interpreted as a lack of resolution, as different mechanisms of interactions between neutral species cannot be distinguished using only a single snapshot of species abundance data.

  10. What lies beneath: underlying assumptions in bioimage analysis.

    Science.gov (United States)

    Pridmore, Tony P; French, Andrew P; Pound, Michael P

    2012-12-01

    The need for plant image analysis tools is established and has led to a steadily expanding literature and set of software tools. This is encouraging, but raises a question: how does a plant scientist with no detailed knowledge or experience of image analysis methods choose the right tool(s) for the task at hand, or satisfy themselves that a suggested approach is appropriate? We believe that too great an emphasis is currently being placed on low-level mechanisms and software environments. In this opinion article we propose that a renewed focus on the core theories and algorithms used, and in particular the assumptions upon which they rely, will better equip plant scientists to evaluate the available resources. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  12. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-02-14

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model.

  13. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, Rick [ICF International, Fairfax, VA (United States); Bluestein, Joel [ICF International, Fairfax, VA (United States); Rodriguez, Nick [ICF International, Fairfax, VA (United States); Knoke, Stu [ICF International, Fairfax, VA (United States)

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  14. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  15. Validating modelling assumptions of alpha particles in electrostatic turbulence

    CERN Document Server

    Wilkie, George; Highcock, Edmund; Dorland, William

    2014-01-01

    To rigorously model fast ions in fusion plasmas, a non-Maxwellian equilibrium distribution must be used. In the work, the response of high-energy alpha particles to electrostatic turbulence has been analyzed for several different tokamak parameters. Our results are consistent with known scalings and experimental evidence that alpha particles are generally well-confined: on the order of several seconds. It is also confirmed that the effect of alphas on the turbulence is negligible at realistically low concentrations, consistent with linear theory. It is demonstrated that the usual practice of using a high-temperature Maxwellian gives incorrect estimates for the radial alpha particle flux, and a method of correcting it is provided. Furthermore, we see that the timescales associated with collisions and transport compete at moderate energies, calling into question the assumption that alpha particles remain confined to a flux surface that is used in the derivation of the slowing-down distribution.

  16. Exploring gravitational statistics not based on quantum dynamical assumptions

    CERN Document Server

    Mandrin, P A

    2016-01-01

    Despite considerable progress in several approaches to quantum gravity, there remain uncertainties on the conceptual level. One issue concerns the different roles played by space and time in the canonical quantum formalism. This issue occurs because the Hamilton-Jacobi dynamics is being quantised. The question then arises whether additional physically relevant states could exist which cannot be represented in the canonical form or as a partition function. For this reason, the author has explored a statistical approach (NDA) which is not based on quantum dynamical assumptions and does not require space-time splitting boundary conditions either. For dimension 3+1 and under thermal equilibrium, NDA simplifies to a path integral model. However, the general case of NDA cannot be written as a partition function. As a test of NDA, one recovers general relativity at low curvature and quantum field theory in the flat space-time approximation. Related paper: arxiv:1505.03719.

  17. Uncovering Metaethical Assumptions in Bioethical Discourse across Cultures.

    Science.gov (United States)

    Sullivan, Laura Specker

    2016-03-01

    Much of bioethical discourse now takes place across cultures. This does not mean that cross-cultural understanding has increased. Many cross-cultural bioethical discussions are marked by entrenched disagreement about whether and why local practices are justified. In this paper, I argue that a major reason for these entrenched disagreements is that problematic metaethical commitments are hidden in these cross-cultural discourses. Using the issue of informed consent in East Asia as an example of one such discourse, I analyze two representative positions in the discussion and identify their metaethical commitments. I suggest that the metaethical assumptions of these positions result from their shared method of ethical justification: moral principlism. I then show why moral principlism is problematic in cross-cultural analyses and propose a more useful method for pursuing ethical justification across cultures.

  18. Linear irreversible heat engines based on local equilibrium assumptions

    Science.gov (United States)

    Izumida, Yuki; Okuda, Koji

    2015-08-01

    We formulate an endoreversible finite-time Carnot cycle model based on the assumptions of local equilibrium and constant energy flux, where the efficiency and the power are expressed in terms of the thermodynamic variables of the working substance. By analyzing the entropy production rate caused by the heat transfer in each isothermal process during the cycle, and using the endoreversible condition applied to the linear response regime, we identify the thermodynamic flux and force of the present system and obtain a linear relation that connects them. We calculate the efficiency at maximum power in the linear response regime by using the linear relation, which agrees with the Curzon-Ahlborn (CA) efficiency known as the upper bound in this regime. This reason is also elucidated by rewriting our model into the form of the Onsager relations, where our model turns out to satisfy the tight-coupling condition leading to the CA efficiency.

  19. New media in strategy – mapping assumptions in the field

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Plesner, Ursula; Raviola, Elena

    2017-01-01

    in relation to the outside or the inside of the organization. After discussing the literature according to these dimensions (deterministic/volontaristic) and (internal/external), the article argues for a sociomaterial approach to strategy and strategy making and for using the concept of affordances......There is plenty of empirical evidence for claiming that new media make a difference for how strategy is conceived and executed. Furthermore, there is a rapidly growing body of literature that engages with this theme, and offers recommendations regarding the appropriate strategic actions in relation...... to new media. By contrast, there is relatively little attention to the assumptions behind strategic thinking in relation to new media. This article reviews the most influential strategy journals, asking how new media are conceptualized. It is shown that strategy scholars have a tendency to place...

  20. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  1. Statistical Tests of the PTHA Poisson Assumption for Submarine Landslides

    Science.gov (United States)

    Geist, E. L.; Chaytor, J. D.; Parsons, T.; Ten Brink, U. S.

    2012-12-01

    We demonstrate that a sequence of dated mass transport deposits (MTDs) can provide information to statistically test whether or not submarine landslides associated with these deposits conform to a Poisson model of occurrence. Probabilistic tsunami hazard analysis (PTHA) most often assumes Poissonian occurrence for all sources, with an exponential distribution of return times. Using dates that define the bounds of individual MTDs, we first describe likelihood and Monte Carlo methods of parameter estimation for a suite of candidate occurrence models (Poisson, lognormal, gamma, Brownian Passage Time). In addition to age-dating uncertainty, both methods incorporate uncertainty caused by the open time intervals: i.e., before the first and after the last event to the present. Accounting for these open intervals is critical when there are a small number of observed events. The optimal occurrence model is selected according to both the Akaike Information Criteria (AIC) and Akaike's Bayesian Information Criterion (ABIC). In addition, the likelihood ratio test can be performed on occurrence models from the same family: e.g., the gamma model relative to the exponential model of return time distribution. Parameter estimation, model selection, and hypothesis testing are performed on data from two IODP holes in the northern Gulf of Mexico that penetrated a total of 14 MTDs, some of which are correlated between the two holes. Each of these events has been assigned an age based on microfossil zonations and magnetostratigraphic datums. Results from these sites indicate that the Poisson assumption is likely valid. However, parameter estimation results using the likelihood method for one of the sites suggest that the events may have occurred quasi-periodically. Methods developed in this study provide tools with which one can determine both the rate of occurrence and the statistical validity of the Poisson assumption when submarine landslides are included in PTHA.

  2. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... discusses how to create a Colored Petri Nets (CPN) model that formally expresses the following elements in a clearly separated structure: (1) assumptions about the behavior of the environment of the component, (2) real-time requirements for the component, and (3) a possible solution in terms of an algorithm...

  3. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  4. On Some Unwarranted Tacit Assumptions in Cognitive Neuroscience†

    Science.gov (United States)

    Mausfeld, Rainer

    2011-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. PMID:22435062

  5. In vitro versus in vivo culture sensitivities: an unchecked assumption?

    Directory of Open Access Journals (Sweden)

    Prasad V

    2013-03-01

    Full Text Available No abstract available. Article truncated at 150 words. Case Presentation A patient presents to urgent care with the symptoms of a urinary tract infection (UTI. The urinalysis is consistent with infection, and the urine culture is sent to lab. In the interim, a physician prescribes empiric treatment, and sends the patient home. Two days later, the culture is positive for E. coli, resistant to the drug prescribed (Ciprofloxacin, Minimum Inhibitory Concentration (MIC 64 μg/ml, but attempts to contact the patient (by telephone are not successful. The patient returns the call two weeks later to say that the infection resolved without sequelae.Discussion Many clinicians have the experience of treatment success in the setting of known antibiotic resistance, and, conversely, treatment failure in the setting of known sensitivity. Such anomalies and empiric research described here forces us to revisit assumptions about the relationship between in vivo and in vitro drug responses. When it comes to the utility of microbiology…

  6. Finite Element Simulations to Explore Assumptions in Kolsky Bar Experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Crum, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-05

    The chief purpose of this project has been to develop a set of finite element models that attempt to explore some of the assumptions in the experimental set-up and data reduction of the Kolsky bar experiment. In brief, the Kolsky bar, sometimes referred to as the split Hopkinson pressure bar, is an experimental apparatus used to study the mechanical properties of materials at high strain rates. Kolsky bars can be constructed to conduct experiments in tension or compression, both of which are studied in this paper. The basic operation of the tension Kolsky bar is as follows: compressed air is inserted into the barrel that contains the striker; the striker accelerates towards the left and strikes the left end of the barrel producing a tensile stress wave that propogates first through the barrel and then down the incident bar, into the specimen, and finally the transmission bar. In the compression case, the striker instead travels to the right and impacts the incident bar directly. As the stress wave travels through an interface (e.g., the incident bar to specimen connection), a portion of the pulse is transmitted and the rest reflected. The incident pulse, as well as the transmitted and reflected pulses are picked up by two strain gauges installed on the incident and transmitted bars as shown. By interpreting the data acquired by these strain gauges, the stress/strain behavior of the specimen can be determined.

  7. Cleanup of contaminated soil -- Unreal risk assumptions: Contaminant degradation

    Energy Technology Data Exchange (ETDEWEB)

    Schiffman, A. [New Jersey Department of Environmental Protection, Ewing, NJ (United States)

    1995-12-31

    Exposure assessments for development of risk-based soil cleanup standards or criteria assume that contaminant mass in soil is infinite and conservative (constant concentration). This assumption is not real for most organic chemicals. Contaminant mass is lost from soil and ground water when organic chemicals degrade. Factors to correct for chemical mass lost by degradation are derived from first-order kinetics for 85 organic chemicals commonly listed by USEPA and state agencies. Soil cleanup criteria, based on constant concentration, are then corrected for contaminant mass lost. For many chemicals, accounting for mass lost yields large correction factors to risk-based soil concentrations. For degradation in ground water and soil, correction factors range from greater than one to several orders of magnitude. The long exposure durations normally used in exposure assessments (25 to 70 years) result in large correction factors to standards even for carcinogenic chemicals with long half-lives. For the ground water pathway, a typical soil criterion for TCE of 1 mg/kg would be corrected to 11 mg/kg. For noncarcinogens, correcting for mass lost means that risk algorithms used to set soil cleanup requirements are inapplicable for many chemicals, especially for long periods of exposure.

  8. Stream of consciousness: Quantum and biochemical assumptions regarding psychopathology.

    Science.gov (United States)

    Tonello, Lucio; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A

    2017-04-01

    The accepted paradigms of mainstream neuropsychiatry appear to be incompletely adequate and in various cases offer equivocal analyses. However, a growing number of new approaches are being proposed that suggest the emergence of paradigm shifts in this area. In particular, quantum theories of mind, brain and consciousness seem to offer a profound change to the current approaches. Unfortunately these quantum paradigms harbor at least two serious problems. First, they are simply models, theories, and assumptions, with no convincing experiments supporting their claims. Second, they deviate from contemporary mainstream views of psychiatric illness and do so in revolutionary ways. We suggest a possible way to integrate experimental neuroscience with quantum models in order to address outstanding issues in psychopathology. A key role is played by the phenomenon called the "stream of consciousness", which can be linked to the so-called "Gamma Synchrony" (GS), which is clearly demonstrated by EEG data. In our novel proposal, a unipolar depressed patient could be seen as a subject with an altered stream of consciousness. In particular, some clues suggest that depression is linked to an "increased power" stream of consciousness. It is additionally suggested that such an approach to depression might be extended to psychopathology in general with potential benefits to diagnostics and therapeutics in neuropsychiatry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. PKreport: report generation for checking population pharmacokinetic model assumptions

    Directory of Open Access Journals (Sweden)

    Li Jun

    2011-05-01

    Full Text Available Abstract Background Graphics play an important and unique role in population pharmacokinetic (PopPK model building by exploring hidden structure among data before modeling, evaluating model fit, and validating results after modeling. Results The work described in this paper is about a new R package called PKreport, which is able to generate a collection of plots and statistics for testing model assumptions, visualizing data and diagnosing models. The metric system is utilized as the currency for communicating between data sets and the package to generate special-purpose plots. It provides ways to match output from diverse software such as NONMEM, Monolix, R nlme package, etc. The package is implemented with S4 class hierarchy, and offers an efficient way to access the output from NONMEM 7. The final reports take advantage of the web browser as user interface to manage and visualize plots. Conclusions PKreport provides 1 a flexible and efficient R class to store and retrieve NONMEM 7 output, 2 automate plots for users to visualize data and models, 3 automatically generated R scripts that are used to create the plots; 4 an archive-oriented management tool for users to store, retrieve and modify figures, 5 high-quality graphs based on the R packages, lattice and ggplot2. The general architecture, running environment and statistical methods can be readily extended with R class hierarchy. PKreport is free to download at http://cran.r-project.org/web/packages/PKreport/index.html.

  10. Testing the habituation assumption underlying models of parasitoid foraging behavior

    Science.gov (United States)

    Abram, Katrina; Colazza, Stefano; Peri, Ezio

    2017-01-01

    Background Habituation, a form of non-associative learning, has several well-defined characteristics that apply to a wide range of physiological and behavioral responses in many organisms. In classic patch time allocation models, habituation is considered to be a major mechanistic component of parasitoid behavioral strategies. However, parasitoid behavioral responses to host cues have not previously been tested for the known, specific characteristics of habituation. Methods In the laboratory, we tested whether the foraging behavior of the egg parasitoid Trissolcus basalis shows specific characteristics of habituation in response to consecutive encounters with patches of host (Nezara viridula) chemical contact cues (footprints), in particular: (i) a training interval-dependent decline in response intensity, and (ii) a training interval-dependent recovery of the response. Results As would be expected of a habituated response, wasps trained at higher frequencies decreased their behavioral response to host footprints more quickly and to a greater degree than those trained at low frequencies, and subsequently showed a more rapid, although partial, recovery of their behavioral response to host footprints. This putative habituation learning could not be blocked by cold anesthesia, ingestion of an ATPase inhibitor, or ingestion of a protein synthesis inhibitor. Discussion Our study provides support for the assumption that diminishing responses of parasitoids to chemical indicators of host presence constitutes habituation as opposed to sensory fatigue, and provides a preliminary basis for exploring the underlying mechanisms. PMID:28321365

  11. Observing gravitational-wave transient GW150914 with minimal assumptions

    Science.gov (United States)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; DeRosa, R. T.; De Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Haas, R.; Hacker, J. J.

    2016-06-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be sensitive to gravitational waves emitted by a wide range of sources including binary black hole mergers. Over the observational period from September 12 to October 20, 2015, these transient searches were sensitive to binary black hole mergers similar to GW150914 to an average distance of ˜600 Mpc . In this paper, we describe the analyses that first detected GW150914 as well as the parameter estimation and waveform reconstruction techniques that initially identified GW150914 as the merger of two black holes. We find that the reconstructed waveform is consistent with the signal from a binary black hole merger with a chirp mass of ˜30 M⊙ and a total mass before merger of ˜70 M⊙ in the detector frame.

  12. Projecting the future of Canada's population: assumptions, implications, and policy

    Directory of Open Access Journals (Sweden)

    Beaujot, Roderic

    2003-01-01

    Full Text Available After considering the assumptions for fertility, mortality and international migration, this paper looks at implications of the evolving demographics for population growth, labour force, retirement, and population distribution. With the help of policies favouring gender equity and supporting families of various types, fertility in Canada could avoid the particularly low levels seen in some countries, and remain at levels closer to 1.6 births per woman. The prognosis in terms of both risk factors and treatment suggests further reductions in mortality toward a life expectancy of 85. On immigration, there are political interests for levels as high as 270,000 per year, while levels of 150,000 correspond to the long term post-war average. The future will see slower population growth, and due to migration more than natural increase. International migration of some 225,000 per year can enable Canada to avoid population decline, and sustain the size of the labour force, but all scenarios show much change in the relative size of the retired compared to the labour force population. According to the ratio of persons aged 20-64 to that aged 65 and over, there were seven persons at labour force ages per person at retirement age in 1951, compared to five in 2001 and probably less than 2.5 in 2051. Growth that is due to migration more so than natural increase will accentuate the urbanization trend and the unevenness of the population distribution over space. Past projections have under-projected the mortality improvements and their impact on the relative size of the population at older age groups. Policies regarding fertility, mortality and migration could be aimed at avoiding population decline and reducing the effect of aging, but there is lack of an institutional basis for policy that would seek to endogenize population.

  13. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  14. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  15. Coefficient Alpha as an Estimate of Test Reliability under Violation of Two Assumptions.

    Science.gov (United States)

    Zimmerman, Donald W.; And Others

    1993-01-01

    Coefficient alpha was examined through computer simulation as an estimate of test reliability under violation of two assumptions. Coefficient alpha underestimated reliability under violation of the assumption of essential tau-equivalence of subtest scores and overestimated it under violation of the assumption of uncorrelated subtest error scores.…

  16. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  17. Educational Technology as a Subversive Activity: Questioning Assumptions Related to Teaching and Leading with Technology

    Science.gov (United States)

    Kruger-Ross, Matthew J.; Holcomb, Lori B.

    2012-01-01

    The use of educational technologies is grounded in the assumptions of teachers, learners, and administrators. Assumptions are choices that structure our understandings and help us make meaning. Current advances in Web 2.0 and social media technologies challenge our assumptions about teaching and learning. The intersection of technology and…

  18. Exploring the Influence of Ethnicity, Age, and Trauma on Prisoners' World Assumptions

    Science.gov (United States)

    Gibson, Sandy

    2011-01-01

    In this study, the author explores world assumptions of prisoners, how these assumptions vary by ethnicity and age, and whether trauma history affects world assumptions. A random sample of young and old prisoners, matched for prison location, was drawn from the New Jersey Department of Corrections prison population. Age and ethnicity had…

  19. Simplified subsurface modelling: data assimilation and violated model assumptions

    Science.gov (United States)

    Erdal, Daniel; Lange, Natascha; Neuweiler, Insa

    2017-04-01

    Integrated models are gaining more and more attention in hydrological modelling as they can better represent the interaction between different compartments. Naturally, these models come along with larger numbers of unknowns and requirements on computational resources compared to stand-alone models. If large model domains are to be represented, e.g. on catchment scale, the resolution of the numerical grid needs to be reduced or the model itself needs to be simplified. Both approaches lead to a reduced ability to reproduce the present processes. This lack of model accuracy may be compensated by using data assimilation methods. In these methods observations are used to update the model states, and optionally model parameters as well, in order to reduce the model error induced by the imposed simplifications. What is unclear is whether these methods combined with strongly simplified models result in completely data-driven models or if they can even be used to make adequate predictions of the model state for times when no observations are available. In the current work we consider the combined groundwater and unsaturated zone, which can be modelled in a physically consistent way using 3D-models solving the Richards equation. For use in simple predictions, however, simpler approaches may be considered. The question investigated here is whether a simpler model, in which the groundwater is modelled as a horizontal 2D-model and the unsaturated zones as a few sparse 1D-columns, can be used within an Ensemble Kalman filter to give predictions of groundwater levels and unsaturated fluxes. This is tested under conditions where the feedback between the two model-compartments are large (e.g. shallow groundwater table) and the simplification assumptions are clearly violated. Such a case may be a steep hill-slope or pumping wells, creating lateral fluxes in the unsaturated zone, or strong heterogeneous structures creating unaccounted flows in both the saturated and unsaturated

  20. 'Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community’

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie

    2016-01-01

    In this full-day workshop we want to discuss how the IDC community can make underlying assumptions, values and views regarding children and childhood in making design decisions more explicit. What assumptions do IDC designers and researchers make, and how can they be supported in reflecting...... on those assumptions and the possible influences on their design decisions? How can we make the assumptions explicit, discuss them in the IDC community and use the discussion to develop higher quality design and research? The workshop will support discussion between researchers, designers and practitioners......, and intends to share different approaches for uncovering and reflecting on values, assumptions and views about children and childhood in design....

  1. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  2. Impact of violated high-dose refuge assumptions on evolution of Bt resistance.

    Science.gov (United States)

    Campagne, Pascal; Smouse, Peter E; Pasquet, Rémy; Silvain, Jean-François; Le Ru, Bruno; Van den Berg, Johnnie

    2016-04-01

    Transgenic crops expressing Bacillus thuringiensis (Bt) toxins have been widely and successfully deployed for the control of target pests, while allowing a substantial reduction in insecticide use. The evolution of resistance (a heritable decrease in susceptibility to Bt toxins) can pose a threat to sustained control of target pests, but a high-dose refuge (HDR) management strategy has been key to delaying countervailing evolution of Bt resistance. The HDR strategy relies on the mating frequency between susceptible and resistant individuals, so either partial dominance of resistant alleles or nonrandom mating in the pest population itself could elevate the pace of resistance evolution. Using classic Wright-Fisher genetic models, we investigated the impact of deviations from standard refuge model assumptions on resistance evolution in the pest populations. We show that when Bt selection is strong, even deviations from random mating and/or strictly recessive resistance that are below the threshold of detection can yield dramatic increases in the pace of resistance evolution. Resistance evolution is hastened whenever the order of magnitude of model violations exceeds the initial frequency of resistant alleles. We also show that the existence of a fitness cost for resistant individuals on the refuge crop cannot easily overcome the effect of violated HDR assumptions. We propose a parametrically explicit framework that enables both comparison of various field situations and model inference. Using this model, we propose novel empiric estimators of the pace of resistance evolution (and time to loss of control), whose simple calculation relies on the observed change in resistance allele frequency.

  3. Estimating risks and relative risks in case-base studies under the assumptions of gene-environment independence and Hardy-Weinberg equilibrium.

    Science.gov (United States)

    Chui, Tina Tsz-Ting; Lee, Wen-Chung

    2014-01-01

    Many diseases result from the interactions between genes and the environment. An efficient method has been proposed for a case-control study to estimate the genetic and environmental main effects and their interactions, which exploits the assumptions of gene-environment independence and Hardy-Weinberg equilibrium. To estimate the absolute and relative risks, one needs to resort to an alternative design: the case-base study. In this paper, the authors show how to analyze a case-base study under the above dual assumptions. This approach is based on a conditional logistic regression of case-counterfactual controls matched data. It can be easily fitted with readily available statistical packages. When the dual assumptions are met, the method is approximately unbiased and has adequate coverage probabilities for confidence intervals. It also results in smaller variances and shorter confidence intervals as compared with a previous method for a case-base study which imposes neither assumption.

  4. Problematic assumptions have slowed down depression research: why symptoms, not syndromes are the way forward

    Directory of Open Access Journals (Sweden)

    Eiko I Fried

    2015-03-01

    Full Text Available Major Depression (MD is a highly heterogeneous diagnostic category. Diverse symptoms such as sad mood, anhedonia, and fatigue are routinely added to an unweighted sum-score, and cutoffs are used to distinguish between depressed participants and healthy controls. Researchers then investigate outcome variables like MD risk factors, biomarkers, and treatment response in such samples. These practices presuppose that (1 depression is a discrete condition, and that (2 symptoms are interchangeable indicators of this latent disorder. Here I review these two assumptions, elucidate their historical roots, show how deeply engrained they are in psychological and psychiatric research, and document that they contrast with evidence. Depression is not a consistent syndrome with clearly demarcated boundaries, and depression symptoms are not interchangeable indicators of an underlying disorder. Current research practices lump individuals with very different problems into one category, which has contributed to the remarkably slow progress in key research domains such as the development of efficacious antidepressants or the identification of biomarkers for depression.The recently proposed network framework offers an alternative to the problematic assumptions. MD is not understood as a distinct condition, but as heterogeneous symptom cluster that substantially overlaps with other syndromes such as anxiety disorders. MD is not framed as an underlying disease with a number of equivalent indicators, but as a network of symptoms that have direct causal influence on each other: insomnia can cause fatigue which then triggers concentration and psychomotor problems. This approach offers new opportunities for constructing an empirically based classification system and has broad implications for future research.

  5. Making Sense out of Sex Stereotypes in Advertising: A Feminist Analysis of Assumptions.

    Science.gov (United States)

    Ferrante, Karlene

    Sexism and racism in advertising have been well documented, but feminist research aimed at social change must go beyond existing content analyses to ask how advertising is created. Analysis of the "mirror assumption" (advertising reflects society) and the "gender assumption" (advertising speaks in a male voice to female…

  6. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  7. Implicit Assumptions in Special Education Policy: Promoting Full Inclusion for Students with Learning Disabilities

    Science.gov (United States)

    Kirby, Moira

    2017-01-01

    Introduction: Everyday millions of students in the United States receive special education services. Special education is an institution shaped by societal norms. Inherent in these norms are implicit assumptions regarding disability and the nature of special education services. The two dominant implicit assumptions evident in the American…

  8. Keeping Things Simple: Why the Human Development Index Should Not Diverge from Its Equal Weights Assumption

    Science.gov (United States)

    Stapleton, Lee M.; Garrod, Guy D.

    2007-01-01

    Using a range of statistical criteria rooted in Information Theory we show that there is little justification for relaxing the equal weights assumption underlying the United Nation's Human Development Index (HDI) even if the true HDI diverges significantly from this assumption. Put differently, the additional model complexity that unequal weights…

  9. Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie; Bekker, Tilde; Barendregt, Wolmet

    2016-01-01

    on those assumptions and the possible influences on their design decisions? How can we make the assumptions explicit, discuss them in the IDC community and use the discussion to develop higher quality design and research? The workshop will support discussion between researchers, designers and practitioners...

  10. A Proposal for Testing Local Realism Without Using Assumptions Related to Hidden Variable States

    Science.gov (United States)

    Ryff, Luiz Carlos

    1996-01-01

    A feasible experiment is discussed which allows us to prove a Bell's theorem for two particles without using an inequality. The experiment could be used to test local realism against quantum mechanics without the introduction of additional assumptions related to hidden variables states. Only assumptions based on direct experimental observation are needed.

  11. Making Foundational Assumptions Transparent: Framing the Discussion about Group Communication and Influence

    Science.gov (United States)

    Meyers, Renee A.; Seibold, David R.

    2009-01-01

    In this article, the authors seek to augment Dean Hewes's (1986, 1996) intriguing bracketing and admirable larger effort to "return to basic theorizing in the study of group communication" by making transparent the foundational, and debatable, assumptions that underlie those models. Although these assumptions are addressed indirectly by Hewes, the…

  12. Sensitivity of TRIM projections to management, harvest, yield, and stocking adjustment assumptions.

    Science.gov (United States)

    Susan J. Alexander

    1991-01-01

    The Timber Resource Inventory Model (TRIM) was used to make several projections of forest industry timber supply for the Douglas-fir region. The sensitivity of these projections to assumptions about management and yields is discussed. A base run is compared to runs in which yields were altered, stocking adjustment was eliminated, harvest assumptions were changed, and...

  13. Assessing Key Assumptions of Network Meta-Analysis: A Review of Methods

    Science.gov (United States)

    Donegan, Sarah; Williamson, Paula; D'Alessandro, Umberto; Tudur Smith, Catrin

    2013-01-01

    Background: Homogeneity and consistency assumptions underlie network meta-analysis (NMA). Methods exist to assess the assumptions but they are rarely and poorly applied. We review and illustrate methods to assess homogeneity and consistency. Methods: Eligible articles focussed on indirect comparison or NMA methodology. Articles were sought by…

  14. Teaching Lessons in Exclusion: Researchers' Assumptions and the Ideology of Normality

    Science.gov (United States)

    Benincasa, Luciana

    2012-01-01

    Filling in a research questionnaire means coming into contact with the researchers' assumptions. In this sense filling in a questionnaire may be described as a learning situation. In this paper I carry out discourse analysis of selected questionnaire items from a number of studies, in order to highlight underlying values and assumptions, and their…

  15. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Science.gov (United States)

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  16. Assessing the assumption of symmetric proximity measures in the context of multidimensional scaling.

    Science.gov (United States)

    Kelley, Ken

    2004-01-01

    Applications of multidimensional scaling often make the assumption of symmetry for the population matrix of proximity measures. Although the likelihood of such an assumption holding true varies from one area of research to another, formal assessment of such an assumption has received little attention. The present article develops a nonparametric procedure that can be used in a confirmatory fashion or in an exploratory fashion in order to probabilistically assess the assumption of population symmetry for proximity measures in a multidimensional scaling context. The proposed procedure makes use of the bootstrap technique and alleviates the assumptions of parametric statistical procedures. Computer code for R and S-Plus is included in an appendix in order to carry out the proposed procedures.

  17. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    Science.gov (United States)

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment.

  18. An Exploration of Dental Students' Assumptions About Community-Based Clinical Experiences.

    Science.gov (United States)

    Major, Nicole; McQuistan, Michelle R

    2016-03-01

    The aim of this study was to ascertain which assumptions dental students recalled feeling prior to beginning community-based clinical experiences and whether those assumptions were fulfilled or challenged. All fourth-year students at the University of Iowa College of Dentistry & Dental Clinics participate in community-based clinical experiences. At the completion of their rotations, they write a guided reflection paper detailing the assumptions they had prior to beginning their rotations and assessing the accuracy of their assumptions. For this qualitative descriptive study, the 218 papers from three classes (2011-13) were analyzed for common themes. The results showed that the students had a variety of assumptions about their rotations. They were apprehensive about working with challenging patients, performing procedures for which they had minimal experience, and working too slowly. In contrast, they looked forward to improving their clinical and patient management skills and knowledge. Other assumptions involved the site (e.g., the equipment/facility would be outdated; protocols/procedures would be similar to the dental school's). Upon reflection, students reported experiences that both fulfilled and challenged their assumptions. Some continued to feel apprehensive about treating certain patient populations, while others found it easier than anticipated. Students were able to treat multiple patients per day, which led to increased speed and patient management skills. However, some reported challenges with time management. Similarly, students were surprised to discover some clinics were new/updated although some had limited instruments and materials. Based on this study's findings about students' recalled assumptions and reflective experiences, educators should consider assessing and addressing their students' assumptions prior to beginning community-based dental education experiences.

  19. Some Finite Sample Properties and Assumptions of Methods for Determining Treatment Effects

    DEFF Research Database (Denmark)

    Petrovski, Erik

    2016-01-01

    for determining treatment effects were chosen: ordinary least squares regression, propensity score matching, and inverse probability weighting. The assumptions and properties tested across these methods are: unconfoundedness, differences in average treatment effects and treatment effects on the treated, overlap...... will compare assumptions and properties of select methods for determining treatment effects with Monte Carlo simulation. The comparison will highlight the pros and cons of using one method over another and the assumptions that researchers need to make for the method they choose.Three popular methods...

  20. Troubling 'lived experience': a post-structural critique of mental health nursing qualitative research assumptions.

    Science.gov (United States)

    Grant, A

    2014-08-01

    Qualitative studies in mental health nursing research deploying the 'lived experience' construct are often written on the basis of conventional qualitative inquiry assumptions. These include the presentation of the 'authentic voice' of research participants, related to their 'lived experience' and underpinned by a meta-assumption of the 'metaphysics of presence'. This set of assumptions is critiqued on the basis of contemporary post-structural qualitative scholarship. Implications for mental health nursing qualitative research emerging from this critique are described in relation to illustrative published work, and some benefits and challenges for researchers embracing post-structural sensibilities are outlined.

  1. Tests of the frozen-flux and tangentially geostrophic assumptions using magnetic satellite data

    DEFF Research Database (Denmark)

    Chulliat, A.; Olsen, Nils; Sabaka, T.

    the very large number of flows explaining the observed secular variation under the frozen-flux assumption alone. More recently, it has been shown that the combined frozen-flux and tangentially geostrophic assumptions translate into constraints on the secular variation whose mathematics are now well...... understood. Using these constraints, we test the combined frozen-flux and tangentially geostrophic assumptions against recent, high-precision magnetic data provided by the and CHAMP satellites. The methodology involves building constrained field models using least-squares methods. Two types of models...

  2. Estimating Alarm Thresholds for Process Monitoring Data under Different Assumptions about the Data Generating Mechanism

    Directory of Open Access Journals (Sweden)

    Tom Burr

    2013-01-01

    Full Text Available Process monitoring (PM for nuclear safeguards sometimes requires estimation of thresholds corresponding to small false alarm rates. Threshold estimation dates to the 1920s with the Shewhart control chart; however, because possible new roles for PM are being evaluated in nuclear safeguards, it is timely to consider modern model selection options in the context of threshold estimation. One of the possible new PM roles involves PM residuals, where a residual is defined as residual = data − prediction. This paper reviews alarm threshold estimation, introduces model selection options, and considers a range of assumptions regarding the data-generating mechanism for PM residuals. Two PM examples from nuclear safeguards are included to motivate the need for alarm threshold estimation. The first example involves mixtures of probability distributions that arise in solution monitoring, which is a common type of PM. The second example involves periodic partial cleanout of in-process inventory, leading to challenging structure in the time series of PM residuals.

  3. A new scenario framework for climate change research: the concept of shared climate policy assumptions

    NARCIS (Netherlands)

    Kriegler, E.; Edmonds, J.; Hallegatte, S.; Ebi, K.L.; Kram, T.; Riahi, K.; Winkler, J.; van Vuuren, Detlef|info:eu-repo/dai/nl/11522016X

    2014-01-01

    The new scenario framework facilitates the coupling of multiple socioeconomic reference pathways with climate model products using the representative concentration pathways. This will allow for improved assessment of climate impacts, adaptation and mitigation. Assumptions about climate policy play a

  4. Who needs the assumption of opportunistic behavior? Transaction cost economics does not!

    DEFF Research Database (Denmark)

    Koch, Carsten Allan

    2000-01-01

    The assumption of opportunistic behavior, familiar from transaction cost economics, has been and remains highly controversial. But opportunistic behavior, albeit undoubtedly an extremely important form of motivation, is not a necessary condition for the contractual problems studied by transaction...

  5. Washington International Renewable Energy Conference (WIREC) 2008 Pledges. Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bilello, Daniel E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cowlin, Shannon C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wise, Alison [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2008-08-01

    This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions resulting from more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy.

  6. Learning disabilities theory and Soviet psychology: a comparison of basic assumptions.

    Science.gov (United States)

    Coles, G S

    1982-09-01

    Critics both within and outside the Learning Disabilities (LD) field have pointed to the weaknesses of LD theory. Beginning with the premise that a significant problem of LD theory has been its failure to explore fully its fundamental assumptions, this paper examines a number of these assumptions about individual and social development, cognition, and learning. These assumptions are compared with a contrasting body of premises found in Soviet psychology, particularly in the works of Vygotsky, Leontiev, and Luria. An examination of the basic assumptions of LD theory and Soviet psychology shows that a major difference lies in their respective nondialectical and dialectical interpretation of the relationship of social factors and cognition, learning, and neurological development.

  7. A Test of Major Assumptions about Behavior Change: A Comprehensive Look at the Effects of Passive and Active HIV-Prevention Interventions Since the Beginning of the Epidemic

    Science.gov (United States)

    Albarracin, Dolores; Gillette, Jeffrey C.; Earl, Allison N.; Glasman, Laura R.; Durantini, Marta R.; Ho, Moon-Ho

    2005-01-01

    This meta-analysis tested the major theoretical assumptions about behavior change by examining the outcomes and mediating mechanisms of different preventive strategies in a sample of 354 HIV-prevention interventions and 99 control groups, spanning the past 17 years. There were 2 main conclusions from this extensive review. First, the most…

  8. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    Directory of Open Access Journals (Sweden)

    Hazim Adnan Hashim

    2016-09-01

    Full Text Available The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led many individuals to build new kind of beliefs and assumptions about themselves and the world. Many writers have written about the human ordeals that followed this incident. Don DeLillo’s Falling Man reflects the traumatic repercussions of this disaster on Americans’ fundamental assumptions. The objective of this study is to examine the novel from the traumatic perspective that has afflicted the victims’ fundamental understandings of the world and the self. Individuals’ fundamental understandings could be changed or modified due to exposure to certain types of events like war, terrorism, political violence or even the sense of alienation. The Assumptive World theory of Ronnie Janoff-Bulman will be used as a framework to study the traumatic experience of the characters in Falling Man. The significance of the study lies in providing a new perception to the field of trauma that can help trauma victims to adopt alternative assumptions or reshape their previous ones to heal from traumatic effects.

  9. Unrealistic Assumptions in Economics: an Analysis under the Logic of Socioeconomic Processes

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2014-11-01

    Full Text Available The realism of assumptions is an ongoing debate within the philosophy of economics. One of the most referenced papers in this matter belongs to Milton Friedman. He defends the use of unrealistic assumptions, not only because of a pragmatic issue, but also the intrinsic difficulties of determining the extent of realism. On the other hand, realists have criticized (and still do today the use of unrealistic assumptions - such as the assumption of rational choice, perfect information, homogeneous goods, etc. However, they did not accompany their statements with a proper epistemological argument that supports their positions. In this work it is expected to show that the realism of (a particular sort of assumptions is clearly relevant when examining economic models, since the system under study (the real economies is not compatible with logic of invariance and of mechanisms, but with the logic of possibility trees. Because of this, models will not function as tools for predicting outcomes, but as representations of alternative scenarios, whose similarity to the real world will be examined in terms of the verisimilitude of a class of model assumptions

  10. Post-traumatic stress and world assumptions: the effects of religious coping.

    Science.gov (United States)

    Zukerman, Gil; Korn, Liat

    2014-12-01

    Religiosity has been shown to moderate the negative effects of traumatic event experiences. The current study was designed to examine the relationship between post-traumatic stress (PTS) following traumatic event exposure; world assumptions defined as basic cognitive schemas regarding the world; and self and religious coping conceptualized as drawing on religious beliefs and practices for understanding and dealing with life stressors. This study examined 777 Israeli undergraduate students who completed several questionnaires which sampled individual world assumptions and religious coping in addition to measuring PTS, as manifested by the PTSD check list. Results indicate that positive religious coping was significantly associated with more positive world assumptions, while negative religious coping was significantly associated with more negative world assumptions. Additionally, negative world assumptions were significantly associated with more avoidance symptoms, while reporting higher rates of traumatic event exposure was significantly associated with more hyper-arousal. These findings suggest that religious-related cognitive schemas directly affect world assumptions by creating protective shields that may prevent the negative effects of confronting an extreme negative experience.

  11. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    Science.gov (United States)

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage.

  12. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  13. World assumptions, posttraumatic stress and quality of life after a natural disaster: A longitudinal study

    Directory of Open Access Journals (Sweden)

    Nygaard Egil

    2012-06-01

    Full Text Available Abstract Background Changes in world assumptions are a fundamental concept within theories that explain posttraumatic stress disorder. The objective of the present study was to gain a greater understanding of how changes in world assumptions are related to quality of life and posttraumatic stress symptoms after a natural disaster. Methods A longitudinal study of 574 Norwegian adults who survived the Southeast Asian tsunami in 2004 was undertaken. Multilevel analyses were used to identify which factors at six months post-tsunami predicted quality of life and posttraumatic stress symptoms two years post-tsunami. Results Good quality of life and posttraumatic stress symptoms were negatively related. However, major differences in the predictors of these outcomes were found. Females reported significantly higher quality of life and more posttraumatic stress than men. The association between level of exposure to the tsunami and quality of life seemed to be mediated by posttraumatic stress. Negative perceived changes in the assumption “the world is just” were related to adverse outcome in both quality of life and posttraumatic stress. Positive perceived changes in the assumptions “life is meaningful” and “feeling that I am a valuable human” were associated with higher levels of quality of life but not with posttraumatic stress. Conclusions Quality of life and posttraumatic stress symptoms demonstrate differences in their etiology. World assumptions may be less specifically related to posttraumatic stress than has been postulated in some cognitive theories.

  14. Are nest sites actively chosen? Testing a common assumption for three non-resource limited birds

    Science.gov (United States)

    Goodenough, A. E.; Elliot, S. L.; Hart, A. G.

    2009-09-01

    Many widely-accepted ecological concepts are simplified assumptions about complex situations that remain largely untested. One example is the assumption that nest-building species choose nest sites actively when they are not resource limited. This assumption has seen little direct empirical testing: most studies on nest-site selection simply assume that sites are chosen actively (and seek explanations for such behaviour) without considering that sites may be selected randomly. We used 15 years of data from a nestbox scheme in the UK to test the assumption of active nest-site choice in three cavity-nesting bird species that differ in breeding and migratory strategy: blue tit ( Cyanistes caeruleus), great tit ( Parus major) and pied flycatcher ( Ficedula hypoleuca). Nest-site selection was non-random (implying active nest-site choice) for blue and great tits, but not for pied flycatchers. We also considered the relative importance of year-specific and site-specific factors in determining occupation of nest sites. Site-specific factors were more important than year-specific factors for the tit species, while the reverse was true for pied flycatchers. Our results show that nest-site selection, in birds at least, is not always the result of active choice, such that choice should not be assumed automatically in studies of nesting behaviour. We use this example to highlight the need to test key ecological assumptions empirically, and the importance of doing so across taxa rather than for single "model" species.

  15. Assumptions and moral understanding of the wish to hasten death: a philosophical review of qualitative studies.

    Science.gov (United States)

    Rodríguez-Prat, Andrea; van Leeuwen, Evert

    2017-07-01

    It is not uncommon for patients with advanced disease to express a wish to hasten death (WTHD). Qualitative studies of the WTHD have found that such a wish may have different meanings, none of which can be understood outside of the patient's personal and sociocultural background, or which necessarily imply taking concrete steps to ending one's life. The starting point for the present study was a previous systematic review of qualitative studies of the WTHD in advanced patients. Here we analyse in greater detail the statements made by patients included in that review in order to examine their moral understandings and representations of illness, the dying process and death. We identify and discuss four classes of assumptions: (1) assumptions related to patients' moral understandings in terms of dignity, autonomy and authenticity; (2) assumptions related to social interactions; (3) assumptions related to the value of life; and (4) assumptions related to medicalisation as an overarching context within which the WTHD is expressed. Our analysis shows how a philosophical perspective can add to an understanding of the WTHD by taking into account cultural and anthropological aspects of the phenomenon. We conclude that the knowledge gained through exploring patients' experience and moral understandings in the end-of-life context may serve as the basis for care plans and interventions that can help them experience their final days as a meaningful period of life, restoring some sense of personal dignity in those patients who feel this has been lost.

  16. Modelling sexual transmission of HIV: testing the assumptions, validating the predictions

    Science.gov (United States)

    Baggaley, Rebecca F.; Fraser, Christophe

    2010-01-01

    Purpose of review To discuss the role of mathematical models of sexual transmission of HIV: the methods used and their impact. Recent findings We use mathematical modelling of “universal test and treat” as a case study to illustrate wider issues relevant to all modelling of sexual HIV transmission. Summary Mathematical models are used extensively in HIV epidemiology to deduce the logical conclusions arising from one or more sets of assumptions. Simple models lead to broad qualitative understanding, while complex models can encode more realistic assumptions and thus be used for predictive or operational purposes. An overreliance on model analysis where assumptions are untested and input parameters cannot be estimated should be avoided. Simple models providing bold assertions have provided compelling arguments in recent public health policy, but may not adequately reflect the uncertainty inherent in the analysis. PMID:20543600

  17. COMPETITION VERSUS COLLUSION: THE PARALLEL BEHAVIOUR IN THE ABSENCE OF THE SYMETRY ASSUMPTION

    Directory of Open Access Journals (Sweden)

    Romano Oana Maria

    2012-07-01

    Full Text Available Cartel detection is usually viewed as a key task of competition authorities. A special case of cartel is the parallel behaviour in terms of price selling. This type of behaviour is difficult to assess and its analysis has not always conclusive results. For evaluating such behaviour the data available are compared with theoretical values obtained by using a competitive or a collusive model. When different competitive or collusive models are considered, for the simplicity of calculations the economists use the symmetry assumption of costs and quantities produced / sold. This assumption has the disadvantage that the theoretical values obtained may deviate significantly from actual values (the real values on the market, which can sometimes lead to ambiguous results. The present paper analyses the parallel behaviour of economic agents in the absence of the symmetry assumption and study the identification of the model in this conditions.

  18. Assumptions and Axioms: Mathematical Structures to Describe the Physics of Rigid Bodies

    CERN Document Server

    Butler, Philip H; Renaud, Peter F

    2010-01-01

    This paper challenges some of the common assumptions underlying the mathematics used to describe the physical world. We start by reviewing many of the assumptions underlying the concepts of real, physical, rigid bodies and the translational and rotational properties of such rigid bodies. Nearly all elementary and advanced texts make physical assumptions that are subtly different from ours, and as a result we develop a mathematical description that is subtly different from the standard mathematical structure. Using the homogeneity and isotropy of space, we investigate the translational and rotational features of rigid bodies in two and three dimensions. We find that the concept of rigid bodies and the concept of the homogeneity of space are intrinsically linked. The geometric study of rotations of rigid objects leads to a geometric product relationship for lines and vectors. By requiring this product to be both associative and to satisfy Pythagoras' theorem, we obtain a choice of Clifford algebras. We extend o...

  19. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  20. Impact of unseen assumptions on communication of atmospheric carbon mitigation options

    Science.gov (United States)

    Elliot, T. R.; Celia, M. A.; Court, B.

    2010-12-01

    With the rapid access and dissemination of information made available through online and digital pathways, there is need for a concurrent openness and transparency in communication of scientific investigation. Even with open communication it is essential that the scientific community continue to provide impartial result-driven information. An unknown factor in climate literacy is the influence of an impartial presentation of scientific investigation that has utilized biased base-assumptions. A formal publication appendix, and additional digital material, provides active investigators a suitable framework and ancillary material to make informed statements weighted by assumptions made in a study. However, informal media and rapid communiqués rarely make such investigatory attempts, often citing headline or key phrasing within a written work. This presentation is focused on Geologic Carbon Sequestration (GCS) as a proxy for the wider field of climate science communication, wherein we primarily investigate recent publications in GCS literature that produce scenario outcomes using apparently biased pro- or con- assumptions. A general review of scenario economics, capture process efficacy and specific examination of sequestration site assumptions and processes, reveals an apparent misrepresentation of what we consider to be a base-case GCS system. The authors demonstrate the influence of the apparent bias in primary assumptions on results from commonly referenced subsurface hydrology models. By use of moderate semi-analytical model simplification and Monte Carlo analysis of outcomes, we can establish the likely reality of any GCS scenario within a pragmatic middle ground. Secondarily, we review the development of publically available web-based computational tools and recent workshops where we presented interactive educational opportunities for public and institutional participants, with the goal of base-assumption awareness playing a central role. Through a series of

  1. cBrother: relaxing parental tree assumptions for Bayesian recombination detection.

    Science.gov (United States)

    Fang, Fang; Ding, Jing; Minin, Vladimir N; Suchard, Marc A; Dorman, Karin S

    2007-02-15

    Bayesian multiple change-point models accurately detect recombination in molecular sequence data. Previous Java-based implementations assume a fixed topology for the representative parental data. cBrother is a novel C language implementation that capitalizes on reduced computational time to relax the fixed tree assumption. We show that cBrother is 19 times faster than its predecessor and the fixed tree assumption can influence estimates of recombination in a medically-relevant dataset. cBrother can be freely downloaded from http://www.biomath.org/dormanks/ and can be compiled on Linux, Macintosh and Windows operating systems. Online documentation and a tutorial are also available at the site.

  2. Automatic ethics: the effects of implicit assumptions and contextual cues on moral behavior.

    Science.gov (United States)

    Reynolds, Scott J; Leavitt, Keith; DeCelles, Katherine A

    2010-07-01

    We empirically examine the reflexive or automatic aspects of moral decision making. To begin, we develop and validate a measure of an individual's implicit assumption regarding the inherent morality of business. Then, using an in-basket exercise, we demonstrate that an implicit assumption that business is inherently moral impacts day-to-day business decisions and interacts with contextual cues to shape moral behavior. Ultimately, we offer evidence supporting a characterization of employees as reflexive interactionists: moral agents whose automatic decision-making processes interact with the environment to shape their moral behavior.

  3. A criterion of orthogonality on the assumption and restrictions in subgrid-scale modelling of turbulence

    Science.gov (United States)

    Fang, L.; Sun, X. Y.; Liu, Y. W.

    2016-12-01

    In order to shed light on understanding the subgrid-scale (SGS) modelling methodology, we analyze and define the concepts of assumption and restriction in the modelling procedure, then show by a generalized derivation that if there are multiple stationary restrictions in a modelling, the corresponding assumption function must satisfy a criterion of orthogonality. Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion. This study is expected to inspire future research on generally guiding the SGS modelling methodology.

  4. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  5. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus.

    Science.gov (United States)

    Taliotis, Constantinos; Taibi, Emanuele; Howells, Mark; Rogner, Holger; Bazilian, Morgan; Welsch, Manuel

    2017-10-01

    The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  6. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  7. Validity of the Michaelis-Menten equation--steady-state or reactant stationary assumption: that is the question.

    Science.gov (United States)

    Schnell, Santiago

    2014-01-01

    The Michaelis-Menten equation is generally used to estimate the kinetic parameters, V and K(M), when the steady-state assumption is valid. Following a brief overview of the derivation of the Michaelis-Menten equation for the single-enzyme, single-substrate reaction, a critical review of the criteria for validity of the steady-state assumption is presented. The application of the steady-state assumption makes the implicit assumption that there is an initial transient during which the substrate concentration remains approximately constant, equal to the initial substrate concentration, while the enzyme-substrate complex concentration builds up. This implicit assumption is known as the reactant stationary assumption. This review presents evidence showing that the reactant stationary assumption is distinct from and independent of the steady-state assumption. Contrary to the widely believed notion that the Michaelis-Menten equation can always be applied under the steady-state assumption, the reactant stationary assumption is truly the necessary condition for validity of the Michaelis-Menten equation to estimate kinetic parameters. Therefore, the application of the Michaelis-Menten equation only leads to accurate estimation of kinetic parameters when it is used under experimental conditions meeting the reactant stationary assumption. The criterion for validity of the reactant stationary assumption does not require the restrictive condition of choosing a substrate concentration that is much higher than the enzyme concentration in initial rate experiments. © 2013 FEBS.

  8. Comparing the Performance of Approaches for Testing the Homogeneity of Variance Assumption in One-Factor ANOVA Models

    Science.gov (United States)

    Wang, Yan; Rodríguez de Gil, Patricia; Chen, Yi-Hsin; Kromrey, Jeffrey D.; Kim, Eun Sook; Pham, Thanh; Nguyen, Diep; Romano, Jeanine L.

    2017-01-01

    Various tests to check the homogeneity of variance assumption have been proposed in the literature, yet there is no consensus as to their robustness when the assumption of normality does not hold. This simulation study evaluated the performance of 14 tests for the homogeneity of variance assumption in one-way ANOVA models in terms of Type I error…

  9. Conceptualizing Identity Development: Unmasking the Assumptions within Inventories Measuring Identity Development

    Science.gov (United States)

    Moran, Christy D.

    2009-01-01

    The purpose of this qualitative research was to analyze the dimensions and manifestations of identity development embedded within commonly used instruments measuring student identity development. To this end, a content analysis of ten identity assessment tools was conducted to determine the assumptions about identity development contained therein.…

  10. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Gernaey, Krist V.; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant...

  11. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid

  12. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    Science.gov (United States)

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  13. Mutual assumptions and facts about nondisclosure among clinical supervisors and students in group supervision

    DEFF Research Database (Denmark)

    Nielsen, Geir Høstmark; Skjerve, Jan; Jacobsen, Claus Haugaard;

    2009-01-01

    In the two preceding papers of this issue of Nordic Psychology the authors report findings from a study of nondisclosure among student therapists and clinical supervisors. The findings were reported separately for each group. In this article, the two sets of findings are held together and compared......, so as to draw a picture of mutual assumptions and facts about nondisclosure among students and supervisors....

  14. Experimental evaluation of the pure configurational stress assumption in the flow dynamics of entangled polymer melts

    DEFF Research Database (Denmark)

    Rasmussen, Henrik K.; Bejenariu, Anca Gabriela; Hassager, Ole

    2010-01-01

    with the assumption of pure configurational stress was accurately able to predict the startup as well as the reversed flow behavior. This confirms that this commonly used theoretical picture for the flow of polymeric liquids is a correct physical principle to apply. c 2010 The Society of Rheology. [DOI: 10.1122/1.3496378]...

  15. Complex Learning Theory--Its Epistemology and Its Assumptions about Learning: Implications for Physical Education

    Science.gov (United States)

    Light, Richard

    2008-01-01

    Davis and Sumara (2003) argue that differences between commonsense assumptions about learning and those upon which constructivism rests present a significant challenge for the fostering of constructivist approaches to teaching in schools. Indeed, as Rink (2001) suggests, initiating any change process for teaching method needs to involve some…

  16. 76 FR 17158 - Assumption Buster Workshop: Distributed Data Schemes Provide Security

    Science.gov (United States)

    2011-03-28

    ... group that coordinates cyber security research activities in support of national security systems, is...: There is a strong and often repeated call for research to provide novel cyber security solutions. The... capable, and that re-examining cyber security solutions in the context of these assumptions will result in...

  17. Kinematic and static assumptions for homogenization in micromechanics of granular materials

    NARCIS (Netherlands)

    Kruyt, N.P.; Rothenburg, L.

    2004-01-01

    A study is made of kinematic and static assumptions for homogenization in micromechanics of granular materials for two cases. The first case considered deals with the elastic behaviour of isotropic, two-dimensional assemblies with bonded contacts. Using a minimum potential energy principle and estim

  18. Is a "Complex" Task Really Complex? Validating the Assumption of Cognitive Task Complexity

    Science.gov (United States)

    Sasayama, Shoko

    2016-01-01

    In research on task-based learning and teaching, it has traditionally been assumed that differing degrees of cognitive task complexity can be inferred through task design and/or observations of differing qualities in linguistic production elicited by second language (L2) communication tasks. Without validating this assumption, however, it is…

  19. How Do People Learn at the Workplace? Investigating Four Workplace Learning Assumptions

    NARCIS (Netherlands)

    Kooken, Jose; Ley, Tobias; Hoog, de Robert; Duval, Erik; Klamma, Ralf

    2007-01-01

    Any software development project is based on assumptions about the state of the world that probably will hold when it is fielded. Investigating whether they are true can be seen as an important task. This paper describes how an empirical investigation was designed and conducted for the EU funded APO

  20. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    Science.gov (United States)

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  1. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid mediu

  2. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  3. Credit Transfer amongst Students in Contrasting Disciplines: Examining Assumptions about Wastage, Mobility and Lifelong Learning

    Science.gov (United States)

    Di Paolo, Terry; Pegg, Ann

    2013-01-01

    While arrangements for credit transfer exist across the UK higher education sector, little is known about credit-transfer students or why they re-engage with study. Policy makers have cited credit transfer as a mechanism for reducing wastage and drop-out, but this paper challenges this assumption and instead examines how credit transfer serves…

  4. Net Generation at Social Software: Challenging Assumptions, Clarifying Relationships and Raising Implications for Learning

    Science.gov (United States)

    Valtonen, Teemu; Dillon, Patrick; Hacklin, Stina; Vaisanen, Pertti

    2010-01-01

    This paper takes as its starting point assumptions about use of information and communication technology (ICT) by people born after 1983, the so called net generation. The focus of the paper is on social networking. A questionnaire survey was carried out with 1070 students from schools in Eastern Finland. Data are presented on students' ICT-skills…

  5. An Algorithm for Determining Database Consistency Under the Coles World Assumption

    Institute of Scientific and Technical Information of China (English)

    沈一栋

    1992-01-01

    It is well-known that there are circumstances where applying Reiter's closed world assumption(CWA)will lead to logical inconsistencies.In this paper,a new characterization of the CA consistency is pesented and an algorithm is proposed for determining whether a datalase without function symbols is consistent with the CWA.The algorithm is shown to be efficient.

  6. Tests of the frozen-flux and tangentially geostrophic assumptions using magnetic satellite data

    DEFF Research Database (Denmark)

    Chulliat, A.; Olsen, Nils; Sabaka, T.

    In 1984, Jean-Louis Le Mouël published a paper suggesting that the flow at the top of the Earth’s core is tangentially geostrophic, i.e., the Lorentz force is much smaller than the Coriolis force in this particular region of the core. This new assumption wassubsequently used to discriminate among...

  7. Exploring the Estimation of Examinee Locations Using Multidimensional Latent Trait Models under Different Distributional Assumptions

    Science.gov (United States)

    Jang, Hyesuk

    2014-01-01

    This study aims to evaluate a multidimensional latent trait model to determine how well the model works in various empirical contexts. Contrary to the assumption of these latent trait models that the traits are normally distributed, situations in which the latent trait is not shaped with a normal distribution may occur (Sass et al, 2008; Woods…

  8. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    Science.gov (United States)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  9. HIERARCHICAL STRUCTURE IN ADL AND IADL - ANALYTICAL ASSUMPTIONS AND APPLICATIONS FOR CLINICIAN AND RESEARCHERS

    NARCIS (Netherlands)

    KEMPEN, GIJM; MYERS, AM; POWELL, LE

    1995-01-01

    The results of a Canadian study have shown that a set of 12 (I)ADL items did not meet the criteria of Guttman's scalogram program, questioning the assumption of hierarchical ordering. In this article, the hierarchical structure of (I)ADL items from the Canadian elderly sample is retested with anothe

  10. The Impact of Feedback Frequency on Learning and Task Performance: Challenging the "More Is Better" Assumption

    Science.gov (United States)

    Lam, Chak Fu; DeRue, D. Scott; Karam, Elizabeth P.; Hollenbeck, John R.

    2011-01-01

    Previous research on feedback frequency suggests that more frequent feedback improves learning and task performance (Salmoni, Schmidt, & Walter, 1984). Drawing from resource allocation theory (Kanfer & Ackerman, 1989), we challenge the "more is better" assumption and propose that frequent feedback can overwhelm an individual's cognitive resource…

  11. The Mediating Effect of World Assumptions on the Relationship between Trauma Exposure and Depression

    Science.gov (United States)

    Lilly, Michelle M.; Valdez, Christine E.; Graham-Bermann, Sandra A.

    2011-01-01

    The association between trauma exposure and mental health-related challenges such as depression are well documented in the research literature. The assumptive world theory was used to explore this relationship in 97 female survivors of intimate partner violence (IPV). Participants completed self-report questionnaires that assessed trauma history,…

  12. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  13. 76 FR 22925 - Assumption Buster Workshop: Abnormal Behavior Detection Finds Malicious Actors

    Science.gov (United States)

    2011-04-25

    ... Assumption Buster Workshop: Abnormal Behavior Detection Finds Malicious Actors AGENCY: The National... assumptionbusters@nitrd.gov . Travel expenses will be paid at the government rate for selected participants who live... behavioral models to monitor the size and destinations of financial transfers, and/or on-line...

  14. World assumptions, religiosity, and PTSD in survivors of intimate partner violence.

    Science.gov (United States)

    Lilly, Michelle M; Howell, Kathryn H; Graham-Bermann, Sandra

    2015-01-01

    Intimate partner violence (IPV) is among the most frequent types of violence annually affecting women. One frequent outcome of violence exposure is posttraumatic stress disorder (PTSD). The theory of shattered world assumptions represents one possible explanation for adverse mental health outcomes following trauma, contending that trauma disintegrates individuals' core assumptions that the world is safe and meaningful, and that the self is worthy. Research that explores world assumptions in relationship to survivors of IPV has remained absent. A more consistent finding in research on IPV suggests that religiosity is strongly associated with survivors' reactions to, and recovery from, IPV. The present study found that world assumptions was a significant mediator of the relationship between IPV exposure and PTSD symptoms. Religiosity was also significantly, positively related to PTSD symptoms, but was not significantly related to amount of IPV exposure. Though African American women reported more IPV exposure and greater religiosity than European American women in the sample, there were no interethnic differences in PTSD symptom endorsement. Implications of these findings are discussed.

  15. The Impact of Feedback Frequency on Learning and Task Performance: Challenging the "More Is Better" Assumption

    Science.gov (United States)

    Lam, Chak Fu; DeRue, D. Scott; Karam, Elizabeth P.; Hollenbeck, John R.

    2011-01-01

    Previous research on feedback frequency suggests that more frequent feedback improves learning and task performance (Salmoni, Schmidt, & Walter, 1984). Drawing from resource allocation theory (Kanfer & Ackerman, 1989), we challenge the "more is better" assumption and propose that frequent feedback can overwhelm an individual's cognitive resource…

  16. Challenging Assumptions about Values, Interests and Power in Further and Higher Education Partnerships

    Science.gov (United States)

    Elliott, Geoffrey

    2017-01-01

    This article raises questions that challenge assumptions about values, interests and power in further and higher education partnerships. These topics were explored in a series of semi-structured interviews with a sample of principals and senior higher education partnership managers of colleges spread across a single region in England. The data…

  17. Comparison of Three Common Experimental Designs to Improve Statistical Power When Data Violate Parametric Assumptions.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    A Monte Carlo technique was used to investigate the small sample goodness of fit and statistical power of several nonparametric tests and their parametric analogues when applied to data which violate parametric assumptions. The motivation was to facilitate choice among three designs, simple random assignment with and without a concomitant variable…

  18. Exploring Epistemologies: Social Work Action as a Reflection of Philosophical Assumptions.

    Science.gov (United States)

    Dean, Ruth G.; Fenby, Barbara L.

    1989-01-01

    Two major philosophical assumptions underlying the literature, practice, and teaching of social work are reviewed: empiricism and existentialism. Two newer theoretical positions, critical theory and deconstruction, are also introduced. The implications for using each position as a context for teaching are considered. (MSE)

  19. Monitoring long-lasting insecticidal net (LLIN) durability to validate net serviceable life assumptions, in Rwanda

    NARCIS (Netherlands)

    Hakizimana, E.; Cyubahiro, B.; Rukundo, A.; Kabayiza, A.; Mutabazi, A.; Beach, R.; Patel, R.; Tongren, J.E.; Karema, C.

    2014-01-01

    Background To validate assumptions about the length of the distribution–replacement cycle for long-lasting insecticidal nets (LLINs) in Rwanda, the Malaria and other Parasitic Diseases Division, Rwanda Ministry of Health, used World Health Organization methods to independently confirm the three-year

  20. The National Teacher Corps: A Study of Shifting Goals and Changing Assumptions

    Science.gov (United States)

    Eckert, Sarah Anne

    2011-01-01

    This article investigates the lasting legacy of the National Teacher Corps (NTC), which was created in 1965 by the U.S. federal government with two crucial assumptions: that teaching poor urban children required a very specific skill set and that teacher preparation programs were not providing adequate training in these skills. Analysis reveals…

  1. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    Science.gov (United States)

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  2. Principle Assumption of Space Object Detection Using Shipborne Great Aperture Photoelectrical Theodolite

    Science.gov (United States)

    Ouyang, Jia; Zhang, Tong-shuang; Wang, Qian-xue

    2016-02-01

    In this paper the use of space object detection is introduced. By analyzing the research actuality of space object detection using photoelectrical equipment, a shipborne great aperture photoelectrical theodolite is designed. The principle assumption of space object detection using shipborne great aperture photoelectrical theodolite is put forward.

  3. The Mediating Effect of World Assumptions on the Relationship between Trauma Exposure and Depression

    Science.gov (United States)

    Lilly, Michelle M.; Valdez, Christine E.; Graham-Bermann, Sandra A.

    2011-01-01

    The association between trauma exposure and mental health-related challenges such as depression are well documented in the research literature. The assumptive world theory was used to explore this relationship in 97 female survivors of intimate partner violence (IPV). Participants completed self-report questionnaires that assessed trauma history,…

  4. On assumption in low-altitude investigation of dayside magnetospheric phenomena

    Science.gov (United States)

    Koskinen, H. E. J.

    In the physics of large-scale phenomena in complicated media, such as space plasmas, the chain of reasoning from the fundamental physics to conceptual models is a long and winding road, requiring much physical insight and reliance on various assumptions and approximations. The low-altitude investigation of dayside phenomena provides numerous examples of problems arising from the necessity to make strong assumptions. In this paper we discuss some important assumptions that are either unavoidable or at least widely used. Two examples are the concepts of frozen-in field lines and convection velocity. Instead of asking what violates the frozen-in condition, it is quite legitimate to ask what freezes the plasma and the magnetic field in the first place. Another important complex of problems are the limitations introduced by a two-dimensional approach or linearization of equations. Although modern research is more and more moving toward three-dimensional and time-dependent models, limitations in computing power often make a two-dimensional approach tempting. In a similar way, linearization makes equations analytically tractable. Finally, a very central question is the mapping. In the first approximation, the entire dayside magnetopause maps down to the ionosphere through the dayside cusp region. From the mapping viewpoint, the cusp is one of the most difficult regions and assumptions needed to perform the mapping in practice must be considered with the greatest possible care. We can never avoid assumptions but we must always make them clear to ourselves and also to the readers of our papers.

  5. Common-sense chemistry: The use of assumptions and heuristics in problem solving

    Science.gov (United States)

    Maeyer, Jenine Rachel

    Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build predictions and make decisions). A better understanding and characterization of these constraints are of central importance in the development of curriculum and teaching strategies that better support student learning in science. It was the overall goal of this thesis to investigate student reasoning in chemistry, specifically to better understand and characterize the assumptions and heuristics used by undergraduate chemistry students. To achieve this, two mixed-methods studies were conducted, each with quantitative data collected using a questionnaire and qualitative data gathered through semi-structured interviews. The first project investigated the reasoning heuristics used when ranking chemical substances based on the relative value of a physical or chemical property, while the second study characterized the assumptions and heuristics used when making predictions about the relative likelihood of different types of chemical processes. Our results revealed that heuristics for cue selection and decision-making played a significant role in the construction of answers during the interviews. Many study participants relied frequently on one or more of the following heuristics to make their decisions: recognition, representativeness, one-reason decision-making, and arbitrary trend. These heuristics allowed students to generate answers in the absence of requisite knowledge, but often led students astray. When characterizing assumptions, our results indicate that students relied on intuitive, spurious, and valid assumptions about the nature of chemical substances and processes in building their responses. In particular, many

  6. Analysis of Modeling Assumptions used in Production Cost Models for Renewable Integration Studies

    Energy Technology Data Exchange (ETDEWEB)

    Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Townsend, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bloom, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-01

    Renewable energy integration studies have been published for many different regions exploring the question of how higher penetration of renewable energy will impact the electric grid. These studies each make assumptions about the systems they are analyzing; however the effect of many of these assumptions has not been yet been examined and published. In this paper we analyze the impact of modeling assumptions in renewable integration studies, including the optimization method used (linear or mixed-integer programming) and the temporal resolution of the dispatch stage (hourly or sub-hourly). We analyze each of these assumptions on a large and a small system and determine the impact of each assumption on key metrics including the total production cost, curtailment of renewables, CO2 emissions, and generator starts and ramps. Additionally, we identified the impact on these metrics if a four-hour ahead commitment step is included before the dispatch step and the impact of retiring generators to reduce the degree to which the system is overbuilt. We find that the largest effect of these assumptions is at the unit level on starts and ramps, particularly for the temporal resolution, and saw a smaller impact at the aggregate level on system costs and emissions. For each fossil fuel generator type we measured the average capacity started, average run-time per start, and average number of ramps. Linear programming results saw up to a 20% difference in number of starts and average run time of traditional generators, and up to a 4% difference in the number of ramps, when compared to mixed-integer programming. Utilizing hourly dispatch instead of sub-hourly dispatch saw no difference in coal or gas CC units for either start metric, while gas CT units had a 5% increase in the number of starts and 2% increase in the average on-time per start. The number of ramps decreased up to 44%. The smallest effect seen was on the CO2 emissions and total production cost, with a 0.8% and 0

  7. Innovation or 'Inventions'? The conflict between latent assumptions in marine aquaculture and local fishery.

    Science.gov (United States)

    Martínez-Novo, Rodrigo; Lizcano, Emmánuel; Herrera-Racionero, Paloma; Miret-Pastor, Lluís

    2016-06-01

    Recent European policy highlights the need to promote local fishery and aquaculture by means of innovation and joint participation in fishery management as one of the keys to achieve the sustainability of our seas. However, the implicit assumptions held by the actors in the two main groups involved - innovators (scientists, businessmen and administration managers) and local fishermen - can complicate, perhaps even render impossible, mutual understanding and co-operation. A qualitative analysis of interviews with members of both groups in the Valencian Community (Spain) reveals those latent assumptions and their impact on the respective practices. The analysis shows that the innovation narrative in which one group is based and the inventions narrative used by the other one are rooted in two dramatically different, or even antagonistic, collective worldviews. Any environmental policy that implies these groups should take into account these strong discords.

  8. IRT models with relaxed assumptions in eRm: A manual-like instruction

    Directory of Open Access Journals (Sweden)

    REINHOLD HATZINGER

    2009-03-01

    Full Text Available Linear logistic models with relaxed assumptions (LLRA as introduced by Fischer (1974 are a flexible tool for the measurement of change for dichotomous or polytomous responses. As opposed to the Rasch model, assumptions on dimensionality of items, their mutual dependencies and the distribution of the latent trait in the population of subjects are relaxed. Conditional maximum likelihood estimation allows for inference about treatment, covariate or trend effect parameters without taking the subjects' latent trait values into account. In this paper we will show how LLRAs based on the LLTM, LRSM and LPCM can be used to answer various questions about the measurement of change and how they can be fitted in R using the eRm package. A number of small didactic examples is provided that can easily be used as templates for real data sets. All datafiles used in this paper are available from http://eRm.R-Forge.R-project.org/

  9. NGNP: High Temperature Gas-Cooled Reactor Key Definitions, Plant Capabilities, and Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Phillip Mills

    2012-02-01

    This document is intended to provide a Next Generation Nuclear Plant (NGNP) Project tool in which to collect and identify key definitions, plant capabilities, and inputs and assumptions to be used in ongoing efforts related to the licensing and deployment of a high temperature gas-cooled reactor (HTGR). These definitions, capabilities, and assumptions are extracted from a number of sources, including NGNP Project documents such as licensing related white papers [References 1-11] and previously issued requirement documents [References 13-15]. Also included is information agreed upon by the NGNP Regulatory Affairs group's Licensing Working Group and Configuration Council. The NGNP Project approach to licensing an HTGR plant via a combined license (COL) is defined within the referenced white papers and reference [12], and is not duplicated here.

  10. THE HISTORY OF BUILDING THE NORTHERN FRATERNAL CELLS OF VIRGIN MARY ASSUMPTION MONASTERY IN TIKHVIN

    Directory of Open Access Journals (Sweden)

    Tatiana Nikolaevna PYATNITSKAYA

    2014-01-01

    Full Text Available The article is focused on the formation of one of the fra-ternal houses of the Virgin Mary Assumption Monastery in Tikhvin (Leningrad region, the volume-spatial compo-sition of which was developed during the second half of the 17th century. It describes the history of the complex origin around the Assumption Cathedral of the 16th cen-tury and Cell housing location in the wooden and stone ensembles. Comparing the archival documents and the data obtained as a result of field studies, were identified the initial planning and design features of the Nordic fraternal cells. The research identified brigades of Tikhvin masons of 1680-1690 who worked in the construction of the building. Fragments of the original architectural dec-orations and facade colors were found. The research also identified graphic reconstructions, giving an idea not only of the original appearance of the building, but also the history of its changes.

  11. Error in the description of foot kinematics due to violation of rigid body assumptions.

    Science.gov (United States)

    Nester, C J; Liu, A M; Ward, E; Howard, D; Cocheba, J; Derrick, T

    2010-03-03

    Kinematic data from rigid segment foot models inevitably includes errors because the bones within each segment move relative to each other. This study sought to define error in foot kinematic data due to violation of the rigid segment assumption. The research compared kinematic data from 17 different mid and forefoot rigid segment models to kinematic data of the individual bones comprising these segments. Kinematic data from a previous dynamic cadaver model study was used to derive individual bone as well as foot segment kinematics. Mean and maximum errors due to violation of the rigid body assumption varied greatly between models. The model with least error was the combination of navicular and cuboid (mean errors kinematics research study being undertaken.

  12. Load assumption for fatigue design of structures and components counting methods, safety aspects, practical application

    CERN Document Server

    Köhler, Michael; Pötter, Kurt; Zenner, Harald

    2017-01-01

    Understanding the fatigue behaviour of structural components under variable load amplitude is an essential prerequisite for safe and reliable light-weight design. For designing and dimensioning, the expected stress (load) is compared with the capacity to withstand loads (fatigue strength). In this process, the safety necessary for each particular application must be ensured. A prerequisite for ensuring the required fatigue strength is a reliable load assumption. The authors describe the transformation of the stress- and load-time functions which have been measured under operational conditions to spectra or matrices with the application of counting methods. The aspects which must be considered for ensuring a reliable load assumption for designing and dimensioning are discussed in detail. Furthermore, the theoretical background for estimating the fatigue life of structural components is explained, and the procedures are discussed for numerous applications in practice. One of the prime intentions of the authors ...

  13. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  14. Unpacking assumptions about inclusion in community-based health promotion: perspectives of women living in poverty.

    Science.gov (United States)

    Ponic, Pamela; Frisby, Wendy

    2010-11-01

    Community-based health promoters often aim to facilitate "inclusion" when working with marginalized women to address their exclusion and related health issues. Yet the notion of inclusion has not been critically interrogated within this field, resulting in the perpetuation of assumptions that oversimplify it. We provide qualitative evidence on inclusion as a health-promotion strategy from the perspectives of women living in poverty. We collected data with women engaged in a 6-year community-based health promotion and feminist participatory action research project. Participants' experiences illustrated that inclusion was a multidimensional process that involved a dynamic interplay between structural determinants and individual agency. The women named multiple elements of inclusion across psychosocial, relational, organizational, and participatory dimensions. This knowledge interrupts assumptions that inclusion is achievable and desirable for so-called recipients of such initiatives. We thus call for critical consideration of the complexities, limitations, and possibilities of facilitating inclusion as a health-promotion strategy.

  15. Heterosexual assumptions in verbal and non-verbal communication in nursing.

    Science.gov (United States)

    Röndahl, Gerd; Innala, Sune; Carlsson, Marianne

    2006-11-01

    This paper reports a study of what lesbian women and gay men had to say, as patients and as partners, about their experiences of nursing in hospital care, and what they regarded as important to communicate about homosexuality and nursing. The social life of heterosexual cultures is based on the assumption that all people are heterosexual, thereby making homosexuality socially invisible. Nurses may assume that all patients and significant others are heterosexual, and these heteronormative assumptions may lead to poor communication that affects nursing quality by leading nurses to ask the wrong questions and make incorrect judgements. A qualitative interview study was carried out in the spring of 2004. Seventeen women and 10 men ranging in age from 23 to 65 years from different parts of Sweden participated. They described 46 experiences as patients and 31 as partners. Heteronormativity was communicated in waiting rooms, in patient documents and when registering for admission, and nursing staff sometimes showed perplexity when an informant deviated from this heteronormative assumption. Informants had often met nursing staff who showed fear of behaving incorrectly, which could lead to a sense of insecurity, thereby impeding further communication. As partners of gay patients, informants felt that they had to deal with heterosexual assumptions more than they did when they were patients, and the consequences were feelings of not being accepted as a 'true' relative, of exclusion and neglect. Almost all participants offered recommendations about how nursing staff could facilitate communication. Heterosexual norms communicated unconsciously by nursing staff contribute to ambivalent attitudes and feelings of insecurity that prevent communication and easily lead to misconceptions. Educational and management interventions, as well as increased communication, could make gay people more visible and thereby encourage openness and awareness by hospital staff of the norms that they

  16. Determination of the optimal periodic maintenance policy under imperfect repair assumption

    OpenAIRE

    Maria Luiza Guerra de Toledo

    2014-01-01

    . An appropriate maintenance policy is essential to reduce expenses and risks related to repairable systems failures. The usual assumptions of minimal or perfect repair at failures are not suitable for many real systems, requiring the application of Imperfect Repair models. In this work, the classes Arithmetic Reduction of Age and Arithmetic Reduction of Intensity, proposed by Doyen and Gaudoin (2004) are explored. Likelihood functions for such models are derived, and the parameters are es...

  17. RateMyProfessors.com: Testing Assumptions about Student Use and Misuse

    Science.gov (United States)

    Bleske-Rechek, April; Michels, Kelsey

    2010-01-01

    Since its inception in 1999, the RateMyProfessors.com (RMP.com) website has grown in popularity and, with that, notoriety. In this research we tested three assumptions about the website: (1) Students use RMP.com to either rant or rave; (2) Students who post on RMP.com are different from students who do not post; and (3) Students reward easiness by…

  18. Assumptions in quantitative analyses of health risks of overhead power lines

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, A.; Wardekker, J.A.; Van der Sluijs, J.P. [Department of Science, Technology and Society, Copernicus Institute, Utrecht University, Budapestlaan 6, 3584 CD Utrecht (Netherlands)

    2012-02-15

    One of the major issues hampering the formulation of uncontested policy decisions on contemporary risks is the presence of uncertainties in various stages of the policy cycle. In literature, different lines are suggested to address the problem of provisional and uncertain evidence. Reflective approaches such as pedigree analysis can be used to explore the quality of evidence when quantification of uncertainties is at stake. One of the issues where the quality of evidence impedes policy making, is the case of electromagnetic fields. In this case, a (statistical) association was suggested with an increased risk on childhood leukaemia in the vicinity of overhead power lines. A biophysical mechanism that could support this association was not found till date however. The Dutch government bases its policy concerning overhead power lines on the precautionary principle. For The Netherlands, previous studies have assessed the potential number of extra cases of childhood leukaemia due to the presence over overhead power lines. However, such a quantification of the health risk of EMF entails a (large) number of assumptions, both prior to and in the calculation chain. In this study, these assumptions were prioritized and critically appraised in an expert elicitation workshop, using a pedigree matrix for characterization of assumptions in assessments. It appeared that assumptions that were regarded to be important in quantifying the health risks show a high value-ladenness. The results show that, given the present state of knowledge, quantification of the health risks of EMF is premature. We consider the current implementation of the precautionary principle by the Dutch government to be adequate.

  19. Risk Pooling, Commitment and Information: An experimental test of two fundamental assumptions

    OpenAIRE

    Abigail Barr

    2003-01-01

    This paper presents rigorous and direct tests of two assumptions relating to limited commitment and asymmetric information that current underpin current models of risk pooling. A specially designed economic experiment involving 678 subjects across 23 Zimbabwean villages is used to solve the problems of observability and quantification that have frustrated previous attempts to conduct such tests. I find that more extrinsic commitment is associated with more risk pooling, but that more informat...

  20. Logic Assumptions and Risks Framework Applied to Defence Campaign Planning and Evaluation

    Science.gov (United States)

    2013-05-01

    Checkland, P. and J. Poulter (2006). Learning for Action:A Definitive Account of Soft Systems Methodology and its use for Practitioners, Teachers and... Systems Methodology alerts us to as differing ‘world views’. These are contrasted with assumptions about the causal linkages about the implementation...the problem and of the population, and the boundary, or limiting conditions, of the effects of the program – what Checkland and Poulter’s (2006) Soft

  1. Bootstrapping realized volatility and realized beta under a local Gaussianity assumption

    DEFF Research Database (Denmark)

    Hounyo, Ulrich

    The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...... simulations and use empirical data to compare the finite sample accuracy of our new bootstrap confidence intervals for integrated volatility and integrated beta with the existing results....

  2. Camera traps and mark-resight models: The value of ancillary data for evaluating assumptions

    Science.gov (United States)

    Parsons, Arielle W.; Simons, Theodore R.; Pollock, Kenneth H.; Stoskopf, Michael K.; Stocking, Jessica J.; O'Connell, Allan F.

    2015-01-01

    Unbiased estimators of abundance and density are fundamental to the study of animal ecology and critical for making sound management decisions. Capture–recapture models are generally considered the most robust approach for estimating these parameters but rely on a number of assumptions that are often violated but rarely validated. Mark-resight models, a form of capture–recapture, are well suited for use with noninvasive sampling methods and allow for a number of assumptions to be relaxed. We used ancillary data from continuous video and radio telemetry to evaluate the assumptions of mark-resight models for abundance estimation on a barrier island raccoon (Procyon lotor) population using camera traps. Our island study site was geographically closed, allowing us to estimate real survival and in situ recruitment in addition to population size. We found several sources of bias due to heterogeneity of capture probabilities in our study, including camera placement, animal movement, island physiography, and animal behavior. Almost all sources of heterogeneity could be accounted for using the sophisticated mark-resight models developed by McClintock et al. (2009b) and this model generated estimates similar to a spatially explicit mark-resight model previously developed for this population during our study. Spatially explicit capture–recapture models have become an important tool in ecology and confer a number of advantages; however, non-spatial models that account for inherent individual heterogeneity may perform nearly as well, especially where immigration and emigration are limited. Non-spatial models are computationally less demanding, do not make implicit assumptions related to the isotropy of home ranges, and can provide insights with respect to the biological traits of the local population.

  3. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  4. Understanding the multiple realities of everyday life: basic assumptions in focus-group methodology.

    Science.gov (United States)

    Ivanoff, Synneve Dahlin; Hultberg, John

    2006-06-01

    In recent years, there has been a notable growth in the use of focus groups within occupational therapy. It is important to understand what kind of knowledge focus-group methodology is meant to acquire. The purpose of this article is to create an understanding of the basic assumptions within focus-group methodology from a theory of science perspective in order to elucidate and encourage reflection on the paradigm. This will be done based on a study of contemporary literature. To further the knowledge of basic assumptions the article will focus on the following themes: the focus-group research arena, the foundation and its core components; subjects, the role of the researcher and the participants; activities, the specific tasks and procedures. Focus-group methodology can be regarded as a specific research method within qualitative methodology with its own form of methodological criteria, as well as its own research procedures. Participants construct a framework to make sense of their experiences, and in interaction with others these experiences will be modified, leading to the construction of new knowledge. The role of the group leader is to facilitate a fruitful environment for the meaning to emerge and to ensure that the understanding of the meaning emerges independently of the interpreter. Focus-group methodology thus shares, in the authors' view, some basic assumptions with social constructivism.

  5. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Science.gov (United States)

    Hsu, Anne; Griffiths, Thomas L

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  6. Combustion Effects in Laser-oxygen Cutting: Basic Assumptions, Numerical Simulation and High Speed Visualization

    Science.gov (United States)

    Zaitsev, Alexander V.; Ermolaev, Grigory V.

    Laser-oxygen cutting is very complicated for theoretical description technological process. Iron-oxygen combustion playing a leading role making it highly effective, able to cut thicker plates and, at the same time, producing special types of striations and other defects on the cut surface. In this paper results of numerical simulation based on elementary assumptions on iron-oxygen combustion are verified with high speed visualization of laser-oxygen cutting process. On a base of assumption that iron oxide lost its protective properties after melting simulation of striation formation due cycles of laser induced non self-sustained combustion is proposed. Assumption that reaction limiting factor is oxygen transport from the jet to cutting front allows to calculate reaction intensity by solving Navier - Stokes and diffusion system in gas phase. Influence of oxygen purity and pressure is studied theoretically. The results of numerical simulation are examined with high speed visualization of laser-oxygen cutting of 4-20 mm mild steel plates at cutting conditions close to industrial.

  7. Fluid-Structure Interaction Modeling of Intracranial Aneurysm Hemodynamics: Effects of Different Assumptions

    Science.gov (United States)

    Rajabzadeh Oghaz, Hamidreza; Damiano, Robert; Meng, Hui

    2015-11-01

    Intracranial aneurysms (IAs) are pathological outpouchings of cerebral vessels, the progression of which are mediated by complex interactions between the blood flow and vasculature. Image-based computational fluid dynamics (CFD) has been used for decades to investigate IA hemodynamics. However, the commonly adopted simplifying assumptions in CFD (e.g. rigid wall) compromise the simulation accuracy and mask the complex physics involved in IA progression and eventual rupture. Several groups have considered the wall compliance by using fluid-structure interaction (FSI) modeling. However, FSI simulation is highly sensitive to numerical assumptions (e.g. linear-elastic wall material, Newtonian fluid, initial vessel configuration, and constant pressure outlet), the effects of which are poorly understood. In this study, a comprehensive investigation of the sensitivity of FSI simulations in patient-specific IAs is investigated using a multi-stage approach with a varying level of complexity. We start with simulations incorporating several common simplifications: rigid wall, Newtonian fluid, and constant pressure at the outlets, and then we stepwise remove these simplifications until the most comprehensive FSI simulations. Hemodynamic parameters such as wall shear stress and oscillatory shear index are assessed and compared at each stage to better understand the sensitivity of in FSI simulations for IA to model assumptions. Supported by the National Institutes of Health (1R01 NS 091075-01).

  8. Efficient Accountable Authority Identity-Based Encryption under Static Complexity Assumptions

    CERN Document Server

    Libert, Benoît

    2008-01-01

    At Crypto'07, Goyal introduced the concept of Accountable Authority Identity-Based Encryption (A-IBE) as a convenient means to reduce the amount of trust in authorities in Identity-Based Encryption (IBE). In this model, if the Private Key Generator (PKG) maliciously re-distributes users' decryption keys, it runs the risk of being caught and prosecuted. Goyal proposed two constructions: a first one based on Gentry's IBE which relies on strong assumptions (such as q-Bilinear Diffie-Hellman Inversion) and a second one resting on the more classical Decision Bilinear Diffie-Hellman (DBDH) assumption but that is too inefficient for practical use. In this work, we propose a new construction that is secure assuming the hardness of the DBDH problem. The efficiency of our scheme is comparable with that of Goyal's main proposal with the advantage of relying on static assumptions (i.e. the strength of which does not depend on the number of queries allowed to the adversary). By limiting the number of adversarial rewinds i...

  9. Bayesian Mass Estimates of the Milky Way II: The dark and light sides of parameter assumptions

    CERN Document Server

    Eadie, Gwendolyn M

    2016-01-01

    We present mass and mass profile estimates for the Milky Way Galaxy using the Bayesian analysis developed by Eadie et al (2015b) and using globular clusters (GCs) as tracers of the Galactic potential. The dark matter and GCs are assumed to follow different spatial distributions; we assume power-law model profiles and use the model distribution functions described in Evans et al. (1997); Deason et al (2011, 2012a). We explore the relationships between assumptions about model parameters and how these assumptions affect mass profile estimates. We also explore how using subsamples of the GC population beyond certain radii affect mass estimates. After exploring the posterior distributions of different parameter assumption scenarios, we conclude that a conservative estimate of the Galaxy's mass within 125kpc is $5.22\\times10^{11} M_{\\odot}$, with a $50\\%$ probability region of $(4.79, 5.63) \\times10^{11} M_{\\odot}$. Extrapolating out to the virial radius, we obtain a virial mass for the Milky Way of $6.82\\times10^{...

  10. Differentiating Different Modeling Assumptions in Simulations of MagLIF loads on the Z Generator

    Science.gov (United States)

    Jennings, C. A.; Gomez, M. R.; Harding, E. C.; Knapp, P. F.; Ampleford, D. J.; Hansen, S. B.; Weis, M. R.; Glinsky, M. E.; Peterson, K.; Chittenden, J. P.

    2016-10-01

    Metal liners imploded by a fast rising (MagLIF experiments have had some success. While experiments are increasingly well diagnosed, many of the measurements (particularly during stagnation) are time integrated, limited in spatial resolution or require additional assumptions to interpret in the context of a structured, rapidly evolving system. As such, in validating MHD calculations, there is the potential for the same observables in the experimental data to be reproduced under different modeling assumptions. Using synthetic diagnostics of the results of different pre-heat, implosion and stagnation simulations run with the Gorgon MHD code, we discuss how the interpretation of typical Z diagnostics relate to more fundamental simulation parameters. We then explore the extent to which different assumptions on instability development, current delivery, high-Z mix into the fuel and initial laser deposition can be differentiated in our existing measurements. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.

  11. Effects of various assumptions on the calculated liquid fraction in isentropic saturated equilibrium expansions

    Science.gov (United States)

    Bursik, J. W.; Hall, R. M.

    1980-01-01

    The saturated equilibrium expansion approximation for two phase flow often involves ideal-gas and latent-heat assumptions to simplify the solution procedure. This approach is well documented by Wegener and Mack and works best at low pressures where deviations from ideal-gas behavior are small. A thermodynamic expression for liquid mass fraction that is decoupled from the equations of fluid mechanics is used to compare the effects of the various assumptions on nitrogen-gas saturated equilibrium expansion flow starting at 8.81 atm, 2.99 atm, and 0.45 atm, which are conditions representative of transonic cryogenic wind tunnels. For the highest pressure case, the entire set of ideal-gas and latent-heat assumptions are shown to be in error by 62 percent for the values of heat capacity and latent heat. An approximation of the exact, real-gas expression is also developed using a constant, two phase isentropic expansion coefficient which results in an error of only 2 percent for the high pressure case.

  12. What is this Substance? What Makes it Different? Mapping Progression in Students' Assumptions about Chemical Identity

    Science.gov (United States)

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-09-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical analysis relies. We conceive chemical identity as a core crosscutting disciplinary concept which can bring coherence and relevance to chemistry curricula at all educational levels, primary through tertiary. Although chemical identity is not a concept explicitly addressed by traditional chemistry curricula, its understanding can be expected to evolve as students are asked to recognize different types of substances and explore their properties. The goal of this contribution is to characterize students' assumptions about factors that determine chemical identity and to map how core assumptions change with training in the discipline. Our work is based on the review and critical analysis of existing research findings on students' alternative conceptions in chemistry education, and historical and philosophical analyses of chemistry. From this perspective, our analysis contributes to the growing body of research in the area of learning progressions. In particular, it reveals areas in which our understanding of students' ideas about chemical identity is quite robust, but also highlights the existence of major knowledge gaps that should be filled in to better foster student understanding. We provide suggestions in this area and discuss implications for the teaching of chemistry.

  13. The Universality of Intuition an aposteriori Criticize to an apriori Assumption

    Directory of Open Access Journals (Sweden)

    Roohollah Haghshenas

    2015-03-01

    Full Text Available Intuition has a central role in philosophy, the role to arbitrating between different opinions. When a philosopher shows that "intuition" supports his view, he thinks this is a good reason for him. In contrast, if we show some contraries between intuition and a theory or some implications of it, we think a replacement or at least some revisions would be needed. There are some well-known examples of this role for intuition in many fields of philosophy the transplant case in ethics, the chinese nation case in philosophy of mind and the Gettier examples in epistemology. But there is an assumption here we suppose all people think in same manner, i.e. we think intuition(s is universal. Experimental philosophy tries to study this assumption experimentally. This project continues Quine's movement to "pursuit of truth" from a naturalistic point of view and making epistemology "as a branch of natural science." The work of experimental philosophy shows that in many cases people with different cultural backgrounds reflect to some specific moral or epistemological cases –like Gettier examples- differently and thus intuition is not universal. So, many problems that are based on this assumption maybe dissolved, have plural forms for plural cultures or bounded to some specific cultures –western culture in many cases.

  14. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  15. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  16. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  17. Exact Controllability for a Class of Nonlinear Evolution Control Systems

    Institute of Scientific and Technical Information of China (English)

    L¨u Yue; Li Yong

    2015-01-01

    In this paper, we study the exact controllability of the nonlinear control systems. The controllability results by using the monotone operator theory are es-tablished. No compactness assumptions are imposed in the main results.

  18. On the underlying assumptions of threshold Boolean networks as a model for genetic regulatory network behavior

    Science.gov (United States)

    Tran, Van; McCall, Matthew N.; McMurray, Helene R.; Almudevar, Anthony

    2013-01-01

    Boolean networks (BoN) are relatively simple and interpretable models of gene regulatory networks. Specifying these models with fewer parameters while retaining their ability to describe complex regulatory relationships is an ongoing methodological challenge. Additionally, extending these models to incorporate variable gene decay rates, asynchronous gene response, and synergistic regulation while maintaining their Markovian nature increases the applicability of these models to genetic regulatory networks (GRN). We explore a previously-proposed class of BoNs characterized by linear threshold functions, which we refer to as threshold Boolean networks (TBN). Compared to traditional BoNs with unconstrained transition functions, these models require far fewer parameters and offer a more direct interpretation. However, the functional form of a TBN does result in a reduction in the regulatory relationships which can be modeled. We show that TBNs can be readily extended to permit self-degradation, with explicitly modeled degradation rates. We note that the introduction of variable degradation compromises the Markovian property fundamental to BoN models but show that a simple state augmentation procedure restores their Markovian nature. Next, we study the effect of assumptions regarding self-degradation on the set of possible steady states. Our findings are captured in two theorems relating self-degradation and regulatory feedback to the steady state behavior of a TBN. Finally, we explore assumptions of synchronous gene response and asynergistic regulation and show that TBNs can be easily extended to relax these assumptions. Applying our methods to the budding yeast cell-cycle network revealed that although the network is complex, its steady state is simplified by the presence of self-degradation and lack of purely positive regulatory cycles. PMID:24376454

  19. On the underlying assumptions of threshold Boolean networks as a model for genetic regulatory network behavior

    Directory of Open Access Journals (Sweden)

    Van eTran

    2013-12-01

    Full Text Available Boolean networks (BoN are relatively simple and interpretable models of gene regulatorynetworks. Specifying these models with fewer parameters while retaining their ability to describe complex regulatory relationships is an ongoing methodological challenge. Additionally, extending these models to incorporate variable gene decay rates, asynchronous gene response, and synergistic regulation while maintaining their Markovian nature increases the applicability of these models to genetic regulatory networks.We explore a previously-proposed class of BoNs characterized by linear threshold functions, which we refer to as threshold Boolean networks (TBN. Compared to traditional BoNs with unconstrained transition functions, these models require far fewer parameters and offer a more direct interpretation. However, the functional form of a TBN does result in a reduction in the regulatory relationships which can be modeled.We show that TBNs can be readily extended to permit self-degradation, with explicitly modeled degradation rates. We note that the introduction of variable degradation compromises the Markovian property fundamental to BoN models but show that a simple state augmentation procedure restores their Markovian nature. Next, we study the effect of assumptions regarding self-degradation on the set of possible steady states. Our findings are captured in two theorems relating self-degradation and regulatory feedback to the steady state behavior of a TBN. Finally, we explore assumptions of synchronous gene response and asynergistic regulation and show that TBNs can be easily extended to relax these assumptions.Applying our methods to the budding yeast cell-cycle network revealed that although the network is complex, its steady state is simplified by the presence of self-degradation and lack of purely positive regulatory cycles.

  20. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis

    Directory of Open Access Journals (Sweden)

    David B. Flora

    2012-03-01

    Full Text Available We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  1. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis.

    Science.gov (United States)

    Flora, David B; Labrish, Cathy; Chalmers, R Philip

    2012-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  2. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-05-23

    Protein-protein interactions are critically dependent on just a few residues (“hot spots”) at the interfaces. Hot spots make a dominant contribution to the binding free energy and if mutated they can disrupt the interaction. As mutagenesis studies require significant experimental efforts, there exists a need for accurate and reliable computational hot spot prediction methods. Compared to the supervised hot spot prediction algorithms, the semi-supervised prediction methods can take into consideration both the labeled and unlabeled residues in the dataset during the prediction procedure. The transductive support vector machine has been utilized for this task and demonstrated a better prediction performance. To the best of our knowledge, however, none of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue prediction, by considering all the three semisupervised assumptions using nonlinear models. Our algorithm, IterPropMCS, works in an iterative manner. In each iteration, the algorithm first propagates the labels of the labeled residues to the unlabeled ones, along the shortest path between them on a graph, assuming that they lie on a nonlinear manifold. Then it selects the most confident residues as the labeled ones for the next iteration, according to the cluster and smoothness criteria, which is implemented by a nonlinear density estimator. Experiments on a benchmark dataset, using protein structure-based features, demonstrate that our approach is effective in predicting hot spots and compares favorably to other available methods. The results also show that our method outperforms the state-of-the-art transductive learning methods.

  3. On the relevance of assumptions associated with classical factor analytic approaches.

    Science.gov (United States)

    Kasper, Daniel; Unlü, Ali

    2013-01-01

    A personal trait, for example a person's cognitive ability, represents a theoretical concept postulated to explain behavior. Interesting constructs are latent, that is, they cannot be observed. Latent variable modeling constitutes a methodology to deal with hypothetical constructs. Constructs are modeled as random variables and become components of a statistical model. As random variables, they possess a probability distribution in the population of reference. In applications, this distribution is typically assumed to be the normal distribution. The normality assumption may be reasonable in many cases, but there are situations where it cannot be justified. For example, this is true for criterion-referenced tests or for background characteristics of students in large scale assessment studies. Nevertheless, the normal procedures in combination with the classical factor analytic methods are frequently pursued, despite the effects of violating this "implicit" assumption are not clear in general. In a simulation study, we investigate whether classical factor analytic approaches can be instrumental in estimating the factorial structure and properties of the population distribution of a latent personal trait from educational test data, when violations of classical assumptions as the aforementioned are present. The results indicate that having a latent non-normal distribution clearly affects the estimation of the distribution of the factor scores and properties thereof. Thus, when the population distribution of a personal trait is assumed to be non-symmetric, we recommend avoiding those factor analytic approaches for estimation of a person's factor score, even though the number of extracted factors and the estimated loading matrix may not be strongly affected. An application to the Progress in International Reading Literacy Study (PIRLS) is given. Comments on possible implications for the Programme for International Student Assessment (PISA) complete the presentation.

  4. Evaluating the reliability of equilibrium dissolution assumption from residual gasoline in contact with water saturated sands

    Science.gov (United States)

    Lekmine, Greg; Sookhak Lari, Kaveh; Johnston, Colin D.; Bastow, Trevor P.; Rayner, John L.; Davis, Greg B.

    2017-01-01

    Understanding dissolution dynamics of hazardous compounds from complex gasoline mixtures is a key to long-term predictions of groundwater risks. The aim of this study was to investigate if the local equilibrium assumption for BTEX and TMBs (trimethylbenzenes) dissolution was valid under variable saturation in two dimensional flow conditions and evaluate the impact of local heterogeneities when equilibrium is verified at the scale of investigation. An initial residual gasoline saturation was established over the upper two-thirds of a water saturated sand pack. A constant horizontal pore velocity was maintained and water samples were recovered across 38 sampling ports over 141 days. Inside the residual NAPL zone, BTEX and TMBs dissolution curves were in agreement with the TMVOC model based on the local equilibrium assumption. Results compared to previous numerical studies suggest the presence of small scale dissolution fingering created perpendicular to the horizontal dissolution front, mainly triggered by heterogeneities in the medium structure and the local NAPL residual saturation. In the transition zone, TMVOC was able to represent a range of behaviours exhibited by the data, confirming equilibrium or near-equilibrium dissolution at the scale of investigation. The model locally showed discrepancies with the most soluble compounds, i.e. benzene and toluene, due to local heterogeneities exhibiting that at lower scale flow bypassing and channelling may have occurred. In these conditions mass transfer rates were still high enough to fall under the equilibrium assumption in TMVOC at the scale of investigation. Comparisons with other models involving upscaled mass transfer rates demonstrated that such approximations with TMVOC could lead to overestimate BTEX dissolution rates and underestimate the total remediation time.

  5. The Avalanche Hypothesis and Compression of Morbidity: Testing Assumptions through Cohort-Sequential Analysis.

    Directory of Open Access Journals (Sweden)

    Jordan Silberman

    Full Text Available The compression of morbidity model posits a breakpoint in the adult lifespan that separates an initial period of relative health from a subsequent period of ever increasing morbidity. Researchers often assume that such a breakpoint exists; however, this assumption is hitherto untested.To test the assumption that a breakpoint exists--which we term a morbidity tipping point--separating a period of relative health from a subsequent deterioration in health status. An analogous tipping point for healthcare costs was also investigated.Four years of adults' (N = 55,550 morbidity and costs data were retrospectively analyzed. Data were collected in Pittsburgh, PA between 2006 and 2009; analyses were performed in Rochester, NY and Ann Arbor, MI in 2012 and 2013. Cohort-sequential and hockey stick regression models were used to characterize long-term trajectories and tipping points, respectively, for both morbidity and costs.Morbidity increased exponentially with age (P<.001. A morbidity tipping point was observed at age 45.5 (95% CI, 41.3-49.7. An exponential trajectory was also observed for costs (P<.001, with a costs tipping point occurring at age 39.5 (95% CI, 32.4-46.6. Following their respective tipping points, both morbidity and costs increased substantially (Ps<.001.Findings support the existence of a morbidity tipping point, confirming an important but untested assumption. This tipping point, however, may occur earlier in the lifespan than is widely assumed. An "avalanche of morbidity" occurred after the morbidity tipping point-an ever increasing rate of morbidity progression. For costs, an analogous tipping point and "avalanche" were observed. The time point at which costs began to increase substantially occurred approximately 6 years before health status began to deteriorate.

  6. Bayesian Mass Estimates of the Milky Way: The Dark and Light Sides of Parameter Assumptions

    Science.gov (United States)

    Eadie, Gwendolyn M.; Harris, William E.

    2016-10-01

    We present mass and mass profile estimates for the Milky Way (MW) Galaxy using the Bayesian analysis developed by Eadie et al. and using globular clusters (GCs) as tracers of the Galactic potential. The dark matter and GCs are assumed to follow different spatial distributions; we assume power-law model profiles and use the model distribution functions described in Evans et al. and Deason et al. We explore the relationships between assumptions about model parameters and how these assumptions affect mass profile estimates. We also explore how using subsamples of the GC population beyond certain radii affect mass estimates. After exploring the posterior distributions of different parameter assumption scenarios, we conclude that a conservative estimate of the Galaxy’s mass within 125 kpc is 5.22× {10}11 {M}⊙ , with a 50% probability region of (4.79,5.63)× {10}11 {M}⊙ . Extrapolating out to the virial radius, we obtain a virial mass for the MW of 6.82× {10}11 {M}⊙ with 50% credible region of (6.06,7.53)× {10}11 {M}⊙ ({r}{vir}={185}-7+7 {{kpc}}). If we consider only the GCs beyond 10 kpc, then the virial mass is 9.02 (5.69,10.86)× {10}11 {M}⊙ ({r}{vir}={198}-24+19 kpc). We also arrive at an estimate of the velocity anisotropy parameter β of the GC population, which is β =0.28 with a 50% credible region (0.21, 0.35). Interestingly, the mass estimates are sensitive to both the dark matter halo potential and visible matter tracer parameters, but are not very sensitive to the anisotropy parameter.

  7. Investigating Darcy-scale assumptions by means of a multiphysics algorithm

    Science.gov (United States)

    Tomin, Pavel; Lunati, Ivan

    2016-09-01

    Multiphysics (or hybrid) algorithms, which couple Darcy and pore-scale descriptions of flow through porous media in a single numerical framework, are usually employed to decrease the computational cost of full pore-scale simulations or to increase the accuracy of pure Darcy-scale simulations when a simple macroscopic description breaks down. Despite the massive increase in available computational power, the application of these techniques remains limited to core-size problems and upscaling remains crucial for practical large-scale applications. In this context, the Hybrid Multiscale Finite Volume (HMsFV) method, which constructs the macroscopic (Darcy-scale) problem directly by numerical averaging of pore-scale flow, offers not only a flexible framework to efficiently deal with multiphysics problems, but also a tool to investigate the assumptions used to derive macroscopic models and to better understand the relationship between pore-scale quantities and the corresponding macroscale variables. Indeed, by direct comparison of the multiphysics solution with a reference pore-scale simulation, we can assess the validity of the closure assumptions inherent to the multiphysics algorithm and infer the consequences for macroscopic models at the Darcy scale. We show that the definition of the scale ratio based on the geometric properties of the porous medium is well justified only for single-phase flow, whereas in case of unstable multiphase flow the nonlinear interplay between different forces creates complex fluid patterns characterized by new spatial scales, which emerge dynamically and weaken the scale-separation assumption. In general, the multiphysics solution proves very robust even when the characteristic size of the fluid-distribution patterns is comparable with the observation length, provided that all relevant physical processes affecting the fluid distribution are considered. This suggests that macroscopic constitutive relationships (e.g., the relative

  8. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions.

    Science.gov (United States)

    Flores-Alsina, Xavier; Gernaey, Krist V; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant was compared for a series of model assumptions. Three different model approaches describing BNR are considered. In the reference case, the original model implementations are used to simulate WWTP1 (ASM1 & 3) and WWTP2 (ASM2d). The second set of models includes a reactive settler, which extends the description of the non-reactive TSS sedimentation and transport in the reference case with the full set of ASM processes. Finally, the third set of models is based on including electron acceptor dependency of biomass decay rates for ASM1 (WWTP1) and ASM2d (WWTP2). The results show that incorporation of a reactive settler: (1) increases the hydrolysis of particulates; (2) increases the overall plant's denitrification efficiency by reducing the S(NOx) concentration at the bottom of the clarifier; (3) increases the oxidation of COD compounds; (4) increases X(OHO) and X(ANO) decay; and, finally, (5) increases the growth of X(PAO) and formation of X(PHA,Stor) for ASM2d, which has a major impact on the whole P removal system. Introduction of electron acceptor dependent decay leads to a substantial increase of the concentration of X(ANO), X(OHO) and X(PAO) in the bottom of the clarifier. The paper ends with a critical discussion of the influence of the different model assumptions, and emphasizes the need for a model user to understand the significant differences in simulation results that are obtained when applying different combinations of 'standard' models.

  9. From the lab to the world: The paradigmatic assumption and the functional cognition of avian foraging

    Institute of Scientific and Technical Information of China (English)

    Danielle SULIKOWSKI; Darren BURKE

    2015-01-01

    Mechanisms of animal learning and memory were traditionally studied without reference to niche-specific functional considerations.More recently,ecological demands have informed such investigations,most notably with respect to foraging in birds.In parallel,behavioural ecologists,primarily concerned with functional optimization,have begun to consider the role of mechanistic factors,including cognition,to explain apparent deviations from optimal predictions.In the present paper we discuss the application of laboratory-based constructs and paradigms of cognition to the real-world challenges faced by avian foragers.We argue that such applications have been handicapped by what we term the 'paradigmatic assumption'-the assumption that a given laboratory paradigm maps well enough onto a congruent cognitive mechanism (or cognitive ability) to justify conflation of the two.We present evidence against the paradigmatic assumption and suggest that to achieve a profitable integration between function and mechanism,with respect to animal cognition,a new conceptualization of cognitive mechanisms-functional cognition-is required.This new conceptualization should define cognitive mechanisms based on the informational properties of the animal's environment and the adaptive challenges faced.Cognitive mechanisms must be examined in settings that mimic the im portant aspects of the natural environment,using customized tasks designed to probe defined aspects of the mechanisms' operation.We suggest that this approach will facilitate investigations of the functional and evolutionary relevance of cognitive mechanisms,as well as the patterns of divergence,convergence and specialization of cognitive mechanisms within and between species [Current Zoology 61 (2):328-340,2015].

  10. Assumption-versus data-based approaches to summarizing species' ranges.

    Science.gov (United States)

    Peterson, A Townsend; Navarro-Sigüenza, Adolfo G; Gordillo, Alejandro

    2016-08-04

    For conservation decision making, species' geographic distributions are mapped using various approaches. Some such efforts have downscaled versions of coarse-resolution extent-of-occurrence maps to fine resolutions for conservation planning. We examined the quality of the extent-of-occurrence maps as range summaries and the utility of refining those maps into fine-resolution distributional hypotheses. Extent-of-occurrence maps tend to be overly simple, omit many known and well-documented populations, and likely frequently include many areas not holding populations. Refinement steps involve typological assumptions about habitat preferences and elevational ranges of species, which can introduce substantial error in estimates of species' true areas of distribution. However, no model-evaluation steps are taken to assess the predictive ability of these models, so model inaccuracies are not noticed. Whereas range summaries derived by these methods may be useful in coarse-grained, global-extent studies, their continued use in on-the-ground conservation applications at fine spatial resolutions is not advisable in light of reliance on assumptions, lack of real spatial resolution, and lack of testing. In contrast, data-driven techniques that integrate primary data on biodiversity occurrence with remotely sensed data that summarize environmental dimensions (i.e., ecological niche modeling or species distribution modeling) offer data-driven solutions based on a minimum of assumptions that can be evaluated and validated quantitatively to offer a well-founded, widely accepted method for summarizing species' distributional patterns for conservation applications. © 2016 Society for Conservation Biology.

  11. A computational model to investigate assumptions in the headturn preference procedure

    Directory of Open Access Journals (Sweden)

    Christina eBergmann

    2013-10-01

    Full Text Available In this paper we use a computational model to investigate four assumptions that are tacitly present in interpreting the results of studies on infants' speech processing abilities using the Headturn Preference Procedure (HPP: (1 behavioural differences originate in different processing; (2 processing involves some form of recognition; (3 words are segmented from connected speech; and (4 differences between infants should not affect overall results. In addition, we investigate the impact of two potentially important aspects in the design and execution of the experiments: (a the specific voices used in the two parts on HPP experiments (familiarisation and test and (b the experimenter's criterion for what is a sufficient headturn angle. The model is designed to be maximise cognitive plausibility. It takes real speech as input, and it contains a module that converts the output of internal speech processing and recognition into headturns that can yield real-time listening preference measurements. Internal processing is based on distributed episodic representations in combination with a matching procedure based on the assumptions that complex episodes can be decomposed as positive weighted sums of simpler constituents. Model simulations show that the first assumptions hold under two different definitions of recognition. However, explicit segmentation is not necessary to simulate the behaviours observed in infant studies. Differences in attention span between infants can affect the outcomes of an experiment. The same holds for the experimenter's decision criterion. The speakers used in experiments affect outcomes in complex ways that require further investigation. The paper ends with recommendations for future studies using the HPP.

  12. Tests of data quality, scaling assumptions, and reliability of the Danish SF-36

    DEFF Research Database (Denmark)

    Bjorner, J B; Damsgaard, M T; Watt, T;

    1998-01-01

    We used general population data (n = 4084) to examine data completeness, response consistency, tests of scaling assumptions, and reliability of the Danish SF-36 Health Survey. We compared traditional multitrait scaling analyses to analyses using polychoric correlations and Spearman correlations...... discriminant validity, equal item-own scale correlations, and equal variances) were satisfactory in the total sample and in all subgroups. The SF-36 could discriminate between levels of health in all subgroups, but there were skewness, kurtosis, and ceiling effects in many subgroups (elderly people and people...

  13. Science with the Square Kilometer Array: Motivation, Key Science Projects, Standards and Assumptions

    CERN Document Server

    Carilli, C

    2004-01-01

    The Square Kilometer Array (SKA) represents the next major, and natural, step in radio astronomical facilities, providing two orders of magnitude increase in collecting area over existing telescopes. In a series of meetings, starting in Groningen, the Netherlands (August 2002) and culminating in a `science retreat' in Leiden (November 2003), the SKA International Science Advisory Committee (ISAC), conceived of, and carried-out, a complete revision of the SKA science case (to appear in New Astronomy Reviews). This preface includes: (i) general introductory material, (ii) summaries of the key science programs, and (iii) a detailed listing of standards and assumptions used in the revised science case.

  14. Condition for Energy Efficient Watermarking with Random Vector Model without WSS Assumption

    CERN Document Server

    Yan, Bin; Guo, Yinjing

    2009-01-01

    Energy efficient watermarking preserves the watermark energy after linear attack as much as possible. We consider in this letter non-stationary signal models and derive conditions for energy efficient watermarking under random vector model without WSS assumption. We find that the covariance matrix of the energy efficient watermark should be proportional to host covariance matrix to best resist the optimal linear removal attacks. In WSS process our result reduces to the well known power spectrum condition. Intuitive geometric interpretation of the results are also discussed which in turn also provide more simpler proof of the main results.

  15. Local conservation scores without a priori assumptions on neutral substitution rates

    Directory of Open Access Journals (Sweden)

    Hagenauer Joachim

    2008-04-01

    Full Text Available Abstract Background Comparative genomics aims to detect signals of evolutionary conservation as an indicator of functional constraint. Surprisingly, results of the ENCODE project revealed that about half of the experimentally verified functional elements found in non-coding DNA were classified as unconstrained by computational predictions. Following this observation, it has been hypothesized that this may be partly explained by biased estimates on neutral evolutionary rates used by existing sequence conservation metrics. All methods we are aware of rely on a comparison with the neutral rate and conservation is estimated by measuring the deviation of a particular genomic region from this rate. Consequently, it is a reasonable assumption that inaccurate neutral rate estimates may lead to biased conservation and constraint estimates. Results We propose a conservation signal that is produced by local Maximum Likelihood estimation of evolutionary parameters using an optimized sliding window and present a Kullback-Leibler projection that allows multiple different estimated parameters to be transformed into a conservation measure. This conservation measure does not rely on assumptions about neutral evolutionary substitution rates and little a priori assumptions on the properties of the conserved regions are imposed. We show the accuracy of our approach (KuLCons on synthetic data and compare it to the scores generated by state-of-the-art methods (phastCons, GERP, SCONE in an ENCODE region. We find that KuLCons is most often in agreement with the conservation/constraint signatures detected by GERP and SCONE while qualitatively very different patterns from phastCons are observed. Opposed to standard methods KuLCons can be extended to more complex evolutionary models, e.g. taking insertion and deletion events into account and corresponding results show that scores obtained under this model can diverge significantly from scores using the simpler model

  16. Untested assumptions: psychological research and credibility assessment in legal decision-making

    Directory of Open Access Journals (Sweden)

    Jane Herlihy

    2015-05-01

    Full Text Available Background: Trauma survivors often have to negotiate legal systems such as refugee status determination or the criminal justice system. Methods & results: We outline and discuss the contribution which research on trauma and related psychological processes can make to two particular areas of law where complex and difficult legal decisions must be made: in claims for refugee and humanitarian protection, and in reporting and prosecuting sexual assault in the criminal justice system. Conclusion: There is a breadth of psychological knowledge that, if correctly applied, would limit the inappropriate reliance on assumptions and myth in legal decision-making in these settings. Specific recommendations are made for further study.

  17. What is a god? Metatheistic assumptions in Old Testament Yahwism(s

    Directory of Open Access Journals (Sweden)

    J W Gericke

    2006-09-01

    Full Text Available In this article, the author provides a prolegomena to further research attempting to answer a most undamental and basic question � much more so than what has thus far been the case in the disciplines of Old Testament theology and history of Israelite religion. It concerns the implicit assumptions in the Hebrew Bible�s discourse about the fundamental nature of deity. In other words, the question is not, �What is� YHWH like?� but rather , �what, according to the Old Testament texts, is a god?�

  18. Bases, Assumptions, and Results of the Flowsheet Calculations for the Decision Phase Salt Disposition Alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Dimenna, R.A.; Jacobs, R.A.; Taylor, G.A.; Durate, O.E.; Paul, P.K.; Elder, H.H.; Pike, J.A.; Fowler, J.R.; Rutland, P.L.; Gregory, M.V.; Smith III, F.G.; Hang, T.; Subosits, S.G.; Campbell, S.G.

    2001-03-26

    The High Level Waste (HLW) Salt Disposition Systems Engineering Team was formed on March 13, 1998, and chartered to identify options, evaluate alternatives, and recommend a selected alternative(s) for processing HLW salt to a permitted wasteform. This requirement arises because the existing In-Tank Precipitation process at the Savannah River Site, as currently configured, cannot simultaneously meet the HLW production and Authorization Basis safety requirements. This engineering study was performed in four phases. This document provides the technical bases, assumptions, and results of this engineering study.

  19. Examining Assumptions and Limitations of Research on the Effects of Emerging Technologies for Teaching and Learning in Higher Education

    Science.gov (United States)

    Kirkwood, Adrian; Price, Linda

    2013-01-01

    This paper examines assumptions and beliefs underpinning research into educational technology. It critically reviews some approaches used to investigate the impact of technologies for teaching and learning. It focuses on comparative studies, performance comparisons and attitudinal studies to illustrate how under-examined assumptions lead to…

  20. The Assumption of Proportional Components when Candecomp Is Applied to Symmetric Matrices in the Context of INDSCAL

    Science.gov (United States)

    Dosse, Mohammed Bennani; Berge, Jos M. F.

    2008-01-01

    The use of Candecomp to fit scalar products in the context of INDSCAL is based on the assumption that the symmetry of the data matrices involved causes the component matrices to be equal when Candecomp converges. Ten Berge and Kiers gave examples where this assumption is violated for Gramian data matrices. These examples are believed to be local…

  1. On the Impact of the Dutch Educational Supervision Act : Analyzing Assumptions Concerning the Inspection of Primary Education

    NARCIS (Netherlands)

    Ehren, Melanie C. M.; Leeuw, Frans L.; Scheerens, Jaap

    2001-01-01

    This article uses a policy scientific approach to reconstruct assumptions underlying the Dutch Educational Supervision Act.We showan example of howto reconstruct and evaluate a program theory that is based on legislation of inspection. The assumptions explain how inspection leads to school improveme

  2. The Assumption of Proportional Components when Candecomp Is Applied to Symmetric Matrices in the Context of INDSCAL

    Science.gov (United States)

    Dosse, Mohammed Bennani; Berge, Jos M. F.

    2008-01-01

    The use of Candecomp to fit scalar products in the context of INDSCAL is based on the assumption that the symmetry of the data matrices involved causes the component matrices to be equal when Candecomp converges. Ten Berge and Kiers gave examples where this assumption is violated for Gramian data matrices. These examples are believed to be local…

  3. Washing machines, driers and dishwashers. Background reports. Vol. 1: Basic assumptions and impact analyses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-06-01

    Before analyzing wet appliances to establish a common European Union (EU) basis for defining efficiency in domestic wet appliances, a framework has to be set up. The first part of this Background Report deals with such a framework and with definitions, basic assumptions and test methods. The next sections give a short introduction on the framework of wet appliances and definitions taken from international standards. Chapter 2 elaborates on basic assumptions regarding appliance categories, capacity, energy efficiency and performance. Chapter 3 contains a survey of test methods from international standard and chapter 4 shows the present state of standard in International Standardization Organization (IEC) and Commite Europeen de Normalisation Electrotechnique (CENELEC). The next two chapter of the report deal with the user of wet appliances: the consumer. Analysis in more detail aspects of daily use, such as ownership level, frequency of use, type of programme used is given. An important question for this study is whether a `European consumer` exists; section 5.5 deals with this subject. Two elements of the marketing mix: product and price are considered. Several possible product options are reviewed and attention is paid to the impact of price on conumsers` buying decicions. The findings of this report and recommendations are summarized. (au)

  4. Testing surrogacy assumptions: can threatened and endangered plants be grouped by biological similarity and abundances?

    Science.gov (United States)

    Che-Castaldo, Judy P; Neel, Maile C

    2012-01-01

    There is renewed interest in implementing surrogate species approaches in conservation planning due to the large number of species in need of management but limited resources and data. One type of surrogate approach involves selection of one or a few species to represent a larger group of species requiring similar management actions, so that protection and persistence of the selected species would result in conservation of the group of species. However, among the criticisms of surrogate approaches is the need to test underlying assumptions, which remain rarely examined. In this study, we tested one of the fundamental assumptions underlying use of surrogate species in recovery planning: that there exist groups of threatened and endangered species that are sufficiently similar to warrant similar management or recovery criteria. Using a comprehensive database of all plant species listed under the U.S. Endangered Species Act and tree-based random forest analysis, we found no evidence of species groups based on a set of distributional and biological traits or by abundances and patterns of decline. Our results suggested that application of surrogate approaches for endangered species recovery would be unjustified. Thus, conservation planning focused on individual species and their patterns of decline will likely be required to recover listed species.

  5. Maximizing the Delivery of MPR Broadcasting Under Realistic Physical Layer Assumptions

    Institute of Scientific and Technical Information of China (English)

    Francois Ingelrest; David Simplot-Ryl

    2008-01-01

    It is now commonly accepted that the unit disk graph used to model the physical layer in wireless network sdoes not reflect real radio transmissions, and that a more realistic model should be considered for experimental simulations.Previous work on realistic scenarios has been focused on unicast, however broadcast requirements are fundamentally different and cannot be derived from the unicast case. There fore, the broadcast protocols must be adapted in order to still be efficient under realistic assumptions. In this paper, we study the well-known multipoint relay broadcast protocol (MPR), in which each node has to choose a set of 1-hop neighbors to act as relays in order to cover the whole 2-hop neighborhood. We giveexperimental results showing that the original strategy used to select these multipoint relays does not suit a realistic model.On the basis of these results, we propose new selection strategies solely based on link quality. One of the key aspects of our solutions is that our strategies do not require any additional hardware and may be implemented at the application layer,which is particularly relevant to the context of ad hoc and sensor networks where energy savings are mandatory. We finall yprovide new experimental results that demonstrate the superiority of our strategies under realistic physical assumptions.

  6. Questioning the foundations of physics which of our fundamental assumptions are wrong?

    CERN Document Server

    Foster, Brendan; Merali, Zeeya

    2015-01-01

    The essays in this book look at way in which the fundaments of physics might need to be changed in order to make progress towards a unified theory. They are based on the prize-winning essays submitted to the FQXi essay competition “Which of Our Basic Physical Assumptions Are Wrong?”, which drew over 270 entries. As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.” The authors of the eighteen prize-winning essays have, where necessary, adapted their essays for the present volume so as to (a) incorporate the community feedback generated in the online discussion of the essays, (b) add new material that has come to light since their completion and (c) to ensure accessibility to a broad audience of re...

  7. The ozone depletion potentials on halocarbons: Their dependence of calculation assumptions

    Science.gov (United States)

    Karol, Igor L.; Kiselev, Andrey A.

    1994-01-01

    The concept of Ozone Depletion Potential (ODP) is widely used in the evaluation of numerous halocarbons and of their replacement effects on ozone, but the methods, assumptions and conditions used in ODP calculations have not been analyzed adequately. In this paper a model study of effects on ozone of the instantaneous releases of various amounts of CH3CCl3 and of CHF2Cl (HCFC-22) for several compositions of the background atmosphere are presented, aimed at understanding connections of ODP values with the assumptions used in their calculations. To facilitate the ODP computation in numerous versions for the long time periods after their releases, the above rather short-lived gases and the one-dimensional radiative photochemical model of the global annually averaged atmospheric layer up to 50 km height are used. The variation of released gas global mass from 1 Mt to 1 Gt leads to ODP value increase with its stabilization close to the upper bound of this range in the contemporary atmosphere. The same variations are analyzed for conditions of the CFC-free atmosphere of 1960's and for the anthropogenically loaded atmosphere in the 21st century according to the known IPCC 'business as usual' scenario. Recommendations for proper ways of ODP calculations are proposed for practically important cases.

  8. Testing assumptions of the enemy release hypothesis: generalist versus specialist enemies of the grass Brachypodium sylvaticum.

    Science.gov (United States)

    Halbritter, Aud H; Carroll, George C; Güsewell, Sabine; Roy, Bitty A

    2012-01-01

    The enemy release hypothesis (ERH) suggests greater success of species in an invaded range due to release from natural enemies. The ERH assumes there will be more specialist enemies in the native range and that generalists will have an equal effect in both ranges. We tested these assumptions with the grass Brachypodium sylvaticum in the native range (Switzerland) and invaded range (Oregon, USA). We assessed all the kinds of damage present (caused by fungi, insects, mollusk and deer) on both leaves and seeds at 10 sites in each range and correlated damage with host fitness. Only two of the 20 fungi found on leaves were specialist pathogens, and these were more frequent in the native range. Conversely there was more insect herbivory on leaves in the invaded range. All fungi and insects found on seeds were generalists. More species of fungi were found on seeds in the native range, and a higher proportion of them were pathogenic than in the invaded range. There were more kinds of enemies in the native range, where the plants had lower fitness, in accordance with the ERH. However, contrary to assumptions of the ERH, generalists appear to be equally or more important than specialists in reducing host fitness.

  9. How do our prior assumptions about basal drag affect ice sheet forecasts?

    Science.gov (United States)

    Arthern, Robert

    2015-04-01

    Forecasts of changes in the large ice sheets of Greenland and Antarctica often begin with an inversion to select initial values for state variables and parameters in the model, such as basal drag and ice viscosity. These inversions can be ill-posed in the sense that many different choices for the parameter values can match the observational data equally well. To recover a mathematically well-posed problem, assumptions must be made that restrict the possible values of the parameters, either by regularisation or by explicit definition of Bayesian priors. Common assumptions are that parameters vary smoothly in space or lie close to some preferred initial guess, but for glaciological inversions it is often unclear how smoothly the parameters should vary, or how reliable the initial guess should be considered. This is especially true of inversions for the basal drag coefficient that can vary enormously from place to place on length scales set by subglacial hydrology, which is itself extremely poorly constrained by direct observations. Here we use a combination of forward modelling, inversion and a theoretical analysis based on transformation group priors to investigate different ways of introducing prior information about parameters, and to consider the consequences for ice sheet forecasts.

  10. Bias in regression coefficient estimates when assumptions for handling missing data are violated: a simulation study

    Directory of Open Access Journals (Sweden)

    Sander MJ van Kuijk

    2016-03-01

    Full Text Available BackgroundThe purpose of this simulation study is to assess the performance of multiple imputation compared to complete case analysis when assumptions of missing data mechanisms are violated.MethodsThe authors performed a stochastic simulation study to assess the performance of Complete Case (CC analysis and Multiple Imputation (MI with different missing data mechanisms (missing completely at random (MCAR, at random (MAR, and not at random (MNAR. The study focused on the point estimation of regression coefficients and standard errors.ResultsWhen data were MAR conditional on Y, CC analysis resulted in biased regression coefficients; they were all underestimated in our scenarios. In these scenarios, analysis after MI gave correct estimates. Yet, in case of MNAR MI yielded biased regression coefficients, while CC analysis performed well.ConclusionThe authors demonstrated that MI was only superior to CC analysis in case of MCAR or MAR. In some scenarios CC may be superior over MI. Often it is not feasible to identify the reason why data in a given dataset are missing. Therefore, emphasis should be put on reporting the extent of missing values, the method used to address them, and the assumptions that were made about the mechanism that caused missing data.

  11. Moral dilemmas in professions of public trust and the assumptions of ethics of social consequences

    Directory of Open Access Journals (Sweden)

    Dubiel-Zielińska Paulina

    2016-06-01

    Full Text Available The aim of the article is to show the possibility of applying assumptions from ethics of social consequences when making decisions about actions, as well as in situations of moral dilemmas, by persons performing occupations of public trust on a daily basis. Reasoning in the article is analytical and synthetic. Article begins with an explanation of the basic concepts of “profession” and “the profession of public trust” and a manifestation of the difference between these terms. This is followed by a general description of professions of public trust. The area and definition of moral dilemmas is emphasized. Furthermore, representatives of professions belonging to them are listed. After a brief characterization of axiological foundations and the main assumptions of ethics of social consequences, actions according to Vasil Gluchman and Włodzimierz Galewicz are discussed and actions in line with ethics of social consequences are transferred to the practical domain. The article points out that actions in professional life are obligatory, impermissible, permissible, supererogatory and unmarked in the moral dimension. In the final part of the article an afterthought is included on how to solve moral dilemmas when in the position of a representative of the profession of public trust. The article concludes with a summary report containing the conclusions that stem from ethics of social consequences for professions of public trust, followed by short examples.

  12. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up.

  13. Temporal Distinctiveness in Task Switching: Assessing the Mixture-Distribution Assumption

    Directory of Open Access Journals (Sweden)

    James A Grange

    2016-02-01

    Full Text Available In task switching, increasing the response--cue interval has been shown to reduce the switch cost. This has been attributed to a time-based decay process influencing the activation of memory representations of tasks (task-sets. Recently, an alternative account based on interference rather than decay has been successfully applied to this data (Horoufchin et al., 2011. In this account, variation of the RCI is thought to influence the temporal distinctiveness (TD of episodic traces in memory, thus affecting their retrieval probability. This can affect performance as retrieval probability influences response time: If retrieval succeeds, responding is fast due to positive priming; if retrieval fails, responding is slow, due to having to perform the task via a slow algorithmic process. This account---and a recent formal model (Grange & Cross, 2015---makes the strong prediction that all RTs are a mixture of one of two processes: a fast process when retrieval succeeds, and a slow process when retrieval fails. The present paper assesses the evidence for this mixture-distribution assumption in TD data. In a first section, statistical evidence for mixture-distributions is found using the fixed-point property test. In a second section, a mathematical process model with mixture-distributions at its core is fitted to the response time distribution data. Both approaches provide good evidence in support of the mixture-distribution assumption, and thus support temporal distinctiveness accounts of the data.

  14. Bothe's 1925 heuristic assumption in the dawn of quantum field theory

    Science.gov (United States)

    Fick, D.

    2013-01-01

    In an unpublished manuscript filed at the Archive of the Max-Planck Society in Berlin, Walther Bothe (1891-1957) put, with one heuristic assumption, the spontaneous and induced transitions of light quanta, on an equal footing, probably as early as 1925. In modern terms, he assumed that the probability for the creation of a light quantum in a phase space cell already containing s light quanta is proportional to s + 1 and not, as assumed at that time, proportional to s; that is proportional to the fraction of the total radiation density which belongs to s light quanta. For Bothe, the added +1 somehow replaced the spontaneous decay and allowed him to treat empty phase space cells in a black body as thermodynamically consistent. We describe in some detail Bothe's route to this heuristic trick. Finally we discuss why, both Bose's and Bothe's heuristic assumptions lead to an identical distribution law for light quanta in a black body and thus to Planck's law and Einstein's fluctuation formula.

  15. Assumptions about footprint layer heights influence the quantification of emission sources: a case study for Cyprus

    Science.gov (United States)

    Hüser, Imke; Harder, Hartwig; Heil, Angelika; Kaiser, Johannes W.

    2017-09-01

    Lagrangian particle dispersion models (LPDMs) in backward mode are widely used to quantify the impact of transboundary pollution on downwind sites. Most LPDM applications count particles with a technique that introduces a so-called footprint layer (FL) with constant height, in which passing air tracer particles are assumed to be affected by surface emissions. The mixing layer dynamics are represented by the underlying meteorological model. This particle counting technique implicitly assumes that the atmosphere is well mixed in the FL. We have performed backward trajectory simulations with the FLEXPART model starting at Cyprus to calculate the sensitivity to emissions of upwind pollution sources. The emission sensitivity is used to quantify source contributions at the receptor and support the interpretation of ground measurements carried out during the CYPHEX campaign in July 2014. Here we analyse the effects of different constant and dynamic FL height assumptions. The results show that calculations with FL heights of 100 and 300 m yield similar but still discernible results. Comparison of calculations with FL heights constant at 300 m and dynamically following the planetary boundary layer (PBL) height exhibits systematic differences, with daytime and night-time sensitivity differences compensating for each other. The differences at daytime when a well-mixed PBL can be assumed indicate that residual inaccuracies in the representation of the mixing layer dynamics in the trajectories may introduce errors in the impact assessment on downwind sites. Emissions from vegetation fires are mixed up by pyrogenic convection which is not represented in FLEXPART. Neglecting this convection may lead to severe over- or underestimations of the downwind smoke concentrations. Introducing an extreme fire source from a different year in our study period and using fire-observation-based plume heights as reference, we find an overestimation of more than 60  % by the constant FL height

  16. Tale of Two Courthouses: A Critique of the Underlying Assumptions in Chronic Disease Self-Management for Aboriginal People

    Directory of Open Access Journals (Sweden)

    Isabelle Ellis

    2009-12-01

    Full Text Available This article reviews the assumptions that underpin thecommonly implemented Chronic Disease Self-Managementmodels. Namely that there are a clear set of instructions forpatients to comply with, that all health care providers agreewith; and that the health care provider and the patient agreewith the chronic disease self-management plan that wasdeveloped as part of a consultation. These assumptions areevaluated for their validity in the remote health care context,particularly for Aboriginal people. These assumptions havebeen found to lack validity in this context, therefore analternative model to enhance chronic disease care isproposed.

  17. Optical tests of Bell's inequalities not resting upon the absurd fair sampling assumption

    CERN Document Server

    Santos, E

    2004-01-01

    A simple local hidden-variables model is exhibited which reproduces the results of all performed tests of Bell\\'{}s inequalities involving optical photon pairs. For the old atomic-cascade experiments, like Aspect\\'{}s, the model agrees with quantum mechanics even for ideal set-ups. For more recent experiments, using parametric down-converted photons, the agreement occurs only for actual experiments, involving low efficiency detectors. Arguments are given against the fair sampling assumption, currently combined with the results of the experiments in order to claim a contradiction with local realism. New tests are proposed which are able to discriminate between quantum mechanics and a restricted, but appealing, family of local hidden-variables models. Such tests require detectors with efficiencies just above 20%.

  18. HARDINESS, WORLD ASSUMPTIONS, MOTIVATION OF ATHLETES OF CONTACT AND NOT CONTACT KINDS OF SPORT

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Molchanova

    2017-04-01

    Full Text Available Investigation of personal psychological specificity of athletes of contact (freestyle wrestling and not contact (archery kinds of sport were carried out. Pronounced deviation in hardiness, world assumptions, motives for sport doing were obtained. In particularly, archery athletes possess higher values of hardiness and positively view the world, than wrestlers, while possess less motives for sport doing as “successful for life quality and skills” and “physical perfection”. Thus for athletes not contact kinds of sports rather coping in permanent stressed conditions are predicted. The obtained results are practically important for counseling work of sport psychologists and moreover they could be a basement for training teach programs and challenge stress overcoming programs.

  19. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure.

    Science.gov (United States)

    Weir, Scott M; Suski, Jamie G; Salice, Christopher J

    2010-12-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation.

  20. A critical test of the assumption that men prefer conformist women and women prefer nonconformist men.

    Science.gov (United States)

    Hornsey, Matthew J; Wellauer, Richard; McIntyre, Jason C; Barlow, Fiona Kate

    2015-06-01

    Five studies tested the common assumption that women prefer nonconformist men as romantic partners, whereas men prefer conformist women. Studies 1 and 2 showed that both men and women preferred nonconformist romantic partners, but women overestimated the extent to which men prefer conformist partners. In Study 3, participants ostensibly in a small-group interaction showed preferences for nonconformist opposite-sex targets, a pattern that was particularly evident when men evaluated women. Dating success was greater the more nonconformist the sample was (Study 4), and perceptions of nonconformity in an ex-partner were associated with greater love and attraction toward that partner (Study 5). On the minority of occasions in which effects were moderated by gender, it was in the reverse direction to the traditional wisdom: Conformity was more associated with dating success among men. The studies contradict the notion that men disproportionately prefer conformist women. © 2015 by the Society for Personality and Social Psychology, Inc.

  1. The "invention" of lesbian acts in Iran: interpretative moves, hidden assumptions, and emerging categories of sexuality.

    Science.gov (United States)

    Bucar, Elizabeth M; Shirazi, Faegheh

    2012-01-01

    This article describes and explains the current official status of lesbianism in Iran. Our central question is why the installation of an Islamic government in Iran resulted in extreme regulations of sexuality. The authors argue that rather than a clear adoption of "Islamic teaching on lesbianism," the current regime of sexuality was "invented" through a series of interpretative moves, adoption of hidden assumptions, and creation of sexual categories. This article is organized into two sections. The first sets the scene of official sexuality in Iran through a summary of (1) the sections of the Iranian Penal code dealing with same-sex acts and (2) government support for sexual reassignment surgeries. The second section traces the "invention" of a dominant post-revolutionary Iranian view of Islam and sexuality through identifying a number of specific interpretive moves this view builds on.

  2. Impact of velocity distribution assumption on simplified laser speckle imaging equation

    Science.gov (United States)

    Ramirez-San-Juan, Julio C; Ramos-Garcia, Ruben; Guizar-Iturbide, Ileana; Martinez-Niconoff, Gabriel; Choi, Bernard

    2012-01-01

    Since blood flow is tightly coupled to the health status of biological tissue, several instruments have been developed to monitor blood flow and perfusion dynamics. One such instrument is laser speckle imaging. The goal of this study was to evaluate the use of two velocity distribution assumptions (Lorentzian- and Gaussian-based) to calculate speckle flow index (SFI) values. When the normalized autocorrelation function for the Lorentzian and Gaussian velocity distributions satisfy the same definition of correlation time, then the same velocity range is predicted for low speckle contrast (0 < C < 0.6) and predict different flow velocity range for high contrast. Our derived equations form the basis for simplified calculations of SFI values. PMID:18542407

  3. Modeling assumptions influence on stress and strain state in 450 t cranes hoisting winch construction

    Directory of Open Access Journals (Sweden)

    Damian GĄSKA

    2011-01-01

    Full Text Available This work investigates the FEM simulation of stress and strain state of the selected trolley’s load-carrying structure with 450 tones hoisting capacity [1]. Computational loads were adopted as in standard PN-EN 13001-2. Model of trolley was built from several cooperating with each other (in contact parts. The influence of model assumptions (simplification in selected construction nodes to the value of maximum stress and strain with its area of occurrence was being analyzed. The aim of this study was to determine whether the simplification, which reduces the time required to prepare the model and perform calculations (e.g., rigid connection instead of contact are substantially changing the characteristics of the model.

  4. Meso-scale modeling: beyond local equilibrium assumption for multiphase flow

    CERN Document Server

    Wang, Wei

    2015-01-01

    This is a summary of the article with the same title, accepted for publication in Advances in Chemical Engineering, 47: 193-277 (2015). Gas-solid fluidization is a typical nonlinear nonequilibrium system with multiscale structure. In particular, the mesoscale structure in terms of bubbles or clusters, which can be characterized by nonequilibrium features in terms of bimodal velocity distribution, energy non equipartition, and correlated density fluctuations, is the critical factor. Traditional two-fluid model (TFM) and relevant closures depend on local equilibrium and homogeneous distribution assumptions, and fail to predict the dynamic, nonequilibrium phenomena in circulating fluidized beds even with fine-grid resolution. In contrast, the mesoscale modeling, as exemplified by the energy-minimization multiscale (EMMS) model, is consistent with the nonequilibrium features in multiphase flows. Thus, the structure-dependent multi-fluid model conservation equations with the EMMS-based mesoscale modeling greatly i...

  5. Special relativity as the limit of an Aristotelian universal friction theory under Reye's assumption

    CERN Document Server

    Minguzzi, E

    2014-01-01

    This work explores a classical mechanical theory under two further assumptions: (a) there is a universal dry friction force (Aristotelian mechanics), and (b) the variation of the mass of a body due to wear is proportional to the work done by the friction force on the body (Reye's hypothesis). It is shown that mass depends on velocity as in Special Relativity, and that the velocity is constant for a particular characteristic value. In the limit of vanishing friction the theory satisfies a relativity principle as bodies do not decelerate and, therefore, the absolute frame becomes unobservable. However, the limit theory is not Newtonian mechanics, with its Galilei group symmetry, but rather Special Relativity. This result suggests to regard Special Relativity as the limit of a theory presenting universal friction and exchange of mass-energy with a reservoir (vacuum). Thus, quite surprisingly, Special Relativity follows from the absolute space (ether) concept and could have been discovered following studies of Ar...

  6. Evaluation of Horizontal Electric Field Under Different Lightning Current Models by Perfect Ground Assumption

    Institute of Scientific and Technical Information of China (English)

    LIANG Jianfeng; LI Yanming

    2012-01-01

    Lightning electromagnetics can affect the reliability of the power system or communication system.Therefore,evaluation of electromagnetic fields generated by lightning return stroke is indispensable.Arnold sommerfeld proposed a model to calculate the electromagnetic field,but it involved the time-consuming sommerfeld integral.However,perfect conductor ground assumption can account for fast calculation,thus this paper reviews the perfect ground equation for evaluation of lightning electromagnetic fields,presents three engineering lightning return stroke models,and calculates the horizontal electric field caused by three lightning return stroke models.According to the results,the amplitude of lightning return stroke has a strong impact on horizontal electric fields,and the steepness of lightning return stroke influences the horizontal electric fields.Moreover,the perfect ground method is faster than the sommerfeld integral method.

  7. Premiums for Long-Term Care Insurance Packages: Sensitivity with Respect to Biometric Assumptions

    Directory of Open Access Journals (Sweden)

    Ermanno Pitacco

    2016-02-01

    Full Text Available Long-term care insurance (LTCI covers are rather recent products, in the framework of health insurance. It follows that specific biometric data are scanty; pricing and reserving problems then arise because of difficulties in the choice of appropriate technical bases. Different benefit structures imply different sensitivity degrees with respect to changes in biometric assumptions. Hence, an accurate sensitivity analysis can help in designing LTCI products and, in particular, in comparing stand-alone products to combined products, i.e., packages including LTCI benefits and other lifetime-related benefits. Numerical examples show, in particular, that the stand-alone cover is much riskier than all of the LTCI combined products that we have considered. As a consequence, the LTCI stand-alone cover is a highly “absorbing” product as regards capital requirements for solvency purposes.

  8. Probabilistic tsunami hazard assessment for the Makran region with focus on maximum magnitude assumption

    Science.gov (United States)

    Hoechner, Andreas; Babeyko, Andrey Y.; Zamora, Natalia

    2016-06-01

    Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.

  9. Recursive Subspace Identification of AUV Dynamic Model under General Noise Assumption

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A recursive subspace identification algorithm for autonomous underwater vehicles (AUVs is proposed in this paper. Due to the advantages at handling nonlinearities and couplings, the AUV model investigated here is for the first time constructed as a Hammerstein model with nonlinear feedback in the linear part. To better take the environment and sensor noises into consideration, the identification problem is concerned as an errors-in-variables (EIV one which means that the identification procedure is under general noise assumption. In order to make the algorithm recursively, propagator method (PM based subspace approach is extended into EIV framework to form the recursive identification method called PM-EIV algorithm. With several identification experiments carried out by the AUV simulation platform, the proposed algorithm demonstrates its effectiveness and feasibility.

  10. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data.

    Science.gov (United States)

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms.

  11. Continuous-discrete model of parasite-host system dynamics: Trigger regime at simplest assumptions

    Directory of Open Access Journals (Sweden)

    L. V. Nedorezov

    2014-09-01

    Full Text Available In paper continuous-discrete model of parasite-host system dynamics is analyzed. Within the framework of model it is assumed that appearance of individuals of new generations of both populations is realized at fixed time moments tk=hk, t0=0, k=1,2,... , h=const>0; it means that several processes are compressed together: producing of eggs by hosts, attack of eggs by parasites (with respective transformation of host's eggs into parasite's eggs, staying of hosts and parasites in phase "egg", and appearance of new individuals. It is also assumed that death process of individuals has a continuous nature, but developments of both populations are realized independently between fixed time moments. Dynamic regimes of model are analyzed. In particular, it was obtained that with simplest assumptions about birth process in host population and numbers of attacked hosts regime with two non-trivial stable attractors in phase space of system can be realized.

  12. Washington International Renewable Energy Conference 2008 Pledges: Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, B.; Bilello, D. E.; Cowlin, S. C.; Mann, M.; Wise, A.

    2008-08-01

    The 2008 Washington International Renewable Energy Conference (WIREC) was held in Washington, D.C., from March 4-6, 2008, and involved nearly 9,000 people from 125 countries. The event brought together worldwide leaders in renewable energy (RE) from governments, international organizations, nongovernmental organizations, and the private sector to discuss the role that renewables can play in alleviating poverty, growing economies, and passing on a healthy planet to future generations. The conference concluded with more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy. The U.S. government authorized the National Renewable Energy Laboratory (NREL) to estimate the carbon dioxide (CO2) savings that would result from the pledges made at the 2008 conference. This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions derived from those pledges.

  13. To describe or prescribe: assumptions underlying a prescriptive nursing process approach to spiritual care.

    Science.gov (United States)

    Pesut, Barbara; Sawatzky, Rick

    2006-06-01

    Increasing attention is being paid to spirituality in nursing practice. Much of the literature on spiritual care uses the nursing process to describe this aspect of care. However, the use of the nursing process in the area of spirituality may be problematic, depending upon the understandings of the nature and intent of this process. Is it primarily a descriptive process meant to make visible the nursing actions to provide spiritual support, or is it a prescriptive process meant to guide nursing actions for intervening in the spirituality of patients? A prescriptive nursing process approach implies influencing, and in some cases reframing, the spirituality of patients and thereby extends beyond general notions of spiritual support. In this paper we discuss four problematic assumptions that form the basis for a prescriptive approach to spiritual care. We conclude that this approach extends the nursing role beyond appropriate professional boundaries, making it ethically problematic.

  14. The effect of physical assumptions on the calculation of microwave background anisotropies

    CERN Document Server

    Hu, W; Sugiyama, N; White, M; Hu, Wayne; Scott, Douglas; Sugiyama, Naoshi; White, Martin

    1995-01-01

    As the data on CMB anisotropies improve and potential cosmological applications are realized, it will be increasingly important for theoretical calculations to be as accurate as possible. All modern calculations for inflationary-inspired fluctuations involve the numerical solution of coupled Boltzmann equations. There are many assumptions and choices to be made when carrying out such calculations. We go through each in turn, pointing out the best selections to make, and the level of inaccuracy expected through incorrect choice: (1) neglecting the effects of neutrinos or polarization has a 10% effect; (2) varying radiation temperature and He fraction can have smaller, but noticeable effects; (3) numerical issues, such as k-range and smoothing are discussed; (4) short-cut methods, e.g. free-streaming and tilt approximations, are generally inadequate at the few % level; (5) at the 1% level somewhat baroque effects are important, such as He recombination and even minimal reionization; (6) at smaller angular scale...

  15. Changing assumption for the design process – New roles of the active end user

    Directory of Open Access Journals (Sweden)

    Monika Hestad

    2009-12-01

    Full Text Available The aim of this article is to discuss how end user involvement in all stages of a product life cycle changes the assumptions of the design process. This article is based on a literature review and three case studies – Imsdal (Ringnes/Carlsberg, Jordan and Stokke. Several examples of how consumers or users are involved in various stages of the product life cycle are presented. The product development is affected both by end users’ activity and by previous knowledge of the product. The use of the product is changing the meaning, and even the disposal of the product is affecting how the product is perceived. The product becomes part of a cultural and historical context in which the end user is actively shaping.  

  16. A rigid thorax assumption affects model loading predictions at the upper but not lower lumbar levels.

    Science.gov (United States)

    Ignasiak, Dominika; Ferguson, Stephen J; Arjmand, Navid

    2016-09-06

    A number of musculoskeletal models of the human spine have been used for predictions of lumbar and muscle forces. However, the predictive power of these models might be limited by a commonly made assumption; thoracic region is represented as a single lumped rigid body. This study hence aims to investigate the impact of such assumption on the predictions of spinal and muscle forces. A validated thoracolumbar spine model was used with a flexible thorax (T1-T12), a completely rigid one or rigid with thoracic posture updated at each analysis step. The simulations of isometric forward flexion up to 80°, with and without a 20kg hand load, were performed, based on the previously measured kinematics. Depending on the simulated task, the rigid model predicted slightly or moderately lower compressive loading than the flexible one. The differences were relatively greater at the upper lumbar levels (average underestimation of 14% at the T12L1 for flexion tasks and of 18% for flexion tasks with hand load) as compared to the lower levels (3% and 8% at the L5S1 for unloaded and loaded tasks, respectively). The rigid model with updated thoracic posture predicted compressive forces similar to those of the rigid model. Predicted muscle forces were, however, very different between the three models. This study indicates that the lumbar spine models with a rigid thorax definition can be used for loading investigations at the lowermost spinal levels. For predictions of upper lumbar spine loading, using models with an articulated thorax is advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. THE CULTURE OF CHILDREN SOCIALIZATION CENTERS AS THE ASSUMPTION FOR SUCCESSFUL STUDENT RESOCIALIZATION: THEORETICAL INSIGHTS

    Directory of Open Access Journals (Sweden)

    Simona Bieliūnė

    2013-06-01

    Full Text Available The term student in this article is used meaning a member of formal education who lives and educates in a children socialization center. Purpose – discuss the culture of children socialization centers as a theoretical assumption for successful student resocialization. Methodology/approach – the concept of student resocialization process used in the article is based on the idea of social constructivism (Berger, Luckmann, 1966 which says that students construct their subjective reality by the interaction in a social system and a specific environment. The correlation between the culture of children socialization centers and student resocialization process is based on the principle of operating system (Targamadzė, 2006 where changes in the system’s one structural element affect other structural elements. Methods: analysis of documents and scientific literature, comparison. Findings: The composition of children socialization center culture is determined by historical context, non-traditional structure and features of operation. According to researchers communities of children socialization centers have authoritarianism-oriented attitude that is demonstrated by discipline, hierarchical relationship, punishments, isolation from society etc. Humanistic attitude is needed for successful student resocialization; this is proved by scientists and established by the legislation. During the process of resocialization students adopt values, norms, behavioral models, roles existing in the culture of children socialization centers. In addition to this, the culture of these institutions must be harmonized with the culture of the society. As a result, children socialization centers have to make reasonable analysis of their culture, recognize limitations and try to change it, external factors cannot make transformations. Research limitations/implications – only theoretical assumptions were made which have to be proved empirically. Practical implications

  18. Limitations to the Dutch cannabis toleration policy: Assumptions underlying the reclassification of cannabis above 15% THC.

    Science.gov (United States)

    Van Laar, Margriet; Van Der Pol, Peggy; Niesink, Raymond

    2016-08-01

    The Netherlands has seen an increase in Δ9-tetrahydrocannabinol (THC) concentrations from approximately 8% in the 1990s up to 20% in 2004. Increased cannabis potency may lead to higher THC-exposure and cannabis related harm. The Dutch government officially condones the sale of cannabis from so called 'coffee shops', and the Opium Act distinguishes cannabis as a Schedule II drug with 'acceptable risk' from other drugs with 'unacceptable risk' (Schedule I). Even in 1976, however, cannabis potency was taken into account by distinguishing hemp oil as a Schedule I drug. In 2011, an advisory committee recommended tightening up legislation, leading to a 2013 bill proposing the reclassification of high potency cannabis products with a THC content of 15% or more as a Schedule I drug. The purpose of this measure was twofold: to reduce public health risks and to reduce illegal cultivation and export of cannabis by increasing punishment. This paper focuses on the public health aspects and describes the (explicit and implicit) assumptions underlying this '15% THC measure', as well as to what extent these are supported by scientific research. Based on scientific literature and other sources of information, we conclude that the 15% measure can provide in theory a slight health benefit for specific groups of cannabis users (i.e., frequent users preferring strong cannabis, purchasing from coffee shops, using 'steady quantities' and not changing their smoking behaviour), but certainly not for all cannabis users. These gains should be weighed against the investment in enforcement and the risk of unintended (adverse) effects. Given the many assumptions and uncertainty about the nature and extent of the expected buying and smoking behaviour changes, the measure is a political choice and based on thin evidence.

  19. CRITICAL ASSUMPTIONS IN THE F-TANK FARM CLOSURE OPERATIONAL DOCUMENTATION REGARDING WASTE TANK INTERNAL CONFIGURATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Hommel, S.; Fountain, D.

    2012-03-28

    The intent of this document is to provide clarification of critical assumptions regarding the internal configurations of liquid waste tanks at operational closure, with respect to F-Tank Farm (FTF) closure documentation. For the purposes of this document, FTF closure documentation includes: (1) Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the FTF PA) (SRS-REG-2007-00002), (2) Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site (DOE/SRS-WD-2012-001), (3) Tier 1 Closure Plan for the F-Area Waste Tank Systems at the Savannah River Site (SRR-CWDA-2010-00147), (4) F-Tank Farm Tanks 18 and 19 DOE Manual 435.1-1 Tier 2 Closure Plan Savannah River Site (SRR-CWDA-2011-00015), (5) Industrial Wastewater Closure Module for the Liquid Waste Tanks 18 and 19 (SRRCWDA-2010-00003), and (6) Tank 18/Tank 19 Special Analysis for the Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the Tank 18/Tank 19 Special Analysis) (SRR-CWDA-2010-00124). Note that the first three FTF closure documents listed apply to the entire FTF, whereas the last three FTF closure documents listed are specific to Tanks 18 and 19. These two waste tanks are expected to be the first two tanks to be grouted and operationally closed under the current suite of FTF closure documents and many of the assumptions and approaches that apply to these two tanks are also applicable to the other FTF waste tanks and operational closure processes.

  20. Symbolic regression via genetic programming for data driven derivation of confinement scaling laws without any assumption on their mathematical form

    Science.gov (United States)

    Murari, A.; Peluso, E.; Gelfusa, M.; Lupelli, I.; Lungaroni, M.; Gaudio, P.

    2015-01-01

    Many measurements are required to control thermonuclear plasmas and to fully exploit them scientifically. In the last years JET has shown the potential to generate about 50 GB of data per shot. These amounts of data require more sophisticated data analysis methodologies to perform correct inference and various techniques have been recently developed in this respect. The present paper covers a new methodology to extract mathematical models directly from the data without any a priori assumption about their expression. The approach, based on symbolic regression via genetic programming, is exemplified using the data of the International Tokamak Physics Activity database for the energy confinement time. The best obtained scaling laws are not in power law form and suggest a revisiting of the extrapolation to ITER. Indeed the best non-power law scalings predict confinement times in ITER approximately between 2 and 3 s. On the other hand, more comprehensive and better databases are required to fully profit from the power of these new methods and to discriminate between the hundreds of thousands of models that they can generate.

  1. Physics Courses X-Rayed - A Comparative Analysis of High School Physics Courses in Terms of Basic Assumptions

    Science.gov (United States)

    Hobbs, E. D.

    1974-01-01

    Reports an attempt to infer from official statements and from course materials some of the assumptions and theoretical positions which underlie four high school physics courses: Nuffield Physics, ECCP's "The Man Made World," Harvard Project Physics, and PSSC Physics. (PEB)

  2. Breeding performance of carbide and nitride fuels in 2000 MWe LMFBRs: a preliminary report. Part I. Assumptions, constraints and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Barthold, W P

    1976-09-01

    The assumptions, constraints, and methodology used in the design and analysis of 2000 MWe LMFBRs using carbide and nitride fuel are presented. The assumptions used in the nuclear, thermal and mechanical designs are discussed together with geometry constraints, operational constraints, performance constraints, constraints coming from fabrication consideration and fuel bundle-duct interaction constraints. The basic calculational flow for system design studies is described, and the computer codes used in the analyses are briefly reviewed.

  3. Effects of Rational-Emotive Hospice Care Therapy on Problematic Assumptions, Death Anxiety, and Psychological Distress in a Sample of Cancer Patients and Their Family Caregivers in Nigeria

    Directory of Open Access Journals (Sweden)

    Kay Chinonyelum Nwamaka Onyechi

    2016-09-01

    Full Text Available This study was a preliminary investigation that aimed to examine the effects of rational emotive hospice care therapy (REHCT on problematic assumptions, death anxiety, and psychological distress in a sample of cancer patients and their family caregivers in Nigeria. The study adopted a pre-posttest randomized control group design. Participants were community-dwelling cancer patients (n = 32 and their family caregivers (n = 52. The treatment process consisted of 10 weeks of full intervention and 4 weeks of follow-up meetings that marked the end of intervention. The study used repeated-measures analysis of variance for data analysis. The findings revealed significant effects of a REHCT intervention program on problematic assumptions, death anxiety, and psychological distress reduction among the cancer patients and their family caregivers at the end of the intervention. The improvements were also maintained at follow-up meetings in the treatment group compared with the control group who received the usual care and conventional counseling. The researchers have been able to show that REHCT intervention is more effective than a control therapy for cancer patients’ care, education, and counseling in the Nigerian context.

  4. Testing assumptions for conservation of migratory shorebirds and coastal managed wetlands

    Science.gov (United States)

    Collazo, Jaime; James Lyons,; Herring, Garth

    2015-01-01

    Managed wetlands provide critical foraging and roosting habitats for shorebirds during migration; therefore, ensuring their availability is a priority action in shorebird conservation plans. Contemporary shorebird conservation plans rely on a number of assumptions about shorebird prey resources and migratory behavior to determine stopover habitat requirements. For example, the US Shorebird Conservation Plan for the Southeast-Caribbean region assumes that average benthic invertebrate biomass in foraging habitats is 2.4 g dry mass m−2 and that the dominant prey item of shorebirds in the region is Chironomid larvae. For effective conservation and management, it is important to test working assumptions and update predictive models that are used to estimate habitat requirements. We surveyed migratory shorebirds and sampled the benthic invertebrate community in coastal managed wetlands of South Carolina. We sampled invertebrates at three points in time representing early, middle, and late stages of spring migration, and concurrently surveyed shorebird stopover populations at approximately 7-day intervals throughout migration. We used analysis of variance by ranks to test for temporal variation in invertebrate biomass and density, and we used a model based approach (linear mixed model and Monte Carlo simulation) to estimate mean biomass and density. There was little evidence of a temporal variation in biomass or density during the course of spring shorebird migration, suggesting that shorebirds did not deplete invertebrate prey resources at our site. Estimated biomass was 1.47 g dry mass m−2 (95 % credible interval 0.13–3.55), approximately 39 % lower than values used in the regional shorebird conservation plan. An additional 4728 ha (a 63 % increase) would be required if habitat objectives were derived from biomass levels observed in our study. Polychaetes, especially Laeonereis culveri(2569 individuals m−2), were the most abundant prey in foraging

  5. 42 CFR 476.74 - General requirements for the assumption of review.

    Science.gov (United States)

    2010-10-01

    ... HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS UTILIZATION AND QUALITY CONTROL REVIEW Review Responsibilities of Utilization and Quality Control Quality Improvement Organizations (QIOs... inspection at its principal business office— (1) A copy of each agreement with Medicare fiscal...

  6. Tank waste remediation system retrieval and disposal mission key enabling assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, J.H.

    1998-01-09

    An overall systems approach has been applied to develop action plans to support the retrieval and immobilization waste disposal mission. The review concluded that the systems and infrastructure required to support the mission are known. Required systems are either in place or plans have been developed. An analysis of the programmatic, management and technical activities necessary to declare Readiness to Proceed with execution of the mission demonstrates that the system, people, and hardware will be on line and ready to support the private contractors. The systems approach included defining the retrieval and immobilized waste disposal mission requirements and evaluating the readiness of the TWRS contractor to supply waste feed to the private contractors in June 2002. The Phase 1 feed delivery requirements from the Private Contractor Request for Proposals were reviewed, transfer piping routes were mapped on it, existing systems were evaluated, and upgrade requirements were defined. Technical Basis Reviews were completed to define work scope in greater detail, cost estimates and associated year by year financial analyses were completed. Personnel training, qualifications, management systems and procedures were reviewed and shown to be in place and ready to support the Phase 1B mission. Key assumptions and risks that could negatively impact mission success were evaluated and appropriate mitigative actions plans were planned and scheduled.

  7. Estimating ETAS: the effects of truncation, missing data, and model assumptions

    Science.gov (United States)

    Seif, Stefanie; Mignan, Arnaud; Zechar, Jeremy; Werner, Maximilian; Wiemer, Stefan

    2016-04-01

    The Epidemic-Type Aftershock Sequence (ETAS) model is widely used to describe the occurrence of earthquakes in space and time, but there has been little discussion of the limits of, and influences on, its estimation. What has been established is that ETAS parameter estimates are influenced by missing data (e.g., earthquakes are not reliably detected during lively aftershock sequences) and by simplifying assumptions (e.g., that aftershocks are isotropically distributed). In this article, we investigate the effect of truncation: how do parameter estimates depend on the cut-off magnitude, Mcut, above which parameters are estimated? We analyze catalogs from southern California and Italy and find that parameter variations as a function of Mcut are caused by (i) changing sample size (which affects e.g. Omori's cconstant) or (ii) an intrinsic dependence on Mcut (as Mcut increases, absolute productivity and background rate decrease). We also explore the influence of another form of truncation - the finite catalog length - that can bias estimators of the branching ratio. Being also a function of Omori's p-value, the true branching ratio is underestimated by 45% to 5% for 1.05ETAS productivity parameters (α and K0) and the Omoris c-value are significantly changed only for low Mcut=2.5. We further find that conventional estimation errors for these parameters, inferred from simulations that do not account for aftershock incompleteness, are underestimated by, on average, a factor of six.

  8. Tight bounds for the Pearle-Braunstein-Caves chained inequality without the fair-coincidence assumption

    Science.gov (United States)

    Jogenfors, Jonathan; Larsson, Jan-Åke

    2017-08-01

    In any Bell test, loopholes can cause issues in the interpretation of the results, since an apparent violation of the inequality may not correspond to a violation of local realism. An important example is the coincidence-time loophole that arises when detector settings might influence the time when detection will occur. This effect can be observed in many experiments where measurement outcomes are to be compared between remote stations because the interpretation of an ostensible Bell violation strongly depends on the method used to decide coincidence. The coincidence-time loophole has previously been studied for the Clauser-Horne-Shimony-Holt and Clauser-Horne inequalities, but recent experiments have shown the need for a generalization. Here, we study the generalized "chained" inequality by Pearle, Braunstein, and Caves (PBC) with N ≥2 settings per observer. This inequality has applications in, for instance, quantum key distribution where it has been used to reestablish security. In this paper we give the minimum coincidence probability for the PBC inequality for all N ≥2 and show that this bound is tight for a violation free of the fair-coincidence assumption. Thus, if an experiment has a coincidence probability exceeding the critical value derived here, the coincidence-time loophole is eliminated.

  9. Assumption tests regarding the ‘narrow’ rectangles dimensions of the open thin wall sections

    Science.gov (United States)

    Oanta, E.; Panait, C.; Sabau, A.; Barhalescu, M.; Dascalescu, A. E.

    2016-08-01

    Computer based analytic models that use the strength of materials theory are inheriting the accuracy given by the basic simplifying hypotheses. The according assumptions were rationally conceived hundreds of years ago in an age when there was no computing instrument, therefore the minimization of the necessary volume of calculi was an important requirement. An initial study was an attempt to evaluate how ‘thin’ may be the walls of an open section in order to have accurate results using the analytic calculus method. In this initial study there was compared the calculus of the rectangular sections loaded by twisting moments vs. a narrow section under the same load. Being compared analytic methods applied for a simple shape section, a more thorough study was required. In this way, we consider a thin wall open section loaded by a twisting moment, section which is discretized in ‘narrow’ rectangles. The ratio of the sides of the ‘narrow’ rectangles is the variable of the study. We compare the results of the finite element analysis to the results of the analytic method. The conclusions are important for the development of computer based analytic models which use parametrized sections for which different sets of calculus relations may be used.

  10. Indoor Slope and Edge Detection by using Two-Dimensional EKF-SLAM with Orthogonal Assumption

    Directory of Open Access Journals (Sweden)

    Jixin Lv

    2015-04-01

    Full Text Available In an indoor environment, slope and edge detection is an important problem in simultaneous localization and mapping (SLAM, which is a basic requirement for mobile robot autonomous navigation. Slope detection allows the robot to find areas that are more traversable while the edge detection can prevent robot from falling. Three-dimensional (3D solutions usually require a large memory and high computational costs. This study proposes an efficient two-dimensional (2D solution to combine slope and edge detection with a line-segment-based extended Kalman filter SLAM (EKF-SLAM in a structured indoor area. The robot is designed to use two fixed 2D laser range finders (LRFs to perform horizontal and vertical scans. With local area orthogonal assumption, the slope and edge are modelled into line segments swiftly from each vertical scan, and then are merged into the EKF-SLAM framework. The EKF-SLAM framework features an optional prediction model that can automatically decide whether the application of iterative closest point (ICP is necessary to compensate for the dead reckoning error. The experimental results demonstrate that the proposed algorithm is capable of building an accurate 2D map swiftly, which contains crucial information of the edge and slope.

  11. Deficient crisis-probing practices and taken-for-granted assumptions in health organisations

    Science.gov (United States)

    Canyon, Deon V.; Adhikari, Ashmita; Cordery, Thomas; Giguère-Simmonds, Philippe; Huang, Jessica; Nguyen, Helen; Watson, Michael; Yang, Daniel

    2011-01-01

    The practice of crisis-probing in proactive organisations involves meticulous and sustained investigation into operational processes and management structures for potential weaknesses and flaws before they become difficult to resolve. In health organisations, crisis probing is a necessary part of preparing to manage emerging health threats. This study examined the degree of pre-emptive probing in health organisations and the type of crisis training provided to determine whether or not they are prepared in this area. This evidence-based study draws on cross-sectional responses provided by executives from chiropractic, physiotherapy, and podiatry practices; dental and medical clinics; pharmacies; aged care facilities; and hospitals. The data show a marked lack of mandatory probing and a generalised failure to reward crisis reporting. Crisis prevention training is poor in all organisations except hospitals and aged care facilities where it occurs at an adequate frequency. However this training focuses primarily on natural disasters, fails to address most other crisis types, is mostly reactive and not designed to probe for and uncover key taken-for-granted assumptions. Crisis-probing in health organisations is inadequate, and improvements in this area may well translate into measurable improvements in preparedness and response outcomes. PMID:24149030

  12. Finding the right fit: A comparison of process assumptions underlying popular drift-diffusion models.

    Science.gov (United States)

    Ashby, Nathaniel J S; Jekel, Marc; Dickert, Stephan; Glöckner, Andreas

    2016-12-01

    Recent research makes increasing use of eye-tracking methodologies to generate and test process models. Overall, such research suggests that attention, generally indexed by fixations (gaze duration), plays a critical role in the construction of preference, although the methods used to support this supposition differ substantially. In 2 studies we empirically test prototypical versions of prominent processing assumptions against 1 another and several base models. We find that general evidence accumulation processes provide a good fit to the data. An accumulation process that assumes leakage and temporal variability in evidence weighting (i.e., a primacy effect) fits the aggregate data, both in terms of choices and decision times, and does so across varying types of choices (e.g., charitable giving and hedonic consumption) and numbers of options well. However, when comparing models on the level of the individual, for a majority of participants simpler models capture choice data better. The theoretical and practical implications of these findings are discussed. (PsycINFO Database Record

  13. Cosmological Perturbations and Quasi-Static Assumption in $f(R)$ Theories

    CERN Document Server

    Chiu, Mu-Chen; Shu, Chenggang; Tu, Hong

    2015-01-01

    $f(R)$ gravity is one of the simplest theories of modified gravity to explain the accelerated cosmic expansion. Although it is usually assumed that the quasi-Newtonian approach for cosmic perturbations is good enough to describe the evolution of large scale structure in $f(R)$ models, some studies have suggested that this method is not valid for all $f(R)$ models. Here, we show that in the matter-dominated era, the pressure and shear equations alone, which can be recast into four first-order equations to solve for cosmological perturbations exactly, are sufficient to solve for the Newtonian potential, $\\Psi$, and the curvature potential, $\\Phi$. Based on these two equations, we are able to clarify how the exact linear perturbations fit into different limits. We find that in the subhorizon limit, the so called quasi-static assumption plays no role in reducing the exact linear perturbations in any viable $f(R)$ gravity. Our findings also disagree with previous studies where we find little difference between our...

  14. Validity of the assumption of Gaussian turbulence; Gyldighed af antagelsen om Gaussisk turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M.; Hansen, K.S.; Juul Pedersen, B.

    2000-07-01

    Wind turbines are designed to withstand the impact of turbulent winds, which fluctuations usually are assumed of Gaussian probability distribution. Based on a large number of measurements from many sites, this seems a reasonable assumption in flat homogeneous terrain whereas it may fail in complex terrain. At these sites the wind speed often has a skew distribution with more frequent lulls than gusts. In order to simulate aerodynamic loads, a numerical turbulence simulation method was developed and implemented. This method may simulate multiple time series of variable not necessarily Gaussian distribution without distortion of the spectral distribution or spatial coherence. The simulated time series were used as input to the dynamic-response simulation program Vestas Turbine Simulator (VTS). In this way we simulated the dynamic response of systems exposed to turbulence of either Gaussian or extreme, yet realistic, non-Gaussian probability distribution. Certain loads on turbines with active pitch regulation were enhanced by up to 15% compared to pure Gaussian turbulence. It should, however, be said that the undesired effect depends on the dynamic system, and it might be mitigated by optimisation of the wind turbine regulation system after local turbulence characteristics. (au)

  15. Naledi: An example of how natural phenomena can inspire metaphysical assumptions

    Directory of Open Access Journals (Sweden)

    Francois Durand

    2017-02-01

    Full Text Available A new fossil site was discovered in the Rising Star Cave in 2013 in the Cradle of Humankind in South Africa. This site which has yielded 1550 hominin bones so far is considered to be one of the richest palaeoanthropological sites in the world. The deposition of the fossils in a remote part of the cave system, approximately 100 m from the entrance, has resulted in a great deal of speculation. The relative inaccessibility of the site and the number of fossil bones it contained and the fact that virtually all these bones were those of a single species of hominid led to the conclusion that the bones were not deposited because of natural sedimentary processes, but that these phenomena were evidence of purposeful disposal or even burial of the dead by hominins. If this assumption is true, it would be the earliest evidence of a metaphysical awareness in humankind. The tenuous evidence on which this hypothesis rests will be discussed and a more plausible alternative explanation where water and gravity were responsible for the deposition of the remains is forwarded.

  16. The European Water Framework Directive: How Ecological Assumptions Frame Technical and Social Change

    Directory of Open Access Journals (Sweden)

    Patrick Steyaert

    2007-06-01

    Full Text Available The European Water Framework Directive (WFD is built upon significant cognitive developments in the field of ecological science but also encourages active involvement of all interested parties in its implementation. The coexistence in the same policy text of both substantive and procedural approaches to policy development stimulated this research as did our concerns about the implications of substantive ecological visions within the WFD policy for promoting, or not, social learning processes through participatory designs. We have used a qualitative analysis of the WFD text which shows the ecological dimension of the WFD dedicates its quasi-exclusive attention to a particular current of thought in ecosystems science focusing on ecosystems status and stability and considering human activities as disturbance factors. This particular worldview is juxtaposed within the WFD with a more utilitarian one that gives rise to many policy exemptions without changing the general underlying ecological model. We discuss these policy statements in the light of the tension between substantive and procedural policy developments. We argue that the dominant substantive approach of the WFD, comprising particular ecological assumptions built upon "compositionalism," seems to be contradictory with its espoused intention of involving the public. We discuss that current of thought in regard to more functionalist thinking and adaptive management, which offers greater opportunities for social learning, i.e., place a set of interdependent stakeholders in an intersubjective position in which they operate a "social construction" of water problems through the co-production of knowledge.

  17. Utility of Web search query data in testing theoretical assumptions about mephedrone.

    Science.gov (United States)

    Kapitány-Fövény, Máté; Demetrovics, Zsolt

    2017-05-01

    With growing access to the Internet, people who use drugs and traffickers started to obtain information about novel psychoactive substances (NPS) via online platforms. This paper aims to analyze whether a decreasing Web interest in formerly banned substances-cocaine, heroin, and MDMA-and the legislative status of mephedrone predict Web interest about this NPS. Google Trends was used to measure changes of Web interest on cocaine, heroin, MDMA, and mephedrone. Google search results for mephedrone within the same time frame were analyzed and categorized. Web interest about classic drugs found to be more persistent. Regarding geographical distribution, location of Web searches for heroin and cocaine was less centralized. Illicit status of mephedrone was a negative predictor of its Web search query rates. The connection between mephedrone-related Web search rates and legislative status of this substance was significantly mediated by ecstasy-related Web search queries, the number of documentaries, and forum/blog entries about mephedrone. The results might provide support for the hypothesis that mephedrone's popularity was highly correlated with its legal status as well as it functioned as a potential substitute for MDMA. Google Trends was found to be a useful tool for testing theoretical assumptions about NPS. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Compound-nuclear reaction cross sections via the Surrogate method: considering the underlying assumptions

    Science.gov (United States)

    Escher, Jutta; Dietrich, Frank

    2006-10-01

    The Surrogate Nuclear Reactions approach makes it possible to determine compound-nuclear reaction cross sections indirectly. The method has been employed to determine (n,f) cross sections for various actinides, including unstable species [1-4]; other, primarily neutron- induced, reactions are being considered also [5,6]. The extraction of the sought-after cross sections typically relies on approximations to the full Surrogate formalism [7]. This presentation will identify and critically examine the most significant assumptions underlying the experimental work carried out so far. Calculations that test the validity of the approximations employed will be presented. [1] J.D. Cramer and H.C. Britt, Nucl. Sci. and Eng. 41, 177(1970); H.C. Britt and J.B. Wilhelmy, ibid. 72, 222(1979) [2] M. Petit et al, Nucl. Phys. A735, 345(2004) [3] C. Plettner et al, Phys. Rev. C 71, 051602(2005); J. Burke et al, Phys. Rev. C. 73, 054604(2006) [4] W. Younes and H.C. Britt, Phys. Rev. C 67, 024610(2003); 68, 034610(2003) [5] L.A. Bernstein et al, AIP Conf. Proc. 769, 890(2005) [6] J. Escher et al, Nucl. Phys. A758, 43c(2005) [7] J. Escher and F.S. Dietrich, submitted (2006)

  19. Improving thermal ablation delineation with electrode vibration elastography using a bidirectional wave propagation assumption.

    Science.gov (United States)

    DeWall, Ryan J; Varghese, Tomy

    2012-01-01

    Thermal ablation procedures are commonly used to treat hepatic cancers and accurate ablation representation on shear wave velocity images is crucial to ensure complete treatment of the malignant target. Electrode vibration elastography is a shear wave imaging technique recently developed to monitor thermal ablation extent during treatment procedures. Previous work has shown good lateral boundary delineation of ablated volumes, but axial delineation was more ambiguous, which may have resulted from the assumption of lateral shear wave propagation. In this work, we assume both lateral and axial wave propagation and compare wave velocity images to those assuming only lateral shear wave propagation in finite element simulations, tissue-mimicking phantoms, and bovine liver tissue. Our results show that assuming bidirectional wave propagation minimizes artifacts above and below ablated volumes, yielding a more accurate representation of the ablated region on shear wave velocity images. Area overestimation was reduced from 13.4% to 3.6% in a stiff-inclusion tissue-mimicking phantom and from 9.1% to 0.8% in a radio-frequency ablation in bovine liver tissue. More accurate ablation representation during ablation procedures increases the likelihood of complete treatment of the malignant target, decreasing tumor recurrence. © 2012 IEEE

  20. Distributional assumptions in food and feed commodities- development of fit-for-purpose sampling protocols.

    Science.gov (United States)

    Paoletti, Claudia; Esbensen, Kim H

    2015-01-01

    Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.

  1. 3 DOF Spherical Pendulum Oscillations with a Uniform Slewing Pivot Center and a Small Angle Assumption

    Directory of Open Access Journals (Sweden)

    Alexander V. Perig

    2014-01-01

    Full Text Available The present paper addresses the derivation of a 3 DOF mathematical model of a spherical pendulum attached to a crane boom tip for uniform slewing motion of the crane. The governing nonlinear DAE-based system for crane boom uniform slewing has been proposed, numerically solved, and experimentally verified. The proposed nonlinear and linearized models have been derived with an introduction of Cartesian coordinates. The linearized model with small angle assumption has an analytical solution. The relative and absolute payload trajectories have been derived. The amplitudes of load oscillations, which depend on computed initial conditions, have been estimated. The dependence of natural frequencies on the transport inertia forces and gravity forces has been computed. The conservative system, which contains first time derivatives of coordinates without oscillation damping, has been derived. The dynamic analogy between crane boom-driven payload swaying motion and Foucault’s pendulum motion has been grounded and outlined. For a small swaying angle, good agreement between theoretical and averaged experimental results was obtained.

  2. NONLINEAR MODELS FOR DESCRIPTION OF CACAO FRUIT GROWTH WITH ASSUMPTION VIOLATIONS

    Directory of Open Access Journals (Sweden)

    JOEL AUGUSTO MUNIZ

    2017-01-01

    Full Text Available Cacao (Theobroma cacao L. is an important fruit in the Brazilian economy, which is mainly cultivated in the southern State of Bahia. The optimal stage for harvesting is a major factor for fruit quality and the knowledge on its growth curves can help, especially in identifying the ideal maturation stage for harvesting. Nonlinear regression models have been widely used for description of growth curves. However, several studies in this subject do not consider the residual analysis, the existence of a possible dependence between longitudinal observations, or the sample variance heterogeneity, compromising the modeling quality. The objective of this work was to compare the fit of nonlinear regression models, considering residual analysis and assumption violations, in the description of the cacao (clone Sial-105 fruit growth. The data evaluated were extracted from Brito and Silva (1983, who conducted the experiment in the Cacao Research Center, Ilheus, State of Bahia. The variables fruit length, diameter and volume as a function of fruit age were studied. The use of weighting and incorporation of residual dependencies was efficient, since the modeling became more consistent, improving the model fit. Considering the first-order autoregressive structure, when needed, leads to significant reduction in the residual standard deviation, making the estimates more reliable. The Logistic model was the most efficient for the description of the cacao fruit growth.

  3. Optimization of Non-Profit Projects’ Portfolio: Chosen Aspects and Assumptions

    Directory of Open Access Journals (Sweden)

    Jacek Woźniak

    2014-09-01

    Full Text Available The chosen aspects and assumptions of the author’s proposal of the optimization model of the non-profit projects’ portfolio are presented. The functional model of the non-profit sector (third sector, which is the base for the further analyses, is also characterized. The article also contains the quantification of fundamental conditions of portfolio optimization. There is developed the utility model for the management system in the non-profit portfolio, in the framework of which there are specified the scope of the model and relationships between four categories of the non-profit portfolio’s participants/stakeholders: non-profit organizations, donors, co-participants and customers (recipients of the basic benefits/values associated with the realization of the non-profit projects. The main optimality conditions and optimization algorithm of the non-profit portfolio are also given. The paper is concluded with exemplary analytical matrixes used for optimization of the non-profit portfolios and based on the evaluation of both the optimization utility conditions and added parameters. Only basic and chosen aspects of the optimization of the non-profit projects’ portfolio have been described here. [b]Keywords[/b]: Management, Organization, Non-Profit, Project, Portfolio, Optimization, Utility

  4. [Assumption of medical risks and the problem of medical liability in ancient Roman law].

    Science.gov (United States)

    Váradi, Agnes

    2008-11-01

    The claim of an individual to assure his health and life, to assume and compensate the damage from diseases and accidents, had already appeared in the system of the ancient Roman law in the form of many singular legal institutions. In lack of a unified archetype of regulation, we have to analyse the damages caused in the health or corporal integrity of different personal groups: we have to mention the legal interpretation of the diseases or injuries suffered by serves, people under manus or patria potestas and free Roman citizens. The fragments from the Digest od Justinian do not only demonstrate concrete legal problems, but they can serve as a starting point for further theoretical analyses. For example: if death is the consequence of a medical failure, does the doctor have any kind of liability? Was after-care part of the healing process according to the Roman law? Examining these questions, we should not forget to talk about the complex liability system of the Roman law, the compensation of the damages caused in a contractual or delictual context and about the lex Aquilia. Although these conclusions have no direct relation with the present legal regulation of risk assumption, we have to see that analysing the examples of the Roman law can be useful for developing our view of a certain theoretical problem, like that of the modern liability concept in medicine as well.

  5. Density assumptions for converting geodetic glacier volume change to mass change

    Directory of Open Access Journals (Sweden)

    M. Huss

    2013-05-01

    Full Text Available The geodetic method is widely used for assessing changes in the mass balance of mountain glaciers. However, comparison of repeated digital elevation models only provides a glacier volume change that must be converted to a change in mass using a density assumption or model. This study investigates the use of a constant factor for the volume-to-mass conversion based on a firn compaction model applied to simplified glacier geometries with idealized climate forcing, and two glaciers with long-term mass balance series. It is shown that the "density" of geodetic volume change is not a constant factor and is systematically smaller than ice density in most cases. This is explained by the accretion/removal of low-density firn layers, and changes in the firn density profile with positive/negative mass balance. Assuming a value of 850 ± 60 kg m−3 to convert volume change to mass change is appropriate for a wide range of conditions. For short time intervals (≤3 yr, periods with limited volume change, and/or changing mass balance gradients, the conversion factor can however vary from 0–2000 kg m−3 and beyond, which requires caution when interpreting glacier mass changes based on geodetic surveys.

  6. Density assumptions for converting geodetic glacier volume change to mass change

    Directory of Open Access Journals (Sweden)

    M. Huss

    2013-01-01

    Full Text Available The geodetic method is widely used for assessing changes in the mass balance of mountain glaciers. However, comparison of repeated digital elevation models only provides a glacier volume change that must be converted to a change in mass using a density assumption. This study investigates this conversion factor based on a firn compaction model applied to simplified glacier geometries with idealized climate forcing, and two glaciers with long-term mass balance series. It is shown that the "density" of geodetic volume change is not a constant factor and is systematically smaller than ice density in most cases. This is explained by the accretion/removal of low-density firn layers, and changes in the firn density profile with positive/negative mass balance. Assuming a value of 850 ± 60 kg m−3 to convert volume change to mass change is appropriate for a wide range of conditions. For short time intervals (≤3 yr, periods with limited volume change, and/or changing mass balance gradients, the conversion factor can however vary from 0–2000 kg m−3 and beyond which requires caution when interpreting glacier mass changes based on geodetic surveys.

  7. Gaussian versus top-hat profile assumptions in integral plume models

    Science.gov (United States)

    Davidson, G. A.

    Numerous integral models describing the behaviour of buoyant plumes released into stratified crossflows have been presented in the literature. One of the differences between these models is the form assumed for the self-similar profile: some models assume a top-hat form while others assume a Gaussian. The differences between these two approaches are evaluated by (a) comparing the governing equations on which Gaussian and top-hat models are based; (b) comparing some typical plume predictions generated by each type of model over a range of model parameters. It is shown that, while the profile assumption does lead to differences in the equations which govern plume variables, the effects of these differences on actual plume predictions is small over the range of parameters of practical interest. Since the predictions of Gaussian and top-hat models are essentially equivalent, it can thus be concluded that the additional physical information incorporated into a Gaussian formulation plays only a minor role in mean plume behaviour, and that the tophat approach, which requires the numerical solution of a simpler set of equations, is adequate for most situations where an integral approach would be used.

  8. Implications of the homogeneous turbulence assumption on the aero-optic linking equation

    Science.gov (United States)

    Hugo, Ronald J.; Jumper, Eric J.

    1995-09-01

    This paper investigates the validity of applying the simplified (under the assumptions of isotropic and homogeneous turbulence) aero-optic linking equation to a flowfield that is known to consist of anisotropic and nonhomogeneous turbulence. The investigation is performed in the near nozzle-region of a heated two-dimensional jet, and the study makes use of a conditional sampling experiment to acquire a spatio-temporal temperature field data base for the heated jet flowfield. After compensating for the bandwidth limitations of constant-current-wire temperature measurements, the temperature field data base is applied to the computation of optical degradation through both direct methods and indirect methods relying on the aero-optic linking equation. The simplified version of the linking equation was found to provide very good agreement with direct calculations provided that the length scale of the density fluctuations was interpreted as being the integral scale, with the limits of the integration being the two first zero crossings of the covariance coefficient function.

  9. The necessary distinction between methodology and philosophical assumptions in healthcare research.

    Science.gov (United States)

    Mesel, Terje

    2013-09-01

    Methodological discussions within healthcare research have traditionally described a methodological dichotomy between qualitative and quantitative methods. The aim of this article is to demonstrate that such a dichotomy presents unnecessary obstacles for good research design and is methodologically and philosophically unsustainable. The issue of incommensurability is not a question of method but rather a question of the philosophical premises underpinning a given method. Thus, transparency on the philosophical level is important for validity and consistency as well as for attempts to integrate or establish an interface to other research. I argue that it is necessary to make a distinction between methodology and philosophical assumptions and to ensure consistency in these correlations. Furthermore, I argue that the question of incommensurability is best answered at this basic philosophical level. The complexity of health care calls for methodological pluralism and creativity that utilises the strength of both qualitative and quantitative approaches. Transparency and consistency on the philosophical level can facilitate new mixed methods research designs that may be promising methodological assets for healthcare research. I believe we are ill served by fortified positions that continue to uphold old battle lines. Empirical research begins in the field of practice and requires a certain amount of pragmatism. However, this pragmatism must be philosophically informed.

  10. Controversies in psychotherapy research: epistemic differences in assumptions about human psychology.

    Science.gov (United States)

    Shean, Glenn D

    2013-01-01

    It is the thesis of this paper that differences in philosophical assumptions about the subject matter and treatment methods of psychotherapy have contributed to disagreements about the external validity of empirically supported therapies (ESTs). These differences are evident in the theories that are the basis for both the design and interpretation of recent psychotherapy efficacy studies. The natural science model, as applied to psychotherapy outcome research, transforms the constitutive features of the study subject in a reciprocal manner so that problems, treatments, and indicators of effectiveness are limited to what can be directly observed. Meaning-based approaches to therapy emphasize processes and changes that do not lend themselves to experimental study. Hermeneutic philosophy provides a supplemental model to establishing validity in those instances where outcome indicators do not lend themselves to direct observation and measurement and require "deep" interpretation. Hermeneutics allows for a broadening of psychological study that allows one to establish a form of validity that is applicable when constructs do not refer to things that literally "exist" in nature. From a hermeneutic perspective the changes that occur in meaning-based therapies must be understood and evaluated on the manner in which they are applied to new situations, the logical ordering and harmony of the parts with the theoretical whole, and the capability of convincing experts and patients that the interpretation can stand up against other ways of understanding. Adoption of this approach often is necessary to competently evaluate the effectiveness of meaning-based therapies.

  11. Studies on the effect of flaw detection probability assumptions on risk reduction at inspection

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K.; Cronvall, O.; Maennistoe, I. (VTT Technical Research Centre of Finland (Finland)); Gunnars, J.; Alverlind, L.; Dillstroem, P. (Inspecta Technology, Stockholm (Sweden)); Gandossi, L. (European Commission Joint Research Centre, Brussels (Belgium))

    2009-12-15

    The aim of the project was to study the effect of POD assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve e.g. in risk-informed in-service inspection (RI-ISI) studies. The results of the study indicate that the use of a simplified POD curve could be justifiable in RI-ISI applications. Another aim was to compare various structural reliability calculation approaches for a set of cases. Through benchmarking one can identify differences and similarities between modelling approaches, and provide added confidence on models and identify development needs. Comparing the leakage probabilities calculated by different approaches at the end of plant lifetime (60 years) shows that the results are very similar when inspections are not accounted for. However, when inspections are taken into account the predicted order of magnitude differs. Further studies would be needed to investigate the reasons for the differences. Development needs and plans for the benchmarked structural reliability models are discussed. (author)

  12. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    Energy Technology Data Exchange (ETDEWEB)

    Weir, Scott M., E-mail: scott.weir@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States); Suski, Jamie G., E-mail: jamie.suski@ttu.ed [Texas Tech University, Department of Biological Sciences, Box 43131, Lubbock, TX (United States); Salice, Christopher J., E-mail: chris.salice@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States)

    2010-12-15

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  13. Relaxing the closure assumption in single-season occupancy models: staggered arrival and departure times

    Science.gov (United States)

    Kendall, William L.; Hines, James E.; Nichols, James D.; Grant, Evan H. Campbell

    2013-01-01

    Occupancy statistical models that account for imperfect detection have proved very useful in several areas of ecology, including species distribution and spatial dynamics, disease ecology, and ecological responses to climate change. These models are based on the collection of multiple samples at each of a number of sites within a given season, during which it is assumed the species is either absent or present and available for detection while each sample is taken. However, for some species, individuals are only present or available for detection seasonally. We present a statistical model that relaxes the closure assumption within a season by permitting staggered entry and exit times for the species of interest at each site. Based on simulation, our open model eliminates bias in occupancy estimators and in some cases increases precision. The power to detect the violation of closure is high if detection probability is reasonably high. In addition to providing more robust estimation of occupancy, this model permits comparison of phenology across sites, species, or years, by modeling variation in arrival or departure probabilities. In a comparison of four species of amphibians in Maryland we found that two toad species arrived at breeding sites later in the season than a salamander and frog species, and departed from sites earlier.

  14. Individualism, collectivism and ethnic identity: cultural assumptions in accounting for caregiving behaviour in Britain.

    Science.gov (United States)

    Willis, Rosalind

    2012-09-01

    Britain is experiencing the ageing of a large number of minority ethnic groups for the first time in its history, due to the post-war migration of people from the Caribbean and the Indian subcontinent. Stereotypes about a high level of provision of informal caregiving among minority ethnic groups are common in Britain, as in the US, despite quantitative studies refuting this assumption. This paper reports on a qualitative analysis of in-depth interviews with older people from five different ethnic groups about their conceptualisation of their ethnic identity, and their attributions of motivations of caregiving within their own ethnic group and in other groups. It is argued that ethnic identity becomes salient after migration and becoming a part of an ethnic minority group in the new country. Therefore, White British people who have never migrated do not have a great sense of ethnic identity. Further, a strong sense of ethnic identity is linked with identifying with the collective rather than the individual, which explains why the White British participants gave an individualist account of their motivations for informal care, whereas the minority ethnic participants gave a collectivist account of their motivations of care. Crucially, members of all ethnic groups were providing or receiving informal care, so it was the attribution and not the behaviour which differed.

  15. Studies on the effect of flaw detection probability assumptions on risk reduction at inspection

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K.; Cronvall, O.; Maennistoe, I. (VTT Technical Research Centre of Finland (Finland)); Gunnars, J.; Alverlind, L.; Dillstroem, P. (Inspecta Technology, Stockholm (Sweden)); Gandossi, L. (European Commission Joint Research Centre, Brussels (Belgium))

    2009-12-15

    The aim of the project was to study the effect of POD assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve e.g. in risk-informed in-service inspection (RI-ISI) studies. The results of the study indicate that the use of a simplified POD curve could be justifiable in RI-ISI applications. Another aim was to compare various structural reliability calculation approaches for a set of cases. Through benchmarking one can identify differences and similarities between modelling approaches, and provide added confidence on models and identify development needs. Comparing the leakage probabilities calculated by different approaches at the end of plant lifetime (60 years) shows that the results are very similar when inspections are not accounted for. However, when inspections are taken into account the predicted order of magnitude differs. Further studies would be needed to investigate the reasons for the differences. Development needs and plans for the benchmarked structural reliability models are discussed. (author)

  16. The current theoretical assumptions of the Bobath concept as determined by the members of BBTA.

    Science.gov (United States)

    Raine, Sue

    2007-01-01

    The Bobath concept is a problem-solving approach to the assessment and treatment of individuals following a lesion of the central nervous system that offers therapists a framework for their clinical practice. The aim of this study was to facilitate a group of experts in determining the current theoretical assumptions underpinning the Bobath concept.A four-round Delphi study was used. The expert sample included all 15 members of the British Bobath Tutors Association. Initial statements were identified from the literature with respondents generating additional statements. Level of agreement was determined by using a five-point Likert scale. Level of consensus was set at 80%. Eighty-five statements were rated from the literature along with 115 generated by the group. Ninety-three statements were identified as representing the theoretical underpinning of the Bobath concept. The Bobath experts agreed that therapists need to be aware of the principles of motor learning such as active participation, opportunities for practice and meaningful goals. They emphasized that therapy is an interactive process between individual, therapist, and the environment and aims to promote efficiency of movement to the individual's maximum potential rather than normal movement. Treatment was identified by the experts as having "change of functional outcome" at its center.

  17. On the Markovian assumption in the excursion set approach: The approximation of Markov Velocities

    CERN Document Server

    Musso, Marcello

    2014-01-01

    The excursion set approach uses the statistics of the density field, smoothed on a wide range of scales, to gain insight into a number of interesting processes in nonlinear structure formation, such as cluster assembly, merging and clustering. The approach treats the curve defined by the overdensity fluctuation field when changing the smoothing scale as a random walk. Most implementations of the approach then assume that, at least to a first approximation, the walks have uncorrelated steps, so that the walk heights are a Markov process. This assumption is known to be inaccurate: smoothing filters that are most easily related to the physics of structure formation generically yield walks whose steps are correlated with one another. We develop models in which it is the steps, rather than the walk heights, that are a Markov process. In such models, which we call Markov Velocity processes, each step correlates only with the previous one. We show that TopHat smoothing of a power law power spectrum with index n = -2...

  18. Limitations of force-free magnetic field extrapolations: revisiting basic assumptions

    CERN Document Server

    Peter, H; Chitta, L P; Cameron, R H

    2015-01-01

    Force-free extrapolations are widely used to study the magnetic field in the solar corona based on surface measurements. The extrapolations assume that the ratio of internal energy of the plasma to magnetic energy, the plasma-beta is negligible. Despite the widespread use of this assumption observations, models, and theoretical considerations show that beta is of the order of a few percent to more than 10%, and thus not small. We investigate what consequences this has for the reliability of extrapolation results. We use basic concepts starting with the force and the energy balance to infer relations between plasma-beta and free magnetic energy, to study the direction of currents in the corona with respect to the magnetic field, and to estimate the errors in the free magnetic energy by neglecting effects of the plasma (beta<<1). A comparison with a 3D MHD model supports our basic considerations. If plasma-beta is of the order of the relative free energy (the ratio of the free magnetic energy to the total...

  19. Numerical simulation of flow in mechanical heart valves: grid resolution and the assumption of flow symmetry.

    Science.gov (United States)

    Ge, Liang; Jones, S Casey; Sotiropoulos, Fotis; Healy, Timothy M; Yoganathan, Ajit P

    2003-10-01

    A numerical method is developed for simulating unsteady, 3-D, laminar flow through a bileaflet mechanical heart valve with the leaflets fixed. The method employs a dual-time-stepping artificial-compressibility approach together with overset (Chimera) grids and is second-order accurate in space and time. Calculations are carried out for the full 3-D valve geometry under steady inflow conditions on meshes with a total number of nodes ranging from 4 x 10(5) to 1.6 x 10(6). The computed results show that downstream of the leaflets the flow is dominated by two pairs of counter-rotating vortices, which originate on either side of the central orifice in the aortic sinus and rotate such that the common flow of each pair is directed away from the aortic wall. These vortices intensify with Reynolds number, and at a Reynolds number of approximately 1200 their complex interaction leads to the onset of unsteady flow and the break of symmetry with respect to both geometric planes of symmetry. Our results show the highly 3-D structure of the flow; question the validity of computationally expedient assumptions of flow symmetry; and demonstrate the need for highly resolved, fully 3-D simulations if computational fluid dynamics is to accurately predict the flow in prosthetic mechanical heart valves.

  20. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  1. 'Gay boy talk' meets 'girl talk': HIV risk assessment assumptions in young gay men's sexual health communication with best friends.

    Science.gov (United States)

    Mutchler, Matt G; McDavitt, Bryce

    2011-06-01

    Young adults, particularly young gay men (YGM), are vulnerable to human immunodeficiency virus (HIV). Yet, little is known about how YGM discuss sexual health issues with their friends ('gay boy talk'). We conducted semi-structured interviews with YGM and their best friends (11 YGM/YGM dyads and 13 YGM/heterosexual female dyads). In this paper, we examine risk assessment assumptions conveyed within YGM's communication about sexual health with their friends and how, if at all, the sexual scripts guiding these assumptions may differ between YGM and young women. Findings demonstrated that, while these young adults clearly intended to support their friends and promote safer sex, they also conveyed assumptions about HIV risk assessment, especially regarding sexual partner selection, that may actually increase their friends' risk for HIV infection. Since inaccurate HIV risk assessment assumptions were transmitted via sexual health communication between peers, it is suggested that such assumptions may need to be addressed in HIV prevention programs working with YGM and their friends. Further, gender differences were identified within the sexual scripts shared between YGM and their friends, suggesting that such interventions should be tailored to the specific needs of different friendship networks.

  2. Different but equal: the implausible assumption at the heart of neutral theory.

    Science.gov (United States)

    Purves, Drew W; Turnbull, Lindsay A

    2010-11-01

    1. The core assumption of neutral theory is that all individuals in a community have equal fitness regardless of species, and regardless of the species composition of the community. But, real communities consist of species exhibiting large trait differences; hence these differences must be subject to perfect fitness-equalizing trade-offs for neutrality to hold. 2. Here we explain that perfect equalizing trade-offs are extremely unlikely to occur in reality, because equality of fitness among species is destroyed by: (i) any deviation in the functional form of the trade-off away from the one special form that gives equal fitness; (ii) spatial or temporal variation in performance; (iii) random species differences in performance. 3. In the absence of the density-dependent processes stressed by traditional niche-based community ecology, communities featuring small amounts of (i) or (ii) rapidly lose trait variation, becoming dominated by species with similar traits, and exhibit substantially lower species richness compared to the neutral case. Communities featuring random interspecific variation in traits (iii) lose all but a few fortuitous species. 4. Thus neutrality should be viewed, a priori, as a highly improbable explanation for the long-term co-occurrence of measurably different species within ecological communities. In contrast, coexistence via niche structure and density dependence, is robust to species differences in baseline fitness, and so remains plausible. 5. We conclude that: (i) co-occurring species will typically exhibit substantial differences in baseline fitness even when (imperfect) equalizing trade-offs have been taken into account; (ii) therefore, communities must be strongly niche structured, otherwise they would lose both trait variation and species richness; (iii) nonetheless, even in strongly niche-structured communities, it is possible that the abundance of species with similar traits are at least partially free to drift.

  3. Microwave Properties of Ice-Phase Hydrometeors for Radar and Radiometers: Sensitivity to Model Assumptions

    Science.gov (United States)

    Johnson, Benjamin T.; Petty, Grant W.; Skofronick-Jackson, Gail

    2012-01-01

    A simplied framework is presented for assessing the qualitative sensitivities of computed microwave properties, satellite brightness temperatures, and radar reflectivities to assumptions concerning the physical properties of ice-phase hydrometeors. Properties considered included the shape parameter of a gamma size distribution andthe melted-equivalent mass median diameter D0, the particle density, dielectric mixing formula, and the choice of complex index of refraction for ice. We examine these properties at selected radiometer frequencies of 18.7, 36.5, 89.0, and 150.0 GHz; and radar frequencies at 2.8, 13.4, 35.6, and 94.0 GHz consistent with existing and planned remote sensing instruments. Passive and active microwave observables of ice particles arefound to be extremely sensitive to the melted-equivalent mass median diameter D0 ofthe size distribution. Similar large sensitivities are found for variations in the ice vol-ume fraction whenever the geometric mass median diameter exceeds approximately 1/8th of the wavelength. At 94 GHz the two-way path integrated attenuation is potentially large for dense compact particles. The distribution parameter mu has a relatively weak effect on any observable: less than 1-2 K in brightness temperature and up to 2.7 dB difference in the effective radar reflectivity. Reversal of the roles of ice and air in the MaxwellGarnett dielectric mixing formula leads to a signicant change in both microwave brightness temperature (10 K) and radar reflectivity (2 dB). The choice of Warren (1984) or Warren and Brandt (2008) for the complex index of refraction of ice can produce a 3%-4% change in the brightness temperature depression.

  4. Evaluating abundance estimate precision and the assumptions of a count-based index for small mammals

    Science.gov (United States)

    Wiewel, A.S.; Adams, A.A.Y.; Rodda, G.H.

    2009-01-01

    Conservation and management of small mammals requires reliable knowledge of population size. We investigated precision of markrecapture and removal abundance estimates generated from live-trapping and snap-trapping data collected at sites on Guam (n 7), Rota (n 4), Saipan (n 5), and Tinian (n 3), in the Mariana Islands. We also evaluated a common index, captures per unit effort (CPUE), as a predictor of abundance. In addition, we evaluated cost and time associated with implementing live-trapping and snap-trapping and compared species-specific capture rates of selected live- and snap-traps. For all species, markrecapture estimates were consistently more precise than removal estimates based on coefficients of variation and 95 confidence intervals. The predictive utility of CPUE was poor but improved with increasing sampling duration. Nonetheless, modeling of sampling data revealed that underlying assumptions critical to application of an index of abundance, such as constant capture probability across space, time, and individuals, were not met. Although snap-trapping was cheaper and faster than live-trapping, the time difference was negligible when site preparation time was considered. Rattus diardii spp. captures were greatest in Haguruma live-traps (Standard Trading Co., Honolulu, HI) and Victor snap-traps (Woodstream Corporation, Lititz, PA), whereas Suncus murinus and Mus musculus captures were greatest in Sherman live-traps (H. B. Sherman Traps, Inc., Tallahassee, FL) and Museum Special snap-traps (Woodstream Corporation). Although snap-trapping and CPUE may have utility after validation against more rigorous methods, validation should occur across the full range of study conditions. Resources required for this level of validation would likely be better allocated towards implementing rigorous and robust methods.

  5. A critical assessment of the equal environment assumption of the twin method for schizophrenia

    Directory of Open Access Journals (Sweden)

    Roar eFosse

    2015-04-01

    Full Text Available The classical twin method (CTM is central to the view that schizophrenia is ~80% heritable. The CTM rests on the equal environments assumption (EEA that identical and fraternal twin pairs experience equivalent trait relevant environmental exposures. The EEA has not been directly tested for schizophrenia with measures of child social adversity, which is particularly etiologically relevant to the disorder. However, if child social adversity is more similar in identical than fraternal pairs in the general twin population, the EEA is unlikely to be valid for schizophrenia, a question which we tested in this study. Using results from prior twin studies, we tested if intraclass correlations for the following five categories of child social adversity are larger in identical than fraternal twins: bullying, sexual abuse, physical maltreatment, emotional neglect and abuse, and general trauma. Eleven relevant studies that encompassed 9119 twin pairs provided 24 comparisons of intraclass correlations, which we grouped into the five social exposure categories. Fisher’s z-test revealed significantly higher correlations in identical than fraternal pairs for each exposure category (z ≥ 3.53, p <.001. The difference remained consistent across gender, study site (country, sample size, whether psychometric instruments were used, whether interviewing was proximate or distant to the exposures, and whether informants were twins or third persons. Combined with other evidence that the differential intraclass correlation for child social adversity cannot be explained by evocative gene-environment covariation, our results indicate that the CTM does not provide any valid indication of genomic effects in schizophrenia.

  6. 论监护人侵权责任的承担%Guardians' Assumption of Torts Liability

    Institute of Scientific and Technical Information of China (English)

    周欣超

    2011-01-01

    A person without or with limited civil liability capacity causes damages to others, his or her guardian should assume the liability, but if the ward possess property, he or she should assume the liability on their own. There are no rules of civil liability capacity in China law, the law does not distinguish guardians from wards when it comes to the assumption of liability, and there are disputes about principle of imputation in theory and judicial practice. Doctrine of fault presumption should be applied to guardians' torts liability, on one hand the wards ought to be protected, on the other also should be considered. The introduction and popularization of family liahilitv insurance will contribute to the solve the disputes.%无民事行为能力人、限制民事行为能力人致人损害,产生的赔偿责任由监护人承担,被监护人有财产的例外。因我国立法中没有责任能力的规定,没有区分承担侵权责任的主体,司法实践和理论上对监护人侵权责任的归责原则亦有争议。监护人的侵权责任应当适用过错推定责任原则,一方面要保护被监护人,另一方面也要注重受害人利益的平衡。家庭责任保险的引入和推广,能够有助于这一争议的解决。

  7. Tank waste remediation system retrieval and disposal mission key enabling assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, J.H.

    1998-01-05

    An overall systems approach has been applied to develop action plans to support the retrieval and immobilization waste disposal mission. The review concluded that the systems and infrastructure required to support the mission are known. Required systems are either in place or plans have been developed to ensure they exist when needed. The review showed that since October 1996 a robust system engineering approach to establishing integrated Technical Baselines, work breakdown structures, tank farm structure and configurations and work scope and costs has been established itself as part of the culture within TWRS. An analysis of the programmatic, management and technical activities necessary to declare readiness to proceed with execution of the mission demonstrates that the system, people and hardware will be on line and ready to support the private contractors. The systems approach included defining the retrieval and immobilized waste disposal mission requirements and evaluating the readiness of the TWRS contractor to supply waste feed to the private contractors in June 2OO2. The Phase 1 feed delivery requirements from the Private Contractor Request for Proposals were reviewed. Transfer piping routes were mapped out, existing systems were evaluated, and upgrade requirements were defined. Technical Basis Reviews were completed to define work scope in greater detail, cost estimates and associated year by year financial analyses were completed. TWRS personnel training, qualifications, management systems and procedures were reviewed and shown to be in place and ready to support the Phase 1B mission. Key assumptions and risks that could negatively impact mission success were evaluated and appropriate mitigative actions plans were planned and scheduled.

  8. Influence of road network and population demand assumptions in evacuation modeling for distant tsunamis

    Science.gov (United States)

    Henry, Kevin; Wood, Nathan J.; Frazier, Tim G.

    2017-01-01

    Tsunami evacuation planning in coastal communities is typically focused on local events where at-risk individuals must move on foot in a matter of minutes to safety. Less attention has been placed on distant tsunamis, where evacuations unfold over several hours, are often dominated by vehicle use and are managed by public safety officials. Traditional traffic simulation models focus on estimating clearance times but often overlook the influence of varying population demand, alternative modes, background traffic, shadow evacuation, and traffic management alternatives. These factors are especially important for island communities with limited egress options to safety. We use the coastal community of Balboa Island, California (USA), as a case study to explore the range of potential clearance times prior to wave arrival for a distant tsunami scenario. We use a first-in–first-out queuing simulation environment to estimate variations in clearance times, given varying assumptions of the evacuating population (demand) and the road network over which they evacuate (supply). Results suggest clearance times are less than wave arrival times for a distant tsunami, except when we assume maximum vehicle usage for residents, employees, and tourists for a weekend scenario. A two-lane bridge to the mainland was the primary traffic bottleneck, thereby minimizing the effect of departure times, shadow evacuations, background traffic, boat-based evacuations, and traffic light timing on overall community clearance time. Reducing vehicular demand generally reduced clearance time, whereas improvements to road capacity had mixed results. Finally, failure to recognize non-residential employee and tourist populations in the vehicle demand substantially underestimated clearance time.

  9. Matrix Diffusion for Performance Assessment - Experimental Evidence, Modelling Assumptions and Open Issues

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, A

    2004-07-01

    In this report a comprehensive overview on the matrix diffusion of solutes in fractured crystalline rocks is presented. Some examples from observations in crystalline bedrock are used to illustrate that matrix diffusion indeed acts on various length scales. Fickian diffusion is discussed in detail followed by some considerations on rock porosity. Due to the fact that the dual-porosity medium model is a very common and versatile method for describing solute transport in fractured porous media, the transport equations and the fundamental assumptions, approximations and simplifications are discussed in detail. There is a variety of geometrical aspects, processes and events which could influence matrix diffusion. The most important of these, such as, e.g., the effect of the flow-wetted fracture surface, channelling and the limited extent of the porous rock for matrix diffusion etc., are addressed. In a further section open issues and unresolved problems related to matrix diffusion are mentioned. Since matrix diffusion is one of the key retarding processes in geosphere transport of dissolved radionuclide species, matrix diffusion was consequently taken into account in past performance assessments of radioactive waste repositories in crystalline host rocks. Some issues regarding matrix diffusion are site-specific while others are independent of the specific situation of a planned repository for radioactive wastes. Eight different performance assessments from Finland, Sweden and Switzerland were considered with the aim of finding out how matrix diffusion was addressed, and whether a consistent picture emerges regarding the varying methodology of the different radioactive waste organisations. In the final section of the report some conclusions are drawn and an outlook is given. An extensive bibliography provides the reader with the key papers and reports related to matrix diffusion. (author)

  10. The biosphere at Laxemar. Data, assumptions and models used in the SR-Can assessment

    Energy Technology Data Exchange (ETDEWEB)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern (eds.)

    2006-10-15

    This is essentially a compilation of a variety of reports concerning the site investigations, the research activities and information derived from other sources important for the safety assessment. The main objective is to present prerequisites, methods and data used, in the biosphere modelling for the safety assessment SR-Can at the Laxemar site. A major part of the report focuses on how site-specific data are used, recalculated or modified in order to be applicable in the safety assessment context; and the methods and sub-models that are the basis for the biosphere modelling. Furthermore, the assumptions made as to the future states of surface ecosystems are mainly presented in this report. A similar report is provided for the Forsmark area. This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. A central tool in the work is the description of the topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary in collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis.

  11. The biosphere at Forsmark. Data, assumptions and models used in the SR-Can assessment

    Energy Technology Data Exchange (ETDEWEB)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern (eds.)

    2006-10-15

    This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. The parameters are topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary, e.g. collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis. The report presents descriptions and estimates not presented elsewhere, as well as summaries of important steps in the biosphere modelling that are presented in more detail in separate reports. The intention is to give the reader a coherent description of the steps taken to calculate doses to biota and humans, including a description of the data used, the rationale for a number of assumptions made during parameterisation, and of how the landscape context is applied in the modelling, and also to present the models used and the results obtained.

  12. Evaluating assumptions and parameterization underlying process-based ecosystem models: the case of LPJ-GUESS

    Science.gov (United States)

    Pappas, C.; Fatichi, S.; Leuzinger, S.; Burlando, P.

    2012-04-01

    Dynamic vegetation models have been widely used for analyzing ecosystem dynamics and climate feedbacks. Their performance has been tested extensively against observations and by model intercomparison studies. In the present study, the LPJ-GUESS state-of-the-art ecosystem model was evaluated with respect to its structure, hypothesis, and parameterization by performing a global sensitivity analysis (GSA). The study aims at examining potential model limitations, particularly with regards to regional and watershed scale applications. A detailed GSA based on variance decomposition is presented to investigate the structural assumptions of the model and to highlight processes and parameters that cause the highest variability in the outputs. First order and total sensitivity indexes were calculated for each of the parameters using Sobol's methodology. In order to elucidate the role of climate on model sensitivity synthetic climate scenarios were generated based on climatic data from Switzerland. The results clearly indicate a very high sensitivity of LPJ-GUESS to photosynthetic parameters. Intrinsic quantum efficiency alone is able to explain about 60% of the variability in vegetation carbon fluxes and pools for most of the investigated climate conditions. Processes related to light were also found important together with parameters affecting plant structure (growth, establishment and mortality). The model shows minor sensitivity to hydrological and soil texture parameters, questioning its skills in representing spatial vegetation heterogeneity at regional or watershed scales. We conclude that LPJ-GUESS' structure and possibly the one of other, structurally similar, dynamic vegetation models may need to be reconsidered. Specifically, the oversensitivity of the photosynthetic component deserves a particular attention, as this seems to contradict an increasing number of observations suggesting that photosynthesis may be a consequence rather than the driver of plant growth.

  13. Using a sharp instrument to parse apart strategy and consistency: an evaluation of PPT and its assumptions.

    Science.gov (United States)

    Trafimow, David; Rice, Stephen

    2011-01-01

    Potential Performance Theory (PPT) is a general theory for parsing observed performance into the underlying strategy and the consistency with which it is used. Although empirical research has supported that PPT is useful, it is desirable to have more information about the bias and standard errors of PPT findings. It also is beneficial to know the effects of violations of PPT assumptions. The authors present computer simulations that evaluate bias and standard errors at varying levels of strategy, consistency, and number of trials per participant. The simulations show that, when the assumptions are true, there is very little bias and the standard errors are low when there are moderate or large numbers of trials per participant (e.g., N=50 or N=100). But when the independence assumption is violated, PPT provides biased findings, although the bias is quite small unless the violations are large.

  14. Uniqueness Results for Second Order Bellman-Isaacs Equations under Quadratic Growth Assumptions and Applications

    CERN Document Server

    Da Lio, Francesca; 10.1137/S0363012904440897

    2010-01-01

    In this paper, we prove a comparison result between semicontinuous viscosity sub and supersolutions growing at most quadratically of second-order degenerate parabolic Hamilton-Jacobi-Bellman and Isaacs equations. As an application, we characterize the value function of a finite horizon stochastic control problem with unbounded controls as the unique viscosity solution of the corresponding dynamic programming equation.

  15. Problematizing therapeutic assumptions about narratives: a case study of storytelling events in a post-conflict context.

    Science.gov (United States)

    Cole, Courtney E

    2010-12-01

    Narrative approaches to health communication research have often been characterized by assumptions of the therapeutic and ameliorative effect of narratives. In this article, I call these assumptions into question by critically engaging extant research in narrative health communication research in light of testimony by a participant in South Africa's Truth and Reconciliation Commission. Drawing on his personal narrative, numerous retellings of his story in public and academic discourse, and his responses to his story's appropriation, I demonstrate the importance of conducting narrative research and theorizing with an appreciation of its therapeutic potential, as well as its ability to harm.

  16. Developmental assumptions in literary criticism and their implications for conceptions of continuity and change in literary creativity.

    Science.gov (United States)

    Cohen-Shalev, A; Rapoport, T

    1990-03-01

    With 2 critical readings of William Wordsworth's "Ode: Intimations of Immortality From Recollections of Early Childhood," this article presents contrasting assumptions of the literary critic about the development of artistic creativity and relates them to the issue of continuity and change in literary expression over the life span. The unveiled assumptions parallel the "hard" (structuralist) and "soft" (life-span) conceptions of human development prevailing in contemporary psychology. A better understanding of creative development may be reached by superimposing the principles derived from the soft metatheoretical orientation on those of the hard theory.

  17. The Microtremor H/V Spectral Ratio: The Physical Basis of the Diffuse Field Assumption

    Science.gov (United States)

    Sanchez-Sesma, F. J.

    2016-12-01

    The microtremor H/V spectral ratio (MHVSR) is popular to obtain the dominant frequency at a site. Despite the success of MHVSR some controversy arose regarding its physical basis. One approach is the Diffuse Field Assumption, DFA. It is then assumed that noise diffuse features come from multiple scattering within the medium. According to theory, the average of the autocorrelation is proportional to directional energy density (DED) and to the imaginary part of the Green's function for same source and receiver. Then, the square of MHVSR is a ratio of DEDs which, in a horizontally layered system, is 2xImG11/ImG33, where ImG11 and ImG33 are the imaginary parts of Green's functions for horizontal and vertical components. This has physical implications that emerge from the duality DED-force, implicit in the DFA. Consider a surface force at a half-space. The radiated energy is carried away by various wave types and the proportions of each one are precisely the fractions of the energy densities of a diffuse elastic wave field at the free surface. Thus, some properties of applied forces are also characteristics of DEDs. For example, consider a Poisson solid. For a normal point load, 67 per cent of energy is carried away by Rayleigh waves. For the tangential case, it is less well known that, 77 per cent of energy goes as shear waves. In a full space, 92 per cent of the energy is emitted as shear waves. The horizontal DED at the half-space surface implies significant emission of down-going shear waves that explains the curious stair-like resonance spectrum of ImG11. Both ImG11 and ImG33 grow linearly versus frequency and this represents wave emission. For a layered medium, besides wave emission, the ensuing variations correspond to reflected waves. For high frequencies, ImG33 depends on the properties of the top layer. Reflected body waves are very small and Rayleigh waves behave in the top layer as in a kind of mini half-space. From HVSR one can invert the velocity model

  18. Examining recent expert elicitation, judgment guidelines: Value assumptions and the prospects for rationality

    Energy Technology Data Exchange (ETDEWEB)

    Fleming, P.A. [Creighton Univ., Omaha, NE (United States). Dept. of Philosophy

    1999-12-01

    Any examination of the role of values in decisions on risk must take into consideration the increasing reliance on the expert judgment method. Today, reliance on expert judgment is conspicuously present in the documents and work associated with site characterization of Yucca Mountain as a host for the United States' first high level nuclear waste repository. The NRC encourages the use of probabilistic risk assessment's state of the art technology as a complement to deterministic approaches to nuclear regulatory activities. It considers expert judgment as one of those technologies. At the last International Conference on High-Level Nuclear Waste Development several presentations report on the use of expert elicitation sessions held during 1997 at Yucca Mountain. Over a decade ago, few guidelines existed for Department of Energy work in expert judgment. In an analysis of these guidelines, I described the author-advocate's view of the role of values in this method of risk assessment. I suggested that the guidelines assume naive positivism. I noted that the creators of these guidelines also tend toward scientific realism in their apologetic tone that expert judgment falls short of representing the way nature is. I also pointed to a tendency toward what I call a heightened or super-realism. Normal science represents the way the world is and for expert judgment this is only likely so. Expert judgment method, however, is capable of truly capturing expertise in a representative sense. The purpose of this paper is to examine new guidelines from the Department of Energy and the Nuclear Regulatory Commission, with a view to eliciting the epistemological assumptions about the role of values and the status of objectivity claimed for this method. Do these new guidelines also adopt naive positivism? Does the inability to encounter raw, pure, value-neutral expert judgment, reveal itself in these guidelines? Or do these guidelines adopt the belief that values are not

  19. Experimental verification of the frozen flow atmospheric turbulence assumption with use of astronomical adaptive optics telemetry.

    Science.gov (United States)

    Poyneer, Lisa; van Dam, Marcos; Véran, Jean-Pierre

    2009-04-01

    We use closed-loop deformable mirror telemetry from Altair and Keck adaptive optics (AO) to determine whether atmospheric turbulence follows the frozen flow hypothesis. Using telemetry from AO systems, our algorithms (based on the predictive Fourier control framework) detect frozen flow >94% of the time. Usually one to three layers are detected. Between 20% and 40% of the total controllable phase power is due to frozen flow. Velocity vector RMS variability is less than 0.5 m/s (per axis) on 10-s intervals, indicating that the atmosphere is stable enough for predictive control to measure and adapt to prevailing atmospheric conditions before they change.

  20. Nominal model predictive control

    OpenAIRE

    Grüne, Lars

    2013-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  1. Nominal Model Predictive Control

    OpenAIRE

    Grüne, Lars

    2014-01-01

    5 p., to appear in Encyclopedia of Systems and Control, Tariq Samad, John Baillieul (eds.); International audience; Model Predictive Control is a controller design method which synthesizes a sampled data feedback controller from the iterative solution of open loop optimal control problems.We describe the basic functionality of MPC controllers, their properties regarding feasibility, stability and performance and the assumptions needed in order to rigorously ensure these properties in a nomina...

  2. Why Bother about Writing a Masters Dissertation? Assumptions of Faculty and Masters Students in an Iranian Setting

    Science.gov (United States)

    Hasrati, Mostafa

    2013-01-01

    This article reports the results of a mixed methodology analysis of the assumptions of academic staff and Masters students in an Iranian university regarding various aspects of the assessment of the Masters degree thesis, including the main objective for writing the thesis, the role of the students, supervisors and advisors in writing the…

  3. The water-filled versus air-filled status of vessels cut open in air: the 'Scholander assumption' revisited

    Science.gov (United States)

    M.T. Tyree; H. Cochard; P. Cruziat

    2003-01-01

    When petioles of transpiring leaves are cut in the air, according to the 'Scholander assumption', the vessels cut open should fill with air as the water is drained away by continued transpiration, The distribution of air-filled vessels versus distance from the cut surface should match the distribution of lengths of 'open vessels', i.e. vessels cut...

  4. Schooling Mobile Phones: Assumptions about Proximal Benefits, the Challenges of Shifting Meanings, and the Politics of Teaching

    Science.gov (United States)

    Philip, Thomas M.; Garcia, Antero

    2015-01-01

    Mobile devices are increasingly upheld as powerful tools for learning and school reform. In this article, we prioritize youth voices to critically examine assumptions about student interest in mobile devices that often drive the incorporation of new technologies into schools. By demonstrating how the very meaning of mobile phones shift as they are…

  5. Scholars and Their Inquiry Paradigms: Exploring a Conceptual Framework for Classifying Inquiry and Inquirers Based upon Paradigmatic Assumptions.

    Science.gov (United States)

    Toma, J. Douglas

    This paper examines whether the social science-based typology of Yvonne Lincoln and Egon Guba (1994), in which social science scholars are divided into positivist, postpositivist, critical, and constructivist paradigms based on ontological, epistemological, and methodological assumptions in the discipline, can be adapted to the academic discipline…

  6. The Change Grid and the Active Client: Challenging the Assumptions of Change Agentry in the Penal Process.

    Science.gov (United States)

    Klofas, John; Duffee, David E.

    1981-01-01

    Reexamines the assumptions of the change grid regarding the channeling of masses of clients into change strategies programs. Penal organizations specifically select and place clients so that programs remain stable, rather than sequence programs to meet the needs of clients. (Author)

  7. Schooling Mobile Phones: Assumptions about Proximal Benefits, the Challenges of Shifting Meanings, and the Politics of Teaching

    Science.gov (United States)

    Philip, Thomas M.; Garcia, Antero

    2015-01-01

    Mobile devices are increasingly upheld as powerful tools for learning and school reform. In this article, we prioritize youth voices to critically examine assumptions about student interest in mobile devices that often drive the incorporation of new technologies into schools. By demonstrating how the very meaning of mobile phones shift as they are…

  8. The Effects of Violating the Beta-Binomial Assumption on Huynh's Estimates of Decision Consistency for Mastery Tests.

    Science.gov (United States)

    Johnston, Shirley H.; And Others

    A computer simulation was undertaken to determine the effects of using Huynh's single-administration estimates of the decision consistency indices for agreement and coefficient kappa, under conditions that violated the beta-binomial assumption. Included in the investigation were two unimodal score distributions that fit the model and two bimodal…

  9. Testing the Assumption of Measurement Invariance in the SAMHSA Mental Health and Alcohol Abuse Stigma Assessment in Older Adults

    NARCIS (Netherlands)

    King-Kallimanis, B.L.; Oort, F.J.; Lynn, N.; Schonfeld, L.

    2012-01-01

    This study examined the assumption of measurement invariance of the SAMSHA Mental Health and Alcohol Abuse Stigma Assessment. This is necessary to make valid comparisons across time and groups. The data come from the Primary Care Research in Substance Abuse and Mental Health for Elderly trial, a

  10. What Is This Substance? What Makes It Different? Mapping Progression in Students' Assumptions about Chemical Identity

    Science.gov (United States)

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-01-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical…

  11. TESTING THE ASSUMPTIONS AND INTERPRETING THE RESULTS OF THE RASCH MODEL USING LOG-LINEAR PROCEDURES IN SPSS

    NARCIS (Netherlands)

    TENVERGERT, E; GILLESPIE, M; KINGMA, J

    This paper shows how to use the log-linear subroutine of SPSS to fit the Rasch model. It also shows how to fit less restrictive models obtained by relaxing specific assumptions of the Rasch model. Conditional maximum likelihood estimation was achieved by including dummy variables for the total

  12. TESTING THE ASSUMPTIONS AND INTERPRETING THE RESULTS OF THE RASCH MODEL USING LOG-LINEAR PROCEDURES IN SPSS

    NARCIS (Netherlands)

    TENVERGERT, E; GILLESPIE, M; KINGMA, J

    1993-01-01

    This paper shows how to use the log-linear subroutine of SPSS to fit the Rasch model. It also shows how to fit less restrictive models obtained by relaxing specific assumptions of the Rasch model. Conditional maximum likelihood estimation was achieved by including dummy variables for the total score

  13. Why Bother about Writing a Masters Dissertation? Assumptions of Faculty and Masters Students in an Iranian Setting

    Science.gov (United States)

    Hasrati, Mostafa

    2013-01-01

    This article reports the results of a mixed methodology analysis of the assumptions of academic staff and Masters students in an Iranian university regarding various aspects of the assessment of the Masters degree thesis, including the main objective for writing the thesis, the role of the students, supervisors and advisors in writing the…

  14. 26 CFR 1.752-6 - Partnership assumption of partner's section 358(h)(3) liability after October 18, 1999, and...

    Science.gov (United States)

    2010-04-01

    ... general. If, in a transaction described in section 721(a), a partnership assumes a liability (defined in... partnership is the fair market value of that interest increased by the partner's share of partnership...) does not apply to an assumption of a liability (defined in section 358(h)(3)) by a partnership as part...

  15. What Is This Substance? What Makes It Different? Mapping Progression in Students' Assumptions about Chemical Identity

    Science.gov (United States)

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-01-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical…

  16. Testing the Assumption of Measurement Invariance in the SAMHSA Mental Health and Alcohol Abuse Stigma Assessment in Older Adults

    NARCIS (Netherlands)

    King-Kallimanis, B.L.; Oort, F.J.; Lynn, N.; Schonfeld, L.

    2012-01-01

    This study examined the assumption of measurement invariance of the SAMSHA Mental Health and Alcohol Abuse Stigma Assessment. This is necessary to make valid comparisons across time and groups. The data come from the Primary Care Research in Substance Abuse and Mental Health for Elderly trial, a lon

  17. 76 FR 6637 - Assumption Buster Workshop: Defense-in-Depth Is a Smart Investment for Cyber Security

    Science.gov (United States)

    2011-02-07

    ... Assumption Buster Workshop: Defense-in-Depth Is a Smart Investment for Cyber Security AGENCY: The National... interagency working group that coordinates cyber security research activities in support of national security...-Depth strategy for cyber security. The workshop will be held March 22, 2011 in the Washington DC area...

  18. 76 FR 2151 - Assumption Buster Workshop: Defense-in-Depth is a Smart Investment for Cyber Security

    Science.gov (United States)

    2011-01-12

    ... Assumption Buster Workshop: Defense-in-Depth is a Smart Investment for Cyber Security AGENCY: The National...) Committee, an interagency working group that coordinates cyber security research activities in support of... the defense-in-depth strategy for cyber security. The workshop will be held March 22, 2011 in the...

  19. 14 CFR Appendix C to Part 440 - Agreement for Waiver of Claims and Assumption of Responsibility for Permitted Activities

    Science.gov (United States)

    2010-01-01

    ... Assumption of Responsibility for Permitted Activities C Appendix C to Part 440 Aeronautics and Space... FINANCIAL RESPONSIBILITY Pt. 440, App. C Appendix C to Part 440—Agreement for Waiver of Claims and... implement the provisions of section 440.17(c) of the Commercial Space Transportation Licensing...

  20. Academic Achievement and Behavioral Health among Asian American and African American Adolescents: Testing the Model Minority and Inferior Minority Assumptions

    Science.gov (United States)

    Whaley, Arthur L.; Noel, La Tonya

    2013-01-01

    The present study tested the model minority and inferior minority assumptions by examining the relationship between academic performance and measures of behavioral health in a subsample of 3,008 (22%) participants in a nationally representative, multicultural sample of 13,601 students in the 2001 Youth Risk Behavioral Survey, comparing Asian…

  1. The Effect of Multicollinearity and the Violation of the Assumption of Normality on the Testing of Hypotheses in Regression Analysis.

    Science.gov (United States)

    Vasu, Ellen S.; Elmore, Patricia B.

    The effects of the violation of the assumption of normality coupled with the condition of multicollinearity upon the outcome of testing the hypothesis Beta equals zero in the two-predictor regression equation is investigated. A monte carlo approach was utilized in which three differenct distributions were sampled for two sample sizes over…

  2. Moving Past Assumptions: Recognizing Parents as Allies in Promoting the Sexual Literacies of Adolescents through a University-Community Collaboration

    Science.gov (United States)

    Horn, Stacey S.; Peter, Christina R.; Tasker, Timothy B.; Sullivan, Shannon L.

    2013-01-01

    This article recounts how a university-community collaborative challenged prevailing assumptions about parents as barriers to the provision of gender and sexuality information to their children, allowing for the recognition of parents as critical stakeholders and partners in sexual literacy work with youth. We provide evidence that parents'…

  3. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Science.gov (United States)

    Shao, Kan; Gift, Jeffrey S; Setzer, R Woodrow

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose-response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean±standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the "hybrid" method and relative deviation approach, we first evaluate six representative continuous dose-response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates.

  4. Bridging the Gap between Human Resource Development and Adult Education: Part One, Assumptions, Definitions, and Critiques

    Science.gov (United States)

    Hatcher, Tim; Bowles, Tuere

    2013-01-01

    Human resource development (HRD) as a scholarly endeavor and as a practice is often criticized in the adult education (AE) literature and by AE scholars as manipulative and oppressive and, through training and other interventions, controlling workers for strictly economic ends (Baptiste, 2001; Cunningham, 2004; Schied, 2001; Welton, 1995).…

  5. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...

  6. Bridging the Gap between Human Resource Development and Adult Education: Part One, Assumptions, Definitions, and Critiques

    Science.gov (United States)

    Hatcher, Tim; Bowles, Tuere

    2013-01-01

    Human resource development (HRD) as a scholarly endeavor and as a practice is often criticized in the adult education (AE) literature and by AE scholars as manipulative and oppressive and, through training and other interventions, controlling workers for strictly economic ends (Baptiste, 2001; Cunningham, 2004; Schied, 2001; Welton, 1995).…

  7. Analysis of Entropy Generation of Combined Heat and Mass Transfer in Internal and External Flows with the Assumption of Local Thermodynamic Equilibrium

    Institute of Scientific and Technical Information of China (English)

    ShouguangYao

    1994-01-01

    In this paper,the control volume method is used to establish the general expression of entropy generation due to combined convective heat and mass transfer in internal and external fluid streams.The expression accounts for irreversibilities due to the presence of heat transfer across a finite temperature difference,mass transfer across a finite difference in the chemical potential of a species,and due to flow friction.Based on the assumption of local thermodynamic equilibrium,the generalized form of the Gibbs equation is used in this analysis.The results are applied to two fundamental problems of forced convection heat and mass transfer in internal and external flows.After minimizing the entropy generation,useful conclusions are derived that are typical of the second law viewpoint for the definition of the optimum operation conditions for the specified applications.which is a valuable criterion for optimum design of heat and fluid flow devices.

  8. Performance Analysis of Flat Surface Assumption and Residual Motion Errors on Airborne Repeat-pass InSAR

    Directory of Open Access Journals (Sweden)

    Lin Xue

    2013-09-01

    Full Text Available When applying to the airborne repeat-pass Interferometric Synthetic Aperture Radar (InSAR, which has long synthetic aperture and large azimuth-dependent errors, the surface assumption used to simply the time-domain algorithm model and the residual motion errors due to the precision of the navigation system will affect the imaging result and the interferometric measurement. This paper analyzes the altitude errors introduced by the surface assumption and the residual motion errors due to the precision of the navigation system. We deduce the range errors model during the single pass and analyze the effects of these errors on the plane location, interferometric phase and DEM precision. Then the accuracy of the theoretical deduction is verified by simulation and real data. The research provides theoretical bases for the system design and signal processing of airborne repeat-pass InSAR.

  9. General aptitude and the assumption of truth in deductively rational reasoning about probable but false antecedent to consequent relations.

    Science.gov (United States)

    Schroyens, Walter; Fleerackers, Lieve; Maes, Sunile

    2010-12-15

    Two experiments (N(1) = 117 and N(2) = 245) on reasoning with knowledge-rich conditionals showed a main effect of logical validity, which was due to the negative effect of counter-examples being smaller for valid than for invalid arguments. These findings support the thesis that some people tend to inhibit background inconsistent with the hypothetical truth of the premises, while others tend to abandon the implicit truth-assumption when they have factual evidence to the contrary. Findings show that adhering to the truth-assumption in the face of conflicting evidence to the contrary requires an investment of time and effort which people with a higher general aptitude are more likely to do.

  10. General aptitude and the assumption of truth in deductively rational reasoning about probable but false antecedent to consequent relations

    Science.gov (United States)

    Schroyens, Walter; Fleerackers, Lieve; Maes, Sunile

    2010-01-01

    Two experiments (N1 = 117 and N2 = 245) on reasoning with knowledge-rich conditionals showed a main effect of logical validity, which was due to the negative effect of counter-examples being smaller for valid than for invalid arguments. These findings support the thesis that some people tend to inhibit background inconsistent with the hypothetical truth of the premises, while others tend to abandon the implicit truth-assumption when they have factual evidence to the contrary. Findings show that adhering to the truth-assumption in the face of conflicting evidence to the contrary requires an investment of time and effort which people with a higher general aptitude are more likely to do. PMID:21228921

  11. Linearity assumption in soil-to-plant transfer factors of natural uranium and radium in Helianthus annuus L

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, P. Blanco [Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain); Tome, F. Vera [Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain)]. E-mail: fvt@unex.es; Fernandez, M. Perez [Area de Ecologia, Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain); Lozano, J.C. [Laboratorio de Radiactividad Ambiental, Facultad de Ciencias, Universidad de Salamanca, 37008 Salamanca (Spain)

    2006-05-15

    The linearity assumption of the validation of soil-to-plant transfer factors of natural uranium and {sup 226}Ra was tested using Helianthus annuus L. (sunflower) grown in a hydroponic medium. Transfer of natural uranium and {sup 226}Ra was tested in both the aerial fraction of plants and in the overall seedlings (roots and shoots). The results show that the linearity assumption can be considered valid in the hydroponic growth of sunflowers for the radionuclides studied. The ability of sunflowers to translocate uranium and {sup 226}Ra was also investigated, as well as the feasibility of using sunflower plants to remove uranium and radium from contaminated water, and by extension, their potential for phytoextraction. In this sense, the removal percentages obtained for natural uranium and {sup 226}Ra were 24% and 42%, respectively. Practically all the uranium is accumulated in the roots. However, 86% of the {sup 226}Ra activity concentration in roots was translocated to the aerial part.

  12. 经济学的理性假设辨析%Analysis on the Rationality Assumption of Economics

    Institute of Scientific and Technical Information of China (English)

    李佩; 张宇

    2014-01-01

    理性历来是经济学研究秉持的最基本的行为假设,然而自从其概念诞生以来,经济学家们对它的理解与解释却众说纷纭。理性的框架应界定为:利己性、最优化与偏好一致性。对理性假设应该采取秉持实证主义思想,维持理性假设的态度与措施;当理论与现实矛盾时,谨慎地适当地扩展理论模型或环境假设,扩展的底限是维持理性的内在一致性,上限则取决于理论的一般性与现实性的权衡。%Rationality has always being the most basic behavioral assumption in Economics research. However, since the con-cept was born, economists have proposed lots of understanding and interpretation. Rational framework should be defined as:self-interest, optimization and preferences consistency. Assumption of rational positivism should be taken to uphold the idea of positivism, maintaining a rational assumption that attitudes and measures; when theory and reality contradictions cautiously properly extended theoretical models or environmental assumptions, the bottom line is to maintain a rational extension of inter-nal consistency, the upper limit depending on the general and the reality of trade-off theory.

  13. Studies on the Effect of Flaw Detection Probability Assumptions on Risk Reduction at Non-Destructive Inspection

    OpenAIRE

    2010-01-01

    The paper summarises the results of a study of the effect of piping inspection reliability assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified probability of detection (POD) curve. Further, the study compared various structural reliability calculation approaches for a set of cases. The results indicate that the use of a simplified POD could be justifiable in RI-ISI applications.

  14. The Manifestations of Positive Leadership Strategies in the Doctrinal Assumptions of the U.S. Army Leadership Concept

    Directory of Open Access Journals (Sweden)

    Andrzej Lis

    2015-06-01

    Full Text Available The aim of the paper is to identify the manifestations of positive leadership strategies in the doctrinal assumptions of the U.S. Army leadership concept. The components of the U.S. Army leadership requirements model are be tested against the Cameron’s (2012 model of positive leadership strategies including: building a positive work climate; fostering positive relationships among the members of an organisation; establishing and promoting positive communication and manifesting the meaningfulness of work.

  15. Study of Effects of Flat Surface Assumption to Synthetic Aperture Radar Time-domain Algorithms Imaging Quality

    Directory of Open Access Journals (Sweden)

    Lin Shi-bin

    2012-08-01

    Full Text Available Time-domain algorithms have great application prospect in Ultra Wide Band Synthetic Aperture Radar (UWB SAR imaging for its advantages such as perfect focusing and perfect motion compensation. We could adopt the flat surface assumption to simplify the imaging geometric model, when undulating terrain is imaged using time-domain algorithms. Nevertheless, the flat surface assumption leads to geometric errors, thereby affecting the imaging results. This paper studies the effects of this assumption on time-domain imaging algorithms, points out that it leads to position offset problem in the case of linear aperture, and it even leads to target defocusing problem in the case of non-linear aperture. The expression of position offset is given in this paper, as well as the restriction of the maximal offset of the non-linear aperture and the maximum elevation of the area in order to focus the targets. The conclusions are validated by simulated data, which is processed by one kind of time-domain algorithms, namely Back Projection (BP algorithm.

  16. Normality assumptions and risk management: an application of the parametric var via goodness-of-fit test

    Directory of Open Access Journals (Sweden)

    Herick Fernando Moralles

    2014-05-01

    Full Text Available Given the weaknesses of the parametric VaR (Value-at-Risk calculated by normality assumptions, this paper develops a method of parametric VaR calculation considering ten different probability distributions. Specifically, the distribution to be used for the VaR calculation of a specific asset or portfolio is indicated by the Kolmogorov-Smirnov goodness-of-fit test. Additionally, the study compares the normality assumptions applicability for the VaR calculation of both individual assets, and to a large portfolio, in the context of market stability. The experiment makes use of a sample of 15 individual assets traded in the Sao Paulo Stock Exchange and the IBOVESPA index, collected in the Economática® database. The goodness-of-fit tests and VaR calculations are performed by a program developed in MATLAB7.1®. This investigation demonstrates that the assumption of normality brings good risk estimates for large portfolios and individual assets.

  17. Multiple Linear Regressions by Maximizing the Likelihood under Assumption of Generalized Gauss-Laplace Distribution of the Error.

    Science.gov (United States)

    Jäntschi, Lorentz; Bálint, Donatella; Bolboacă, Sorana D

    2016-01-01

    Multiple linear regression analysis is widely used to link an outcome with predictors for better understanding of the behaviour of the outcome of interest. Usually, under the assumption that the errors follow a normal distribution, the coefficients of the model are estimated by minimizing the sum of squared deviations. A new approach based on maximum likelihood estimation is proposed for finding the coefficients on linear models with two predictors without any constrictive assumptions on the distribution of the errors. The algorithm was developed, implemented, and tested as proof-of-concept using fourteen sets of compounds by investigating the link between activity/property (as outcome) and structural feature information incorporated by molecular descriptors (as predictors). The results on real data demonstrated that in all investigated cases the power of the error is significantly different by the convenient value of two when the Gauss-Laplace distribution was used to relax the constrictive assumption of the normal distribution of the error. Therefore, the Gauss-Laplace distribution of the error could not be rejected while the hypothesis that the power of the error from Gauss-Laplace distribution is normal distributed also failed to be rejected.

  18. Adhesion Detection Analysis by Modeling Rail Wheel Set Dynamics under the Assumption of Constant Creep Coefficient

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali Soomro

    2014-12-01

    Full Text Available Adhesion level control is very necessary to avoid slippage of rail wheelset and track from derailment for smoothing running of rail vehicle. In this paper the proper dynamics of wheelset for velocities acting in three dimensions of wheelset and rail track has been discussed along with creep forces on each wheel in longitudinal, lateral and spin directions has been enumerated and computed for suitable modeling. The concerned results have been simulated by Matlab code to observe the correlation of this phenomenon to compare creepage and creep forces for detecting adhesion level. This adhesion identification is recognized by applying coulomb’s law for sliding friction by comparing tangential and normal forces through co-efficient of friction

  19. World assumptions in psychosis: do paranoid patients believe in a just world?

    Science.gov (United States)

    Valiente, Carmen; Espinosa, Regina; Vázquez, Carmelo; Cantero, Dolores; Fuentenebro, Filiberto

    2010-11-01

    The aim of this study was to examine the contents of world views held by patients with current persecutory beliefs. We examined whether these beliefs in a just world (BJW) were associated with the severity of psychopathology of participants. Our results showed that, compared with a healthy control group, the current persecutory beliefs group had weaker beliefs in a just world related to themselves (BJW-P), but there were no differences between both groups in their beliefs in general justice in the world (BJW-G). Regression analyses showed that BJW, particularly weaker beliefs in personal justice, significantly associated with more severe symptoms of depression and paranoia as well as with lower scores of psychological well-being. Our results support the relevance of the BJW framework in exploring world views in patients with persecutory beliefs. We discuss the implications of these results for the research and treatment of paranoid ideation.

  20. Improving access to psychological therapies (IAPT) and treatment outcomes: epistemological assumptions and controversies.

    Science.gov (United States)

    Williams, C H J

    2015-06-01

    Cognitive behaviour therapy (CBT) is recommended as a primary treatment choice in England, for anxiety and depression, by the National Institute for Health and Care Excellence (NICE). It has been argued that CBT has enjoyed political and cultural dominance and this has arguably led to maintained government investment in England for the cognitive and behavioural treatment of mental health problems. The government programme 'Improving Access to Psychological Therapies' (IAPT) aims to improve the availability of CBT. The criticism of the NICE evidence-based guidelines supporting the IAPT programme, has been the dominance of the gold standard randomized controlled trial methodology, with a focus on numerical outcome data, rather than a focus on a recovery narrative. RCT-based research is influenced by a philosophical paradigm called positivism. The IAPT culture is arguably influenced by one research paradigm and such an influence can skew services only towards numerical outcome data as the only truth of 'recovery'. An interpretative paradigm could assist in shaping service-based cultures, alter how services are evaluated and improve the richness of CBT research. This paper explores the theory of knowledge (epistemology) that underpins the evidence-based perspective of CBT and how this influences service delivery. The paper argues that the inclusion of service user narrative (qualitative data) can assist the evaluation of CBT from the user's perspective and can understand the context in which people live and how they access services. A qualitative perspective is discussed as a research strategy, capturing the lived experience of under-represented groups, such as sexual, gender and ethnic minorities. Cognitive behaviour therapy (CBT) has enjoyed political and cultural dominance within mental healthcare, with renewed government investment in England for the'Improving Access to Psychological Therapies' (IAPT) programme. The criticism of the evidence-based guidelines

  1. Peer effects in early childhood education: testing the assumptions of special-education inclusion.

    Science.gov (United States)

    Justice, Laura M; Logan, Jessica A R; Lin, Tzu-Jung; Kaderavek, Joan N

    2014-09-01

    There has been a push in recent years for students with disabilities to be educated alongside their typically developing peers, a practice called inclusion. In this study, we sought to determine whether peer effects operate within early-childhood special-education (ECSE) classrooms in which preschoolers with disabilities are educated alongside typical peers. Peer effects specific to language growth were assessed for 670 preschoolers (mean age = 52 months) in 83 ECSE classrooms; 55% of the children had disabilities. We found that the average language skills of classmates, as assessed in the fall of the year, significantly predicted children's language skills in the spring (after controlling for their relative skill level in the fall); in addition, there was a significant interactive effect of disability status (i.e., the presence or absence of a disability) and peers' language skills. Peer effects were the least consequential for children without disabilities whose classmates had relatively strong language skills, and the most consequential for children with disabilities whose classmates had relatively poor language skills. © The Author(s) 2014.

  2. Oceanographic and behavioural assumptions in models of the fate of coral and coral reef fish larvae.

    Science.gov (United States)

    Wolanski, Eric; Kingsford, Michael J

    2014-09-06

    A predictive model of the fate of coral reef fish larvae in a reef system is proposed that combines the oceanographic processes of advection and turbulent diffusion with the biological process of horizontal swimming controlled by olfactory and auditory cues within the timescales of larval development. In the model, auditory cues resulted in swimming towards the reefs when within hearing distance of the reef, whereas olfactory cues resulted in the larvae swimming towards the natal reef in open waters by swimming against the concentration gradients in the smell plume emanating from the natal reef. The model suggested that the self-seeding rate may be quite large, at least 20% for the larvae of rapidly developing reef fish species, which contrasted with a self-seeding rate less than 2% for non-swimming coral larvae. The predicted self-recruitment rate of reefs was sensitive to a number of parameters, such as the time at which the fish larvae reach post-flexion, the pelagic larval duration of the larvae, the horizontal turbulent diffusion coefficient in reefal waters and the horizontal swimming behaviour of the fish larvae in response to auditory and olfactory cues, for which better field data are needed. Thus, the model suggested that high self-seeding rates for reef fish are possible, even in areas where the 'sticky water' effect is minimal and in the absence of long-term trapping in oceanic fronts and/or large-scale oceanic eddies or filaments that are often argued to facilitate the return of the larvae after long periods of drifting at sea. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  3. On the importance of the heterogeneity assumption in the characterization of reservoir geomechanical properties

    Science.gov (United States)

    Zoccarato, C.; Baù, D.; Bottazzi, F.; Ferronato, M.; Gambolati, G.; Mantica, S.; Teatini, P.

    2016-10-01

    The geomechanical analysis of a highly compartmentalized reservoir is performed to simulate the seafloor subsidence due to gas production. The available observations over the hydrocarbon reservoir consist of bathymetric surveys carried out before and at the end of a 10-yr production life. The main goal is the calibration of the reservoir compressibility cM, that is, the main geomechanical parameter controlling the surface response. Two conceptual models are considered: in one (i) cM varies only with the depth and the vertical effective stress (heterogeneity due to lithostratigraphic variability); in another (ii) cM varies also in the horizontal plane, that is, it is spatially distributed within the reservoir stratigraphic units. The latter hypothesis accounts for a possible partitioning of the reservoir due to the presence of sealing faults and thrusts that suggests the idea of a block heterogeneous system with the number of reservoir blocks equal to the number of uncertain parameters. The method applied here relies on an ensemble-based data assimilation (DA) algorithm (i.e. the ensemble smoother, ES), which incorporates the information from the bathymetric measurements into the geomechanical model response to infer and reduce the uncertainty of the parameter cM. The outcome from conceptual model (i) indicates that DA is effective in reducing the cM uncertainty. However, the maximum settlement still remains underestimated, while the areal extent of the subsidence bowl is overestimated. We demonstrate that the selection of the heterogeneous conceptual model (ii) allows to reproduce much better the observations thus removing a clear bias of the model structure. DA allows significantly reducing the cM uncertainty in the five blocks (out of the seven) characterized by large volume and large pressure decline. Conversely, the assimilation of land displacements only partially constrains the prior cM uncertainty in the reservoir blocks marginally contributing to the

  4. The early bird gets the shrimp: Confronting assumptions of isotopic equilibrium and homogeneity in a wild bird population

    Science.gov (United States)

    Wunder, Michael B.; Jehl, Joseph R.; Stricker, Craig A.

    2012-01-01

    1. Because stable isotope distributions in organic material vary systematically across energy gradients that exist in ecosystems, community and population structures, and in individual physiological systems, isotope values in animal tissues have helped address a broad range of questions in animal ecology. It follows that every tissue sample provides an isotopic profile that can be used to study dietary or movement histories of individual animals. Interpretations of these profiles depend on the assumption that metabolic pools are isotopically well mixed and in equilibrium with dietary resources prior to tissue synthesis, and they extend to the population level by assuming isotope profiles are identically distributed for animals using the same proximal dietary resource. As these assumptions are never fully met, studying structure in the variance of tissue isotope values from wild populations is informative. 2. We studied variation in δ13C, δ15N, δ2H and δ18O data for feathers from a population of eared grebes (Podiceps nigricollis) that migrate to Great Salt Lake each fall to moult feathers. During this time, they cannot fly and feed almost exclusively on superabundant brine shrimp (Artemia franciscana). The ecological simplicity of this situation minimized the usual spatial and trophic complexities often present in natural studies of feather isotope values. 3. Ranges and variances of isotope values for the feathers were larger than those from previously published studies that report feather isotopic variance, but they were bimodally distributed in all isotope dimensions. Isotope values for proximal dietary resources and local surface water show that some of the feathers we assumed to have been grown locally must have been grown before birds reached isotopic equilibrium with local diet or immediately prior to arrival at Great Salt Lake. 4. Our study provides novel insights about resource use strategies in eared grebes during migration. More generally, it

  5. Error estimations of dry deposition velocities of air pollutants using bulk sea surface temperature under common assumptions

    Science.gov (United States)

    Lan, Yung-Yao; Tsuang, Ben-Jei; Keenlyside, Noel; Wang, Shu-Lun; Arthur Chen, Chen-Tung; Wang, Bin-Jye; Liu, Tsun-Hsien

    2010-07-01

    It is well known that skin sea surface temperature (SSST) is different from bulk sea surface temperature (BSST) by a few tenths of a degree Celsius. However, the extent of the error associated with dry deposition (or uptake) estimation by using BSST is not well known. This study tries to conduct such an evaluation using the on-board observation data over the South China Sea in the summers of 2004 and 2006. It was found that when a warm layer occurred, the deposition velocities using BSST were underestimated within the range of 0.8-4.3%, and the absorbed sea surface heat flux was overestimated by 21 W m -2. In contrast, under cool skin only conditions, the deposition velocities using BSST were overestimated within the range of 0.5-2.0%, varying with pollutants and the absorbed sea surface heat flux was underestimated also by 21 W m -2. Scale analysis shows that for a slightly soluble gas (e.g., NO 2, NO and CO), the error in the solubility estimation using BSST is the major source of the error in dry deposition estimation. For a highly soluble gas (e.g., SO 2), the error in the estimation of turbulent heat fluxes and, consequently, aerodynamic resistance and gas-phase film resistance using BSST is the major source of the total error. In contrast, for a medium soluble gas (e.g., O 3 and CO 2) both the errors from the estimations of the solubility and aerodynamic resistance are important. In addition, deposition estimations using various assumptions are discussed. The largest uncertainty is from the parameterizations for chemical enhancement factors. Other important areas of uncertainty include: (1) various parameterizations for gas-transfer velocity; (2) neutral-atmosphere assumption; (3) using BSST as SST, and (4) constant pH value assumption.

  6. Estimating the global prevalence of inadequate zinc intake from national food balance sheets: effects of methodological assumptions.

    Directory of Open Access Journals (Sweden)

    K Ryan Wessells

    Full Text Available BACKGROUND: The prevalence of inadequate zinc intake in a population can be estimated by comparing the zinc content of the food supply with the population's theoretical requirement for zinc. However, assumptions regarding the nutrient composition of foods, zinc requirements, and zinc absorption may affect prevalence estimates. These analyses were conducted to: (1 evaluate the effect of varying methodological assumptions on country-specific estimates of the prevalence of dietary zinc inadequacy and (2 generate a model considered to provide the best estimates. METHODOLOGY AND PRINCIPAL FINDINGS: National food balance data were obtained from the Food and Agriculture Organization of the United Nations. Zinc and phytate contents of these foods were estimated from three nutrient composition databases. Zinc absorption was predicted using a mathematical model (Miller equation. Theoretical mean daily per capita physiological and dietary requirements for zinc were calculated using recommendations from the Food and Nutrition Board of the Institute of Medicine and the International Zinc Nutrition Consultative Group. The estimated global prevalence of inadequate zinc intake varied between 12-66%, depending on which methodological assumptions were applied. However, country-specific rank order of the estimated prevalence of inadequate intake was conserved across all models (r = 0.57-0.99, P<0.01. A "best-estimate" model, comprised of zinc and phytate data from a composite nutrient database and IZiNCG physiological requirements for absorbed zinc, estimated the global prevalence of inadequate zinc intake to be 17.3%. CONCLUSIONS AND SIGNIFICANCE: Given the multiple sources of uncertainty in this method, caution must be taken in the interpretation of the estimated prevalence figures. However, the results of all models indicate that inadequate zinc intake may be fairly common globally. Inferences regarding the relative likelihood of zinc deficiency as a public

  7. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    Science.gov (United States)

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy.

  8. Contradiction between assumption on superposition of flux-qubit states and the law of angular momentum conservation

    CERN Document Server

    Nikulov, A V

    2009-01-01

    Superconducting loop interrupted by one or three Josephson junctions is considered in many publications as a possible quantum bit, flux qubit, which can be used for creation of quantum computer. But the assumption on superposition of two macroscopically distinct quantum states of superconducting loop contradict to the fundamental law of angular momentum conservation and the universally recognized quantum formalism. Numerous publications devoted to the flux qubit testify to an inadequate interpretation by many authors of paradoxical nature of superposition principle and the subject of quantum description.

  9. Can Total Quality Management Succeed at Your College--Now? (Does Your College Meet the Essential Prerequisites and Underlying Assumptions of TQM?)

    Science.gov (United States)

    Hammons, James O.

    1994-01-01

    Defines Total Quality Management (TQM) and describes prerequisites for successful implementation, underlying assumptions, and the cultural barriers hindering implementation. Indicates that TQM's long-term benefits outweigh costs at most colleges. Urges practitioners to rate their schools with respect to the prerequisites and assumptions to…

  10. Can Total Quality Management Succeed at Your College--Now? (Does Your College Meet the Essential Prerequisites and Underlying Assumptions of TQM?)

    Science.gov (United States)

    Hammons, James O.

    1994-01-01

    Defines Total Quality Management (TQM) and describes prerequisites for successful implementation, underlying assumptions, and the cultural barriers hindering implementation. Indicates that TQM's long-term benefits outweigh costs at most colleges. Urges practitioners to rate their schools with respect to the prerequisites and assumptions to…

  11. 42 CFR 137.302 - Are Federal funds available to cover start-up costs associated with initial Tribal assumption of...

    Science.gov (United States)

    2010-10-01

    ... funds available to cover start-up costs associated with initial Tribal assumption of environmental... 42 Public Health 1 2010-10-01 2010-10-01 false Are Federal funds available to cover start-up costs associated with initial Tribal assumption of environmental responsibilities? 137.302 Section 137.302 Public...

  12. On the Empirical Importance of the Conditional Skewness Assumption in Modelling the Relationship between Risk and Return

    Science.gov (United States)

    Pipień, M.

    2008-09-01

    We present the results of an application of Bayesian inference in testing the relation between risk and return on the financial instruments. On the basis of the Intertemporal Capital Asset Pricing Model, proposed by Merton we built a general sampling distribution suitable in analysing this relationship. The most important feature of our assumptions is that the skewness of the conditional distribution of returns is used as an alternative source of relation between risk and return. This general specification relates to Skewed Generalized Autoregressive Conditionally Heteroscedastic-in-Mean model. In order to make conditional distribution of financial returns skewed we considered the unified approach based on the inverse probability integral transformation. In particular, we applied hidden truncation mechanism, inverse scale factors, order statistics concept, Beta and Bernstein distribution transformations and also a constructive method. Based on the daily excess returns on the Warsaw Stock Exchange Index we checked the empirical importance of the conditional skewness assumption on the relation between risk and return on the Warsaw Stock Market. We present posterior probabilities of all competing specifications as well as the posterior analysis of the positive sign of the tested relationship.

  13. CCN predictions using simplified assumptions of organic aerosol composition and mixing state: a synthesis from six different locations

    Directory of Open Access Journals (Sweden)

    B. Ervens

    2010-05-01

    Full Text Available An accurate but simple quantification of the fraction of aerosol particles that can act as cloud condensation nuclei (CCN is needed for implementation in large-scale models. Data on aerosol size distribution, chemical composition, and CCN concentration from six different locations have been analyzed to explore the extent to which simple assumptions of composition and mixing state of the organic fraction can reproduce measured CCN number concentrations.

    Fresher pollution aerosol as encountered in Riverside, CA, and the ship channel in Houston, TX, cannot be represented without knowledge of more complex (size-resolved composition. For aerosol that has experienced processing (Mexico City, Holme Moss (UK, Point Reyes (CA, and Chebogue Point (Canada, CCN can be predicted within a factor of two assuming either externally or internally mixed soluble organics although these simplified compositions/mixing states might not represent the actual properties of ambient aerosol populations, in agreement with many previous CCN studies in the literature. Under typical conditions, a factor of two uncertainty in CCN concentration due to composition assumptions translates to an uncertainty of ~15% in cloud drop concentration, which might be adequate for large-scale models given the much larger uncertainty in cloudiness.

  14. Fostering assumption-based stress-test thinking in managing groundwater systems: learning to avoid failures due to basic dynamics

    Science.gov (United States)

    Guillaume, Joseph H. A.; El Sawah, Sondoss

    2014-06-01

    Sustainable groundwater resource management can only be achieved if planning processes address the basic dynamics of the groundwater system. Conceptual and distributed groundwater models do not necessarily translate into an understanding of how a plan might operate in reality. Prompted by Australian experiences, `iterative closed-question modelling' has been used to develop a process of iterative dialogue about management options, objectives and knowledge. Simple hypothetical models of basic system dynamics that satisfy agreed assumptions are used to stress-test the ability of a proposed management plan to achieve desired future conditions. Participants learn from models in which a plan succeeds and fails, updating their assumptions, expectations or plan. Their new understanding is tested against further hypothetical models. The models act as intellectual devices that confront users with new scenarios to discuss. This theoretical approach is illustrated using simple one and two-cell groundwater models that convey basic notions of capture and spatial impacts of pumping. Simple extensions can address uncertain climate, managed-aquifer recharge and alternate water sources. Having learnt to address the dynamics captured by these models, participants may be better placed to address local conditions and develop more effective arrangements to achieve management outcomes.

  15. Using Community-Based Participatory Research to Prevent HIV Disparities: Assumptions and Opportunities Identified by The Latino Partnership

    Science.gov (United States)

    Rhodes, Scott D.; Duck, Stacy; Alonzo, Jorge; Daniel-Ulloa, Jason; Aronson, Robert E.

    2013-01-01

    Background HIV disproportionately affects vulnerable populations in the United States (US), including recently arrived immigrant Latinos. However, the current arsenal of effective approaches to increase adherence to risk-reduction strategies and treatment within Latino populations remains insufficient. Methods Our community-based participatory research (CBPR) partnership blends multiple perspectives of community members, organizational representatives, local business leaders, and academic researchers to explore and intervene on HIV risk within Latino populations. We used CBPR to develop, implement, and evaluate two interventions that were found to be efficacious. Results We identified seven assumptions of CBPR as an approach to research, including more authentic study designs, stronger measurement, and improved quality of knowledge gained; increased community capacity to tackle other health disparities; the need to focus on community priorities; increased participation and retention rates; more successful interventions; reduced generalizability; and increased sustainability. Conclusions Despite the advancement of CBPR as an approach to research, key assumptions remain. Further research is needed to compare CBPR to other more traditional approaches to research. Such research would move us from assuming the value of CBPR to identifying its actual value in health disparity reduction. After all, communities carrying disproportionate burden of HIV, including immigrant Latino communities, deserve the best science possible. PMID:23673883

  16. A monotonic method for solving nonlinear optimal control problems

    CERN Document Server

    Salomon, Julien

    2009-01-01

    Initially introduced in the framework of quantum control, the so-called monotonic algorithms have shown excellent numerical results when dealing with various bilinear optimal control problems. This paper aims at presenting a unified formulation of such procedures and the intrinsic assumptions they require. In this framework, we prove the feasibility of the general algorithm. Finally, we explain how these assumptions can be relaxed.

  17. Parametric control charts

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.; Nurdiati, S.

    2002-01-01

    Standard control charts are based on the assumption that the observations are normally distributed. In practice, normality often fails and consequently the false alarm rate is seriously in error. Application of a nonparametric approach is only possible with many Phase I observations. Since nowadays

  18. Parametric control charts

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.; Sri Nurdiati, S.N.

    2004-01-01

    Standard control charts are based on the assumption that the observations are normally distributed. In practice, normality often fails and consequently the false alarm rate is seriously in error. Application of a nonparametric approach is only possible with many Phase I observations. Since nowadays

  19. Sensitivity of Utility-Scale Solar Deployment Projections in the SunShot Vision Study to Market and Performance Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Eurek, K.; Denholm, P.; Margolis, R.; Mowers, M.

    2013-04-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The ReEDS model was used to simulate utility PV and CSP deployment for this present study, based on several market and performance assumptions - electricity demand, natural gas prices, coal retirements, cost and performance of non-solar renewable technologies, PV resource variability, distributed PV deployment, and solar market supply growth - in addition to the SunShot solar price projections. This study finds that utility-scale solar deployment is highly sensitive to solar prices. Other factors can have significant impacts, particularly electricity demand and natural gas prices.

  20. The effects of material property assumptions on predicted meltpool shape for laser powder bed fusion based additive manufacturing

    Science.gov (United States)

    Teng, Chong; Ashby, Kathryn; Phan, Nam; Pal, Deepankar; Stucker, Brent

    2016-08-01

    The objective of this study was to provide guidance on material specifications for powders used in laser powder bed fusion based additive manufacturing (AM) processes. The methodology was to investigate how different material property assumptions in a simulation affect meltpool prediction and by corrolary how different material properties affect meltpool formation in AM processes. The sensitvity of meltpool variations to each material property can be used as a guide to help drive future research and to help prioritize material specifications in requirements documents. By identifying which material properties have the greatest affect on outcomes, metrology can be tailored to focus on those properties which matter most; thus reducing costs by eliminating unnecessary testing and property charaterizations. Futhermore, this sensitivity study provides insight into which properties require more accurate measurements, thus motivating development of new metrology methods to measure those properties accurately.

  1. SIMPLEST DIFFERENTIAL EQUATION OF STOCK PRICE, ITS SOLUTION AND RELATION TO ASSUMPTION OF BLACK-SCHOLES MODEL

    Institute of Scientific and Technical Information of China (English)

    云天铨; 雷光龙

    2003-01-01

    Two kinds of mathematical expressions of stock price, one of which based on certain description is the solution of the simplest differential equation (S.D.E.) obtained by method similar to that used in solid mechanics, the other based on uncertain description (i. e., the statistic theory) is the assumption of Black-Scholes's model (A.B-S.M.) in which the density function of stock price obeys logarithmic normal distribution, can be shown to be completely the same under certain equivalence relation of coefficients. The range of the solution of S.D.E. has been shown to be suited only for normal cases (no profit, or lost profit news, etc.) of stock market, so the same range is suited for A. B-S. M. as well.

  2. Guideline for Adopting the Local Reaction Assumption for Porous Absorbers in Terms of Random Incidence Absorption Coefficients

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2011-01-01

    errors of less than 10% if the thickness exceeds 120 mm for a flow resistivity of 5000 Nm-4s. As the flow resistivity doubles, a decrease in the required thickness by 25 mm is observed to achieve the same amount of error. For an absorber backed by an air gap, the thickness ratio between the material...... and air cavity is important, since the thicker the cavity, the more extendedly reacting the absorber. If the absorber thickness is approximately 40% of the cavity depth, the local reaction models give errors below 10% even for a low flow resistivity case....... resistivity and the absorber thickness on the difference between the two surface reaction models are examined and discussed. For a porous absorber backed by a rigid surface, the assumption of local reaction always underestimates the random incidence absorption coefficient and the local reaction models give...

  3. Multiquadratic methods, collocation and kriging - comparison with geostatistical model assumptions; Multiquadratische Methode, Kollokation und Kriging - Vergleich unter geostatistischen Modellannahmen

    Energy Technology Data Exchange (ETDEWEB)

    Menz, J. [Technische Univ. Freiburg (Germany). Inst. fuer Markscheidewesen und Geodaesie; Bian Shaofeng [Technical Univ. of Surveying and Mapping, Wuhan (China)

    1998-10-01

    The contribution shows that Hardy`s multisquare method leads to results that are similar in their structure to the predictions by collocation. On the basis of geostatistical model assumptions, equations for calculating the prediction error are presented, and the multisquare method is compared with the collocation method on this basis. Equivalences between collocation and kriging are gone into, and information is presented on how predictions can be improved in the Bayesian sense. [Deutsch] In der folgenden Arbeit soll zuerst gezeigt werden, dass die Multiquadratische Methode nach HARDY zu Vorhersagen fuehrt, die in ihrer Struktur den Vorhersagen durch Kollokation entsprechen. Unter geostatistischen Modellannahmen werden nach dem Fehlerfortpflanzungsgesetz Formeln fuer die Berechnung der Vorhersagefehler angegeben. Auf der Grundlage dieser Formeln wird die Multiquadratische Methode mit der Kollokation verglichen. Es wird auf die Aequivalenzen zwischen Kollokation und Kriging verwiesen und angegeben, wie sich die Vorhersagen im BAYESschen Sinne verbessern lassen. (orig./MSK)

  4. Usefulness of an equal-probability assumption for out-of-equilibrium states: A master equation approach

    KAUST Repository

    Nogawa, Tomoaki

    2012-10-18

    We examine the effectiveness of assuming an equal probability for states far from equilibrium. For this aim, we propose a method to construct a master equation for extensive variables describing nonstationary nonequilibrium dynamics. The key point of the method is the assumption that transient states are equivalent to the equilibrium state that has the same extensive variables, i.e., an equal probability holds for microscopic states in nonequilibrium. We demonstrate an application of this method to the critical relaxation of the two-dimensional Potts model by Monte Carlo simulations. While the one-variable description, which is adequate for equilibrium, yields relaxation dynamics that are very fast, the redundant two-variable description well reproduces the true dynamics quantitatively. These results suggest that some class of the nonequilibrium state can be described with a small extension of degrees of freedom, which may lead to an alternative way to understand nonequilibrium phenomena. © 2012 American Physical Society.

  5. How do people learn from negative evidence? Non-monotonic generalizations and sampling assumptions in inductive reasoning.

    Science.gov (United States)

    Voorspoels, Wouter; Navarro, Daniel J; Perfors, Amy; Ransom, Keith; Storms, Gert

    2015-09-01

    A robust finding in category-based induction tasks is for positive observations to raise the willingness to generalize to other categories while negative observations lower the willingness to generalize. This pattern is referred to as monotonic generalization. Across three experiments we find systematic non-monotonicity effects, in which negative observations raise the willingness to generalize. Experiments 1 and 2 show that this effect emerges in hierarchically structured domains when a negative observation from a different category is added to a positive observation. They also demonstrate that this is related to a specific kind of shift in the reasoner's hypothesis space. Experiment 3 shows that the effect depends on the assumptions that the reasoner makes about how inductive arguments are constructed. Non-monotonic reasoning occurs when people believe the facts were put together by a helpful communicator, but monotonicity is restored when they believe the observations were sampled randomly from the environment.

  6. Questioning Assumptions about the Role of Education in American Society: A Review of Schooling in Capitalist America

    Science.gov (United States)

    Rosenberg, Seth

    2004-09-01

    According to many scholars, classrooms in America are overwhelmingly authoritarian and undemocratic. They focus on fragmented knowledge that is disconnected from the students' lives. Proven reforms are resisted at all levels, and systematic progressive change is non-existent nearly a century after the progressive movement. Why is this so? The standard liberal outlook is that the schools are `broken' and `neglected', but that they have the potential, with reform, to be a major progressive force in society. This paper questions these assumptions through a review of the seminal educational-economic work by Bowles and Gintis: Schooling in Capitalist America. The major claim of this text is that our educational system's primary role is to mirror, support, stabilize, and reproduce the fundamentally hierarchical and undemocratic social relationships that exist in the majority of American workplaces. The major arguments and evidence of this text are reviewed, and implications for PER will be briefly mentioned.

  7. Uncovering Underlying Assumptions Regarding Education and Technology in Educational Reform Efforts A conversation with Dr. Larry Johnson

    Directory of Open Access Journals (Sweden)

    Gabriela Melano

    2000-11-01

    Full Text Available Educational systems around the world, and specifically in the United States, have long been awaiting for genuine reform efforts. Technology is often perceived as a panacea, if not as a crucial instrument in any educational reform effort. In a conversation with one of his students, Doctor Johnson discusses how the underlying assumptions embedded in our current schooling practices need to be seriously reviewed before any technology strategy is considered. New understandings, as opposed to mere information, is what schools need to reach in order to transform themselves. Finally, Dr. Johnson provides two brief examples, one in the United States and another in México, were hermeneutical approaches have been used for educational reform endeavors.

  8. Herd immunity effect of the HPV vaccination program in Australia under different assumptions regarding natural immunity against re-infection.

    Science.gov (United States)

    Korostil, Igor A; Peters, Gareth W; Law, Matthew G; Regan, David G

    2013-04-01

    Deterministic dynamic compartmental transmission models (DDCTMs) of human papillomavirus (HPV) transmission have been used in a number of studies to estimate the potential impact of HPV vaccination programs. In most cases, the models were built under the assumption that an individual who cleared HPV infection develops (life-long) natural immunity against re-infection with the same HPV type (this is known as SIR scenario). This assumption was also made by two Australian modelling studies evaluating the impact of the National HPV Vaccination Program to assist in the health-economic assessment of male vaccination. An alternative view denying natural immunity after clearance (SIS scenario) was only presented in one study, although neither scenario has been supported by strong evidence. Some recent findings, however, provide arguments in favour of SIS. We developed HPV transmission models implementing life-time (SIR), limited, and non-existent (SIS) natural immunity. For each model we estimated the herd immunity effect of the ongoing Australian HPV vaccination program and its extension to cover males. Given the Australian setting, we aimed to clarify the extent to which the choice of model structure would influence estimation of this effect. A statistically robust and efficient calibration methodology was applied to ensure credibility of our results. We observed that for non-SIR models the herd immunity effect measured in relative reductions in HPV prevalence in the unvaccinated population was much more pronounced than for the SIR model. For example, with vaccine efficacy of 95% for females and 90% for males, the reductions for HPV-16 were 3% in females and 28% in males for the SIR model, and at least 30% (females) and 60% (males) for non-SIR models. The magnitude of these differences implies that evaluations of the impact of vaccination programs using DDCTMs should incorporate several model structures until our understanding of natural immunity is improved.

  9. Estimates of Late Pleistocene Runoff in Estancia Drainage Basin, Central New Mexico: Climate Assumptions vs. Model Results

    Science.gov (United States)

    Menking, K. M.; Anderson, R. Y.; Syed, K. H.; Shafike, N. G.

    2002-12-01

    The climatic conditions leading to highstands of "pluvial" Lake Estancia in central New Mexico have been a matter of considerable debate, resulting in a wide range of estimates for Pleistocene precipitation and temperature in the southwestern United States. Using a simple hydrologic balance approach, Leopold (1951) calculated that precipitation was 50% greater than modern based on the assumption that summer temperatures were 9 ° C colder while winter temperatures were unchanged. In contrast, Galloway (1970) called on temperature decreases of 10-11 ° C throughout the year and a reduction in mean annual precipitation of 14% to raise Lake Estancia to its highstand. In still another study, Brakenridge suggested that highstands could be achieved through no change in precipitation if monthly temperatures were reduced by 7-8 ° C. Experiments with 3 physically-based, continuous-time models to simulate surface runoff (USDA Soil and Water Assessment Tool), groundwater flow (MODFLOW with LAK2 package), and lake evaporation (lake energy balance model of Hostetler and Bartlein, 1990) indicate that none of these proposed full glacial climate scenarios could have produced a highstand lake. In particular, previous workers appear to have overestimated the reduction in evaporation rates associated with their proposed temperature changes, suggesting that using empirical relationships between modern air temperature and evaporation to predict late Pleistocene evaporation is problematic. Furthermore, model-determined reductions in lake evaporation are insufficient to allow for lake expansion as suggested by Galloway and Brakenridge. Even under Leopold's assumption that precipitation increased by 50%, modeled runoff appears to be insufficient to raise Lake Estancia more than a few meters above the lake floor.

  10. Right to Development and Right to the City : A Proposal of Human Rights Categories Universal as assumptions Citizenship

    Directory of Open Access Journals (Sweden)

    Alessandra Danielle Carneiro dos Santos Hilário

    2016-05-01

    Full Text Available This article discusses the Right to the City, in a conceptual dimension and wide, and his dialectical relationship with the Universal Declaration of Human Rights of 1948 and its universalism and cultural relativism categories. The Right to the City (RtC is capitula- ted as one of the categories of the Human Right to Development from the compartments on Human Rights to descend from the Universal Declaration of Human Rights. Linked to this assumption, the discussion of universalism and cultural relativism theories bring to the fore important questions and considerations as to RtC condition, since in its current design and trampled by an evil legacy of neoliberalism, this right has demonstrated the need for authoritative action of the State, given the nature of fundamental human right of the third dimension. Through RtC, boasts up of economic, social and cultural rights, requiring a positive action of the state as compliance guarantee this human right. In this bias, relevant are discussions about the concept of law, morality, liberalism, effectiveness and universality of human rights theories and cultural relativism in dialectic with the RtC and its complexity. It starts from the assumption that the Universal Declaration of Human Rights and other statements which have descended universality (despite criticism, however, this har- vest, it is imperative closer examination of the concept, forecast, guarantee and effective- ness fundamental human rights, which may lead to a mixed application of universalistic and relativistic theories when analyzed from the perspective of these institutes. The Hu- man Right to Development (RtD presupposes notions of environmental sustainability and economic democracy, with qualified participation of social subjects (wide citizenship, seen continuous and articulated perspective as guiding the development process.

  11. Assumption Centred Modelling of Ecosystem Responses to CO2 at Six US Atmospheric CO2 Enrichment Experiments.

    Science.gov (United States)

    Walker, A. P.; De Kauwe, M. G.; Medlyn, B. E.; Zaehle, S.; Luus, K. A.; Ryan, E.; Xia, J.; Norby, R. J.

    2015-12-01

    Plant photosynthetic rates increase and stomatal apertures decrease in response to elevated atmospheric CO[2] (eCO2), increasing both plant carbon (C) availability and water use efficiency. These physiological responses to eCO2 are well characterised and understood, however the ecological effects of these responses as they cascade through a suite of plant and ecosystem processes are complex and subject to multiple interactions and feedbacks. Therefore the response of the terrestrial carbon sink to increasing atmospheric CO[2] remains the largest uncertainty in global C cycle modelling to date, and is a huge contributor to uncertainty in climate change projections. Phase 2 of the FACE Model-Data Synthesis (FACE-MDS) project synthesises ecosystem observations from five long-term Free-Air CO[2] Enrichment (FACE) experiments and one open top chamber (OTC) experiment to evaluate the assumptions of a suite of terrestrial ecosystem models. The experiments are: The evergreen needleleaf Duke Forest FACE (NC), the deciduous broadleaf Oak Ridge FACE (TN), the prairie heating and FACE (WY), and the Nevada desert FACE, and the evergreen scrub oak OTC (FL). An assumption centered approach is being used to analyse: the interaction between eCO2 and water limitation on plant productivity; the interaction between eCO2 and temperature on plant productivity; whether increased rates of soil decomposition observed in many eCO2 experiments can account for model deficiencies in N uptake shown during Phase 1 of the FACE-MDS; and tracing carbon through the ecosystem to identify the exact cause of changes in ecosystem C storage.

  12. Social aspects of revitalization of rural areas. Implementation of the rural revival programme in lodzkie voivodeship. Assumptions for sociological research

    Directory of Open Access Journals (Sweden)

    Pamela Jeziorska-Biel

    2012-04-01

    Full Text Available Essential elements of the process of rural renovation programme are: stimulating activity of local communities, cooperation for development, while preserving social identity, cultural heritage and natural environment. Implementing a rural revival programme in Poland: Sectoral Operational Programme “The Restructuring and Modernisation of the Food Sector and the Development of Rural Areas in 2004-2006” (action 2.3 “Rural renovation and protection and preservation of cultural heritage” evokes criticism. A wide discussion is carried amongst researchers, politicians, social activists, and local government practitioners. The main question remains: “is rural renovation process in Poland conducted in accordance with the rules in European countries or it is only a new formula of rural modernisation with the use of European funds?” The authors are joining the discussion and in the second part of the article they are presenting the assumption of sociological research. The aim of the analysis is to grasp the essence of revitalization of rural areas located in Łódzkie voivodeship, and analyse the question of specificity of rural Revival Programmes. What is the scope and manner of use of local capital? If so, are the results obtained from implementing a rural revival programme in 2004-2006 within the scope of sustainable development? What activities are predominant in the process of project implementation? Is it rural modernisation, revitalization of the rural areas, barrier removal and change in Infrastructure, or creation of social capital and subjectivity of the local community? Has the process of rural renovation in Łódzkie voivodeship got the so called “social face” and if so, to what extent? The major assumption is that rural renovation programme in Łódzkie voivodeship relates more to revitalization material aspects than “spirituality”.

  13. Assessing the Sensitivity of a Reservoir Management System Under Plausible Assumptions About Future Climate Over Seasons to Decades

    Science.gov (United States)

    Ward, M. N.; Brown, C. M.; Baroang, K. M.; Kaheil, Y. H.

    2011-12-01

    We illustrate an analysis procedure that explores the robustness and overall productivity of a reservoir management system under plausible assumptions about climate fluctuation and change. Results are presented based on a stylized version of a multi-use reservoir management model adapted from Angat Dam, Philippines. It represents a modest-sized seasonal storage reservoir in a climate with a pronounced dry season. The reservoir management model focuses on October-March, during which climatological inflow declines due to the arrival of the dry season, and reservoir management becomes critical and challenging. Inflow is assumed to be impacted by climate fluctuations representing interannal variation (white noise), decadal to multidecadal variation (MDV, here represented by a stochastic autoregressive process) and global change (GC), here represented by a systematic linear trend in seasonal inflow total over the simulation period of 2008-2047. Reservoir reliability, and risk of extreme persistent water shortfall, is assessed under different combinations and magnitudes of GC and MDV. We include an illustration of adaptive management, using seasonal forecasts and updated climate normals. A set of seasonal forecast and observed inflow values are generated for 2008-2047 by randomly rearranging the forecast-observed pairs for 1968-2007. Then, trends are imposed on the observed series, with differing assumptions about the extent to which the seasonal forecasts can be expected to track the trend. We consider the framework presented here well-suited to providing insights about managing the climate risks in reservoir operations, providing guidance on expected benefits and risks of different strategies and climate scenarios.

  14. Assumption Trade-Offs When Choosing Identification Strategies for Pre-Post Treatment Effect Estimation: An Illustration of a Community-Based Intervention in Madagascar.

    Science.gov (United States)

    Weber, Ann M; van der Laan, Mark J; Petersen, Maya L

    2015-03-01

    Failure (or success) in finding a statistically significant effect of a large-scale intervention may be due to choices made in the evaluation. To highlight the potential limitations and pitfalls of some common identification strategies used for estimating causal effects of community-level interventions, we apply a roadmap for causal inference to a pre-post evaluation of a national nutrition program in Madagascar. Selection into the program was non-random and strongly associated with the pre-treatment (lagged) outcome. Using structural causal models (SCM), directed acyclic graphs (DAGs) and simulated data, we illustrate that an estimand with the outcome defined as the post-treatment outcome controls for confounding by the lagged outcome but not by possible unmeasured confounders. Two separate differencing estimands (of the pre- and post-treatment outcome) have the potential to adjust for a certain type of unmeasured confounding, but introduce bias if the additional identification assumptions they rely on are not met. In order to illustrate the practical impact of choice between three common identification strategies and their corresponding estimands, we used observational data from the community nutrition program in Madagascar to estimate each of these three estimands. Specifically, we estimated the average treatment effect of the program on the community mean nutritional status of children 5 years and under and found that the estimate based on the post-treatment estimand was about a quarter of the magnitude of either of the differencing estimands (0.066 SD vs. 0.26-0.27 SD increase in mean weight-for-age z-score). Choice of estimand clearly has important implications for the interpretation of the success of the program to improve nutritional status of young children. A careful appraisal of the assumptions underlying the causal model is imperative before committing to a statistical model and progressing to estimation. However, knowledge about the data

  15. Auditory and visual stream segregation in children and adults: an assessment of the amodality assumption of the 'sluggish attentional shifting' theory of dyslexia.

    Science.gov (United States)

    Lallier, Marie; Thierry, Guillaume; Tainturier, Marie-Josèphe; Donnadieu, Sophie; Peyrin, Carole; Billard, Catherine; Valdois, Sylviane

    2009-12-11

    Among the hypotheses relating dyslexia to a temporal processing disorder, Hari and Renvall (Hari, R., Renvall, H., 2001. Impaired processing of rapid stimulus sequences in dyslexia. Trends. Cognit. Sci. 5, 525-532.) argued that dyslexic individuals would show difficulties at an attentional level, through sluggish attentional shifting (SAS) in all sensory modalities. However, the amodality assumption of the SAS theory was never straightforwardly assessed in the same group of dyslexic participants using similar paradigms in both the visual and auditory modalities. Here, the attentional sequential performance of control and dyslexic participants was evaluated using rapid serial presentation paradigms measuring individual stream segregation thresholds in the two modalities. The first experiment conducted on French dyslexic children with a phonological disorder revealed an SAS only in the auditory modality only which was strongly related to reading performance. The second experiment carried out on British dyslexic young adults with a phonological disorder using the same auditory segregation task but a different visual paradigm revealed an SAS in both the visual and the auditory modalities. In addition, a relationship was found in this group between SAS, poor reading and poor phonological skills. Two further control experiments showed that differences in task design or participants' language between Experiments 1 and 2 could not account for the differences in terms of visual segregation patterns. Overall, our results support the view that the auditory SAS plays a role in developmental dyslexia via its impact on phonological abilities. In addition, a visual temporal disorder in dyslexia might emerge at a later developmental stage, when the visual system normally becomes more expert at rapid temporal processing.

  16. A Study of the Effects of Underlying Assumptions in the Reduction of Multi-Object Photometry of Transiting Exoplanets

    Science.gov (United States)

    Fitzpatrick, M. Ryleigh; Silva Martins-Filho, Walter; Griffith, Caitlin Ann; Pearson, Kyle; Zellem, Robert Thomas; AzGOE

    2016-10-01

    The analysis of ground-based photometric observations of planetary transits must treat the effects of the Earth's atmosphere, which exceed the signal of the extrasolar planet. Generally, this is achieved by dividing the signal of the host star and planet from that of nearby field stars to reveal the lightcurve. The lightcurve is then fit to a model of the planet's orbit and physical characteristics, also taking into account the characteristics of the star. The fit to the in and out-of-transit data establish the depth of the lightcurve. The question arises, what is the best way to select and treat reference stars to best characterize and remove the shared atmospheric systematics that plague our transit signal. To explore these questions we examine the effects of several assumptions that underline the calculation of the light curve depth. Our study involves repeated photometric observations of hot Jupiter primary transits in the U and B filters. Data were taken with the University of Arizona's Kuiper 1.55m telescope/Mont4K CCD. Each exoplanet observed offers a unique field with stars of various brightness, spectral types and angular distance from the host star. While these observations are part of a larger study of the Rayleigh scattering signature of hot Jupiter exoplanets, here we study the effects of various choices during reduction, specifically the treatment of reference stars and atmospheric systematics.We calculate the lightcurve for all permutations of reference stars, considering several out-of-transit assumptions (e.g. linear, quadratic or exponential). We assess the sensitivity of the transit depths based on the spread of values. In addition we look for characteristics that minimize the scatter in the reduced lightcurve and analyze the effects of the treatment of individual variables on the resultant lightcurve model. Here we present the results of an in depth statistical analysis that classifies the effect of various parameters and choices involved in

  17. Curriculum Control in Distance Education.

    Science.gov (United States)

    Chesterton, Paul

    1985-01-01

    The nature of distance education is to shift the locus of curriculum control toward the institution and its staff and away from the students. This imposes a responsibility on the institution to examine and evaluate the values and assumptions underlying the decision-making and the implications of the patterns of control that emerge. (Author/MSE)

  18. Convergence analysis of cautious control

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yanxia; GUO Lei

    2006-01-01

    In this paper, we present a theoretical analysis on stability and convergence of the cautious control, which has advantages over the traditional certainty equivalence adaptive control, since it takes the parameter estimation error into account in the design,and is also one-step-ahead optimal in the mean square sense under Gaussian assumptions.

  19. Thai SF-36 health survey: tests of data quality, scaling assumptions, reliability and validity in healthy men and women

    Directory of Open Access Journals (Sweden)

    Sleigh Adrian

    2008-07-01

    Full Text Available Abstract Background Since its translation to Thai in 2000, the SF-36 Health Survey has been used extensively in many different clinical settings in Thailand. Its popularity has increased despite the absence of published evidence that the translated instrument satisfies scoring assumptions, the psychometric properties required for valid interpretation of the SF-36 summated ratings scales. The purpose of this paper was to examine these properties and to report on the reliability and validity of the Thai SF-36 in a non-clinical general population. Methods 1345 distance-education university students who live in all areas of Thailand completed a questionnaire comprising the Thai SF-36 (Version 1. Median age was 31 years. Psychometric tests recommended by the International Quality of Life Assessment Project were used. Results Data quality was satisfactory: questionnaire completion rate was high (97.5% and missing data rates were low ( Conclusion The summated ratings method can be used for scoring the Thai SF-36. The instrument was found to be reliable and valid for use in a general non-clinical population. Version 2 of the SF-36 could improve ceiling and floor effects in the role functioning scales. Further work is warranted to refine items that measure the concepts of social functioning, vitality and mental health to improve the reliability and discriminant validity of these scales.

  20. The metaphysics of D-CTCs: On the underlying assumptions of Deutsch's quantum solution to the paradoxes of time travel

    Science.gov (United States)

    Dunlap, Lucas

    2016-11-01

    I argue that Deutsch's model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a "multiverse", and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent a quantum solution to the paradoxes of time travel.

  1. Flanking p10 contribution and sequence bias in matrix based epitope prediction: revisiting the assumption of independent binding pockets

    Directory of Open Access Journals (Sweden)

    Parry Christian S

    2008-10-01

    Full Text Available Abstract Background Eluted natural peptides from major histocompatibility molecules show patterns of conserved residues. Crystallographic structures show that the bound peptide in class II major histocompatibility complex adopts a near uniform polyproline II-like conformation. This way allele-specific favoured residues are able to anchor into pockets in the binding groove leaving other peptide side chains exposed for recognition by T cells. The anchor residues form a motif. This sequence pattern can be used to screen large sequences for potential epitopes. Quantitative matrices extend the motif idea to include the contribution of non-anchor peptide residues. This report examines two new matrices that extend the binding register to incorporate the polymorphic p10 pocket of human leukocyte antigen DR1. Their performance is quantified against experimental binding measurements and against the canonical nine-residue register matrix. Results One new matrix shows significant improvement over the base matrix; the other does not. The new matrices differ in the sequence of the peptide library. Conclusion One of the extended quantitative matrices showed significant improvement in prediction over the original nine residue matrix and over the other extended matrix. Proline in the sequence of the peptide library of the better performing matrix presumably stabilizes the peptide conformation through neighbour interactions. Such interactions may influence epitope prediction in this test of quantitative matrices. This calls into question the assumption of the independent contribution of individual binding pockets.

  2. CCN predictions using simplified assumptions of organic aerosol composition and mixing state: a synthesis from six different locations

    Directory of Open Access Journals (Sweden)

    B. Ervens

    2009-10-01

    Full Text Available An accurate but simple quantification of the fraction of aerosol particles that can act as cloud condensation nuclei (CCN is needed for implementation in large-scale models. Data on aerosol size distribution, chemical composition, and CCN concentration from six different locations have been analyzed to explore the extent to which simple assumptions of composition and mixing state of the organic fraction can reproduce measured CCN number concentrations.

    Fresher pollution aerosol as encountered in Riverside, CA, and the ship channel in Houston, TX, cannot be represented without knowledge of more complex (size-resolved composition. For aerosol that has experienced processing (Mexico City, Holme Moss (UK, Point Reyes (CA, and Chebogue Point (Canada, CCN can be predicted within a factor of two assuming either externally or internally mixed soluble organics although these simplified compositions/mixing states might not represent the actual properties of ambient aerosol populations. Under typical conditions, a factor of two uncertainty in CCN concentration translates to an uncertainty of ~15% in cloud drop concentration, which might be adequate for large-scale models given the much larger uncertainty in cloudiness.

  3. A computer code for forward calculation and inversion of the H/V spectral ratio under the diffuse field assumption

    Science.gov (United States)

    García-Jerez, Antonio; Piña-Flores, José; Sánchez-Sesma, Francisco J.; Luzón, Francisco; Perton, Mathieu

    2016-12-01

    During a quarter of a century, the main characteristics of the horizontal-to-vertical spectral ratio of ambient noise HVSRN have been extensively used for site effect assessment. In spite of the uncertainties about the optimum theoretical model to describe these observations, over the last decade several schemes for inversion of the full HVSRN curve for near surface surveying have been developed. In this work, a computer code for forward calculation of H/V spectra based on the diffuse field assumption (DFA) is presented and tested. It takes advantage of the recently stated connection between the HVSRN and the elastodynamic Green's function which arises from the ambient noise interferometry theory. The algorithm allows for (1) a natural calculation of the Green's functions imaginary parts by using suitable contour integrals in the complex wavenumber plane, and (2) separate calculation of the contributions of Rayleigh, Love, P-SV and SH waves as well. The stability of the algorithm at high frequencies is preserved by means of an adaptation of the Wang's orthonormalization method to the calculation of dispersion curves, surface-waves medium responses and contributions of body waves. This code has been combined with a variety of inversion methods to make up a powerful tool for passive seismic surveying.

  4. Rating leniency and halo in multisource feedback ratings: testing cultural assumptions of power distance and individualism-collectivism.

    Science.gov (United States)

    Ng, Kok-Yee; Koh, Christine; Ang, Soon; Kennedy, Jeffrey C; Chan, Kim-Yin

    2011-09-01

    This study extends multisource feedback research by assessing the effects of rater source and raters' cultural value orientations on rating bias (leniency and halo). Using a motivational perspective of performance appraisal, the authors posit that subordinate raters followed by peers will exhibit more rating bias than superiors. More important, given that multisource feedback systems were premised on low power distance and individualistic cultural assumptions, the authors expect raters' power distance and individualism-collectivism orientations to moderate the effects of rater source on rating bias. Hierarchical linear modeling on data collected from 1,447 superiors, peers, and subordinates who provided developmental feedback to 172 military officers show that (a) subordinates exhibit the most rating leniency, followed by peers and superiors; (b) subordinates demonstrate more halo than superiors and peers, whereas superiors and peers do not differ; (c) the effects of power distance on leniency and halo are strongest for subordinates than for peers and superiors; (d) the effects of collectivism on leniency were stronger for subordinates and peers than for superiors; effects on halo were stronger for subordinates than superiors, but these effects did not differ for subordinates and peers. The present findings highlight the role of raters' cultural values in multisource feedback ratings.

  5. A computer code for forward calculation and inversion of the H/V spectral ratio under the diffuse field assumption

    CERN Document Server

    García-Jerez, Antonio; Sánchez-Sesma, Francisco J; Luzón, Francisco; Perton, Mathieu

    2016-01-01

    During a quarter of a century, the main characteristics of the horizontal-to-vertical spectral ratio of ambient noise HVSRN have been extensively used for site effect assessment. In spite of the uncertainties about the optimum theoretical model to describe these observations, several schemes for inversion of the full HVSRN curve for near surface surveying have been developed over the last decade. In this work, a computer code for forward calculation of H/V spectra based on the diffuse field assumption (DFA) is presented and tested.It takes advantage of the recently stated connection between the HVSRN and the elastodynamic Green's function which arises from the ambient noise interferometry theory. The algorithm allows for (1) a natural calculation of the Green's functions imaginary parts by using suitable contour integrals in the complex wavenumber plane, and (2) separate calculation of the contributions of Rayleigh, Love, P-SV and SH waves as well. The stability of the algorithm at high frequencies is preserv...

  6. The Fitness of Assumptions and an Alternative Model for Funding the Public Sector Pension Scheme: The Case of Rio Grande do Sul

    Directory of Open Access Journals (Sweden)

    Paulo Roberto Caldart

    2014-12-01

    Full Text Available The research presented herein has two objectives. First, this study will test whether actuarial assumptions for public sector pension schemes in Brazil adhere to reality and whether changing these assumptions might affect the results, particularly with respect to life tables and wage growth assumptions. The paper shows that the best fit life table is AT 2000 for males aggregated by one year, which involves a longer life expectancy than the life table proposed under current legislation (IBGE 2009. The data also show that actual wage growth was 4.59% per year from 2002 to 2012, as opposed to the 1% wage increase proposed by the same legislation. Changing these two assumptions increases the actuarial imbalance for a representative individual by 18.17% after accounting for the adjusted life table or by 98.30% after revising the wage growth assumption. With respect to its second objective, this paper proposes alternative funding mechanisms in which the local pension scheme will provide the funded component of the benefit that would be complemented by local government in a pay-as-you-go manner. The database utilized was for the state of Rio Grande do Sul in the month of November 2011. The results are thus restricted to Rio Grande do Sul.

  7. Sensitivity of Mission Energy Consumption to Turboelectric Distributed Propulsion Design Assumptions on the N3-X Hybrid Wing Body Aircraft

    Science.gov (United States)

    Felder, James L.; Tong, Michael T.; Chu, Julio

    2012-01-01

    In a previous study by the authors it was shown that the N3-X, a 300 passenger hybrid wing body (HWB) aircraft with a turboelectric distributed propulsion (TeDP) system, was able to meet the NASA Subsonic Fixed Wing (SFW) project goal for N+3 generation aircraft of at least a 60% reduction in total energy consumption as compared to the best in class current generation aircraft. This previous study combined technology assumptions that represented the highest anticipated values that could be matured to technology readiness level (TRL) 4-6 by 2030. This paper presents the results of a sensitivity analysis of the total mission energy consumption to reductions in each key technology assumption. Of the parameters examined, the mission total energy consumption was most sensitive to changes to total pressure loss in the propulsor inlet. The baseline inlet internal pressure loss is assumed to be an optimistic 0.5%. An inlet pressure loss of 3% increases the total energy consumption 9%. However changes to reduce inlet pressure loss can result in additional distortion to the fan which can reduce fan efficiency or vice versa. It is very important that the inlet and fan be analyzed and optimized as a single unit. The turboshaft hot section is assumed to be made of ceramic matrix composite (CMC) with a 3000 F maximum material temperature. Reducing the maximum material temperature to 2700 F increases the mission energy consumption by only 1.5%. Thus achieving a 3000 F temperature in CMCs is important but not central to achieving the energy consumption objective of the N3-X/TeDP. A key parameter in the efficiency of superconducting motors and generators is the size of the superconducting filaments in the stator. The size of the superconducting filaments in the baseline model is assumed to be 10 microns. A 40 micron filament, which represents current technology, results in a 200% increase in AC losses in the motor and generator stators. This analysis shows that for a system with 40

  8. Assessing the validity of station location assumptions made in the calculation of the geomagnetic disturbance index, Dst

    Science.gov (United States)

    Gannon, Jennifer

    2012-01-01

    In this paper, the effects of the assumptions made in the calculation of the Dst index with regard to longitude sampling, hemisphere bias, and latitude correction are explored. The insights gained from this study will allow operational users to better understand the local implications of the Dst index and will lead to future index formulations that are more physically motivated. We recompute the index using 12 longitudinally spaced low-latitude stations, including the traditional 4 (in Honolulu, Kakioka, San Juan, and Hermanus), and compare it to the standard United States Geological Survey definitive Dst. We look at the hemisphere balance by comparing stations at equal geomagnetic latitudes in the Northern and Southern hemispheres. We further separate the 12-station time series into two hemispheric indices and find that there are measurable differences in the traditional Dst formulation due to the undersampling of the Southern Hemisphere in comparison with the Northern Hemisphere. To analyze the effect of latitude correction, we plot latitudinal variation in a disturbance observed during the year 2005 using two separate longitudinal observatory chains. We separate these by activity level and find that while the traditional cosine form fits the latitudinal distributions well for low levels of activity, at higher levels of disturbance the cosine form does not fit the observed variation. This suggests that the traditional latitude scaling is insufficient during active times. The effect of the Northern Hemisphere bias and the inadequate latitude scaling is such that the standard correction underestimates the true disturbance by 10–30 nT for storms of main phase magnitude deviation greater than 150 nT in the traditional Dst index.

  9. Quantifying tap-to-household water quality deterioration in urban communities in Vellore, India: The impact of spatial assumptions.

    Science.gov (United States)

    Alarcon Falconi, Tania M; Kulinkina, Alexandra V; Mohan, Venkata Raghava; Francis, Mark R; Kattula, Deepthi; Sarkar, Rajiv; Ward, Honorine; Kang, Gagandeep; Balraj, Vinohar; Naumova, Elena N

    2017-01-01

    Municipal water sources in India have been found to be highly contaminated, with further water quality deterioration occurring during household storage. Quantifying water quality deterioration requires knowledge about the exact source tap and length of water storage at the household, which is not usually known. This study presents a methodology to link source and household stored water, and explores the effects of spatial assumptions on the association between tap-to-household water quality deterioration and enteric infections in two semi-urban slums of Vellore, India. To determine a possible water source for each household sample, we paired household and tap samples collected on the same day using three spatial approaches implemented in GIS: minimum Euclidean distance; minimum network distance; and inverse network-distance weighted average. Logistic and Poisson regression models were used to determine associations between water quality deterioration and household-level characteristics, and between diarrheal cases and water quality deterioration. On average, 60% of households had higher fecal coliform concentrations in household samples than at source taps. Only the weighted average approach detected a higher risk of water quality deterioration for households that do not purify water and that have animals in the home (RR=1.50 [1.03, 2.18], p=0.033); and showed that households with water quality deterioration were more likely to report diarrheal cases (OR=3.08 [1.21, 8.18], p=0.02). Studies to assess contamination between source and household are rare due to methodological challenges and high costs associated with collecting paired samples. Our study demonstrated it is possible to derive useful spatial links between samples post hoc; and that the pairing approach affects the conclusions related to associations between enteric infections and water quality deterioration.

  10. Residential applliance data, assumptions and methodology for end-use forecasting with EPRI-REEPS 2.1

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, R.J,; Johnson, F.X.; Brown, R.E.; Hanford, J.W.; Kommey, J.G.

    1994-05-01

    This report details the data, assumptions and methodology for end-use forecasting of appliance energy use in the US residential sector. Our analysis uses the modeling framework provided by the Appliance Model in the Residential End-Use Energy Planning System (REEPS), which was developed by the Electric Power Research Institute. In this modeling framework, appliances include essentially all residential end-uses other than space conditioning end-uses. We have defined a distinct appliance model for each end-use based on a common modeling framework provided in the REEPS software. This report details our development of the following appliance models: refrigerator, freezer, dryer, water heater, clothes washer, dishwasher, lighting, cooking and miscellaneous. Taken together, appliances account for approximately 70% of electricity consumption and 30% of natural gas consumption in the US residential sector. Appliances are thus important to those residential sector policies or programs aimed at improving the efficiency of electricity and natural gas consumption. This report is primarily methodological in nature, taking the reader through the entire process of developing the baseline for residential appliance end-uses. Analysis steps documented in this report include: gathering technology and market data for each appliance end-use and specific technologies within those end-uses, developing cost data for the various technologies, and specifying decision models to forecast future purchase decisions by households. Our implementation of the REEPS 2.1 modeling framework draws on the extensive technology, cost and market data assembled by LBL for the purpose of analyzing federal energy conservation standards. The resulting residential appliance forecasting model offers a flexible and accurate tool for analyzing the effect of policies at the national level.

  11. The inversion of spectral ratio H/V in a layered system using the diffuse field assumption (DFA)

    Science.gov (United States)

    Piña-Flores, José; Perton, Mathieu; García-Jerez, Antonio; Carmona, Enrique; Luzón, Francisco; Molina-Villegas, Juan C.; Sánchez-Sesma, Francisco J.

    2017-01-01

    In order to evaluate the site effects on seismic ground motion and establish preventive measures to mitigate these effects, the dynamic characterization of sites is mandatory. Among the various geophysical tools aimed to this end, the horizontal to vertical spectral ratio (H/V) is a simple way to assess the dominant frequency of a site from seismic ambient noise. The aim of this communication is contributing to enhance the potential of this measurement with a novel method that allows extracting from the H/V the elastic properties of the subsoil, assumed here as a multilayer medium. For that purpose, we adopt the diffuse field assumption from both the experimental and the modelling perspectives. At the experimental end, the idea is to define general criteria that make the data processing closely supported by theory. On the modelling front, the challenge is to compute efficiently the imaginary part of Green's function. The Cauchy's residue theory in the horizontal wavenumber complex plane is the selected approach. This method allows both identifying the contributions of body and surface waves and computing them separately. This permits exploring the theoretical properties of the H/V under different compositions of the seismic ambient noise. This answers some questions that historically aroused and gives new insights into the H/V method. The efficient forward calculation is the prime ingredient of an inversion scheme based on both gradient and heuristic searches. The availability of efficient forward calculation of H/V allows exploring some relevant relationships between the H/V curves and the parameters. This allows generating useful criteria to speed up inversion. As in many inverse problems, the non-uniqueness issues also emerge here. A joint inversion method that considers also the dispersion curves of surface waves extracted from seismic ambient noise is presented and applied to experimental data. This joint scheme mitigates effectively the non-uniqueness.

  12. The inversion of spectral ratio H/V in a layered system using the Diffuse Field Assumption (DFA)

    Science.gov (United States)

    Piña-Flores, José; Perton, Mathieu; García-Jerez, Antonio; Carmona, Enrique; Luzón, Francisco; Molina-Villegas, Juan C.; Sánchez-Sesma, Francisco J.

    2016-11-01

    In order to evaluate the site effects on seismic ground motion and establish preventive measures to mitigate these effects, the dynamic characterization of sites is mandatory. Among the various geophysical tools aimed to this end, the horizontal to vertical spectral ratio (H/V) is a simple way to assess the dominant frequency of a site from seismic ambient noise. The aim of this communication is contributing to enhance the potential of this measurement with a novel method that allows extracting from the H/V the elastic properties of the subsoil, assumed here as a multilayer medium. For that purpose, we adopt the Diffuse Field Assumption from both the experimental and the modeling perspectives. At the experimental end, the idea is to define general criteria that make the data processing closely supported by theory. On the modeling front, the challenge is to compute efficiently the imaginary part of Green's function. The Cauchy's residue theory in the horizontal wavenumber complex plane is the selected approach. This method allows both identifying the contributions of body and surface waves and computing them separately. This permits exploring the theoretical properties of the H/V under different compositions of the seismic ambient noise. This answers some questions that historically aroused and gives new insights into the H/V method. The efficient forward calculation is the prime ingredient of an inversion scheme based on both gradient and heuristic searches. The availability of efficient forward calculation of H/V allows exploring some relevant relationships between the H/V curves and the parameters. This allows generating useful criteria to speed up inversion. As in many inverse problems, the non-uniqueness issues also emerge here. A joint inversion method that considers also the dispersion curves of surface waves extracted from seismic ambient noise is presented and applied to experimental data. This joint scheme mitigates effectively the non-uniqueness.

  13. Testing Earth System Model Assumptions of Photosynthetic Parameters with in situ Leaf Measurements from a Temperate Zone Forest.

    Science.gov (United States)

    Cheng, S. J.; Thomas, R. Q.; Wilkening, J. V.; Curtis, P.; Sharkey, T. D.; Nadelhoffer, K. J.

    2015-12-01

    Estimates of global land CO2 uptake vary widely across Earth system models. This uncertainty around model estimates of land-atmosphere CO2 fluxes may result from differences in how models parameterize and scale photosynthesis from the leaf-to-global level. To test model assumptions about photosynthesis, we derive rates of maximum carboxylation (Vc,max), electron transport (J), and triose phosphate utilization (TPU) from in situ leaf measurements from a forest representative of the Great Lakes region. Leaf-level gas exchange measurements were collected across a temperature range from sun and shade leaves of canopy-dominant tree species typically grouped into the same plant functional type. We evaluate the influence of short-term increases in leaf temperature, nitrogen per leaf area (Narea), species, and leaf light environment on Vc,max, J, and TPU by testing contrasting model equations that isolate the influence of these factors on these rate-limiting steps in leaf photosynthesis. Results indicate that patterns in Vc,max are best explained by a model that includes temperature and Narea. However, J varied with species and leaf light environment in addition to temperature. TPU also varied with leaf light environment and possibly with temperature. These variations in J and TPU with species or between sun and shade leaves suggest that plant traits outside of Narea are needed to explain patterns in J and TPU. This study provides in situ evidence on how Vc,max, J, and TPU vary within a forest canopy and highlight how leaf responses to changes in climate, forest species composition, and canopy structure may alter forest CO2 uptake.

  14. How family caregivers' medical and moral assumptions influence decision making for patients in the vegetative state: a qualitative interview study.

    Science.gov (United States)

    Kuehlmeyer, Katja; Borasio, Gian Domenico; Jox, Ralf J

    2012-06-01

    Decisions on limiting life-sustaining treatment for patients in the vegetative state (VS) are emotionally and morally challenging. In Germany, doctors have to discuss, together with the legal surrogate (often a family member), whether the proposed treatment is in accordance with the patient's will. However, it is unknown whether family members of the patient in the VS actually base their decisions on the patient's wishes. To examine the role of advance directives, orally expressed wishes, or the presumed will of patients in a VS for family caregivers' decisions on life-sustaining treatment. A qualitative interview study with 14 next of kin of patients in a VS in a long-term care setting was conducted; 13 participants were the patient's legal surrogates. Interviews were analysed according to qualitative content analysis. The majority of family caregivers said that they were aware of aforementioned wishes of the patient that could be applied to the VS condition, but did not base their decisions primarily on these wishes. They gave three reasons for this: (a) the expectation of clinical improvement, (b) the caregivers' definition of life-sustaining treatments and (c) the moral obligation not to harm the patient. If the patient's wishes were not known or not revealed, the caregivers interpreted a will to live into the patient's survival and non-verbal behaviour. Whether or not prior treatment wishes of patients in a VS are respected depends on their applicability, and also on the medical assumptions and moral attitudes of the surrogates. We recommend repeated communication, support for the caregivers and advance care planning.

  15. FIGHTING THE CLASSICAL CRIME-SCENE ASSUMPTIONS. CRITICAL ASPECTS IN ESTABLISHING THE CRIME-SCENE PERIMETER IN COMPUTER-BASED EVIDENCE CASES

    Directory of Open Access Journals (Sweden)

    Cristina DRIGĂ

    2016-05-01

    Full Text Available Physical-world forensic investigation has the luxury of being tied to the sciences governing the investigated space, hence some assumptions can be made with some degree of certainty when investigating a crime. Cyberspace on the other hand, has a dual nature comprising both a physical layer susceptible of scientific analysis, and a virtual layer governed entirely by the conventions established between the various actors involved at a certain moment in time, defining the actual digital landscape and being the layer where the actual facts relevant from the legal point of view occur. This distinct nature renders unusable many of the assumptions which the legal professionals and the courts of law are used to operate with. The article intends to identify the most important features of cyberspace having immediate legal consequences, with the purpose to establish new and safe assumptions from the legal professional's perspective when cross-examining facts that occurred in cyberspace.

  16. Critical appraisal of assumptions in chains of model calculations used to project local climate impacts for adaptation decision support—the case of Baakse Beek

    Science.gov (United States)

    van der Sluijs, Jeroen P.; Arjan Wardekker, J.

    2015-04-01

    In order to enable anticipation and proactive adaptation, local decision makers increasingly seek detailed foresight about regional and local impacts of climate change. To this end, the Netherlands Models and Data-Centre implemented a pilot chain of sequentially linked models to project local climate impacts on hydrology, agriculture and nature under different national climate scenarios for a small region in the east of the Netherlands named Baakse Beek. The chain of models sequentially linked in that pilot includes a (future) weather generator and models of respectively subsurface hydrogeology, ground water stocks and flows, soil chemistry, vegetation development, crop yield and nature quality. These models typically have mismatching time step sizes and grid cell sizes. The linking of these models unavoidably involves the making of model assumptions that can hardly be validated, such as those needed to bridge the mismatches in spatial and temporal scales. Here we present and apply a method for the systematic critical appraisal of model assumptions that seeks to identify and characterize the weakest assumptions in a model chain. The critical appraisal of assumptions presented in this paper has been carried out ex-post. For the case of the climate impact model chain for Baakse Beek, the three most problematic assumptions were found to be: land use and land management kept constant over time; model linking of (daily) ground water model output to the (yearly) vegetation model around the root zone; and aggregation of daily output of the soil hydrology model into yearly input of a so called ‘mineralization reduction factor’ (calculated from annual average soil pH and daily soil hydrology) in the soil chemistry model. Overall, the method for critical appraisal of model assumptions presented and tested in this paper yields a rich qualitative insight in model uncertainty and model quality. It promotes reflectivity and learning in the modelling community, and leads to

  17. Assumption-free estimation of heritability from genome-wide identity-by-descent sharing between full siblings.

    Directory of Open Access Journals (Sweden)

    Peter M Visscher

    2006-03-01

    Full Text Available The study of continuously varying, quantitative traits is important in evolutionary biology, agriculture, and medicine. Variation in such traits is attributable to many, possibly interacting, genes whose expression may be sensitive to the environment, which makes their dissection into underlying causative factors difficult. An important population parameter for quantitative traits is heritability, the proportion of total variance that is due to genetic factors. Response to artificial and natural selection and the degree of resemblance between relatives are all a function of this parameter. Following the classic paper by R. A. Fisher in 1918, the estimation of additive and dominance genetic variance and heritability in populations is based upon the expected proportion of genes shared between different types of relatives, and explicit, often controversial and untestable models of genetic and non-genetic causes of family resemblance. With genome-wide coverage of genetic markers it is now possible to estimate such parameters solely within families using the actual degree of identity-by-descent sharing between relatives. Using genome scans on 4,401 quasi-independent sib pairs of which 3,375 pairs had phenotypes, we estimated the heritability of height from empirical genome-wide identity-by-descent sharing, which varied from 0.374 to 0.617 (mean 0.498, standard deviation 0.036. The variance in identity-by-descent sharing per chromosome and per genome was consistent with theory. The maximum likelihood estimate of the heritability for height was 0.80 with no evidence for non-genetic causes of sib resemblance, consistent with results from independent twin and family studies but using an entirely separate source of information. Our application shows that it is feasible to estimate genetic variance solely from within-family segregation and provides an independent validation of previously untestable assumptions. Given sufficient data, our new paradigm will

  18. Assumption-free estimation of heritability from genome-wide identity-by-descent sharing between full siblings.

    Directory of Open Access Journals (Sweden)

    2006-03-01

    Full Text Available The study of continuously varying, quantitative traits is important in evolutionary biology, agriculture, and medicine. Variation in such traits is attributable to many, possibly interacting, genes whose expression may be sensitive to the environment, which makes their dissection into underlying causative factors difficult. An important population parameter for quantitative traits is heritability, the proportion of total variance that is due to genetic factors. Response to artificial and natural selection and the degree of resemblance between relatives are all a function of this parameter. Following the classic paper by R. A. Fisher in 1918, the estimation of additive and dominance genetic variance and heritability in populations is based upon the expected proportion of genes shared between different types of relatives, and explicit, often controversial and untestable models of genetic and non-genetic causes of family resemblance. With genome-wide coverage of genetic markers it is now possible to estimate such parameters solely within families using the actual degree of identity-by-descent sharing between relatives. Using genome scans on 4,401 quasi-independent sib pairs of which 3,375 pairs had phenotypes, we estimated the heritability of height from empirical genome-wide identity-by-descent sharing, which varied from 0.374 to 0.617 (mean 0.498, standard deviation 0.036. The variance in identity-by-descent sharing per chromosome and per genome was consistent with theory. The maximum likelihood estimate of the heritability for height was 0.80 with no evidence for non-genetic causes of sib resemblance, consistent with results from independent twin and family studies but using an entirely separate source of information. Our application shows that it is feasible to estimate genetic variance solely from within-family segregation and provides an independent validation of previously untestable assumptions. Given sufficient data, our new paradigm will

  19. Dynamic Teams and Decentralized Control Problems with Substitutable Actions

    OpenAIRE

    Asghari, Seyed Mohammad; Nayyar, Ashutosh

    2016-01-01

    This paper considers two problems -- a dynamic team problem and a decentralized control problem. The problems we consider do not belong to the known classes of "simpler" dynamic team/decentralized control problems such as partially nested or quadratically invariant problems. However, we show that our problems admit simple solutions under an assumption referred to as the substitutability assumption. Intuitively, substitutability in a team (resp. decentralized control) problem means that the ef...

  20. A Novel Robust Adaptive Fuzzy Controller

    Institute of Scientific and Technical Information of China (English)

    LIU Xiao-hua; WANG Xiu-hong; FEN En-min

    2002-01-01

    For a class of continuous-time nonlinear system, a novel robust adaptive fuzzy controller is proposed by using of Lyapunov method. It is proven that the control algorithm is globally stable, the output tracking-error can convergence to a domain of zero under the assumptions. As a result, the system controlled has stronger robustness for disturbance and modeling error.