WorldWideScience

Sample records for pseudo-steady state assumption

  1. Pseudo-steady rates of crystal nucleation in suspensions of charged colloidal particles

    CERN Document Server

    Dixit, N M

    2003-01-01

    We develop an analytical model to describe crystal nucleation in suspensions of charged colloidal particles. The particles are assumed to interact with a repulsive hard-core Yukawa potential. The thermodynamic properties of the suspensions are determined by mapping onto an effective hard-sphere system using perturbation theory. Hydrodynamic effects are calculated by approximating particle interactions with the excluded shell potential. The rates of particle aggregation and dissociation from cluster surfaces in supersaturated suspensions are determined by solving the diffusion and Smoluchowski equations, respectively, which allow the calculation of pseudo-steady rates of crystal nucleation. By decoupling thermodynamic and hydrodynamic effects, we find intriguing non-monotonic dependencies of the nucleation rate on the strength and the range of particle repulsions. In particular, we find that the rate at any effective hard-sphere volume fraction can be lower than that of the hard-sphere system at that volume fr...

  2. Bayou Corne sinkhole : control measurements of State Highway 70 in Assumption Parish, Louisiana, tech summary.

    Science.gov (United States)

    2014-01-01

    The sinkhole located in Assumption Parish, Louisiana, threatens the stability of Highway 70, a state maintained route. In order to : mitigate the potential damaging e ects of the sinkhole on this infrastructure, the Louisiana Department of Transpo...

  3. Bayou Corne Sinkhole: Control Measurements of State Highway 70 in Assumption Parish, Louisiana : Research Project Capsule

    Science.gov (United States)

    2012-09-01

    The sinkhole located in northern Assumption Parish, Louisiana, threatens : the stability of Highway 70, a state-maintained route. In order to monitor : and mitigate potential damage eff ects on this infrastructure, the Louisiana : Department of Trans...

  4. Bioaccumulation factors and the steady state assumption for cesium isotopes in aquatic foodwebs near nuclear facilities.

    Science.gov (United States)

    Rowan, D J

    2013-07-01

    Steady state approaches, such as transfer coefficients or bioaccumulation factors, are commonly used to model the bioaccumulation of (137)Cs in aquatic foodwebs from routine operations and releases from nuclear generating stations and other nuclear facilities. Routine releases from nuclear generating stations and facilities, however, often consist of pulses as liquid waste is stored, analyzed to ensure regulatory compliance and then released. The effect of repeated pulse releases on the steady state assumption inherent in the bioaccumulation factor approach has not been evaluated. In this study, I examine the steady state assumption for aquatic biota by analyzing data for two cesium isotopes in the same biota, one isotope in steady state (stable (133)Cs) from geologic sources and the other released in pulses ((137)Cs) from reactor operations. I also compare (137)Cs bioaccumulation factors for similar upstream populations from the same system exposed solely to weapon test (137)Cs, and assumed to be in steady state. The steady state assumption appears to be valid for small organisms at lower trophic levels (zooplankton, rainbow smelt and 0+ yellow perch) but not for older and larger fish at higher trophic levels (walleye). Attempts to account for previous exposure and retention through a biokinetics approach had a similar effect on steady state, upstream and non-steady state, downstream populations of walleye, but were ineffective in explaining the more or less constant deviation between fish with steady state exposures and non-steady state exposures of about 2-fold for all age classes of walleye. These results suggest that for large, piscivorous fish, repeated exposure to short duration, pulse releases leads to much higher (137)Cs BAFs than expected from (133)Cs BAFs for the same fish or (137)Cs BAFs for similar populations in the same system not impacted by reactor releases. These results suggest that the steady state approach should be used with caution in any

  5. Teachers' response to standards-based reform: Probing reform assumptions in Washington State.

    Directory of Open Access Journals (Sweden)

    Hilary Loeb

    2008-04-01

    Full Text Available Because teachers' efforts are central to the success of standards-based reform, it behooves the policy community to look carefully at the beliefs about instruction that are rooted in this reform theory. Building on teacher-centric research on standards-based reform and ideas about teaching practice from research on multicultural education, this paper focuses on the assumptions embedded in Washington state's approach. Survey data from a representative sample of teachers suggest that the state's program of high student learning standards, aligned assessments and an accountability system has shaped teachers' instructional practice and their students' learning in ways that the state's reform theory assumes. However, teachers' concerns about student achievement and instructional supports indicate problems with the inherent logic of the state's reform regarding how well it serves a diverse student population.

  6. Anti-Atheist Bias in the United States: Testing Two Critical Assumptions

    Directory of Open Access Journals (Sweden)

    Lawton K Swan

    2012-02-01

    Full Text Available Decades of opinion polling and empirical investigations have clearly demonstrated a pervasive anti-atheist prejudice in the United States. However, much of this scholarship relies on two critical and largely unaddressed assumptions: (a that when people report negative attitudes toward atheists, they do so because they are reacting specifically to their lack of belief in God; and (b that survey questions asking about attitudes toward atheists as a group yield reliable information about biases against individual atheist targets. To test these assumptions, an online survey asked a probability-based random sample of American adults (N = 618 to evaluate a fellow research participant (“Jordan”. Jordan garnered significantly more negative evaluations when identified as an atheist than when described as religious or when religiosity was not mentioned. This effect did not differ as a function of labeling (“atheist” versus “no belief in God”, or the amount of individuating information provided about Jordan. These data suggest that both assumptions are tenable: nonbelief—rather than extraneous connotations of the word “atheist”—seems to underlie the effect, and participants exhibited a marked bias even when confronted with an otherwise attractive individual.

  7. Accelerated Gillespie Algorithm for Gas–Grain Reaction Network Simulations Using Quasi-steady-state Assumption

    Science.gov (United States)

    Chang, Qiang; Lu, Yang; Quan, Donghui

    2017-12-01

    Although the Gillespie algorithm is accurate in simulating gas–grain reaction networks, so far its computational cost is so expensive that it cannot be used to simulate chemical reaction networks that include molecular hydrogen accretion or the chemical evolution of protoplanetary disks. We present an accelerated Gillespie algorithm that is based on a quasi-steady-state assumption with the further approximation that the population distribution of transient species depends only on the accretion and desorption processes. The new algorithm is tested against a few reaction networks that are simulated by the regular Gillespie algorithm. We found that the less likely it is that transient species are formed and destroyed on grain surfaces, the more accurate the new method is. We also apply the new method to simulate reaction networks that include molecular hydrogen accretion. The results show that surface chemical reactions involving molecular hydrogen are not important for the production of surface species under standard physical conditions of dense molecular clouds.

  8. Pharmacokinetics and central nervous system effects of the novel dopamine D3 receptor antagonist GSK598809 and intravenous alcohol infusion at pseudo-steady state.

    Science.gov (United States)

    te Beek, E T; Zoethout, R W M; Bani, M S G; Andorn, A; Iavarone, L; Klaassen, E S; Fina, P; van Gerven, J M A

    2012-02-01

    GSK598809 is a novel selective dopamine D(3) receptor antagonist, currently in development for the treatment of substance abuse and addiction. In a blinded, randomized, placebo-controlled study, effects of single oral doses of 175 mg GSK598809 were evaluated in healthy volunteers. Pharmacokinetics, central nervous system (CNS) effects and potential for interactions with alcohol were evaluated, using an alcohol infusion paradigm and analysis of eye movements, adaptive tracking, visual analogue scales, body sway, serum prolactin and verbal visual learning test. Adverse effects of GSK598809 included headache, dizziness and somnolence. Plasma concentration of GSK598809 was maximal 2-3 hours postdose and decreased with a half-life of roughly 20 hours. CNS effects were limited to prolactin elevation and decreased adaptive tracking. Co-administration of GSK598809 and alcohol did not affect alcohol pharmacokinetics, but caused a 9% decrease of C (max) and a 15% increase of AUC of GSK598809. CNS effects of co-administration were mainly additive, except a small supra-additive increase in saccadic reaction time and decrease in delayed word recall. In conclusion, GSK598809 causes elevation of serum prolactin and a small decrease in adaptive tracking performance. After co-administration with alcohol, effects of GSK598809 are mainly additive and the combination is well tolerated in healthy volunteers.

  9. Placebo- and amitriptyline-controlled evaluation of central nervous system effects of the NK1 receptor antagonist aprepitant and intravenous alcohol infusion at pseudo-steady state.

    Science.gov (United States)

    te Beek, Erik T; Tatosian, Daniel; Majumdar, Anup; Selverian, Diana; Klaassen, Erica S; Petty, Kevin J; Gargano, Cynthia; van Dyck, Kristien; McCrea, Jacqueline; Murphy, Gail; van Gerven, Joop M A

    2013-08-01

    Recent interest in NK1 receptor antagonists has focused on a potential role in the treatment of drug addiction and substance abuse. In the present study, the potential for interactions between the NK1 receptor antagonist aprepitant and alcohol, given as an infusion at a target level of 0.65 g/L, was evaluated. Amitriptyline was included as positive control to provide an impression of the profile of central nervous system (CNS) effects. In a double-blind, randomized, placebo- and amitriptyline-controlled study, the pharmacokinetics and CNS effects of aprepitant and alcohol were investigated in 16 healthy volunteers. Cognitive and psychomotor function tests included the visual verbal learning test (VVLT), Bond and Lader visual analogue scales (VAS), digit symbol substitution test (DSST), visual pattern recognition, binary choice reaction time, critical flicker fusion (CFF), body sway, finger tapping, and adaptive tracking. Alcohol impaired finger tapping and body sway. Amitriptyline impaired DSST performance, VAS alertness, CFF, body sway, finger tapping, and adaptive tracking. No impairments were found after administration of aprepitant. Co-administration of aprepitant with alcohol was generally well tolerated and did not cause significant additive CNS effects, compared with alcohol alone. Therefore, our study found no indications for clinically relevant interactions between aprepitant and alcohol. © The Author(s) 2013.

  10. Analysis of the adsorption process and of desiccant cooling systems: a pseudo- steady-state model for coupled heat and mass transfer. [DESSIM, DESSIM2, DESSIM4

    Energy Technology Data Exchange (ETDEWEB)

    Barlow, R.S.

    1982-12-01

    A computer model to simulate the adiabatic adsorption/desorption process is documented. Developed to predict the performance of desiccant cooling systems, the model has been validated through comparison with experimental data for single-blow adsorption and desorption. A literature review on adsorption analysis, detailed discussions of the adsorption process, and an initial assessment of the potential for performance improvement through advanced component development are included.

  11. Bayou Corne sinkhole : control measurements of State Highway 70 in Assumption Parish, Louisiana.

    Science.gov (United States)

    2014-01-01

    This project measured and assessed the surface stability of the portion of LA Highway 70 that is : potentially vulnerable to the Assumption Parish sinkhole. Using Global Positioning Systems (GPS) : enhanced by a real-time network (RTN) of continuousl...

  12. Usefulness of an equal-probability assumption for out-of-equilibrium states: A master equation approach

    KAUST Repository

    Nogawa, Tomoaki

    2012-10-18

    We examine the effectiveness of assuming an equal probability for states far from equilibrium. For this aim, we propose a method to construct a master equation for extensive variables describing nonstationary nonequilibrium dynamics. The key point of the method is the assumption that transient states are equivalent to the equilibrium state that has the same extensive variables, i.e., an equal probability holds for microscopic states in nonequilibrium. We demonstrate an application of this method to the critical relaxation of the two-dimensional Potts model by Monte Carlo simulations. While the one-variable description, which is adequate for equilibrium, yields relaxation dynamics that are very fast, the redundant two-variable description well reproduces the true dynamics quantitatively. These results suggest that some class of the nonequilibrium state can be described with a small extension of degrees of freedom, which may lead to an alternative way to understand nonequilibrium phenomena. © 2012 American Physical Society.

  13. Probabilistic choice models in health-state valuation research: background, theories, assumptions and applications.

    Science.gov (United States)

    Arons, Alexander M M; Krabbe, Paul F M

    2013-02-01

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.

  14. The stochastic quasi-steady-state assumption: Reducing the model but not the noise

    Science.gov (United States)

    Srivastava, Rishi; Haseltine, Eric L.; Mastny, Ethan; Rawlings, James B.

    2011-04-01

    Highly reactive species at small copy numbers play an important role in many biological reaction networks. We have described previously how these species can be removed from reaction networks using stochastic quasi-steady-state singular perturbation analysis (sQSPA). In this paper we apply sQSPA to three published biological models: the pap operon regulation, a biochemical oscillator, and an intracellular viral infection. These examples demonstrate three different potential benefits of sQSPA. First, rare state probabilities can be accurately estimated from simulation. Second, the method typically results in fewer and better scaled parameters that can be more readily estimated from experiments. Finally, the simulation time can be significantly reduced without sacrificing the accuracy of the solution.

  15. CCN predictions using simplified assumptions of organic aerosol composition and mixing state: a synthesis from six different locations

    Directory of Open Access Journals (Sweden)

    B. Ervens

    2010-05-01

    Full Text Available An accurate but simple quantification of the fraction of aerosol particles that can act as cloud condensation nuclei (CCN is needed for implementation in large-scale models. Data on aerosol size distribution, chemical composition, and CCN concentration from six different locations have been analyzed to explore the extent to which simple assumptions of composition and mixing state of the organic fraction can reproduce measured CCN number concentrations.

    Fresher pollution aerosol as encountered in Riverside, CA, and the ship channel in Houston, TX, cannot be represented without knowledge of more complex (size-resolved composition. For aerosol that has experienced processing (Mexico City, Holme Moss (UK, Point Reyes (CA, and Chebogue Point (Canada, CCN can be predicted within a factor of two assuming either externally or internally mixed soluble organics although these simplified compositions/mixing states might not represent the actual properties of ambient aerosol populations, in agreement with many previous CCN studies in the literature. Under typical conditions, a factor of two uncertainty in CCN concentration due to composition assumptions translates to an uncertainty of ~15% in cloud drop concentration, which might be adequate for large-scale models given the much larger uncertainty in cloudiness.

  16. False assumptions.

    Science.gov (United States)

    Swaminathan, M

    1997-01-01

    Indian women do not have to be told the benefits of breast feeding or "rescued from the clutches of wicked multinational companies" by international agencies. There is no proof that breast feeding has declined in India; in fact, a 1987 survey revealed that 98% of Indian women breast feed. Efforts to promote breast feeding among the middle classes rely on such initiatives as the "baby friendly" hospital where breast feeding is promoted immediately after birth. This ignores the 76% of Indian women who give birth at home. Blaming this unproved decline in breast feeding on multinational companies distracts attention from more far-reaching and intractable effects of social change. While the Infant Milk Substitutes Act is helpful, it also deflects attention from more pressing issues. Another false assumption is that Indian women are abandoning breast feeding to comply with the demands of employment, but research indicates that most women give up employment for breast feeding, despite the economic cost to their families. Women also seek work in the informal sector to secure the flexibility to meet their child care responsibilities. Instead of being concerned about "teaching" women what they already know about the benefits of breast feeding, efforts should be made to remove the constraints women face as a result of their multiple roles and to empower them with the support of families, governmental policies and legislation, employers, health professionals, and the media.

  17. When Is a Diffusion Profile Not a Diffusion Profile? the Importance of Initial State Assumptions in Diffusion Modelling

    Science.gov (United States)

    Morgan, D. J.; Chamberlain, K. J.; Kahl, M.; Potts, N. J.; Pankhurst, M. J.; Wilson, C. J. N.

    2014-12-01

    Over the past 20 years, diffusion chronometers have evolved from a niche tool into one of routine application, with more practitioners, new tools and increasingly large datasets. As we expand the horizons of diffusional geochronometry, it is worth taking stock of developments in methodologies and data acquisition, and taking time to revisit the underpinnings of the technique. Data collected as part of recent projects on Campi Flegrei, the Bishop Tuff and Fimmvörðuháls-Eyjafjallajökull are here used to investigate the initial state assumption, an absolutely vital aspect underpinning most diffusional work and one that is rarely evaluated despite its fundamental importance. To illustrate the nature of the problem we consider two widely-used element-mineral systems for felsic and mafic systems, respectively. First, barium and strontium profiles within sanidine crystals, modelled independently, can give strongly contrasting timescales from the same crystal zone. We can reconcile the datasets only for a situation where the initial boundary within the crystal was not a sharp step function, but relatively fuzzy before diffusion onset. This fuzziness effectively starts both chronometers off with an apparent, and false, pre-existing timescale, impacting the slower-diffusing barium much more strongly than the faster-diffusing strontium, yielding thousands of years of non-existent diffusion history. By combining both elements, a starting width of tens of microns can be shown, shortening the true diffusive timescales from tens of thousands of years to hundreds. Second, in olivine, we encounter different growth-related problems. Here, Fe-Mg interdiffusion occurs at a rate comparable to growth, with the compound nature of zonation making it difficult to extract the diffusion component. This requires a treatment of changing boundary conditions and sequential growth to generate the curvature seen in natural data, in order to recover timescales for anything but the outermost

  18. [The accuracy of rapid equilibrium assumption in steady-state enzyme kinetics is the function of equilibrium segment structure and properties].

    Science.gov (United States)

    Vrzheshch, P V

    2015-01-01

    Quantitative evaluation of the accuracy of the rapid equilibrium assumption in the steady-state enzyme kinetics was obtained for an arbitrary mechanism of an enzyme-catalyzed reaction. This evaluation depends only on the structure and properties of the equilibrium segment, but doesn't depend on the structure and properties of the rest (stationary part) of the kinetic scheme. The smaller the values of the edges leaving equilibrium segment in relation to values of the edges within the equilibrium segment, the higher the accuracy of determination of intermediate concentrations and reaction velocity in a case of the rapid equilibrium assumption.

  19. Monthly values of the standardized precipitation index in the State of São Paulo, Brazil: trends and spectral features under the normality assumption

    Directory of Open Access Journals (Sweden)

    Gabriel Constantino Blain

    2012-01-01

    Full Text Available The aim of this study was to describe monthly series of the Standardized Precipitation Index obtained from four weather stations of the State of São Paulo, Brazil. The analyses were carried out by evaluating the normality assumption of the SPI distributions, the spectral features of these series and, the presence of climatic trends in these datasets. It was observed that the Pearson type III distribution was better than the gamma 2-parameter distribution in providing monthly SPI series closer to the normality assumption inherent to the use of this standardized index. The spectral analyses carried out in the time-frequency domain did not allow us to establish a dominant mode in the analyzed series. In general, the Mann-Kendall and the Pettitt tests indicated the presence of no significant trend in the SPI series. However, both trend tests have indicated that the temporal variability of this index, observed at the months of October over the last 60 years, cannot be seen as the result of a purely random process. This last inference is due to the concentration of decreasing trends, with a common beginning (1983/84 in the four locations of the study.

  20. Multiverse Assumptions and Philosophy

    OpenAIRE

    James R. Johnson

    2018-01-01

    Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong) topics such as: infinity, duplicate yous, hypothetical fields, mo...

  1. Multiverse Assumptions and Philosophy

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2018-02-01

    Full Text Available Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong topics such as: infinity, duplicate yous, hypothetical fields, more than three space dimensions, Hilbert space, advanced civilizations, and reality established by mathematical relationships. It is easy to confuse multiverse proposals because many divergent models exist. This overview defines the characteristics of eleven popular multiverse proposals. The characteristics compared are: initial conditions, values of constants, laws of nature, number of space dimensions, number of universes, and fine tuning explanations. Future scientific experiments may validate selected assumptions; but until they do, proposals by philosophers may be as valid as theoretical scientific theories.

  2. Interrogating Values and Assumptions

    OpenAIRE

    Maudlin, Daniel

    2012-01-01

    Architecture and design students are not necessarily or immediately interested in history. It cannot be assumed that the typological or stylistic evolution of form and ornament over time, or lists of acknowledged masters and their works, will be viewed as having any relevance to a contemporary designer. Teaching architectural history within design forces a reappraisal of assumptions and hitherto unquestioned values. The artist-and-object, art-historical tradition excludes all architectures co...

  3. Contextuality under weak assumptions

    Science.gov (United States)

    Simmons, Andrew W.; Wallman, Joel J.; Pashayan, Hakop; Bartlett, Stephen D.; Rudolph, Terry

    2017-03-01

    The presence of contextuality in quantum theory was first highlighted by Bell, Kochen and Specker, who discovered that for quantum systems of three or more dimensions, measurements could not be viewed as deterministically revealing pre-existing properties of the system. More precisely, no model can assign deterministic outcomes to the projectors of a quantum measurement in a way that depends only on the projector and not the context (the full set of projectors) in which it appeared, despite the fact that the Born rule probabilities associated with projectors are independent of the context. A more general, operational definition of contextuality introduced by Spekkens, which we will term ‘probabilistic contextuality’, drops the assumption of determinism and allows for operations other than measurements to be considered contextual. Even two-dimensional quantum mechanics can be shown to be contextual under this generalised notion. Probabilistic noncontextuality represents the postulate that elements of an operational theory that cannot be distinguished from each other based on the statistics of arbitrarily many repeated experiments (they give rise to the same operational probabilities) are ontologically identical. In this paper, we introduce a framework that enables us to distinguish between different noncontextuality assumptions in terms of the relationships between the ontological representations of objects in the theory given a certain relation between their operational representations. This framework can be used to motivate and define a ‘possibilistic’ analogue, encapsulating the idea that elements of an operational theory that cannot be unambiguously distinguished operationally can also not be unambiguously distinguished ontologically. We then prove that possibilistic noncontextuality is equivalent to an alternative notion of noncontextuality proposed by Hardy. Finally, we demonstrate that these weaker noncontextuality assumptions are sufficient to prove

  4. Linking assumptions in amblyopia

    Science.gov (United States)

    LEVI, DENNIS M.

    2017-01-01

    Over the last 35 years or so, there has been substantial progress in revealing and characterizing the many interesting and sometimes mysterious sensory abnormalities that accompany amblyopia. A goal of many of the studies has been to try to make the link between the sensory losses and the underlying neural losses, resulting in several hypotheses about the site, nature, and cause of amblyopia. This article reviews some of these hypotheses, and the assumptions that link the sensory losses to specific physiological alterations in the brain. Despite intensive study, it turns out to be quite difficult to make a simple linking hypothesis, at least at the level of single neurons, and the locus of the sensory loss remains elusive. It is now clear that the simplest notion—that reduced contrast sensitivity of neurons in cortical area V1 explains the reduction in contrast sensitivity—is too simplistic. Considerations of noise, noise correlations, pooling, and the weighting of information also play a critically important role in making perceptual decisions, and our current models of amblyopia do not adequately take these into account. Indeed, although the reduction of contrast sensitivity is generally considered to reflect “early” neural changes, it seems plausible that it reflects changes at many stages of visual processing. PMID:23879956

  5. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  6. Assumptions and Grand Strategy

    Science.gov (United States)

    2011-01-01

    definition that is applicable today, noting that grand strategy: Encompasses the decisions of a given state about its overall secu- rity—the threat...offering instead an expla- nation that draws upon Western secular and materialist (i.e., socio-economic) understandings of the individual. Yet such

  7. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the ...

  8. Stochastic chemical kinetics and the total quasi-steady-state assumption: application to the stochastic simulation algorithm and chemical master equation.

    Science.gov (United States)

    Macnamara, Shev; Bersani, Alberto M; Burrage, Kevin; Sidje, Roger B

    2008-09-07

    Recently the application of the quasi-steady-state approximation (QSSA) to the stochastic simulation algorithm (SSA) was suggested for the purpose of speeding up stochastic simulations of chemical systems that involve both relatively fast and slow chemical reactions [Rao and Arkin, J. Chem. Phys. 118, 4999 (2003)] and further work has led to the nested and slow-scale SSA. Improved numerical efficiency is obtained by respecting the vastly different time scales characterizing the system and then by advancing only the slow reactions exactly, based on a suitable approximation to the fast reactions. We considerably extend these works by applying the QSSA to numerical methods for the direct solution of the chemical master equation (CME) and, in particular, to the finite state projection algorithm [Munsky and Khammash, J. Chem. Phys. 124, 044104 (2006)], in conjunction with Krylov methods. In addition, we point out some important connections to the literature on the (deterministic) total QSSA (tQSSA) and place the stochastic analogue of the QSSA within the more general framework of aggregation of Markov processes. We demonstrate the new methods on four examples: Michaelis-Menten enzyme kinetics, double phosphorylation, the Goldbeter-Koshland switch, and the mitogen activated protein kinase cascade. Overall, we report dramatic improvements by applying the tQSSA to the CME solver.

  9. Disastrous assumptions about community disasters

    Energy Technology Data Exchange (ETDEWEB)

    Dynes, R.R. [Univ. of Delaware, Newark, DE (United States). Disaster Research Center

    1995-12-31

    Planning for local community disasters is compounded with erroneous assumptions. Six problematic models are identified: agent facts, big accident, end of the world, media, command and control, administrative. Problematic assumptions in each of them are identified. A more adequate model centered on problem solving is identified. That there is a discrepancy between disaster planning efforts and the actual response experience seems rather universal. That discrepancy is symbolized by the graffiti which predictably surfaces on many walls in post disaster locations -- ``First the earthquake, then the disaster.`` That contradiction is seldom reduced as a result of post disaster critiques, since the most usual conclusion is that the plan was adequate but the ``people`` did not follow it. Another explanation will be provided here. A more plausible explanation for failure is that most planning efforts adopt a number of erroneous assumptions which affect the outcome. Those assumptions are infrequently changed or modified by experience.

  10. The Axioms and Special Assumptions

    Science.gov (United States)

    Borchers, Hans-Jürgen; Sen, Rathindra Nath

    For ease of reference, the axioms, the nontriviality assumptions (3.1.10), the definition of a D-set and the special assumptions of Chaps. 5 and 6 are collected together in the following. The verbal explanations that follow the formal definitions a)-f) of (4.2.1) have been omitted. The entries below are numbered as they are in the text. Recall that βC is the subset of the cone C which, in a D-set, is seen to coincide with the boundary of C after the topology is introduced (Sects. 3.2 and 3.2.1).

  11. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  12. Do unreal assumptions pervert behaviour?

    DEFF Research Database (Denmark)

    Petersen, Verner C.

    After conducting a series of experiments involving economics students Miller concludes: "The experience of taking a course in microeconomics actually altered students' conceptions of the appropriateness of acting in a self-interested manner, not merely their definition of self-interest." Being...... taught the assumptions of neoclassical economics one might become inclined to expect others to act in a self-interested way. This may indicate that the canonical assumptions of economics in turn influence the views of its practitioners for instance in business administration. The management practice...... of Jack Welch may show how this works in practice. He became famous for promoting a system of internal competition, in which employees were divided into a three category ranking with the top 20% being the stars, and the bottom 10% were weeded out. If such a scheme does not force employees to act in a self...

  13. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption. In this ......Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption...

  14. Modern Cosmology: Assumptions and Limits

    Science.gov (United States)

    Hwang, Jai-Chan

    2012-06-01

    Physical cosmology tries to understand the Universe at large with its origin and evolution. Observational and experimental situations in cosmology do not allow us to proceed purely based on the empirical means. We examine in which sense our cosmological assumptions in fact have shaped our current cosmological worldview with consequent inevitable limits. Cosmology, as other branches of science and knowledge, is a construct of human imagination reflecting the popular belief system of the era. The question at issue deserves further philosophic discussions. In Whitehead's words, ``philosophy, in one of its functions, is the critic of cosmologies.'' (Whitehead 1925).

  15. Pre-steady-state Kinetics for Hydrolysis of Insoluble Cellulose by Cellobiohydrolase Cel7A

    DEFF Research Database (Denmark)

    Cruys-Bagger, Nicolaj; Olsen, Jens Elmerdahl; Præstgaard, Eigil

    2012-01-01

    complex, processive hydrolysis, and dissociation, respectively. These kinetic parameters elucidate limiting factors in the cellulolytic process. We concluded, for example, that Cel7A cleaves about four glycosidic bonds/s during processive hydrolysis. However, the results suggest that stalling...... for the exo-acting cellulase Cel7A using amperometric biosensors and an explicit model for processive hydrolysis of cellulose. This analysis allows the identification of a pseudo-steady-state period and quantification of a processivity number as well as rate constants for the formation of a threaded enzyme...

  16. Underlying assumptions of developmental models

    Science.gov (United States)

    Britten, Roy J.

    1998-01-01

    These 10 obvious propositions make a model of the specification of form, intended to expose underlying assumptions of developmental biology for examination and future experimentation. (I) The control of development is by means of local interactions, rather than global control mechanisms. (II) A macromolecule near a specific site will bind by mass action. (III) Starting with a precursor cell, all cells are assembled automatically by specifically binding macromolecules. (IV) At the surface of cells are specific adhesion sites that determine how all cells bind to each other. (V) An organism will assemble automatically from parts (macromolecules, structures, and cells) specified by nuclear control factors. (VI) The nuclear control factors in each cell are from precursor cells and factors derived by signaling from other cells. (VII) The macromolecules that determine specific binding, cell adhesion, and signaling are controlled by nuclear control factors, and in a grand feedback the cell adhesion and signaling systems determine the nuclear factor patterns. (VIII) The embryonic precursor cells for organs, termed “precursor groups,” are linked by adhesion and signaling relationships. (IX) The precursor groups include precursors for regions of an organ and boundary cells between regions having few cell types, growing without additional specific cell-to-cell relationships. (X) Organs are held together by cell adhesion in functional relationships. Thus the form and function of the organism is specified entirely by local control mechanisms. Without global control systems, information for form is in the genes for structural proteins, adhesion molecules, control factors, signaling molecules, and their control regions. PMID:9689087

  17. Utilizing assumption for project of stand for solid state targets activation on inner beams of AIC-144 cyclotron; Zalozenia uzytkowe do projektu stanowiska do aktywacji tarcz w stanie stalym na wiazce wewnetrznej cyklotronu AIC-144

    Energy Technology Data Exchange (ETDEWEB)

    Petelenz, B. [The H. Niewodniczanski Inst. of Nuclear Physics, Cracow (Poland)

    1997-09-01

    General assumptions for project of target activation stand at AIC-144 cyclotron are presented. The project predicts production of {sup 67}Ga, {sup 111}In, {sup 201}Tl, {sup 139}Ce, {sup 88}Y, {sup 123}I and {sup 211}At isotopes using various target backings. Directions concerning target cooling and beam parameters are also described 25 refs, 1 tab

  18. Catalyst Deactivation: Control Relevance of Model Assumptions

    Directory of Open Access Journals (Sweden)

    Bernt Lie

    2000-10-01

    Full Text Available Two principles for describing catalyst deactivation are discussed, one based on the deactivation mechanism, the other based on the activity and catalyst age distribution. When the model is based upon activity decay, it is common to use a mean activity developed from the steady-state residence time distribution. We compare control-relevant properties of such an approach with those of a model based upon the deactivation mechanism. Using a continuous stirred tank reactor as an example, we show that the mechanistic approach and the population balance approach lead to identical models. However, common additional assumptions used for activity-based models lead to model properties that may deviate considerably from the correct one.

  19. The relevance of ''theory rich'' bridge assumptions

    NARCIS (Netherlands)

    Lindenberg, S

    1996-01-01

    Actor models are increasingly being used as a form of theory building in sociology because they can better represent the caul mechanisms that connect macro variables. However, actor models need additional assumptions, especially so-called bridge assumptions, for filling in the relatively empty

  20. Relaxing the zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Haegeman, Bart; Etienne, Rampal S.

    2008-01-01

    The zero-sum assumption is one of the ingredients of the standard neutral model of biodiversity by Hubbell. It states that the community is saturated all the time, which in this model means that the total number of individuals in the community is constant over time, and therefore introduces a

  1. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    application domains of the theory. We argue that assumptional symmetry leads to theoretical advancement by promoting the development of theory with greater falsifiability and stronger ontological grounding. Thus, strategic management theory may be advanced by systematically searching for asymmetrical...

  2. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  3. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  4. Unconditionally Secure and Universally Composable Commitments from Physical Assumptions

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Scafuro, Alessandra

    2013-01-01

    the usefulness of our compiler by providing two (constant-round) instantiations of ideal straight-line extractable commitment based on (malicious) PUFs [36] and stateless tamper-proof hardware tokens [26], therefore achieving the first unconditionally UC-secure commitment with malicious PUFs and stateless tokens......, respectively. Our constructions are secure for adversaries creating arbitrarily malicious stateful PUFs/tokens. Previous results with malicious PUFs used either computational assumptions to achieve UC-secure commitments or were unconditionally secure but only in the indistinguishability sense [36]. Similarly......, with stateless tokens, UC-secure commitments are known only under computational assumptions [13,24,15], while the (not UC) unconditional commitment scheme of [23] is secure only in a weaker model in which the adversary is not allowed to create stateful tokens. Besides allowing us to prove feasibility...

  5. Questioning Assumptions about Portfolio-Based Assessment.

    Science.gov (United States)

    Hamp-Lyons, Liz; Condon, William

    1993-01-01

    Reviews basic concepts and history of portfolio assessment as a useful means of evaluating student writing. Considers insights gained from the use of portfolio assessment. Questions five assumptions underlying portfolio assessment and suggests ways of working with portfolios that take into account new insights and perspectives. (HB)

  6. 10 CFR 436.14 - Methodological assumptions.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and... period is as follows: (1) For evaluating and ranking alternative retrofits for an existing Federal...

  7. Evaluating model assumptions in item response theory

    NARCIS (Netherlands)

    Tijmstra, J.

    2013-01-01

    This dissertation deals with the evaluation of model assumptions in the context of item response theory. Item response theory, also known as modern test theory, provides a statistical framework for the measurement of psychological constructs that cannot by observed directly, such as intelligence or

  8. Mexican-American Cultural Assumptions and Implications.

    Science.gov (United States)

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  9. Assumptions of Multiple Regression: Correcting Two Misconceptions

    Science.gov (United States)

    Williams, Matt N.; Gomez Grajales, Carlos Alberto; Kurkiewicz, Dason

    2013-01-01

    In 2002, an article entitled "Four assumptions of multiple regression that researchers should always test" by Osborne and Waters was published in "PARE." This article has gone on to be viewed more than 275,000 times (as of August 2013), and it is one of the first results displayed in a Google search for "regression…

  10. BOLD Noise Assumptions in fMRI

    NARCIS (Netherlands)

    Wink, Alle Meije; Roerdink, Jos B.T.M.

    2006-01-01

    This paper discusses the assumption of Gaussian noise in the blood-oxygenation-dependent (BOLD) contrast for functional MRI (fMRI). In principle, magnitudes in MRI images follow a Rice distribution. We start by reviewing differences between Rician and Gaussian noise. An analytic expression is

  11. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  12. Artificial Intelligence: Underlying Assumptions and Basic Objectives.

    Science.gov (United States)

    Cercone, Nick; McCalla, Gordon

    1984-01-01

    Presents perspectives on methodological assumptions underlying research efforts in artificial intelligence (AI) and charts activities, motivations, methods, and current status of research in each of the major AI subareas: natural language understanding; computer vision; expert systems; search, problem solving, planning; theorem proving and logic…

  13. Critically Challenging Some Assumptions in HRD

    Science.gov (United States)

    O'Donnell, David; McGuire, David; Cross, Christine

    2006-01-01

    This paper sets out to critically challenge five interrelated assumptions prominent in the (human resource development) HRD literature. These relate to: the exploitation of labour in enhancing shareholder value; the view that employees are co-contributors to and co-recipients of HRD benefits; the distinction between HRD and human resource…

  14. Leakage-resilient cryptography from minimal assumptions

    DEFF Research Database (Denmark)

    Hazay, Carmit; López-Alt, Adriana; Wee, Hoeteck

    2013-01-01

    results under specific assumptions. As a building block of independent interest, we study a notion of weak hash-proof systems in the public-key and symmetric-key settings. While these inherit some of the interesting security properties of standard hash-proof systems, we can instantiate them under general...

  15. 47 CFR 214.3 - Assumptions.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Assumptions. 214.3 Section 214.3 Telecommunication OFFICE OF SCIENCE AND TECHNOLOGY POLICY AND NATIONAL SECURITY COUNCIL PROCEDURES FOR THE USE AND..., will have authority to make new or revised assignments of radio frequencies in accordance with...

  16. On distributional assumptions and whitened cosine similarities

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Recently, an interpretation of the whitened cosine similarity measure as a Bayes decision rule was proposed (C. Liu, "The Bayes Decision Rule Induced Similarity Measures,'' IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 6, pp. 1086-1090, June 2007. This communication makes...... the observation that some of the distributional assumptions made to derive this measure are very restrictive and, considered simultaneously, even inconsistent....

  17. Critically challenging some assumptions in HRD

    OpenAIRE

    O'Donnell, D.; McGuire, David; Cross, C

    2006-01-01

    peer-reviewed This paper sets out to critically challenge five inter-related assumptions prominent in the HRD literature. These relate to: the exploitation of labour in enhancing shareholder value; the view that employees are co-contributors to and co-recipients of HRD benefits; the distinction between HRD and HRM; the relationship between HRD and unitarism; and, the relationship between HRD and organisational and learning cultures. From a critical modernist perspective, it ...

  18. The Arundel Assumption And Revision Of Some Large-Scale Maps ...

    African Journals Online (AJOL)

    The rather common practice of stating or using the Arundel Assumption without reference to appropriate mapping standards (except mention of its use for graphical plotting) is a major cause of inaccuracies in map revision. This paper describes an investigation to ascertain the applicability of the Assumption to the revision of ...

  19. Implicit Assumptions in Special Education Policy: Promoting Full Inclusion for Students with Learning Disabilities

    Science.gov (United States)

    Kirby, Moira

    2017-01-01

    Introduction: Everyday millions of students in the United States receive special education services. Special education is an institution shaped by societal norms. Inherent in these norms are implicit assumptions regarding disability and the nature of special education services. The two dominant implicit assumptions evident in the American…

  20. Questioning ten common assumptions about peatlands

    Directory of Open Access Journals (Sweden)

    University of Leeds Peat Club:

    2017-07-01

    Full Text Available Peatlands have been widely studied in terms of their ecohydrology, carbon dynamics, ecosystem services and palaeoenvironmental archives. However, several assumptions are frequently made about peatlands in the academic literature, practitioner reports and the popular media which are either ambiguous or in some cases incorrect. Here we discuss the following ten common assumptions about peatlands: 1. the northern peatland carbon store will shrink under a warming climate; 2. peatlands are fragile ecosystems; 3. wet peatlands have greater rates of net carbon accumulation; 4. different rules apply to tropical peatlands; 5. peat is a single soil type; 6. peatlands behave like sponges; 7. Sphagnum is the main ‘ecosystem engineer’ in peatlands; 8. a single core provides a representative palaeo-archive from a peatland; 9. water-table reconstructions from peatlands provide direct records of past climate change; and 10. restoration of peatlands results in the re-establishment of their carbon sink function. In each case we consider the evidence supporting the assumption and, where appropriate, identify its shortcomings or ways in which it may be misleading.

  1. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  2. Explorations in statistics: the assumption of normality.

    Science.gov (United States)

    Curran-Everett, Douglas

    2017-09-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This twelfth installment of Explorations in Statistics explores the assumption of normality, an assumption essential to the meaningful interpretation of a t test. Although the data themselves can be consistent with a normal distribution, they need not be. Instead, it is the theoretical distribution of the sample mean or the theoretical distribution of the difference between sample means that must be roughly normal. The most versatile approach to assess normality is to bootstrap the sample mean, the difference between sample means, or t itself. We can then assess whether the distributions of these bootstrap statistics are consistent with a normal distribution by studying their normal quantile plots. If we suspect that an inference we make from a t test may not be justified-if we suspect that the theoretical distribution of the sample mean or the theoretical distribution of the difference between sample means is not normal-then we can use a permutation method to analyze our data. Copyright © 2017 the American Physiological Society.

  3. Wetland distribution assumptions: consequences for Methane emissions

    Science.gov (United States)

    Kleinen, Thomas; Brovkin, Victor

    2017-04-01

    Wetlands are the largest natural source of methane to the atmosphere. While process models of wetland methane emissions have advanced considerably in recent years, all of these models critically depend on estimates of the methane-emitting area. These estimates are highly uncertain, however. We investigate several approaches for estimating the wetland area and the consequences these assumptions have for the spatial and temporal distributions of wetland methane emissions. For this investigation we use JSBACH, the land surface component of the Max Planck Institute Earth System Model MPI-ESM, extended with modules for the generation and soil transport of methane. We drive the model with an ensemble of simulations of climate over the historical period from the MPI-ESM CMIP5 archive, as well as observed climate from CRU-NCEP. We impose both static and dynamic wetland maps, as well as modelled wetland distributions, and determine the wetland methane emissions resulting from these estimates. Results are compared to methane fluxes from atmospheric inversions to evaluate the consequences of the assumptions on wetland area.

  4. Challenging the assumptions for thermal sensation scales

    DEFF Research Database (Denmark)

    Schweiker, Marcel; Fuchs, Xaver; Becker, Susanne

    2016-01-01

    extensively, which is suitable for describing a one-dimensional relationship between physical parameters of indoor environments and subjective thermal sensation. However, human thermal comfort is not merely a physiological but also a psychological phenomenon. Thus, it should be investigated how scales for its...... assessment could benefit from a multidimensional conceptualization. The common assumptions related to the usage of thermal sensation scales are challenged, empirically supported by two analyses. These analyses show that the relationship between temperature and subjective thermal sensation is non......-linear and depends on the type of scale used. Moreover, the results signify that most people do not perceive the categories of the thermal sensation scale as equidistant and that the range of sensations regarded as ‘comfortable’ varies largely. Therefore, challenges known from experimental psychology (describing...

  5. Assumptions about Ecological Scale and Nature Knowing Best Hiding in Environmental Decisions

    Directory of Open Access Journals (Sweden)

    R. Bruce Hull

    2002-12-01

    Full Text Available Assumptions about nature are embedded in people's preferences for environmental policy and management. The people we interviewed justified preservationist policies using four assumptions about nature knowing best: nature is balanced, evolution is progressive, technology is suspect, and the Creation is perfect. They justified interventionist policies using three assumptions about nature: it is dynamic, inefficient, and robust. Unstated assumptions about temporal, spatial, and organizational scales further confuse discussions about nature. These findings confirm and extend findings from previous research. Data for our study were derived from interviews with people actively involved in negotiating the fate of forest ecosystems in southwest Virginia: landowners, forest advisors, scientists, state and federal foresters, loggers, and leaders in non-governmental environmental organizations. We argue that differing assumptions about nature constrain people's vision of what environmental conditions can and should exist, thereby constraining the future that can be negotiated. We recommend promoting ecological literacy and a biocultural approach to ecological science.

  6. Transsexual parenthood and new role assumptions.

    Science.gov (United States)

    Faccio, Elena; Bordin, Elena; Cipolletta, Sabrina

    2013-01-01

    This study explores the parental role of transsexuals and compares this to common assumptions about transsexuality and parentage. We conducted semi-structured interviews with 14 male-to-female transsexuals and 14 men, half parents and half non-parents, in order to explore four thematic areas: self-representation of the parental role, the description of the transsexual as a parent, the common representations of transsexuals as a parent, and male and female parental stereotypes. We conducted thematic and lexical analyses of the interviews using Taltac2 software. The results indicate that social representations of transsexuality and parenthood have a strong influence on processes of self-representation. Transsexual parents accurately understood conventional male and female parental prototypes and saw themselves as competent, responsible parents. They constructed their role based on affection toward the child rather than on the complementary role of their wives. In contrast, men's descriptions of transsexual parental roles were simpler and the descriptions of their parental role coincided with their personal experiences. These results suggest that the transsexual journey toward parenthood involves a high degree of re-adjustment, because their parental role does not coincide with a conventional one.

  7. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  8. Challenging assumptions from emotion dysregulation psychological treatments.

    Science.gov (United States)

    Neacsiu, Andrada D; Smith, Megan; Fang, Caitlin M

    2017-09-01

    Contemporary treatments assume that the inability to downregulate negative emotional arousal is a key problem in the development and maintenance of psychopathology and that lack of effective regulation efforts and a preference to use maladaptive regulation strategies is a primary mechanism. Though ubiquitous, there is limited empirical evidence to support this assumption. Therefore, the aim of the current study was to examine whether self-reported emotion dysregulation equated to difficulties reducing emotional arousal during a behavioral task and to primary use of maladaptive strategies to manage negative emotions. 44 anxious and depressed adults with high emotion dysregulation induced negative distress using autobiographic memory recall. After induction, participants were instructed to downregulate but were not provided any specific instructions in strategies to use. Self-reported emotional arousal was assessed before and after induction and after regulation. Qualitative descriptions of regulation efforts were collected and codedinto effective and maladaptive strategies. The task was successful in inducing emotional arousal and participants were successful in their efforts to down regulate negative emotions. Additionally, effective regulation strategies were used more frequently than maladaptive strategies. Data collected was exclusively self-report and the sample size was small. Adults who report high emotion dysregulation may still have effective emotion regulation strategies in their behavioral repertoire and are more likely to engage in these effective strategies when given an unspecific prompt to regulate negative emotional arousal. Despite reporting problems with emotion regulation, adults with anxiety and depression can successfully downregulate distress when prompted to do so. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Psychopatholgy, fundamental assumptions and CD-4 T lymphocyte ...

    African Journals Online (AJOL)

    e.g. PTSD, depression, substance abuse) was associated with more negative fundamental assumptions. A secondary aim was to investigate whether psychopathology and fundamental assumptions were associated with a lower CD4 count.

  10. Roy's specific life values and the philosophical assumption of humanism.

    Science.gov (United States)

    Hanna, Debra R

    2013-01-01

    Roy's philosophical assumption of humanism, which is shaped by the veritivity assumption, is considered in terms of her specific life values and in contrast to the contemporary view of humanism. Like veritivity, Roy's philosophical assumption of humanism unites a theocentric focus with anthropological values. Roy's perspective enriches the mainly secular, anthropocentric assumption. In this manuscript, the basis for Roy's perspective of humanism will be discussed so that readers will be able to use the Roy adaptation model in an authentic manner.

  11. 46 CFR 67.239 - Requirements for assumptions of mortgages.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Requirements for assumptions of mortgages. 67.239... MEASUREMENT OF VESSELS DOCUMENTATION OF VESSELS Filing and Recording of Instruments-Mortgages, Preferred Mortgages, and Related Instruments § 67.239 Requirements for assumptions of mortgages. An assumption of...

  12. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  13. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, Rampal S.; Alonso, David; McKane, Alan J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  14. Solid-state voltammetry in a three electrode electrochemical cell-on-a-chip with a microlithographically defined microelectrode

    Energy Technology Data Exchange (ETDEWEB)

    Morita, M.; Longmire, M.L.; Murray, R.W.

    1988-12-15

    Microlithographic procedures are employed to fabricate electrochemical microcells with three coplanar gold-film electrodes (working, auxiliary, and reference) resting on a silicon water. The working electrode can be a 11 /times/ 516 or 11 /times/ 256 ..mu..m microband or a 11 /times/ 11 ..mu..microsquare, and the cell solution of a film of an ionically conducting polymer such as lithium triflate dissolved in poly(ethylene oxide). The cyclic voltammetry and chronoamperometry of electroactive species dissolved in these cell media are reported, and the chronoamperometric data are compared to theoretical current-time curves calculated for band microelectrodes. Cyclic voltammetry in an aqueous droplet on the microsquare gives pseudo-steady-state-waves, the limiting currents of which quantitatively agree with those for microdisk electrodes of equivalent area. The microlithographically defined microcells can be produced in quantity and can be considered disposable electroanalytical devices, which can be advantageous for routine electroanalytical applications involving electrode-poisoning reaction systems.

  15. Assessing Statistical Model Assumptions under Climate Change

    Science.gov (United States)

    Varotsos, Konstantinos V.; Giannakopoulos, Christos; Tombrou, Maria

    2016-04-01

    The majority of the studies assesses climate change impacts on air-quality using chemical transport models coupled to climate ones in an off-line mode, for various horizontal resolutions and different present and future time slices. A complementary approach is based on present-day empirical relations between air-pollutants and various meteorological variables which are then extrapolated to the future. However, the extrapolation relies on various assumptions such as that these relationships will retain their main characteristics in the future. In this study we focus on the ozone-temperature relationship. It is well known that among a number of meteorological variables, temperature is found to exhibit the highest correlation with ozone concentrations. This has led, in the past years, to the development and application of statistical models with which the potential impact of increasing future temperatures on various ozone statistical targets was examined. To examine whether the ozone-temperature relationship retains its main characteristics under warmer temperatures we analyze the relationship during the heatwaves events of 2003 and 2006 in Europe. More specifically, we use available gridded daily maximum temperatures (E-OBS) and hourly ozone observations from different non-urban stations (EMEP) within the areas that were impacted from the two heatwave events. In addition, we compare the temperature distributions of the two events with temperatures from two different future time periods 2021-2050 and 2071-2100 from a number of regional climate models developed under the framework of the Cordex initiative (http://www.cordex.org) with a horizontal resolution of 12 x 12km, based on different IPCC RCPs emissions scenarios. A statistical analysis is performed on the ozone-temperature relationship for each station and for the two aforementioned years which are then compared against the ozone-temperature relationships obtained from the rest of the available dataseries. The

  16. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  17. Assumptions, Trust, and Names in Computer Security Protocols

    Science.gov (United States)

    2011-06-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS ASSUMPTIONS, TRUST, AND NAMES IN COMPUTER SECURITY PROTOCOLS by Charles Dylan Shearer June 2011...21–6–2011 Master’s Thesis 27-09-2010—17-06-2011 Assumptions, Trust, and Names in Computer Security Protocols Charles Dylan Shearer Naval Postgraduate...future work. computer security , protocol, assumption, belief, trust, naming Unclassified Unclassified Unclassified UU 85 i THIS PAGE INTENTIONALLY LEFT

  18. Investigating the Assumptions of Uses and Gratifications Research

    Science.gov (United States)

    Lometti, Guy E.; And Others

    1977-01-01

    Discusses a study designed to determine empirically the gratifications sought from communication channels and to test the assumption that individuals differentiate channels based on gratifications. (MH)

  19. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  20. Evaluation of Algebraic Reynolds Stress Model Assumptions Using Experimental Data

    Science.gov (United States)

    Jyoti, B.; Ewing, D.; Matovic, D.

    1996-11-01

    The accuracy of Rodi's ASM assumption is examined by evaluating the terms in Reynolds stress transport equation and their modelled counterparts. The basic model assumption: Dτ_ij/Dt + partial T_ijl/partial xl = (τ_ij/k )(Dk/Dt + partial Tl /partial xl ) (Rodi( Rodi W., ZAMM.), 56, pp. 219-221, 1976.), can also be broken into two stronger assumptions: Da_ij/Dt = 0 and (2) partial T_ijl/partial xl = (τ_ij/k )(partial Tl /partial xl ) (e.g. Taulbee( Taulbee D. B., Phys. of Fluids), 4(11), pp. 2555-2561, 1992.). Fu et al( Fu S., Huang P.G., Launder B.E. & Leschziner M.A., J. Fluid Eng.), 110(2), pp. 216-221., 1988 examined the accuracy of Rodi's assumption using the results of RSM calculation of axisymmetric jets. Since the RSM results did not accurately predict the experimental results either, it may be useful to examine the basic ASM model assumptions using experimental data. The database of Hussein, Capp and George( Hussein H., Capp S. & George W., J.F.M.), 258, pp.31-75., 1994. is sufficiently detailed to evaluate the terms of Reynolds stress transport equations individually, thus allowing both Rodi's and the stronger assumptions to be tested. For this flow assumption (1) is well satisfied for all the components (including \\overlineuv); however, assumption (2) does not seem as well satisfied.

  1. Basic assumptions in statistical analyses of data in biomedical ...

    African Journals Online (AJOL)

    If one or more assumptions are violated, an alternative procedure must be used to obtain valid results. This article aims at highlighting some basic assumptions in statistical analyses of data in biomedical sciences. Keywords: samples, independence, non-parametric, parametric, statistical analyses. Int. J. Biol. Chem. Sci. Vol.

  2. 40 CFR 761.2 - PCB concentration assumptions for use.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false PCB concentration assumptions for use..., AND USE PROHIBITIONS General § 761.2 PCB concentration assumptions for use. (a)(1) Any person may..., oil-filled cable, and rectifiers whose PCB concentration is not established contain PCBs at < 50 ppm...

  3. 7 CFR 1779.88 - Transfers and assumptions.

    Science.gov (United States)

    2010-01-01

    ... AGRICULTURE (CONTINUED) WATER AND WASTE DISPOSAL PROGRAMS GUARANTEED LOANS § 1779.88 Transfers and assumptions... transfer fees will be a standard fee plus the cost of the appraisal. (2) The lender will collect and submit... be filed, registered, or recorded as appropriate and legally permissible. (4) The assumption will be...

  4. Performance Appraisal Is Based on Five Major Assumptions.

    Science.gov (United States)

    Silver, Harvey A.

    This review of the performance appraisal process discusses the major assumptions on which performance appraisal is based, the general goals of performance appraisal, and the characteristics of effective performance appraisal programs. The author stresses the dependence of the process on the assumption that human behavior can be changed; he…

  5. How do people learn at the workplace? Investigating four workplace learning assumptions

    NARCIS (Netherlands)

    Kooken, J.P.; Duval, Erik; Ley, Tobias; Klamma, Ralf; de Hoog, Robert

    2007-01-01

    Any software development project is based on assumptions about the state of the world that probably will hold when it is fielded. Investigating whether they are true can be seen as an important task. This paper describes how an empirical investigation was designed and conducted for the EU funded

  6. International Work Group in Death, Dying and Bereavement: Assumptions and Principles Underlying Standards for Terminal Care.

    Science.gov (United States)

    Essence: Issues in the Study of Ageing, Dying, and Death, 1979

    1979-01-01

    States the general assumptions and principles underlying standards for terminal care for those who have initiated or are planning programs for the terminally ill. These are urgently needed in view of the rapid development of the hospice movement. Patient-oriented, family-oriented, and staff-oriented standards are also enumerated. (BEF)

  7. Depletion sampling in stream ecosystems: assumptions and techniques

    Science.gov (United States)

    Raleigh, Robert F.; Short, Cathleen

    1981-01-01

    Reliable fish and invertebrate population estimates depend on meeting the assumptions of the methods used for organism capture and data analysis. A review of several population estimation studies has indicated that assumptions of the removal method for population estimation are often violated. This paper outlines (1) procedures to assist in meeting the removal method assumptions (2) an economical procedure to obtain reliable invertebrate population estimates by the removal method and (3) a computer program (CAPTURE) designed to test the adequacy of study design and to analyze capture data where variable probability of removal exists.

  8. The Immoral Assumption Effect: Moralization Drives Negative Trait Attributions.

    Science.gov (United States)

    Meindl, Peter; Johnson, Kate M; Graham, Jesse

    2016-04-01

    Jumping to negative conclusions about other people's traits is judged as morally bad by many people. Despite this, across six experiments (total N = 2,151), we find that multiple types of moral evaluations--even evaluations related to open-mindedness, tolerance, and compassion--play a causal role in these potentially pernicious trait assumptions. Our results also indicate that moralization affects negative-but not positive-trait assumptions, and that the effect of morality on negative assumptions cannot be explained merely by people's general (nonmoral) preferences or other factors that distinguish moral and nonmoral traits, such as controllability or desirability. Together, these results suggest that one of the more destructive human tendencies--making negative assumptions about others--can be caused by the better angels of our nature. © 2016 by the Society for Personality and Social Psychology, Inc.

  9. Target similarity effects: support for the parallel distributed processing assumptions.

    Science.gov (United States)

    Humphreys, M S; Tehan, G; O'Shea, A; Bolland, S W

    2000-07-01

    Recent research has begun to provide support for the assumptions that memories are stored as a composite and are accessed in parallel (Tehan & Humphreys, 1998). New predictions derived from these assumptions and from the Chappell and Humphreys (1994) implementation of these assumptions were tested. In three experiments, subjects studied relatively short lists of words. Some of the lists contained two similar targets (thief and theft) or two dissimilar targets (thief and steal) associated with the same cue (robbery). As predicted, target similarity affected performance in cued recall but not free association. Contrary to predictions, two spaced presentations of a target did not improve performance in free association. Two additional experiments confirmed and extended this finding. Several alternative explanations for the target similarity effect, which incorporate assumptions about separate representations and sequential search, are rejected. The importance of the finding that, in at least one implicit memory paradigm, repetition does not improve performance is also discussed.

  10. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  11. Verification of test battery of motoric assumptions for tennis

    OpenAIRE

    Křelina, Vladimír

    2016-01-01

    This thesis focuses on testing the motoric assumptions of junior category tennis players in certain sport games. The aim of this thesis is to compare the results of the motoric test regarding to three tennis players of various performance levels in chosen sport games. Thus define the substantive significance and specificity of each test towards tennis. The assumptions in the theoretical part are based on my Bachelor thesis. In said thesis I am dealing with the characteristics of tennis, the s...

  12. Developing animals flout prominent assumptions of ecological physiology.

    Science.gov (United States)

    Burggren, Warren W

    2005-08-01

    Every field of biology has its assumptions, but when they grow to be dogma, they can become constraining. This essay presents data-based challenges to several prominent assumptions of developmental physiologists. The ubiquity of allometry is such an assumption, yet animal development is characterized by rate changes that are counter to allometric predictions. Physiological complexity is assumed to increase with development, but examples are provided showing that complexity can be greatest at intermediate developmental stages. It is assumed that organs have functional equivalency in embryos and adults, yet embryonic structures can have quite different functions than inferred from adults. Another assumption challenged is the duality of neural control (typically sympathetic and parasympathetic), since one of these two regulatory mechanisms typically considerably precedes in development the appearance of the other. A final assumption challenged is the notion that divergent phylogeny creates divergent physiologies in embryos just as in adults, when in fact early in development disparate vertebrate taxa show great quantitative as well as qualitative similarity. Collectively, the inappropriateness of these prominent assumptions based on adult studies suggests that investigation of embryos, larvae and fetuses be conducted with appreciation for their potentially unique physiologies.

  13. Simplified subsurface modelling: data assimilation and violated model assumptions

    Science.gov (United States)

    Erdal, Daniel; Lange, Natascha; Neuweiler, Insa

    2017-04-01

    Integrated models are gaining more and more attention in hydrological modelling as they can better represent the interaction between different compartments. Naturally, these models come along with larger numbers of unknowns and requirements on computational resources compared to stand-alone models. If large model domains are to be represented, e.g. on catchment scale, the resolution of the numerical grid needs to be reduced or the model itself needs to be simplified. Both approaches lead to a reduced ability to reproduce the present processes. This lack of model accuracy may be compensated by using data assimilation methods. In these methods observations are used to update the model states, and optionally model parameters as well, in order to reduce the model error induced by the imposed simplifications. What is unclear is whether these methods combined with strongly simplified models result in completely data-driven models or if they can even be used to make adequate predictions of the model state for times when no observations are available. In the current work we consider the combined groundwater and unsaturated zone, which can be modelled in a physically consistent way using 3D-models solving the Richards equation. For use in simple predictions, however, simpler approaches may be considered. The question investigated here is whether a simpler model, in which the groundwater is modelled as a horizontal 2D-model and the unsaturated zones as a few sparse 1D-columns, can be used within an Ensemble Kalman filter to give predictions of groundwater levels and unsaturated fluxes. This is tested under conditions where the feedback between the two model-compartments are large (e.g. shallow groundwater table) and the simplification assumptions are clearly violated. Such a case may be a steep hill-slope or pumping wells, creating lateral fluxes in the unsaturated zone, or strong heterogeneous structures creating unaccounted flows in both the saturated and unsaturated

  14. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant......: (1) IT PPM as the top management marketplace, (2) IT PPM as the cause of social dilemmas at the lower organizational levels (3) IT PPM as polity between different organizational interests, (4) IT PPM as power relations that suppress creativity and diversity. Our metaphors can be used by practitioners...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases...

  15. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  16. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases......: (1) IT PPM as the top management marketplace, (2) IT PPM as the cause of social dilemmas at the lower organizational levels (3) IT PPM as polity between different organizational interests, (4) IT PPM as power relations that suppress creativity and diversity. Our metaphors can be used by practitioners...

  17. Universally composable protocols with relaxed set-up assumptions

    DEFF Research Database (Denmark)

    Barak, Boaz; Canetti, Ran; Nielsen, Jesper Buus

    2004-01-01

    infrastructure": parties have registered public keys, no single registration authority needs to be fully trusted, and no single piece of information has to be globally trusted and available. In addition, unlike known protocols in the CRS model, the proposed protocols guarantee some basic level of security even...... allow for UC protocols. We answer this question in the affirmative: we propose alternative and relaxed set-up assumptions and show that they suffice for reproducing the general feasibility results for UC protocols in the CRS model. These alternative assumptions have the flavor of a "public-key...

  18. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2017-01-01

    A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...... are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute...

  19. Roy's specific life values and the philosophical assumption of veritivity.

    Science.gov (United States)

    Hanna, Debra R

    2012-07-01

    Roman Catholic beliefs that form the basis for Roy's life values are discussed to help others understand veritivity and the Roy adaptation model more clearly. Veritivity, the main philosophical assumption of the Roy adaptation model, shapes it, and Roy's assumption of humanism in a unique way. Veritivity has a theocentric focus, with anthropological values. Roy views human beings as individuals in community with a loving Creator and with others. Truth, freedom, and moral ends are discussed in terms of veritivity and in terms of contemporary values.

  20. Challenging Teachers' Pedagogic Practice and Assumptions about Social Media

    Science.gov (United States)

    Cartner, Helen C.; Hallas, Julia L.

    2017-01-01

    This article describes an innovative approach to professional development designed to challenge teachers' pedagogic practice and assumptions about educational technologies such as social media. Developing effective technology-related professional development for teachers can be a challenge for institutions and facilitators who provide this…

  1. Extension of the GSMW Formula in Weaker Assumptions

    Directory of Open Access Journals (Sweden)

    Wenfeng Wang

    2014-01-01

    Full Text Available In this note, the generalized Sherman-Morrison-Woodbury (for short GSMW formula (A+YGZ∗⊙=A⊙−A⊙Y(G⊙+Z∗A⊙Y⊙Z∗A⊙ is extended under some assumptions weaker than those used by Duan, 2013.

  2. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  3. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2011-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of F_{q} ``in the exponent'' of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring R_f= \\F...

  4. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    This kind of distribution through assumption and commitment is not novel. This hap- pens routinely when one develops subsystems without having access to a global view of the system, for example, when different groups develop parts of a large program. For instance, suppose we are designing a receiver that receives a bit ...

  5. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed...

  6. Does Artificial Neural Network Support Connectivism's Assumptions?

    Science.gov (United States)

    AlDahdouh, Alaa A.

    2017-01-01

    Connectivism was presented as a learning theory for the digital age and connectivists claim that recent developments in Artificial Intelligence (AI) and, more specifically, Artificial Neural Network (ANN) support their assumptions of knowledge connectivity. Yet, very little has been done to investigate this brave allegation. Does the advancement…

  7. Origins and Traditions in Comparative Education: Challenging Some Assumptions

    Science.gov (United States)

    Manzon, Maria

    2018-01-01

    This article questions some of our assumptions about the history of comparative education. It explores new scholarship on key actors and ways of knowing in the field. Building on the theory of the social constructedness of the field of comparative education, the paper elucidates how power shapes our scholarly histories and identities.

  8. Operation Cottage: A Cautionary Tale of Assumption and Perceptual Bias

    Science.gov (United States)

    2015-01-01

    the planning process, but the planning staff must not become so wedded to their assumptions that they reject or overlook information that is not in...Allied deci- sionmakers misread and misunderstood Japanese intentions on Kiska, facilitating a needless loss of blood and treasure. Epilogue Two tense

  9. Bilingual Learners: How Our Assumptions Limit Their World.

    Science.gov (United States)

    Freeman, David; Freeman, Yvonne

    Five common assumptions are held by teachers about learners: (1) adults should choose what children need to learn; (2) oral language must be mastered before written language can be introduced; (3) real, whole language is too difficult for students learning language; (4) language learning is different in different languages, and simultaneous…

  10. 7 CFR 1980.476 - Transfer and assumptions.

    Science.gov (United States)

    2010-01-01

    ... number of the transferor and transferee. (m) Loan terms cannot be changed by the Assumption agreement... released from personal liability. Any new loan terms cannot exceed those authorized in this subpart. The... loan terms. (2) Certification that the lien position securing the guaranteed loan will be maintained or...

  11. 7 CFR 3575.88 - Transfers and assumptions.

    Science.gov (United States)

    2010-01-01

    ... Agency case number of the transferor and transferee. (5) Loan terms cannot be changed by the Assumption... transferor (including guarantor if it has not been released from personal liability). Any new loan terms... explanation of the reasons for the proposed change in the loan terms, and (ii) Certification that the lien...

  12. Observing gravitational-wave transient GW150914 with minimal assumptions

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. C.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, R.D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, A.L.S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, T.C; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brocki, P.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderon Bustillo, J.; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Qian; Chua, S. E.; Chung, E.S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, A.C.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, A.L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Debra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.A.; DeRosa, R. T.; Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M.G.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H. -B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, T. M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.M.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. R.; Flaminio, R.; Fletcher, M; Fournier, J. -D.; Franco, S; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.P.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.M.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; de Haas, R.; Hacker, J. J.; Buffoni-Hall, R.; Hall, E. D.; Hammond, G.L.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, P.J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hinder, I.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.H.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kefelian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan., S.; Khan, Z.; Khazanov, E. A.; Kijhunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.M.; King, E. J.; King, P. J.; Kinsey, M.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Laguna, P.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, R.; Leavey, S.; Lebigot, E. O.; Lee, C.H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lueck, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magana-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R.M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, J.C.; Moraru, D.; Gutierrez Moreno, M.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Gutierrez-Neri, M.; Neunzert, A.; Newton-Howes, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J.; Oh, S. H.; Ohme, F.; Oliver, M. B.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Page, J.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prolchorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Puerrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosinska, D.; Rowan, S.; Ruediger, A.; Ruggi, P.; Ryan, K.A.; Sachdev, P.S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabel, R.B.; Schofield, R. M. S.; Schoenbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, M.S.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shithriar, M. S.; Shaltev, M.; Shao, Z.M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, António Dias da; Simakov, D.; Singer, A; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, J.R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepanczyk, M. J.; Tacca, M.D.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Toyra, D.; Travasso, F.; Traylor, G.; Trifiro, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlhruch, H.; Vajente, G.; Valdes, G.; Van Bakel, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasuth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, R. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.M.; Wessels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, D.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J.L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. -P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be

  13. Measuring Productivity Change without Neoclassical Assumptions: A Conceptual Analysis

    NARCIS (Netherlands)

    B.M. Balk (Bert)

    2008-01-01

    textabstractThe measurement of productivity change (or difference) is usually based on models that make use of strong assumptions such as competitive behaviour and constant returns to scale. This survey discusses the basics of productivity measurement and shows that one can dispense with most if not

  14. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  15. Unpacking Assumptions in Research Synthesis: A Critical Construct Synthesis Approach

    Science.gov (United States)

    Wolgemuth, Jennifer R.; Hicks, Tyler; Agosto, Vonzell

    2017-01-01

    Research syntheses in education, particularly meta-analyses and best-evidence syntheses, identify evidence-based practices by combining findings across studies whose constructs are similar enough to warrant comparison. Yet constructs come preloaded with social, historical, political, and cultural assumptions that anticipate how research problems…

  16. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  17. Challenging Our Assumptions: Helping a Baby Adjust to Center Care.

    Science.gov (United States)

    Elliot, Enid

    2003-01-01

    Contends that assumptions concerning infants' adjustment to child center care need to be tempered with attention to observation, thought, and commitment to each individual baby. Describes the Options Daycare program for pregnant teens and young mothers. Presents a case study illustrating the need for openness in strategy and planning for…

  18. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  19. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    , such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing......A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results...

  20. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    Web usage mining concerns the discovery of common browsing patterns, i.e., pages requested in sequence, from web logs. To cope with the enormous amounts of data, several aggregated structures based on statistical models of web surfing have appeared, e.g., the Hypertext Probabilistic Grammar (HPG...... knowledge there has been no systematic study of the validity of the Markov assumption wrt.\\ web usage mining and the resulting quality of the mined browsing patterns. In this paper we systematically investigate the quality of browsing patterns mined from structures based on the Markov assumption. Formal......, that long rules are generally more distorted than shorter rules and that the model yield knowledge of a higher quality when applied to more random usage patterns. Thus we conclude that Markov-based structures for web usage mining are best suited for tasks demanding less accuracy such as pre...

  1. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... discusses how to create a Colored Petri Nets (CPN) model that formally expresses the following elements in a clearly separated structure: (1) assumptions about the behavior of the environment of the component, (2) real-time requirements for the component, and (3) a possible solution in terms of an algorithm...... for the component. The CPN model can be used to validate the environment assumptions and the requirements. The validation is performed by execution of the model during which traces of events and states are automatically generated and evaluated against the requirements....

  2. On the minimum set of physical assumption leading to the

    Energy Technology Data Exchange (ETDEWEB)

    Bobbio, S.; Marrucci, G. [Naples Univ. `Federico II` (Italy). School of Engineering

    1996-11-01

    The way of obtaining the Schroedinger equation for a quantum particle in an electromagnetic field is revisited, showing that very few physical assumptions are required. In fact, after having introduced the general formalism of non-relativistic quantum mechanics, it is shown that the structure of the Schroedinger equation for a spinless particle is obtained merely by requiring continuity of space and time, and covariance with respect to Galilean transformation. Both the Correspondence and Uncertainly principles then become `theorems`.

  3. How biological background assumptions influence scientific risk evaluation of stacked genetically modified plants: an analysis of research hypotheses and argumentations.

    Science.gov (United States)

    Rocca, Elena; Andersen, Fredrik

    2017-08-14

    Scientific risk evaluations are constructed by specific evidence, value judgements and biological background assumptions. The latter are the framework-setting suppositions we apply in order to understand some new phenomenon. That background assumptions co-determine choice of methodology, data interpretation, and choice of relevant evidence is an uncontroversial claim in modern basic science. Furthermore, it is commonly accepted that, unless explicated, disagreements in background assumptions can lead to misunderstanding as well as miscommunication. Here, we extend the discussion on background assumptions from basic science to the debate over genetically modified (GM) plants risk assessment. In this realm, while the different political, social and economic values are often mentioned, the identity and role of background assumptions at play are rarely examined. We use an example from the debate over risk assessment of stacked genetically modified plants (GM stacks), obtained by applying conventional breeding techniques to GM plants. There are two main regulatory practices of GM stacks: (i) regulate as conventional hybrids and (ii) regulate as new GM plants. We analyzed eight papers representative of these positions and found that, in all cases, additional premises are needed to reach the stated conclusions. We suggest that these premises play the role of biological background assumptions and argue that the most effective way toward a unified framework for risk analysis and regulation of GM stacks is by explicating and examining the biological background assumptions of each position. Once explicated, it is possible to either evaluate which background assumptions best reflect contemporary biological knowledge, or to apply Douglas' 'inductive risk' argument.

  4. Posttraumatic world assumptions among treatment-seeking refugees.

    Science.gov (United States)

    Ter Heide, F Jackie June; Sleijpen, Marieke; van der Aa, Niels

    2017-01-01

    The clinical relevance of negative changes in cognitions about oneself, others, and the world is reflected in the diagnostic criteria for posttraumatic stress disorder (PTSD) in the DSM-5 and complex posttraumatic stress disorder in the ICD-11. Although such changes in cognition have been posited to be especially relevant for traumatised refugees, few studies have examined this in refugee populations. The present study used a cross-sectional design to compare negative cognitions among 213 adult treatment-seeking refugees with those in previously published samples from the general population, veterans with combat-related PTSD, and whiplash victims. Measures included the World Assumptions Scale (WAS) and the Events and DSM-IV PTSD subscales of the Harvard Trauma Questionnaire (HTQ). Path models examined the relation of the WAS subscales to five demographic and trauma-related variables. Results showed that world assumptions were especially negative with regard to Benevolence of World, Benevolence of People, and Luck subscales, on which refugees scored lower than all reference samples. Differences between the refugee sample and the reference samples were smallest with regard to self-worth and self-controllability. World assumptions were associated with gender and PTSD symptom severity but not with age, length of residence in the Netherlands, and number of traumatic event types. The DSM-5 criterion of negative changes in belief about oneself, others, and the world appears more applicable to refugees than the more narrowly formulated ICD-11 criterion of diminished and defeated sense of self. Prevention and treatment efforts with refugees may need to be especially aimed at preventing a further decline of trust as well as restoration of trust in others and the world.

  5. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  6. Testing legal assumptions regarding the effects of dancer nudity and proximity to patron on erotic expression.

    Science.gov (United States)

    Linz, D; Blumenthal, E; Donnerstein, E; Kunkel, D; Shafer, B J; Lichtenstein, A

    2000-10-01

    A field experiment was conducted in order to test the assumptions by the Supreme Court in Barnes v. Glen Theatre, Inc. (1991) and the Ninth Circuit Court of Appeals in Colacurcio v. City of Kent (1999) that government restrictions on dancer nudity and dancer-patron proximity do not affect the content of messages conveyed by erotic dancers. A field experiment was conducted in which dancer nudity (nude vs. partial clothing) and dancer-patron proximity (4 feet; 6 in.; 6 in. plus touch) were manipulated under controlled conditions in an adult night club. After male patrons viewed the dances, they completed questionnaires assessing affective states and reception of erotic, relational intimacy, and social messages. Contrary to the assumptions of the courts, the results showed that the content of messages conveyed by the dancers was significantly altered by restrictions placed on dancer nudity and dancer-patron proximity. These findings are interpreted in terms of social psychological responses to nudity and communication theories of nonverbal behavior. The legal implications of rejecting the assumptions made by the courts in light of the findings of this study are discussed. Finally, suggestions are made for future research.

  7. Posttraumatic Growth and Shattered World Assumptions Among Ex-POWs

    DEFF Research Database (Denmark)

    Lahav, Y.; Bellin, Elisheva S.; Solomon, Z.

    2016-01-01

    world assumptions (WAs) and that the co-occurrence of high PTG and negative WAs among trauma survivors reflects reconstruction of an integrative belief system. The present study aimed to test these claims by investigating, for the first time, the mediating role of dissociation in the relation between...... PTG and WAs. Method: Former prisoners of war (ex-POWs; n = 158) and comparable controls (n = 106) were assessed 38 years after the Yom Kippur War. Results: Ex-POWs endorsed more negative WAs and higher PTG and dissociation compared to controls. Ex-POWs with posttraumatic stress disorder (PTSD...

  8. Diversion assumptions for high-powered research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Binford, F.T.

    1984-01-01

    This study deals with diversion assumptions for high-powered research reactors -- specifically, MTR fuel; pool- or tank-type research reactors with light-water moderator; and water, beryllium, or graphite reflectors, and which have a power level of 25 MW(t) or more. The objective is to provide assistance to the IAEA in documentation of criteria and inspection observables related to undeclared plutonium production in the reactors described above, including: criteria for undeclared plutonium production, necessary design information for implementation of these criteria, verification guidelines including neutron physics and heat transfer, and safeguards measures to facilitate the detection of undeclared plutonium production at large research reactors.

  9. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  10. First assumptions and overlooking competing causes of death

    DEFF Research Database (Denmark)

    Leth, Peter Mygind; Andersen, Anh Thao Nguyen

    2014-01-01

    Determining the most probable cause of death is important, and it is sometimes tempting to assume an obvious cause of death, when it readily presents itself, and stop looking for other competing causes of death. The case story presented in the article illustrates this dilemma. The first assumption...... of cause of death, which was based on results from bacteriology tests, proved to be wrong when the results from the forensic toxicology testing became available. This case also illustrates how post mortem computed tomography (PMCT) findings of radio opaque material in the stomach alerted the pathologist...

  11. Exploring power assumptions in the leadership and management debate

    OpenAIRE

    Edwards, G.; Schedlitzki, D.; Turnbull, S.; Gill, R.

    2015-01-01

    Purpose – The purpose of this paper is to take a fresh look at the leadership and management debate through exploring underlying power assumptions in the literature.\\ud \\ud Design/methodology/approach – The paper is a conceptual discussion that draws on the power-based literature to develop a framework to help conceptually understand leadership in relation to management.\\ud \\ud Findings – The paper highlights the historically clichéd nature of comments regarding conceptual similarities and di...

  12. Assumptions in quantitative analyses of health risks of overhead power lines

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, A.; Wardekker, J.A.; Van der Sluijs, J.P. [Department of Science, Technology and Society, Copernicus Institute, Utrecht University, Budapestlaan 6, 3584 CD Utrecht (Netherlands)

    2012-02-15

    One of the major issues hampering the formulation of uncontested policy decisions on contemporary risks is the presence of uncertainties in various stages of the policy cycle. In literature, different lines are suggested to address the problem of provisional and uncertain evidence. Reflective approaches such as pedigree analysis can be used to explore the quality of evidence when quantification of uncertainties is at stake. One of the issues where the quality of evidence impedes policy making, is the case of electromagnetic fields. In this case, a (statistical) association was suggested with an increased risk on childhood leukaemia in the vicinity of overhead power lines. A biophysical mechanism that could support this association was not found till date however. The Dutch government bases its policy concerning overhead power lines on the precautionary principle. For The Netherlands, previous studies have assessed the potential number of extra cases of childhood leukaemia due to the presence over overhead power lines. However, such a quantification of the health risk of EMF entails a (large) number of assumptions, both prior to and in the calculation chain. In this study, these assumptions were prioritized and critically appraised in an expert elicitation workshop, using a pedigree matrix for characterization of assumptions in assessments. It appeared that assumptions that were regarded to be important in quantifying the health risks show a high value-ladenness. The results show that, given the present state of knowledge, quantification of the health risks of EMF is premature. We consider the current implementation of the precautionary principle by the Dutch government to be adequate.

  13. The contour method cutting assumption: error minimization and correction

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Kastengren, Alan L [ANL

    2010-01-01

    The recently developed contour method can measure 2-D, cross-sectional residual-stress map. A part is cut in two using a precise and low-stress cutting technique such as electric discharge machining. The contours of the new surfaces created by the cut, which will not be flat if residual stresses are relaxed by the cutting, are then measured and used to calculate the original residual stresses. The precise nature of the assumption about the cut is presented theoretically and is evaluated experimentally. Simply assuming a flat cut is overly restrictive and misleading. The critical assumption is that the width of the cut, when measured in the original, undeformed configuration of the body is constant. Stresses at the cut tip during cutting cause the material to deform, which causes errors. The effect of such cutting errors on the measured stresses is presented. The important parameters are quantified. Experimental procedures for minimizing these errors are presented. An iterative finite element procedure to correct for the errors is also presented. The correction procedure is demonstrated on experimental data from a steel beam that was plastically bent to put in a known profile of residual stresses.

  14. The extended evolutionary synthesis: its structure, assumptions and predictions.

    Science.gov (United States)

    Laland, Kevin N; Uller, Tobias; Feldman, Marcus W; Sterelny, Kim; Müller, Gerd B; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-08-22

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the 'extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism-environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. © 2015 The Author(s).

  15. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  16. Economic Growth Assumptions in Climate and Energy Policy

    Directory of Open Access Journals (Sweden)

    Nir Y. Krakauer

    2014-03-01

    Full Text Available The assumption that the economic growth seen in recent decades will continue has dominated the discussion of future greenhouse gas emissions and the mitigation of and adaptation to climate change. Given that long-term economic growth is uncertain, the impacts of a wide range of growth trajectories should be considered. In particular, slower economic growth would imply that future generations will be relatively less able to invest in emissions controls or adapt to the detrimental impacts of climate change. Taking into consideration the possibility of economic slowdown therefore heightens the urgency of reducing greenhouse gas emissions now by moving to renewable energy sources, even if this incurs short-term economic cost. I quantify this counterintuitive impact of economic growth assumptions on present-day policy decisions in a simple global economy-climate model (Dynamic Integrated model of Climate and the Economy (DICE. In DICE, slow future growth increases the economically optimal present-day carbon tax rate and the utility of taxing carbon emissions, although the magnitude of the increase is sensitive to model parameters, including the rate of social time preference and the elasticity of the marginal utility of consumption. Future scenario development should specifically include low-growth scenarios, and the possibility of low-growth economic trajectories should be taken into account in climate policy analyses.

  17. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J.; Kahn, Yonatan; McCullough, Matthew

    2015-10-06

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  18. Uncertainties of simulated aerosol optical properties induced by assumptions on aerosol physical and chemical properties: an AQMEII-2 perspective

    Science.gov (United States)

    The calculation of aerosol optical properties from aerosol mass is a process subject to uncertainty related to necessary assumptions on the treatment of the chemical species mixing state, density, refractive index, and hygroscopic growth. In the framework of the AQMEII-2 model in...

  19. Imagining America: A Study of Assumptions and Expectations among English as a Second Language Students from Japan.

    Science.gov (United States)

    Fels, Michael D.

    A study examined a set of assumptions and expectations held by six male Japanese students (who came to the United States to learn to speak, read, and write English) upon their arrival and 6 months later. These students were considered at-risk because their marginal university status and their inability to function in an English language…

  20. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  1. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  2. Analysis of assumptions of recent tests of local realism

    Science.gov (United States)

    Bednorz, Adam

    2017-04-01

    Local realism in recent experiments is excluded on condition of freedom or randomness of choice combined with no signaling between observers by implementations of simple quantum models. Both no signaling and the underlying quantum model can be directly checked by analysis of experimental data. For particular tests performed on the data, it is shown that two of these experiments give the probability of the data under no-signaling (or choice independence in one of them) hypothesis at the level of 5%, accounting for the look-elsewhere effect, moderately suggesting that no signaling is violated with 95% confidence. On the other hand, the data from the two other experiments violate the assumption of the simple quantum model. Further experiments are necessary to clarify these issues and freedom and randomness of choice.

  3. Commentary: profiling by appearance and assumption: beyond race and ethnicity.

    Science.gov (United States)

    Sapién, Robert E

    2010-04-01

    In this issue, Acquaviva and Mintz highlight issues regarding racial profiling in medicine and how it is perpetuated through medical education: Physicians are taught to make subjective determinations of race and/or ethnicity in case presentations, and such assumptions may affect patient care. The author of this commentary believes that the discussion should be broadened to include profiling on the basis of general appearance. The author reports personal experiences as someone who has profiled and been profiled by appearance-sometimes by skin color, sometimes by other physical attributes. In the two cases detailed here, patient care could have been affected had the author not become aware of his practices in such situations. The author advocates raising awareness of profiling in the broader sense through training.

  4. Ancestral assumptions and the clinical uncertainty of evolutionary medicine.

    Science.gov (United States)

    Cournoyea, Michael

    2013-01-01

    Evolutionary medicine is an emerging field of medical studies that uses evolutionary theory to explain the ultimate causes of health and disease. Educational tools, online courses, and medical school modules are being developed to help clinicians and students reconceptualize health and illness in light of our evolutionary past. Yet clinical guidelines based on our ancient life histories are epistemically weak, relying on the controversial assumptions of adaptationism and advocating a strictly biophysical account of health. To fulfill the interventionist goals of clinical practice, it seems that proximate explanations are all we need to develop successful diagnostic and therapeutic guidelines. Considering these epistemic concerns, this article argues that the clinical relevance of evolutionary medicine remains uncertain at best.

  5. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, R.; Bluestein, J.; Rodriguez, N.; Knoke, S.

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  6. Elements and elasmobranchs: hypotheses, assumptions and limitations of elemental analysis.

    Science.gov (United States)

    McMillan, M N; Izzo, C; Wade, B; Gillanders, B M

    2017-02-01

    Quantifying the elemental composition of elasmobranch calcified cartilage (hard parts) has the potential to answer a range of ecological and biological questions, at both the individual and population level. Few studies, however, have employed elemental analyses of elasmobranch hard parts. This paper provides an overview of the range of applications of elemental analysis in elasmobranchs, discussing the assumptions and potential limitations in cartilaginous fishes. It also reviews the available information on biotic and abiotic factors influencing patterns of elemental incorporation into hard parts of elasmobranchs and provides some comparative elemental assays and mapping in an attempt to fill knowledge gaps. Directions for future experimental research are highlighted to better understand fundamental elemental dynamics in elasmobranch hard parts. © 2016 The Fisheries Society of the British Isles.

  7. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-02-14

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model.

  8. New media in strategy – mapping assumptions in the field

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Plesner, Ursula; Raviola, Elena

    2018-01-01

    themselves in either a deterministic or at volontaristic camp with regards to technology. Strategy is portrayed as either determined by new media or a matter of rationally using them. Additionally, most articles portray the organization nicely delineated entity, where new media are relevant either...... to new media. By contrast, there is relatively little attention to the assumptions behind strategic thinking in relation to new media. This article reviews the most influential strategy journals, asking how new media are conceptualized. It is shown that strategy scholars have a tendency to place......There is plenty of empirical evidence for claiming that new media make a difference for how strategy is conceived and executed. Furthermore, there is a rapidly growing body of literature that engages with this theme, and offers recommendations regarding the appropriate strategic actions in relation...

  9. Elasticity reconstruction: Beyond the assumption of local homogeneity

    Science.gov (United States)

    Sinkus, Ralph; Daire, Jean-Luc; Van Beers, Bernard E.; Vilgrain, Valerie

    2010-07-01

    Elasticity imaging is a novel domain which is currently gaining significant interest in the medical field. Most inversion techniques are based on the homogeneity assumption, i.e. the local spatial derivatives of the complex-shear modulus are ignored. This analysis presents an analytic approach in order to overcome this limitation, i.e. first order spatial derivatives of the real-part of the complex-shear modulus are taken into account. Resulting distributions in a gauged breast lesion phantom agree very well with the theoretical expectations. An in-vivo example of a cholangiocarcinoma demonstrates that the new approach provides maps of the viscoelastic properties which agree much better with expectations from anatomy.

  10. Breakdown of Hydrostatic Assumption in Tidal Channel with Scour Holes

    Directory of Open Access Journals (Sweden)

    Chunyan Li

    2016-10-01

    Full Text Available Hydrostatic condition is a common assumption in tidal and subtidal motions in oceans and estuaries.. Theories with this assumption have been largely successful. However, there is no definite criteria separating the hydrostatic from the non-hydrostatic regimes in real applications because real problems often times have multiple scales. With increased refinement of high resolution numerical models encompassing smaller and smaller spatial scales, the need for non-hydrostatic models is increasing. To evaluate the vertical motion over bathymetric changes in tidal channels and assess the validity of the hydrostatic approximation, we conducted observations using a vessel-based acoustic Doppler current profiler (ADCP. Observations were made along a straight channel 18 times over two scour holes of 25 m deep, separated by 330 m, in and out of an otherwise flat 8 m deep tidal pass leading to the Lake Pontchartrain over a time period of 8 hours covering part of the diurnal tidal cycle. Out of the 18 passages over the scour holes, 11 of them showed strong upwelling and downwelling which resulted in the breakdown of hydrostatic condition. The maximum observed vertical velocity was ~ 0.35 m/s, a high value in a tidal channel, and the estimated vertical acceleration reached a high value of 1.76×10-2 m/s2. Analysis demonstrated that the barotropic non-hydrostatic acceleration was dominant. The cause of the non-hydrostatic flow was the that over steep slopes. This demonstrates that in such a system, the bathymetric variation can lead to the breakdown of hydrostatic conditions. Models with hydrostatic restrictions will not be able to correctly capture the dynamics in such a system with significant bathymetric variations particularly during strong tidal currents.

  11. The medical assumption at the Foundation of Roe v. Wade & its implications for women's health.

    Science.gov (United States)

    Forsythe, Clark

    2014-01-01

    Too little attention has been paid over the past forty years to the complete lack of a factual record in Roe v. Wade and Doe v. Bolton, and to the Court's fundamental assumption that drove the outcome. The decision and opinions were driven by the medical claim that "abortion was safer than childbirth," which was raised for the first time in the briefs in the Supreme Court without any lower court record. This medical premise directly and profoundly shaped virtually every major aspect of Roe and Doe, including the creation of the trimester system and the prohibition of health and safety regulations in the first trimester. Because of this medical assumption, the Justices extended the right to abortion throughout pregnancy. It was key to the Court's historical rationale for a "right" to abortion. Because of this notion, the Justices gave abortion providers complete discretion to manage any issues of health and safety, and they prohibited public health officials from regulating abortion in the first trimester. This medical assumption was the most consequential factual assumption of the abortion decisions of 1973 and it has been assumed to be true in subsequent abortion decisions by the Court. The notion that "abortion is safer than childbirth" has become even less tenable for at least five reasons: (1) the dysfunctional abortion data reporting system in the United States that relies completely on voluntary reporting; (2) the incomparability of the published abortion mortality rate and the published maternal (childbirth) mortality rate; (3) medical data on the increasing rate of maternal mortality in the second trimester; (4) the growing body of international medical studies finding long-term risks to women from abortion; and (5) maternal mortality data from countries with superior abortion recordkeeping collection and reporting systems, which find a higher rate of abortion mortality than childbirth mortality. These concerns and the growth in international medical data

  12. Stream of consciousness: Quantum and biochemical assumptions regarding psychopathology.

    Science.gov (United States)

    Tonello, Lucio; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A

    2017-04-01

    The accepted paradigms of mainstream neuropsychiatry appear to be incompletely adequate and in various cases offer equivocal analyses. However, a growing number of new approaches are being proposed that suggest the emergence of paradigm shifts in this area. In particular, quantum theories of mind, brain and consciousness seem to offer a profound change to the current approaches. Unfortunately these quantum paradigms harbor at least two serious problems. First, they are simply models, theories, and assumptions, with no convincing experiments supporting their claims. Second, they deviate from contemporary mainstream views of psychiatric illness and do so in revolutionary ways. We suggest a possible way to integrate experimental neuroscience with quantum models in order to address outstanding issues in psychopathology. A key role is played by the phenomenon called the "stream of consciousness", which can be linked to the so-called "Gamma Synchrony" (GS), which is clearly demonstrated by EEG data. In our novel proposal, a unipolar depressed patient could be seen as a subject with an altered stream of consciousness. In particular, some clues suggest that depression is linked to an "increased power" stream of consciousness. It is additionally suggested that such an approach to depression might be extended to psychopathology in general with potential benefits to diagnostics and therapeutics in neuropsychiatry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. On Some Unwarranted Tacit Assumptions in Cognitive Neuroscience†

    Science.gov (United States)

    Mausfeld, Rainer

    2011-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. PMID:22435062

  14. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2012-01-01

    _f= \\mathbb{F}_{q}[X]/(f)$ where f is a degree-d polynomial. The decision problem that follows naturally reduces to the case where f is irreducible. This variant is called the d-DDH problem, where 1-DDH is standard DDH. We show in the generic group model that d-DDH is harder than DDH for d > 1 and that we...... obtain, in fact, an infinite hierarchy of progressively weaker assumptions whose complexities lie “between” DDH and CDH. This leads to a large number of new schemes because virtually all known DDH-based constructions can very easily be upgraded to be based on d-DDH. We use the same construction......-VDDH), which are based on f(X) = Xd, but with a twist to avoid problems with reducible polynomials. We show in the generic group model that d-VDDH is hard in bilinear groups and that the problems become harder with increasing d. We show that hardness of d-VDDH implies CCA-secure encryption, efficient Naor...

  15. PKreport: report generation for checking population pharmacokinetic model assumptions.

    Science.gov (United States)

    Sun, Xiaoyong; Li, Jun

    2011-05-16

    Graphics play an important and unique role in population pharmacokinetic (PopPK) model building by exploring hidden structure among data before modeling, evaluating model fit, and validating results after modeling. The work described in this paper is about a new R package called PKreport, which is able to generate a collection of plots and statistics for testing model assumptions, visualizing data and diagnosing models. The metric system is utilized as the currency for communicating between data sets and the package to generate special-purpose plots. It provides ways to match output from diverse software such as NONMEM, Monolix, R nlme package, etc. The package is implemented with S4 class hierarchy, and offers an efficient way to access the output from NONMEM 7. The final reports take advantage of the web browser as user interface to manage and visualize plots. PKreport provides 1) a flexible and efficient R class to store and retrieve NONMEM 7 output, 2) automate plots for users to visualize data and models, 3) automatically generated R scripts that are used to create the plots; 4) an archive-oriented management tool for users to store, retrieve and modify figures, 5) high-quality graphs based on the R packages, lattice and ggplot2. The general architecture, running environment and statistical methods can be readily extended with R class hierarchy. PKreport is free to download at http://cran.r-project.org/web/packages/PKreport/index.html.

  16. PKreport: report generation for checking population pharmacokinetic model assumptions

    Directory of Open Access Journals (Sweden)

    Li Jun

    2011-05-01

    Full Text Available Abstract Background Graphics play an important and unique role in population pharmacokinetic (PopPK model building by exploring hidden structure among data before modeling, evaluating model fit, and validating results after modeling. Results The work described in this paper is about a new R package called PKreport, which is able to generate a collection of plots and statistics for testing model assumptions, visualizing data and diagnosing models. The metric system is utilized as the currency for communicating between data sets and the package to generate special-purpose plots. It provides ways to match output from diverse software such as NONMEM, Monolix, R nlme package, etc. The package is implemented with S4 class hierarchy, and offers an efficient way to access the output from NONMEM 7. The final reports take advantage of the web browser as user interface to manage and visualize plots. Conclusions PKreport provides 1 a flexible and efficient R class to store and retrieve NONMEM 7 output, 2 automate plots for users to visualize data and models, 3 automatically generated R scripts that are used to create the plots; 4 an archive-oriented management tool for users to store, retrieve and modify figures, 5 high-quality graphs based on the R packages, lattice and ggplot2. The general architecture, running environment and statistical methods can be readily extended with R class hierarchy. PKreport is free to download at http://cran.r-project.org/web/packages/PKreport/index.html.

  17. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  18. Finite Element Simulations to Explore Assumptions in Kolsky Bar Experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Crum, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-05

    The chief purpose of this project has been to develop a set of finite element models that attempt to explore some of the assumptions in the experimental set-up and data reduction of the Kolsky bar experiment. In brief, the Kolsky bar, sometimes referred to as the split Hopkinson pressure bar, is an experimental apparatus used to study the mechanical properties of materials at high strain rates. Kolsky bars can be constructed to conduct experiments in tension or compression, both of which are studied in this paper. The basic operation of the tension Kolsky bar is as follows: compressed air is inserted into the barrel that contains the striker; the striker accelerates towards the left and strikes the left end of the barrel producing a tensile stress wave that propogates first through the barrel and then down the incident bar, into the specimen, and finally the transmission bar. In the compression case, the striker instead travels to the right and impacts the incident bar directly. As the stress wave travels through an interface (e.g., the incident bar to specimen connection), a portion of the pulse is transmitted and the rest reflected. The incident pulse, as well as the transmitted and reflected pulses are picked up by two strain gauges installed on the incident and transmitted bars as shown. By interpreting the data acquired by these strain gauges, the stress/strain behavior of the specimen can be determined.

  19. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-05-23

    Protein-protein interactions are critically dependent on just a few residues (“hot spots”) at the interfaces. Hot spots make a dominant contribution to the binding free energy and if mutated they can disrupt the interaction. As mutagenesis studies require significant experimental efforts, there exists a need for accurate and reliable computational hot spot prediction methods. Compared to the supervised hot spot prediction algorithms, the semi-supervised prediction methods can take into consideration both the labeled and unlabeled residues in the dataset during the prediction procedure. The transductive support vector machine has been utilized for this task and demonstrated a better prediction performance. To the best of our knowledge, however, none of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue prediction, by considering all the three semisupervised assumptions using nonlinear models. Our algorithm, IterPropMCS, works in an iterative manner. In each iteration, the algorithm first propagates the labels of the labeled residues to the unlabeled ones, along the shortest path between them on a graph, assuming that they lie on a nonlinear manifold. Then it selects the most confident residues as the labeled ones for the next iteration, according to the cluster and smoothness criteria, which is implemented by a nonlinear density estimator. Experiments on a benchmark dataset, using protein structure-based features, demonstrate that our approach is effective in predicting hot spots and compares favorably to other available methods. The results also show that our method outperforms the state-of-the-art transductive learning methods.

  20. Evaluation of assumptions for estimating chemical light extinction at U.S. national parks.

    Science.gov (United States)

    Lowenthal, Douglas; Zielinska, Barbara; Samburova, Vera; Collins, Don; Taylor, Nathan; Kumar, Naresh

    2015-03-01

    Studies were conducted at Great Smoky Mountains National Park (NP) (GRSM), Tennessee, Mount Rainier NP (MORA), Washington, and Acadia NP (ACAD), Maine, to evaluate assumptions used to estimate aerosol light extinction from chemical composition. The revised IMPROVE equation calculates light scattering from concentrations of PM2.5 sulfates, nitrates, organic carbon mass (OM), and soil. Organics are assumed to be nonhygroscopic. Organic carbon (OC) is converted to OM with a multiplier of 1.8. Experiments were conducted to evaluate assumptions on aerosol hydration state, the OM/OC ratio, OM hygroscopicity, and mass scattering efficiencies. Sulfates were neutralized by ammonium during winter at GRSM (W, winter) and at MORA during summer but were acidic at ACAD and GRSM (S, summer) during summer. Hygroscopic growth was mostly smooth and continuous, rarely exhibiting hysteresis. Deliquescence was not observed except infrequently during winter at GRSM (W). Water-soluble organic carbon (WSOC) was separated from bulk OC with solid-phase absorbents. The average OM/OC ratios were 2.0, 2.7, 2.1, and 2.2 at GRSM (S), GRSM (W), MORA, and ACAD, respectively. Hygroscopic growth factors (GF) at relative humidity (RH) 90% for aerosols generated from WSOC extracts averaged 1.19, 1.06, 1.13, and 1.16 at GRSM (S), GRSM (W), MORA, and ACAD, respectively. Thus, the assumption that OM is not hygroscopic may lead to underestimation of its contribution to light scattering. Studies at IMPROVE sites conducted in U.S. national parks showed that aerosol organics comprise more PM2.5 mass and absorb more water as a function of relative humidity than is currently assumed by the IMPROVE equation for calculating chemical light extinction. Future strategies for reducing regional haze may therefore need to focus more heavily on understanding the origins and control of anthropogenic sources of organic aerosols.

  1. Projecting the future of Canada's population: assumptions, implications, and policy

    Directory of Open Access Journals (Sweden)

    Beaujot, Roderic

    2003-01-01

    Full Text Available After considering the assumptions for fertility, mortality and international migration, this paper looks at implications of the evolving demographics for population growth, labour force, retirement, and population distribution. With the help of policies favouring gender equity and supporting families of various types, fertility in Canada could avoid the particularly low levels seen in some countries, and remain at levels closer to 1.6 births per woman. The prognosis in terms of both risk factors and treatment suggests further reductions in mortality toward a life expectancy of 85. On immigration, there are political interests for levels as high as 270,000 per year, while levels of 150,000 correspond to the long term post-war average. The future will see slower population growth, and due to migration more than natural increase. International migration of some 225,000 per year can enable Canada to avoid population decline, and sustain the size of the labour force, but all scenarios show much change in the relative size of the retired compared to the labour force population. According to the ratio of persons aged 20-64 to that aged 65 and over, there were seven persons at labour force ages per person at retirement age in 1951, compared to five in 2001 and probably less than 2.5 in 2051. Growth that is due to migration more so than natural increase will accentuate the urbanization trend and the unevenness of the population distribution over space. Past projections have under-projected the mortality improvements and their impact on the relative size of the population at older age groups. Policies regarding fertility, mortality and migration could be aimed at avoiding population decline and reducing the effect of aging, but there is lack of an institutional basis for policy that would seek to endogenize population.

  2. Cosmology without Einstein's assumption that inertial mass produces gravity

    Science.gov (United States)

    Ellis, Homer G.

    2015-06-01

    Giving up Einstein's assumption, implicit in his 1916 field equations, that inertial mass, even in its appearance as energy, is equivalent to active gravitational mass and therefore is a source of gravity allows revising the field equations to a form in which a positive cosmological constant is seen to (mis)represent a uniform negative net mass density of gravitationally attractive and gravitationally repulsive matter. Field equations with both positive and negative active gravitational mass densities of both primordial and continuously created matter, incorporated along with two scalar fields to 'relax the constraints' on the spacetime geometry, yield cosmological solutions that exhibit inflation, deceleration, coasting, acceleration, and a 'big bounce' instead of a 'big bang,' and provide good fits to a Hubble diagram of Type Ia supernovae data. The repulsive matter is identified as the back sides of the 'drainholes' introduced by the author in 1973 as solutions of those same field equations. Drainholes (prototypical examples of 'traversable wormholes') are topological tunnels in space which gravitationally attract on their front, entrance sides, and repel more strongly on their back, exit sides. The front sides serve both as the gravitating cores of the visible, baryonic particles of primordial matter and as the continuously created, invisible particles of the 'dark matter' needed to hold together the large-scale structures seen in the universe; the back sides serve as the misnamed 'dark energy' driving the current acceleration of the expansion of the universe. Formation of cosmic voids, walls, filaments and nodes is attributed to expulsion of drainhole entrances from regions populated by drainhole exits and accumulation of the entrances on boundaries separating those regions.

  3. On the derivation of approximations to cellular automata models and the assumption of independence.

    Science.gov (United States)

    Davies, K J; Green, J E F; Bean, N G; Binder, B J; Ross, J V

    2014-07-01

    Cellular automata are discrete agent-based models, generally used in cell-based applications. There is much interest in obtaining continuum models that describe the mean behaviour of the agents in these models. Previously, continuum models have been derived for agents undergoing motility and proliferation processes, however, these models only hold under restricted conditions. In order to narrow down the reason for these restrictions, we explore three possible sources of error in deriving the model. These sources are the choice of limiting arguments, the use of a discrete-time model as opposed to a continuous-time model and the assumption of independence between the state of sites. We present a rigorous analysis in order to gain a greater understanding of the significance of these three issues. By finding a limiting regime that accurately approximates the conservation equation for the cellular automata, we are able to conclude that the inaccuracy between our approximation and the cellular automata is completely based on the assumption of independence. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Local conservation scores without a priori assumptions on neutral substitution rates

    Directory of Open Access Journals (Sweden)

    Hagenauer Joachim

    2008-04-01

    Full Text Available Abstract Background Comparative genomics aims to detect signals of evolutionary conservation as an indicator of functional constraint. Surprisingly, results of the ENCODE project revealed that about half of the experimentally verified functional elements found in non-coding DNA were classified as unconstrained by computational predictions. Following this observation, it has been hypothesized that this may be partly explained by biased estimates on neutral evolutionary rates used by existing sequence conservation metrics. All methods we are aware of rely on a comparison with the neutral rate and conservation is estimated by measuring the deviation of a particular genomic region from this rate. Consequently, it is a reasonable assumption that inaccurate neutral rate estimates may lead to biased conservation and constraint estimates. Results We propose a conservation signal that is produced by local Maximum Likelihood estimation of evolutionary parameters using an optimized sliding window and present a Kullback-Leibler projection that allows multiple different estimated parameters to be transformed into a conservation measure. This conservation measure does not rely on assumptions about neutral evolutionary substitution rates and little a priori assumptions on the properties of the conserved regions are imposed. We show the accuracy of our approach (KuLCons on synthetic data and compare it to the scores generated by state-of-the-art methods (phastCons, GERP, SCONE in an ENCODE region. We find that KuLCons is most often in agreement with the conservation/constraint signatures detected by GERP and SCONE while qualitatively very different patterns from phastCons are observed. Opposed to standard methods KuLCons can be extended to more complex evolutionary models, e.g. taking insertion and deletion events into account and corresponding results show that scores obtained under this model can diverge significantly from scores using the simpler model

  5. White Noise Assumptions Revisited : Regression Models and Statistical Designs for Simulation Practice

    OpenAIRE

    Kleijnen, J.P.C.

    2006-01-01

    Classic linear regression models and their concomitant statistical designs assume a univariate response and white noise.By definition, white noise is normally, independently, and identically distributed with zero mean.This survey tries to answer the following questions: (i) How realistic are these classic assumptions in simulation practice?(ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions hold?(i...

  6. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  7. Exploring the Influence of Ethnicity, Age, and Trauma on Prisoners' World Assumptions

    Science.gov (United States)

    Gibson, Sandy

    2011-01-01

    In this study, the author explores world assumptions of prisoners, how these assumptions vary by ethnicity and age, and whether trauma history affects world assumptions. A random sample of young and old prisoners, matched for prison location, was drawn from the New Jersey Department of Corrections prison population. Age and ethnicity had…

  8. 7 CFR 1436.16 - Foreclosure, liquidation, assumptions, sales or conveyance, or bankruptcy.

    Science.gov (United States)

    2010-01-01

    ... subsequent borrower's ability to show a satisfactory credit history. An assumption of the loan may be... 7 Agriculture 10 2010-01-01 2010-01-01 false Foreclosure, liquidation, assumptions, sales or... FARM STORAGE FACILITY LOAN PROGRAM REGULATIONS § 1436.16 Foreclosure, liquidation, assumptions, sales...

  9. Philosophy of Technology Assumptions in Educational Technology Leadership: Questioning Technological Determinism

    Science.gov (United States)

    Webster, Mark David

    2013-01-01

    Scholars have emphasized that decisions about technology can be influenced by philosophy of technology assumptions, and have argued for research that critically questions technological determinist assumptions. Empirical studies of technology management in fields other than K-12 education provided evidence that philosophy of technology assumptions,…

  10. Self-transcendent positive emotions increase spirituality through basic world assumptions.

    Science.gov (United States)

    Van Cappellen, Patty; Saroglou, Vassilis; Iweins, Caroline; Piovesana, Maria; Fredrickson, Barbara L

    2013-01-01

    Spirituality has mostly been studied in psychology as implied in the process of overcoming adversity, being triggered by negative experiences, and providing positive outcomes. By reversing this pathway, we investigated whether spirituality may also be triggered by self-transcendent positive emotions, which are elicited by stimuli appraised as demonstrating higher good and beauty. In two studies, elevation and/or admiration were induced using different methods. These emotions were compared to two control groups, a neutral state and a positive emotion (mirth). Self-transcendent positive emotions increased participants' spirituality (Studies 1 and 2), especially for the non-religious participants (Study 1). Two basic world assumptions, i.e., belief in life as meaningful (Study 1) and in the benevolence of others and the world (Study 2) mediated the effect of these emotions on spirituality. Spirituality should be understood not only as a coping strategy, but also as an upward spiralling pathway to and from self-transcendent positive emotions.

  11. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  12. Discrete-State and Continuous Models of Recognition Memory: Testing Core Properties under Minimal Assumptions

    Science.gov (United States)

    Kellen, David; Klauer, Karl Christoph

    2014-01-01

    A classic discussion in the recognition-memory literature concerns the question of whether recognition judgments are better described by continuous or discrete processes. These two hypotheses are instantiated by the signal detection theory model (SDT) and the 2-high-threshold model, respectively. Their comparison has almost invariably relied on…

  13. 77 FR 64548 - Solicitation of Comments on Request for United States Assumption of Concurrent Federal Criminal...

    Science.gov (United States)

    2012-10-22

    ... authority to establish a complex jurisdictional scheme for the prosecution of crimes committed in Indian... country typically depends on several factors, including the nature of the crime; whether the alleged... violations of the General Crimes Act and the Major Crimes Act within that tribe's Indian country. Department...

  14. 78 FR 16867 - Solicitation of Comments on Request for United States Assumption of Concurrent Federal Criminal...

    Science.gov (United States)

    2013-03-19

    ... with respect to Indian tribes, however, and has exercised this authority to establish a complex jurisdictional scheme for the prosecution of crimes committed in Indian country. (The term ``Indian country'' is..., including the nature of the crime; whether the alleged offender, the victim, or both are Indian; and whether...

  15. 77 FR 64547 - Solicitation of Comments on Request for United States Assumption of Concurrent Federal Criminal...

    Science.gov (United States)

    2012-10-22

    ... a complex jurisdictional scheme for the prosecution of crimes committed in Indian country. (The term... on several factors, including the nature of the crime; whether the alleged offender, the victim, or... General Crimes Act and the Major Crimes Act within that tribe's Indian country. Department of Justice...

  16. 77 FR 24516 - Solicitation of Comments on Request for United States Assumption of Concurrent Federal Criminal...

    Science.gov (United States)

    2012-04-24

    ... exercised this authority to establish a complex jurisdictional scheme for the prosecution of crimes... jurisdiction in Indian country typically depends on several factors, including the nature of the crime; whether... prosecute violations of the General Crimes Act and the Major Crimes Act within that tribe's Indian country...

  17. 77 FR 24222 - Solicitation of Comments on Request for United States Assumption of Concurrent Federal Criminal...

    Science.gov (United States)

    2012-04-23

    ... a complex jurisdictional scheme for the prosecution of crimes committed in Indian country. (The term... on several factors, including the nature of the crime; whether the alleged offender, the victim, or... the General Crimes Act and the Major Crimes Act within that tribe's Indian country. Department of...

  18. 77 FR 24517 - Solicitation of Comments on Request for United States Assumption of Concurrent Federal Criminal...

    Science.gov (United States)

    2012-04-24

    ... respect to Indian tribes, however, and has exercised this authority to establish a complex jurisdictional scheme for the prosecution of crimes committed in Indian country. (The term ``Indian country'' is defined..., including the nature of the crime; whether the alleged offender, the victim, or both are Indian; and whether...

  19. 77 FR 32998 - Solicitation of Comments on Request for United States Assumption of Concurrent Federal Criminal...

    Science.gov (United States)

    2012-06-04

    ... a complex jurisdictional scheme for the prosecution of crimes committed in Indian country. (The term... on several factors, including the nature of the crime; whether the alleged offender, the victim, or... the General Crimes Act and the Major Crimes Act within that tribe's Indian country. Department of...

  20. Probabilistic choice models in health-state valuation research: background, theories, assumptions and applications

    NARCIS (Netherlands)

    Arons, A.M.M.; Krabbe, P.F.M.

    2013-01-01

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the

  1. Probabilistic choice models in health-state valuation research : background, theories, assumptions and applications

    NARCIS (Netherlands)

    Arons, Alexander M M; Krabbe, Paul F M

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the

  2. 7 Mass casualty incidents: a review of triage severity planning assumptions.

    Science.gov (United States)

    Hunt, Paul

    2017-12-01

    Recent events involving a significant number of casualties have emphasised the importance of appropriate preparation for receiving hospitals, especially Emergency Departments, during the initial response phase of a major incident. Development of a mass casualty resilience and response framework in the Northern Trauma Network included a review of existing planning assumptions in order to ensure effective resource allocation, both in local receiving hospitals and system-wide.Existing planning assumptions regarding categorisation by triage level are generally stated as a ratio for P1:P2:P3 of 25%:25%:50% of the total number of injured survivors. This may significantly over-, or underestimate, the number in each level of severity in the case of a large-scale incident. A pilot literature review was conducted of the available evidence from historical incidents in order to gather data regarding the confirmed number of overall casualties, 'critical' cases, admitted cases, and non-urgent or discharged cases. This data was collated and grouped by mechanism in order to calculate an appropriate severity ratio for each incident type. 12 articles regarding mass casualty incidents from the last two decades were identified covering three main incident types: (1) Mass transportation crash, (2) Building fire, and (3) Bomb and related terrorist attacks and involving a total of 3615 injured casualties. The overall mortality rate was calculated as 12.3%. Table 1 summarises the available patient casualty data from each of the specific incidents reported and calculated proportions of critical ('P1'), admitted ('P2'), and non-urgent or ambulatory cases ('P3'). Despite the heterogeneity of data and range of incident type there is sufficient evidence to suggest that current planning assumptions are incorrect and a more refined model is required. An important finding is the variation in proportion of critical cases depending upon the mechanism. For example, a greater than expected proportion

  3. Graded Hypercapnia-Calibrated BOLD: Beyond the Iso-metabolic Hypercapnic Assumption

    Directory of Open Access Journals (Sweden)

    Ian D. Driver

    2017-05-01

    Full Text Available Calibrated BOLD is a promising technique that overcomes the sensitivity of conventional fMRI to the cerebrovascular state; measuring either the basal level, or the task-induced response of cerebral metabolic rate of oxygen consumption (CMRO2. The calibrated BOLD method is susceptible to errors in the measurement of the calibration parameter M, the theoretical BOLD signal change that would occur if all deoxygenated hemoglobin were removed. The original and most popular method for measuring M uses hypercapnia (an increase in arterial CO2, making the assumption that it does not affect CMRO2. This assumption has since been challenged and recent studies have used a corrective term, based on literature values of a reduction in basal CMRO2 with hypercapnia. This is not ideal, as this value may vary across subjects and regions of the brain, and will depend on the level of hypercapnia achieved. Here we propose a new approach, using a graded hypercapnia design and the assumption that CMRO2 changes linearly with hypercapnia level, such that we can measure M without assuming prior knowledge of the scale of CMRO2 change. Through use of a graded hypercapnia gas challenge, we are able to remove the bias caused by a reduction in basal CMRO2 during hypercapnia, whilst simultaneously calculating the dose-wise CMRO2 change with hypercapnia. When compared with assuming no change in CMRO2, this approach resulted in significantly lower M-values in both visual and motor cortices, arising from significant dose-dependent hypercapnia reductions in basal CMRO2 of 1.5 ± 0.6%/mmHg (visual and 1.8 ± 0.7%/mmHg (motor, where mmHg is the unit change in end-tidal CO2 level. Variability in the basal CMRO2 response to hypercapnia, due to experimental differences and inter-subject variability, is accounted for in this approach, unlike previous correction approaches, which use literature values. By incorporating measurement of, and correction for, the reduction in basal CMRO2

  4. Retinal image registration under the assumption of a spherical eye.

    Science.gov (United States)

    Hernandez-Matas, Carlos; Zabulis, Xenophon; Triantafyllou, Areti; Anyfanti, Panagiota; Argyros, Antonis A

    2017-01-01

    We propose a method for registering a pair of retinal images. The proposed approach employs point correspondences and assumes that the human eye has a spherical shape. The image registration problem is formulated as a 3D pose estimation problem, solved by estimating the rigid transformation that relates the views from which the two images were acquired. Given this estimate, each image can be warped upon the other so that pixels with the same coordinates image the same retinal point. Extensive experimental evaluation shows improved accuracy over state of the art methods, as well as robustness to noise and spurious keypoint matches. Experiments also indicate the method's applicability to the comparative analysis of images from different examinations that may exhibit changes and its applicability to diagnostic support. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Are Quantum States Real?

    Science.gov (United States)

    Hardy, Lucien

    2013-01-01

    In this paper we consider theories in which reality is described by some underlying variables, λ. Each value these variables can take represents an ontic state (a particular state of reality). The preparation of a quantum state corresponds to a distribution over the ontic states, λ. If we make three basic assumptions, we can show that the distributions over ontic states corresponding to distinct pure states are nonoverlapping. This means that we can deduce the quantum state from a knowledge of the ontic state. Hence, if these assumptions are correct, we can claim that the quantum state is a real thing (it is written into the underlying variables that describe reality). The key assumption we use in this proof is ontic indifference — that quantum transformations that do not affect a given pure quantum state can be implemented in such a way that they do not affect the ontic states in the support of that state. In fact this assumption is violated in the Spekkens toy model (which captures many aspects of quantum theory and in which different pure states of the model have overlapping distributions over ontic states). This paper proves that ontic indifference must be violated in any model reproducing quantum theory in which the quantum state is not a real thing. The argument presented in this paper is different from that given in a recent paper by Pusey, Barrett and Rudolph. It uses a different key assumption and it pertains to a single copy of the system in question.

  6. When real life wind speed exceeds design wind assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Winther-Jensen, M.; Joergensen, E.R. [Risoe National Lab., Roskilde (Denmark)

    1999-03-01

    Most modern wind turbines are designed according to a standard or a set of standards to withstand the design loads with a defined survival probability. Mainly the loads are given by the wind conditions on the site defining the `design wind speeds`, normally including extreme wind speeds given as an average and a peak value. The extreme wind speeds are normally (e.g. in the upcoming IEC standard for wind turbine safety) defined as having a 50-year recurrence period. But what happens when the 100 or 10,000 year wind situation hits a wind turbine? Results on wind turbines of wind speeds higher than the extreme design wind speeds are presented based on experiences especially from the State of Gujarat in India. A description of the normal approach of designing wind turbines in accordance with the standards in briefly given in this paper with special focus on limitations and built-in safety levels. Based on that, other possibilities than just accepting damages on wind turbines exposed for higher than design wind speeds are mentioned and discussed. The presentation does not intend to give the final answer to this problem but is meant as an input to further investigations and discussions. (au)

  7. NONLINEAR MODELS FOR DESCRIPTION OF CACAO FRUIT GROWTH WITH ASSUMPTION VIOLATIONS

    Directory of Open Access Journals (Sweden)

    JOEL AUGUSTO MUNIZ

    2017-01-01

    Full Text Available Cacao (Theobroma cacao L. is an important fruit in the Brazilian economy, which is mainly cultivated in the southern State of Bahia. The optimal stage for harvesting is a major factor for fruit quality and the knowledge on its growth curves can help, especially in identifying the ideal maturation stage for harvesting. Nonlinear regression models have been widely used for description of growth curves. However, several studies in this subject do not consider the residual analysis, the existence of a possible dependence between longitudinal observations, or the sample variance heterogeneity, compromising the modeling quality. The objective of this work was to compare the fit of nonlinear regression models, considering residual analysis and assumption violations, in the description of the cacao (clone Sial-105 fruit growth. The data evaluated were extracted from Brito and Silva (1983, who conducted the experiment in the Cacao Research Center, Ilheus, State of Bahia. The variables fruit length, diameter and volume as a function of fruit age were studied. The use of weighting and incorporation of residual dependencies was efficient, since the modeling became more consistent, improving the model fit. Considering the first-order autoregressive structure, when needed, leads to significant reduction in the residual standard deviation, making the estimates more reliable. The Logistic model was the most efficient for the description of the cacao fruit growth.

  8. The Geographic Distribution of Ixodes scapularis (Acari: Ixodidae) Revisited: The Importance of Assumptions About Error Balance.

    Science.gov (United States)

    Peterson, A Townsend; Raghavan, Ram K

    2017-07-01

    The black-legged tick, Ixodes scapularis Say, is the primary vector of Borrelia burgdorferi, a spirochete that causes Lyme disease, in eastern North America. Lyme disease risk has generally been considered to be focused in the Northeast and the northern Midwest in the United States, yet the distribution of the vector extends considerably more broadly. A recent analysis of the distribution of the species using ecological niche modeling approaches painted an odd biogeographic picture, in which the species is distributed in a "rimming" distribution across the northern Midwest and Northeast, and along the Atlantic and Gulf coasts of the eastern United States, but not broadly in the interior of eastern North America. Here, we reanalyze the situation for this species, and demonstrate that the distribution estimated in the previous study was a consequence of assumptions about relative weights applied to different error types. A more appropriate error weighting scheme for niche modeling analyses, in which omission error is prioritized over commission error, shows a simpler distribution, in which the species ranges continuously across eastern North America; this distributional pattern is supported by independent occurrence data from the eastern Great Plains, in Kansas. We discuss implications for public health planning and intervention across the region, as well as for developing effective and predictive maps of vector distributions and pathogen transmission risk. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Tax subsidies for employer-sponsored health insurance: updated microsimulation estimates and sensitivity to alternative incidence assumptions.

    Science.gov (United States)

    Miller, G Edward; Selden, Thomas M

    2013-04-01

    To estimate 2012 tax expenditures for employer-sponsored insurance (ESI) in the United States and to explore the sensitivity of estimates to assumptions regarding the incidence of employer premium contributions. Nationally representative Medical Expenditure Panel Survey data from the 2005-2007 Household Component (MEPS-HC) and the 2009-2010 Insurance Component (MEPS IC). We use MEPS HC workers to construct synthetic workforces for MEPS IC establishments, applying the workers' marginal tax rates to the establishments' insurance premiums to compute the tax subsidy, in aggregate and by establishment characteristics. Simulation enables us to examine the sensitivity of ESI tax subsidy estimates to a range of scenarios for the within-firm incidence of employer premium contributions when workers have heterogeneous health risks and make heterogeneous plan choices. We simulate the total ESI tax subsidy for all active, civilian U.S. workers to be $257.4 billion in 2012. In the private sector, the subsidy disproportionately flows to workers in large establishments and establishments with predominantly high wage or full-time workforces. The estimates are remarkably robust to alternative incidence assumptions. The aggregate value of the ESI tax subsidy and its distribution across firms can be reliably estimated using simplified incidence assumptions. © Health Research and Educational Trust.

  10. 7 CFR 772.10 - Transfer and assumption-AMP loans.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Transfer and assumption-AMP loans. 772.10 Section 772..., DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS SERVICING MINOR PROGRAM LOANS § 772.10 Transfer and assumption—AMP loans. (a) Eligibility. The Agency may approve transfers and assumptions of AMP loans when: (1) The...

  11. Assumptions about Ecological Scale and Nature Knowing Best Hiding in Environmental Decisions

    Science.gov (United States)

    R. Bruce Hull; David P. Robertson; David Richert; Erin Seekamp; Gregory J. Buhyoff

    2002-01-01

    Assumptions about nature are embedded in people's preferences for environmental policy and management. The people we interviewed justified preservationist policies using four assumptions about nature knowing best: nature is balanced, evolution is progressive, technology is suspect, and the Creation is perfect. They justified interventionist policies using three...

  12. Behavioural assumptions in labour economics: Analysing social security reforms and labour market transitions

    NARCIS (Netherlands)

    van Huizen, T.M.

    2012-01-01

    The aim of this dissertation is to test behavioural assumptions in labour economics models and thereby improve our understanding of labour market behaviour. The assumptions under scrutiny in this study are derived from an analysis of recent influential policy proposals: the introduction of savings

  13. Dialogic or Dialectic? The Significance of Ontological Assumptions in Research on Educational Dialogue

    Science.gov (United States)

    Wegerif, Rupert

    2008-01-01

    This article explores the relationship between ontological assumptions and studies of educational dialogue through a focus on Bakhtin's "dialogic". The term dialogic is frequently appropriated to a modernist framework of assumptions, in particular the neo-Vygotskian or sociocultural tradition. However, Vygotsky's theory of education is dialectic,…

  14. Making Foundational Assumptions Transparent: Framing the Discussion about Group Communication and Influence

    Science.gov (United States)

    Meyers, Renee A.; Seibold, David R.

    2009-01-01

    In this article, the authors seek to augment Dean Hewes's (1986, 1996) intriguing bracketing and admirable larger effort to "return to basic theorizing in the study of group communication" by making transparent the foundational, and debatable, assumptions that underlie those models. Although these assumptions are addressed indirectly by Hewes, the…

  15. Sensitivity of the OMI ozone profile retrieval (OMO3PR) to a priori assumptions

    NARCIS (Netherlands)

    Mielonen, T.; De Haan, J.F.; Veefkind, J.P.

    2014-01-01

    We have assessed the sensitivity of the operational OMI ozone profile retrieval (OMO3PR) algorithm to a number of a priori assumptions. We studied the effect of stray light correction, surface albedo assumptions and a priori ozone profiles on the retrieved ozone profile. Then, we studied how to

  16. Food-based dietary guidelines : some assumptions tested for the Netherlands

    NARCIS (Netherlands)

    Löwik, M.R.H.; Hulshof, K.F.A.M.; Brussaard, J.H.

    1999-01-01

    Recently, the concept of food-based dietary guidelines has been introduced by WHO and FAO. For this concept, several assumptions were necessary. The validity and potential consequences of some of these assumptions are discussed in this paper on the basis of the Dutch National Food Consumption

  17. 77 FR 41270 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-07-13

    ... covered by title IV of the Employee Retirement Income Security Act of 1974. The interest assumptions in... Allocation of Assets in Single-Employer Plans (29 CFR Part 4044) prescribes interest assumptions for valuing benefits under terminating covered single- employer plans for purposes of allocation of assets under ERISA...

  18. 76 FR 2578 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2011-01-14

    ... title IV of the Employee Retirement Income Security Act of 1974. PBGC uses the interest assumptions in... payments interest assumptions for February 2011.\\1\\ \\1\\ Appendix B to PBGC's regulation on Allocation of... under terminating covered single- employer plans for purposes of allocation of assets under ERISA...

  19. 75 FR 63380 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2010-10-15

    ... title IV of the Employee Retirement Income Security Act of 1974. ] PBGC uses the interest assumptions in... payments interest assumptions for November 2010.\\1\\ \\1\\ Appendix B to PBGC's regulation on Allocation of... under terminating covered single- employer plans for purposes of allocation of assets under ERISA...

  20. 78 FR 49682 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2013-08-15

    ... of the Employee Retirement Income Security Act of 1974. The interest assumptions in the regulation... payments interest assumptions for September 2013.\\1\\ \\1\\ Appendix B to PBGC's regulation on Allocation of... under terminating covered single- employer plans for purposes of allocation of assets under ERISA...

  1. Investigating Teachers' and Students' Beliefs and Assumptions about CALL Programme at Caledonian College of Engineering

    Science.gov (United States)

    Ali, Holi Ibrahim Holi

    2012-01-01

    This study is set to investigate students' and teachers' perceptions and assumptions about newly implemented CALL Programme at the School of Foundation Studies, Caledonian College of Engineering, Oman. Two versions of questionnaire were administered to 24 teachers and 90 students to collect their beliefs and assumption about CALL programame. The…

  2. Sensitivity of TRIM projections to management, harvest, yield, and stocking adjustment assumptions.

    Science.gov (United States)

    Susan J. Alexander

    1991-01-01

    The Timber Resource Inventory Model (TRIM) was used to make several projections of forest industry timber supply for the Douglas-fir region. The sensitivity of these projections to assumptions about management and yields is discussed. A base run is compared to runs in which yields were altered, stocking adjustment was eliminated, harvest assumptions were changed, and...

  3. Making Sense out of Sex Stereotypes in Advertising: A Feminist Analysis of Assumptions.

    Science.gov (United States)

    Ferrante, Karlene

    Sexism and racism in advertising have been well documented, but feminist research aimed at social change must go beyond existing content analyses to ask how advertising is created. Analysis of the "mirror assumption" (advertising reflects society) and the "gender assumption" (advertising speaks in a male voice to female…

  4. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  5. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Science.gov (United States)

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  6. A violation of the conditional independence assumption in the two-high-threshold model of recognition memory.

    Science.gov (United States)

    Chen, Tina; Starns, Jeffrey J; Rotello, Caren M

    2015-07-01

    The 2-high-threshold (2HT) model of recognition memory assumes that test items result in distinct internal states: they are either detected or not, and the probability of responding at a particular confidence level that an item is "old" or "new" depends on the state-response mapping parameters. The mapping parameters are independent of the probability that an item yields a particular state (e.g., both strong and weak items that are detected as old have the same probability of producing a highest-confidence "old" response). We tested this conditional independence assumption by presenting nouns 1, 2, or 4 times. To maximize the strength of some items, "superstrong" items were repeated 4 times and encoded in conjunction with pleasantness, imageability, anagram, and survival processing tasks. The 2HT model failed to simultaneously capture the response rate data for all item classes, demonstrating that the data violated the conditional independence assumption. In contrast, a Gaussian signal detection model, which posits that the level of confidence that an item is "old" or "new" is a function of its continuous strength value, provided a good account of the data. (c) 2015 APA, all rights reserved.

  7. The privatization of spa companies in Poland - An evaluation of policy assumptions and implementation.

    Science.gov (United States)

    Szromek, Adam R; Romaniuk, Piotr; Hadzik, Andrzej

    2016-04-01

    The aim of this article is to present the course of privatization of spa companies in Poland during the period 2001-2011. We discuss assumptions of the privatization process, as well as actual implementation, having identified the process as chaotic and inconsistent with prior legal provisions. We found that in its applied form the process resulted in limitation of the therapeutic potential of spas, and reduction of the State's ability to implement health policy in a legally determined form. We also found that privatization potentially improved spa infrastructure standards and increases the tourist potential of spa resorts. We recommend that clear eligibility criteria are applied to institutions in the privatization process, as well as the provision of legal guarantees for access to spa services financed from public resources. Such guarantees should be made a public obligation, to ensure the availability of services for insured persons, and there should be an obligation to maintain a specific part of a given institution's potential for the needs of patients funded by public health insurance. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Transient productivity index for numerical well test simulations

    Energy Technology Data Exchange (ETDEWEB)

    Blanc, G.; Ding, D.Y.; Ene, A. [Institut Francais du Petrole, Pau (France)] [and others

    1997-08-01

    The most difficult aspect of numerical simulation of well tests is the treatment of the Bottom Hole Flowing (BHF) Pressure. In full field simulations, this pressure is derived from the Well-block Pressure (WBP) using a numerical productivity index which accounts for the grid size and permeability, and for the well completion. This productivity index is calculated assuming a pseudo-steady state flow regime in the vicinity of the well and is therefore constant during the well production period. Such a pseudo-steady state assumption is no longer valid for the early time of a well test simulation as long as the pressure perturbation has not reached several grid-blocks around the well. This paper offers two different solutions to this problem: (1) The first one is based on the derivation of a Numerical Transient Productivity Index (NTPI) to be applied to Cartesian grids; (2) The second one is based on the use of a Corrected Transmissibility and Accumulation Term (CTAT) in the flow equation. The representation of the pressure behavior given by both solutions is far more accurate than the conventional one as shown by several validation examples which are presented in the following pages.

  9. An Exploration of Dental Students' Assumptions About Community-Based Clinical Experiences.

    Science.gov (United States)

    Major, Nicole; McQuistan, Michelle R

    2016-03-01

    The aim of this study was to ascertain which assumptions dental students recalled feeling prior to beginning community-based clinical experiences and whether those assumptions were fulfilled or challenged. All fourth-year students at the University of Iowa College of Dentistry & Dental Clinics participate in community-based clinical experiences. At the completion of their rotations, they write a guided reflection paper detailing the assumptions they had prior to beginning their rotations and assessing the accuracy of their assumptions. For this qualitative descriptive study, the 218 papers from three classes (2011-13) were analyzed for common themes. The results showed that the students had a variety of assumptions about their rotations. They were apprehensive about working with challenging patients, performing procedures for which they had minimal experience, and working too slowly. In contrast, they looked forward to improving their clinical and patient management skills and knowledge. Other assumptions involved the site (e.g., the equipment/facility would be outdated; protocols/procedures would be similar to the dental school's). Upon reflection, students reported experiences that both fulfilled and challenged their assumptions. Some continued to feel apprehensive about treating certain patient populations, while others found it easier than anticipated. Students were able to treat multiple patients per day, which led to increased speed and patient management skills. However, some reported challenges with time management. Similarly, students were surprised to discover some clinics were new/updated although some had limited instruments and materials. Based on this study's findings about students' recalled assumptions and reflective experiences, educators should consider assessing and addressing their students' assumptions prior to beginning community-based dental education experiences.

  10. Mathematical assumptions versus biological reality: myths in affected sib pair linkage analysis.

    Science.gov (United States)

    Elston, Robert C; Song, Danhong; Iyengar, Sudha K

    2005-01-01

    Affected sib pair (ASP) analysis has become common ever since it was shown that, under very specific assumptions, ASPs afford a powerful design for linkage analysis. In 2003, Vieland and Huang, on the basis of a "fundamental heterogeneity equation," proved that heterogeneity and epistasis are confounded in ASP linkage analysis. A much more serious limitation of ASP linkage analysis is the implicit assumption that randomly sampled sib pairs share half their alleles identical by descent at any locus, whereas a critical assumption underlying Vieland and Huang's proof is that of joint Hardy-Weinberg equilibrium proportions at two trait loci. These are considered as examples of mathematical assumptions that may not always reflect biological reality. More-robust sib-pair designs and appropriate methods for their analysis have long been available.

  11. Washington International Renewable Energy Conference (WIREC) 2008 Pledges. Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bilello, Daniel E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cowlin, Shannon C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wise, Alison [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2008-08-01

    This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions resulting from more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy.

  12. Assumptions for Including Organic Food in the Gastronomic Offering of Istrian Agritourism

    Directory of Open Access Journals (Sweden)

    Pavlo Ružić

    2009-01-01

    Full Text Available The authors of this research analyze assumptions to including organic food in the gastronomic offering of Istrians agritourism. They assume that gastronomic offering of Istrian agritourism that includes organic food would be more acceptable and competitive on the tourist market. The authors analyzed their assumptions using surveys conducted in 2007 and 2008 on tourists in Istra to learn whether they prefer organic food, does organic food match modern tourist trends, and are they willing to pay more for it.

  13. Mathematical Assumptions versus Biological Reality: Myths in Affected Sib Pair Linkage Analysis

    OpenAIRE

    Elston, Robert C.; Song, Danhong; Sudha K. Iyengar

    2004-01-01

    Affected sib pair (ASP) analysis has become common ever since it was shown that, under very specific assumptions, ASPs afford a powerful design for linkage analysis. In 2003, Vieland and Huang, on the basis of a “fundamental heterogeneity equation,” proved that heterogeneity and epistasis are confounded in ASP linkage analysis. A much more serious limitation of ASP linkage analysis is the implicit assumption that randomly sampled sib pairs share half their alleles identical by descent at any ...

  14. Post-traumatic stress and world assumptions: the effects of religious coping.

    Science.gov (United States)

    Zukerman, Gil; Korn, Liat

    2014-12-01

    Religiosity has been shown to moderate the negative effects of traumatic event experiences. The current study was designed to examine the relationship between post-traumatic stress (PTS) following traumatic event exposure; world assumptions defined as basic cognitive schemas regarding the world; and self and religious coping conceptualized as drawing on religious beliefs and practices for understanding and dealing with life stressors. This study examined 777 Israeli undergraduate students who completed several questionnaires which sampled individual world assumptions and religious coping in addition to measuring PTS, as manifested by the PTSD check list. Results indicate that positive religious coping was significantly associated with more positive world assumptions, while negative religious coping was significantly associated with more negative world assumptions. Additionally, negative world assumptions were significantly associated with more avoidance symptoms, while reporting higher rates of traumatic event exposure was significantly associated with more hyper-arousal. These findings suggest that religious-related cognitive schemas directly affect world assumptions by creating protective shields that may prevent the negative effects of confronting an extreme negative experience.

  15. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    Directory of Open Access Journals (Sweden)

    Hazim Adnan Hashim

    2016-09-01

    Full Text Available The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led many individuals to build new kind of beliefs and assumptions about themselves and the world. Many writers have written about the human ordeals that followed this incident. Don DeLillo’s Falling Man reflects the traumatic repercussions of this disaster on Americans’ fundamental assumptions. The objective of this study is to examine the novel from the traumatic perspective that has afflicted the victims’ fundamental understandings of the world and the self. Individuals’ fundamental understandings could be changed or modified due to exposure to certain types of events like war, terrorism, political violence or even the sense of alienation. The Assumptive World theory of Ronnie Janoff-Bulman will be used as a framework to study the traumatic experience of the characters in Falling Man. The significance of the study lies in providing a new perception to the field of trauma that can help trauma victims to adopt alternative assumptions or reshape their previous ones to heal from traumatic effects.

  16. The contributions of interpersonal trauma exposure and world assumptions to predicting dissociation in undergraduates.

    Science.gov (United States)

    Lilly, Michelle M

    2011-01-01

    This study examines the relationship between world assumptions and trauma history in predicting symptoms of dissociation. It was proposed that cognitions related to the safety and benevolence of the world, as well as self-worth, would be related to the presence of dissociative symptoms, the latter of which were theorized to defend against threats to one's sense of safety, meaningfulness, and self-worth. Undergraduates from a midwestern university completed the Multiscale Dissociation Inventory, World Assumptions Scale, and Traumatic Life Events Questionnaire. Consistent with the hypotheses, world assumptions were related to the extent of trauma exposure and interpersonal trauma exposure in the sample but were not significantly related to non-interpersonal trauma exposure. World assumptions acted as a significant partial mediator of the relationship between trauma exposure and dissociation, and this relationship held when interpersonal trauma exposure specifically was considered. The factor structures of dissociation and world assumptions were also examined using principal component analysis, with the benevolence and self-worth factors of the World Assumptions Scale showing the strongest relationships with trauma exposure and dissociation. Clinical implications are discussed.

  17. Uncovering Underlying Assumptions Regarding Education and Technology in Educational Reform Efforts A conversation with Dr. Larry Johnson

    Directory of Open Access Journals (Sweden)

    Gabriela Melano

    2000-11-01

    Full Text Available Educational systems around the world, and specifically in the United States, have long been awaiting for genuine reform efforts. Technology is often perceived as a panacea, if not as a crucial instrument in any educational reform effort. In a conversation with one of his students, Doctor Johnson discusses how the underlying assumptions embedded in our current schooling practices need to be seriously reviewed before any technology strategy is considered. New understandings, as opposed to mere information, is what schools need to reach in order to transform themselves. Finally, Dr. Johnson provides two brief examples, one in the United States and another in México, were hermeneutical approaches have been used for educational reform endeavors.

  18. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  19. Faculty assumptions about the student characteristics required for success in introductory college biology

    Science.gov (United States)

    Daempfle, Peter August

    2000-10-01

    The clear and effective matching of post-secondary faculty requirements with incoming student characteristics is strongly related to undergraduate student success (National Research Council, 1996). It is especially a responsibility of secondary school biology teachers to prepare their students for introductory college biology (life science) requirements. If post-secondary faculty requirements for incoming student preparation are to be met, secondary teacher assumptions about what is required must be congruent with post-secondary faculty assumptions. The purpose of this study was to learn how well matched high school biology teachers' assumptions about the knowledge, abilities, and dispositions necessary for success in introductory college biology courses are with the knowledge, abilities, and dispositions faculty teaching introductory college biology courses assume are essential to success. The research questions were: (1) What are the secondary biology teachers' assumptions about the knowledge, abilities, and dispositions required for success in introductory college biology (life science) courses? (2) What are post-secondary biology teachers' assumptions about the knowledge, abilities, and dispositions required for success in introductory college biology (life science) courses? and (3) Do secondary biology teachers' assumptions about the knowledge, abilities, and dispositions required for success in introductory college biology (life science) courses match with those of post-secondary biology teachers'? To answer these questions, faculty were interviewed individually and the results of the interviews summarized. Then all faculty participants met in focus groups to discuss the summary data. The study included life science faculty participants from secondary and post-secondary institutions. The results of this study indicated five major findings about faculty assumptions. First, secondary and post-secondary faculty do not have the same assumptions about the

  20. The biosphere at Laxemar. Data, assumptions and models used in the SR-Can assessment

    Energy Technology Data Exchange (ETDEWEB)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern (eds.)

    2006-10-15

    This is essentially a compilation of a variety of reports concerning the site investigations, the research activities and information derived from other sources important for the safety assessment. The main objective is to present prerequisites, methods and data used, in the biosphere modelling for the safety assessment SR-Can at the Laxemar site. A major part of the report focuses on how site-specific data are used, recalculated or modified in order to be applicable in the safety assessment context; and the methods and sub-models that are the basis for the biosphere modelling. Furthermore, the assumptions made as to the future states of surface ecosystems are mainly presented in this report. A similar report is provided for the Forsmark area. This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. A central tool in the work is the description of the topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary in collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis.

  1. Contemporary assumptions on human nature and work and approach to human potential managing

    Directory of Open Access Journals (Sweden)

    Vujić Dobrila

    2006-01-01

    Full Text Available A general problem of this research is to identify if there is a relationship between the assumption on human nature and work (Mcgregor, Argyris, Schein, Steers and Porter and a general organizational model preference, as well as a mechanism of human resource management? This research was carried out in 2005/2006. The sample consisted of 317 subjects (197 managers, 105 highly educated subordinates and 15 entrepreneurs in 7 big enterprises in a group of small business enterprises differentiating in terms of the entrepreneur’s structure and a type of activity. A general hypothesis "that assumptions on human nature and work are statistically significant in connection to the preference approach (models, of work motivation commitment", has been confirmed. A specific hypothesis have been also confirmed: ·The assumptions on a human as a rational economic being are statistically significant in correlation with only two mechanisms of traditional models, the mechanism of method work control and the working discipline mechanism. ·Statistically significant assumptions on a human as a social being are correlated with all mechanisms of engaging employees, which belong to the model of the human relations, except the mechanism introducing the adequate type of prizes for all employees independently of working results. ·The assumptions on a human as a creative being are statistically significant, positively correlating with preference of two mechanisms belonging to the human resource model by investing into education and training and making conditions for the application of knowledge and skills. The young with assumptions on a human as a creative being prefer much broader repertoire of mechanisms belonging to the human resources model from the remaining category of subjects in the pattern. The connection between the assumption on human nature and preference models of engaging appears especially in the sub-pattern of managers, in the category of young subjects

  2. A priori assumptions about characters as a cause of incongruence between molecular and morphological hypotheses of primate interrelationships.

    Science.gov (United States)

    Tornow, Matthew A; Skelton, Randall R

    2012-01-01

    When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy.

  3. Examining recent expert elicitation, judgment guidelines: Value assumptions and the prospects for rationality

    Energy Technology Data Exchange (ETDEWEB)

    Fleming, P.A. [Creighton Univ., Omaha, NE (United States). Dept. of Philosophy

    1999-12-01

    Any examination of the role of values in decisions on risk must take into consideration the increasing reliance on the expert judgment method. Today, reliance on expert judgment is conspicuously present in the documents and work associated with site characterization of Yucca Mountain as a host for the United States' first high level nuclear waste repository. The NRC encourages the use of probabilistic risk assessment's state of the art technology as a complement to deterministic approaches to nuclear regulatory activities. It considers expert judgment as one of those technologies. At the last International Conference on High-Level Nuclear Waste Development several presentations report on the use of expert elicitation sessions held during 1997 at Yucca Mountain. Over a decade ago, few guidelines existed for Department of Energy work in expert judgment. In an analysis of these guidelines, I described the author-advocate's view of the role of values in this method of risk assessment. I suggested that the guidelines assume naive positivism. I noted that the creators of these guidelines also tend toward scientific realism in their apologetic tone that expert judgment falls short of representing the way nature is. I also pointed to a tendency toward what I call a heightened or super-realism. Normal science represents the way the world is and for expert judgment this is only likely so. Expert judgment method, however, is capable of truly capturing expertise in a representative sense. The purpose of this paper is to examine new guidelines from the Department of Energy and the Nuclear Regulatory Commission, with a view to eliciting the epistemological assumptions about the role of values and the status of objectivity claimed for this method. Do these new guidelines also adopt naive positivism? Does the inability to encounter raw, pure, value-neutral expert judgment, reveal itself in these guidelines? Or do these guidelines adopt the belief that values are not

  4. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    Science.gov (United States)

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Parameter Estimation in Groundwater: Classical, Bayesian, and Deterministic Assumptions and Their Impact on Management Policies

    Science.gov (United States)

    Loaiciga, Hugo A.; MariñO, Miguel A.

    1987-06-01

    This work deals with a theoretical analysis of parameter uncertainty in groundwater management models. The importance of adopting classical, Bayesian, or deterministic distribution assumptions on parameters is examined from a mathematical standpoint. In the classical case, the parameters (e.g., hydraulic conductivities or storativities) are assumed fixed (i.e., nonrandom) but unknown. The Bayesian assumption considers the parameters as random entities with some probability distribution. The deterministic case, also called certainty equivalence, assumes that the parameters are fixed and known. Previous work on the inverse problem has emphasized the numerical solution for parameter estimates with the subsequent aim to use them in the simulation of field variables. In this paper, the role of parameter uncertainty (measured by their statistical variability) in groundwater management decisions is investigated. It is shown that the classical, Bayesian, and deterministic assumptions lead to analytically different management solutions. Numerically, the difference between such solutions depends upon the covariance of the parameter estimates. The theoretical analyses of this work show the importance of specifying the proper distributional assumption on groundwater parameters, as well as the need for using efficient and statistically consistent methods to solve the inverse problem. The distributional assumptions on groundwater parameters and the covariance of their sample estimators are shown to be the dominant parameter uncertainty factors affecting groundwater management solutions. An example illustrates the conceptual findings of this work.

  6. Using community-based participatory research to prevent HIV disparities: assumptions and opportunities identified by the Latino partnership.

    Science.gov (United States)

    Rhodes, Scott D; Duck, Stacy; Alonzo, Jorge; Ulloa, Jason Daniel-; Aronson, Robert E

    2013-06-01

    HIV disproportionately affects vulnerable populations in the United States, including recently arrived immigrant Latinos. However, the current arsenal of effective approaches to increase adherence to risk-reduction strategies and treatment within Latino populations remains insufficient. Our community-based participatory research (CBPR) partnership blends multiple perspectives of community members, organizational representatives, local business leaders, and academic researchers to explore and intervene on HIV risk within Latino populations. We used CBPR to develop, implement, and evaluate 2 interventions that were found to be efficacious. We identified 7 assumptions of CBPR as an approach to research, including more authentic study designs, stronger measurement, and improved quality of knowledge gained; increased community capacity to tackle other health disparities; the need to focus on community priorities; increased participation and retention rates; more successful interventions; reduced generalizability; and increased sustainability. Despite the advancement of CBPR as an approach to research, key assumptions remain. Further research is needed to compare CBPR with other more-traditional approaches to research. Such research would move us from assuming the value of CBPR to identifying its actual value in health disparity reduction. After all, communities carrying a disproportionate burden of HIV, including immigrant Latino communities, deserve the best science possible.

  7. COMPETITION VERSUS COLLUSION: THE PARALLEL BEHAVIOUR IN THE ABSENCE OF THE SYMETRY ASSUMPTION

    Directory of Open Access Journals (Sweden)

    Romano Oana Maria

    2012-07-01

    Full Text Available Cartel detection is usually viewed as a key task of competition authorities. A special case of cartel is the parallel behaviour in terms of price selling. This type of behaviour is difficult to assess and its analysis has not always conclusive results. For evaluating such behaviour the data available are compared with theoretical values obtained by using a competitive or a collusive model. When different competitive or collusive models are considered, for the simplicity of calculations the economists use the symmetry assumption of costs and quantities produced / sold. This assumption has the disadvantage that the theoretical values obtained may deviate significantly from actual values (the real values on the market, which can sometimes lead to ambiguous results. The present paper analyses the parallel behaviour of economic agents in the absence of the symmetry assumption and study the identification of the model in this conditions.

  8. Culturally responsive suicide prevention in indigenous communities: unexamined assumptions and new possibilities.

    Science.gov (United States)

    Wexler, Lisa M; Gone, Joseph P

    2012-05-01

    Indigenous communities have significantly higher rates of suicide than non-Native communities in North America. Prevention and intervention efforts have failed to redress this disparity. One explanation is that these efforts are culturally incongruent for Native communities. Four prevalent assumptions that underpin professional suicide prevention may conflict with local indigenous understandings about suicide. Our experiences in indigenous communities led us to question assumptions that are routinely endorsed and promoted in suicide prevention programs and interventions. By raising questions about the universal relevance of these assumptions, we hope to stimulate exchange and inquiry into the character of this devastating public health challenge and to aid the development of culturally appropriate interventions in cross-cultural contexts.

  9. Commentary: Considering Assumptions in Associations Between Music Preferences and Empathy-Related Responding

    Directory of Open Access Journals (Sweden)

    Susan A O'Neill

    2015-09-01

    Full Text Available This commentary considers some of the assumptions underpinning the study by Clark and Giacomantonio (2015. Their exploratory study examined relationships between young people's music preferences and their cognitive and affective empathy-related responses. First, the prescriptive assumption that music preferences can be measured according to how often an individual listens to a particular music genre is considered within axiology or value theory as a multidimensional construct (general, specific, and functional values. This is followed by a consideration of the causal assumption that if we increase young people's empathy through exposure to prosocial song lyrics this will increase their prosocial behavior. It is suggested that the predictive power of musical preferences on empathy-related responding might benefit from a consideration of the larger pattern of psychological and subjective wellbeing within the context of developmental regulation across ontogeny that involves mutually influential individual—context relations.

  10. Testing implicit assumptions and explicit recommendations: the effects of probability information on risk perception.

    Science.gov (United States)

    Mevissen, Fraukje E F; Meertens, Ree M; Ruiter, Robert A C; Schaalma, Herman P

    2010-09-01

    When people underestimate a risk, often probability information is communicated because of the implicit assumption that it will raise people's risk estimates as a result of these objective facts. Also, scientific literature suggested that stressing the cumulative aspects of a risk might lead to higher susceptibility perceptions than only emphasizing the single incident probability. Empirical evidence that supports the effectiveness of these strategies, however, is lacking. In two studies, we examined whether cumulative and single incident probability information on sexually transmitted infections leads to higher perceived susceptibility for Chlamydia and HIV. Contrary to assumptions and recommendations, results showed that both types of probability information may result in people feeling less susceptible toward Chlamydia and having less intention to reduce the risk. For HIV, no effects were found. These results contradict implicit assumptions and explicit recommendations concerning the effects of probability information on risk perceptions.

  11. Potential sensitivity of bias analysis results to incorrect assumptions of nondifferential or differential binary exposures misclassification

    Science.gov (United States)

    Johnson, Candice Y.; Flanders, W. Dana; Strickland, Matthew J.; Honein, Margaret A.; Howards, Penelope P.

    2015-01-01

    Background Results of bias analyses for exposure misclassification are dependent on assumptions made during analysis. We describe how adjustment for misclassification is affected by incorrect assumptions about whether sensitivity and specificity are the same (nondifferential) or different (differential) for cases and non-cases. Methods We adjusted for exposure misclassification using probabilistic bias analysis, under correct and incorrect assumptions about whether exposure misclassification was differential or not. First, we used simulated datasets in which nondifferential and differential misclassification were introduced. Then, we used data on obesity and diabetes from the National Health and Nutrition Examination Survey (NHANES) in which both self-reported (misclassified) and measured (true) obesity were available, using literature estimates of sensitivity and specificity to adjust for bias. The ratio of odds ratio (ROR; observed odds ratio divided by true odds ratio) was used to quantify magnitude of bias, with ROR=1 signifying no bias. Results In the simulated datasets, under incorrect assumptions (e.g., assuming nondifferential misclassification when it was truly differential), results were biased, with RORs ranging from 0.18 to 2.46. In NHANES, results adjusted based on incorrect assumptions also produced biased results, with RORs ranging from 1.26 to 1.55; results were more biased when making these adjustments than when using the misclassified exposure values (ROR=0.91). Conclusions Making an incorrect assumption about nondifferential or differential exposure misclassification in bias analyses can lead to more biased results than if no adjustment is performed. In our analyses, incorporating uncertainty using probabilistic bias analysis was not sufficient to overcome this problem. PMID:25120106

  12. I Assumed You Knew: Teaching Assumptions as Co-Equal to Observations in Scientific Work

    Science.gov (United States)

    Horodyskyj, L.; Mead, C.; Anbar, A. D.

    2016-12-01

    Introductory science curricula typically begin with a lesson on the "nature of science". Usually this lesson is short, built with the assumption that students have picked up this information elsewhere and only a short review is necessary. However, when asked about the nature of science in our classes, student definitions were often confused, contradictory, or incomplete. A cursory review of how the nature of science is defined in a number of textbooks is similarly inconsistent and excessively loquacious. With such confusion both from the student and teacher perspective, it is no surprise that students walk away with significant misconceptions about the scientific endeavor, which they carry with them into public life. These misconceptions subsequently result in poor public policy and personal decisions on issues with scientific underpinnings. We will present a new way of teaching the nature of science at the introductory level that better represents what we actually do as scientists. Nature of science lessons often emphasize the importance of observations in scientific work. However, they rarely mention and often hide the importance of assumptions in interpreting those observations. Assumptions are co-equal to observations in building models, which are observation-assumption networks that can be used to make predictions about future observations. The confidence we place in these models depends on whether they are assumption-dominated (hypothesis) or observation-dominated (theory). By presenting and teaching science in this manner, we feel that students will better comprehend the scientific endeavor, since making observations and assumptions and building mental models is a natural human behavior. We will present a model for a science lab activity that can be taught using this approach.

  13. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  14. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  15. Boltzmann's H-theorem and the assumption of molecular chaos

    Energy Technology Data Exchange (ETDEWEB)

    Boozer, A D, E-mail: boozer@unm.edu [Department of Physics, University of New Mexico, Albuquerque, NM 87131 (United States)

    2011-09-15

    We describe a simple dynamical model of a one-dimensional ideal gas and use computer simulations of the model to illustrate two fundamental results of kinetic theory: the Boltzmann transport equation and the Boltzmann H-theorem. Although the model is time-reversal invariant, both results predict that the behaviour of the gas is time-asymmetric. We show that the assumption of molecular chaos is a necessary condition, but not a sufficient condition, for such time-asymmetric results to correctly describe the model, and we use computer simulations to investigate the conditions under which the assumption of molecular chaos holds.

  16. Tests of the frozen-flux and tangentially geostrophic assumptions using magnetic satellite data

    DEFF Research Database (Denmark)

    Chulliat, A.; Olsen, Nils; Sabaka, T.

    In 1984, Jean-Louis Le Mouël published a paper suggesting that the flow at the top of the Earth’s core is tangentially geostrophic, i.e., the Lorentz force is much smaller than the Coriolis force in this particular region of the core. This new assumption wassubsequently used to discriminate among...... understood. Using these constraints, we test the combined frozen-flux and tangentially geostrophic assumptions against recent, high-precision magnetic data provided by the and CHAMP satellites. The methodology involves building constrained field models using least-squares methods. Two types of models...

  17. Right to Development and Right to the City : A Proposal of Human Rights Categories Universal as assumptions Citizenship

    Directory of Open Access Journals (Sweden)

    Alessandra Danielle Carneiro dos Santos Hilário

    2016-05-01

    Full Text Available This article discusses the Right to the City, in a conceptual dimension and wide, and his dialectical relationship with the Universal Declaration of Human Rights of 1948 and its universalism and cultural relativism categories. The Right to the City (RtC is capitula- ted as one of the categories of the Human Right to Development from the compartments on Human Rights to descend from the Universal Declaration of Human Rights. Linked to this assumption, the discussion of universalism and cultural relativism theories bring to the fore important questions and considerations as to RtC condition, since in its current design and trampled by an evil legacy of neoliberalism, this right has demonstrated the need for authoritative action of the State, given the nature of fundamental human right of the third dimension. Through RtC, boasts up of economic, social and cultural rights, requiring a positive action of the state as compliance guarantee this human right. In this bias, relevant are discussions about the concept of law, morality, liberalism, effectiveness and universality of human rights theories and cultural relativism in dialectic with the RtC and its complexity. It starts from the assumption that the Universal Declaration of Human Rights and other statements which have descended universality (despite criticism, however, this har- vest, it is imperative closer examination of the concept, forecast, guarantee and effective- ness fundamental human rights, which may lead to a mixed application of universalistic and relativistic theories when analyzed from the perspective of these institutes. The Hu- man Right to Development (RtD presupposes notions of environmental sustainability and economic democracy, with qualified participation of social subjects (wide citizenship, seen continuous and articulated perspective as guiding the development process.

  18. Mutual assumptions and facts about nondisclosure among clinical supervisors and students in group supervision

    DEFF Research Database (Denmark)

    Nielsen, Geir Høstmark; Skjerve, Jan; Jacobsen, Claus Haugaard

    2009-01-01

    In the two preceding papers of this issue of Nordic Psychology the authors report findings from a study of nondisclosure among student therapists and clinical supervisors. The findings were reported separately for each group. In this article, the two sets of findings are held together and compared......, so as to draw a picture of mutual assumptions and facts about nondisclosure among students and supervisors....

  19. Credit Transfer amongst Students in Contrasting Disciplines: Examining Assumptions about Wastage, Mobility and Lifelong Learning

    Science.gov (United States)

    Di Paolo, Terry; Pegg, Ann

    2013-01-01

    While arrangements for credit transfer exist across the UK higher education sector, little is known about credit-transfer students or why they re-engage with study. Policy makers have cited credit transfer as a mechanism for reducing wastage and drop-out, but this paper challenges this assumption and instead examines how credit transfer serves…

  20. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    Science.gov (United States)

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  1. The Impact of Feedback Frequency on Learning and Task Performance: Challenging the "More Is Better" Assumption

    Science.gov (United States)

    Lam, Chak Fu; DeRue, D. Scott; Karam, Elizabeth P.; Hollenbeck, John R.

    2011-01-01

    Previous research on feedback frequency suggests that more frequent feedback improves learning and task performance (Salmoni, Schmidt, & Walter, 1984). Drawing from resource allocation theory (Kanfer & Ackerman, 1989), we challenge the "more is better" assumption and propose that frequent feedback can overwhelm an individual's cognitive resource…

  2. The Robustness of Confidence Intervals for Coefficient Alpha under Violation of the Assumption of Essential Parallelism.

    Science.gov (United States)

    Barchard, Kimberly A.; Hakstian, A. Ralph

    1997-01-01

    Two studies, both using Type 12 sampling, are presented in which the effects of violating the assumption of essential parallelism in setting confidence intervals are studied. Results indicate that as long as data manifest properties of essential parallelism, the two methods studied maintain precise Type I error control. (SLD)

  3. Psychometric Properties and Underlying Assumptions of Four Objective Measures of Fear of Success.

    Science.gov (United States)

    Paludi, Michele A.

    1984-01-01

    Reviews underlying reliability, theoretical assumptions, and validity of several "fear of success" tests. Also examines the validity of the fear of success construct itself. Concludes that continued use of the label "fear of success" only reinforces belief in a scientifically unfounded concept of intrapsychic difference between…

  4. Not Quite Normal: Consequences of Violating the Assumption of Normality in Regression Mixture Models

    Science.gov (United States)

    Van Horn, M. Lee; Smith, Jessalyn; Fagan, Abigail A.; Jaki, Thomas; Feaster, Daniel J.; Masyn, Katherine; Hawkins, J. David; Howe, George

    2012-01-01

    Regression mixture models, which have only recently begun to be used in applied research, are a new approach for finding differential effects. This approach comes at the cost of the assumption that error terms are normally distributed within classes. This study uses Monte Carlo simulations to explore the effects of relatively minor violations of…

  5. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  6. Teachers' Assumptions and Beliefs about the Delivery of Services to Exceptional Children.

    Science.gov (United States)

    Wilson, Anne Jordan; Silverman, Harry

    1991-01-01

    Interviews were conducted with 93 principals, classroom teachers, resource teachers, and special education teachers to determine their assumptions and beliefs about "restorative" and "preventive" delivery systems for exceptional children. Preventive belief systems were held more frequently by principals, followed by resource teachers and special…

  7. Measuring oblique incidence sound absorption using a local plane wave assumption

    NARCIS (Netherlands)

    Kuipers, E.R.; Wijnant, Ysbrand H.; de Boer, Andries

    2014-01-01

    In this paper a method for the measurement of the oblique incidence sound absorption coefficient is presented. It is based on a local field assumption, in which the acoustic field is locally approximated by one incident- and one specularly reflected plane wave. The amplitudes of these waves can be

  8. 7 CFR 765.402 - Transfer of security and loan assumption on same rates and terms.

    Science.gov (United States)

    2010-01-01

    ... and terms. An eligible applicant may assume an FLP loan on the same rates and terms as the original... 7 Agriculture 7 2010-01-01 2010-01-01 false Transfer of security and loan assumption on same rates and terms. 765.402 Section 765.402 Agriculture Regulations of the Department of Agriculture (Continued...

  9. Sensitivity Analysis and Bounding of Causal Effects with Alternative Identifying Assumptions

    Science.gov (United States)

    Jo, Booil; Vinokur, Amiram D.

    2011-01-01

    When identification of causal effects relies on untestable assumptions regarding nonidentified parameters, sensitivity of causal effect estimates is often questioned. For proper interpretation of causal effect estimates in this situation, deriving bounds on causal parameters or exploring the sensitivity of estimates to scientifically plausible…

  10. Common-Sense Chemistry: The Use of Assumptions and Heuristics in Problem Solving

    Science.gov (United States)

    Maeyer, Jenine Rachel

    2013-01-01

    Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build…

  11. What Mathematics Education Can Learn from Art: The Assumptions, Values, and Vision of Mathematics Education

    Science.gov (United States)

    Dietiker, Leslie

    2015-01-01

    Elliot Eisner proposed that educational challenges can be met by applying an artful lens. This article draws from Eisner's proposal to consider the assumptions, values, and vision of mathematics education by theorizing mathematics curriculum as an art form. By conceptualizing mathematics curriculum (both in written and enacted forms) as stories…

  12. 77 FR 74353 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2012-12-14

    ... Allocation of Assets in Single- Employer Plans for the first quarter of 2013. DATES: Effective January 1... of the Employee Retirement Income Security Act of 1974. The interest assumptions in the regulation... year under its regulation on Allocation of Assets in Single-Employer Plans (29 CFR part 4044) in a...

  13. 76 FR 70639 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Paying Benefits

    Science.gov (United States)

    2011-11-15

    ... single-employer plans covered by title IV of the Employee Retirement Income Security Act of 1974. The... PBGC's regulation on Allocation of Assets in Single-Employer Plans (29 CFR Part 4044) prescribes... allocation of assets under ERISA section 4044. Those assumptions are updated quarterly. The December 2011...

  14. Impact of one-layer assumption on diffuse reflectance spectroscopy of skin.

    Science.gov (United States)

    Hennessy, Ricky; Markey, Mia K; Tunnell, James W

    2015-02-01

    Diffuse reflectance spectroscopy (DRS) can be used to noninvasively measure skin properties. To extract skin properties from DRS spectra, you need a model that relates the reflectance to the tissue properties. Most models are based on the assumption that skin is homogenous. In reality, skin is composed of multiple layers, and the homogeneity assumption can lead to errors. In this study, we analyze the errors caused by the homogeneity assumption. This is accomplished by creating realistic skin spectra using a computational model, then extracting properties from those spectra using a one-layer model. The extracted parameters are then compared to the parameters used to create the modeled spectra. We used a wavelength range of 400 to 750 nm and a source detector separation of 250 μm. Our results show that use of a one-layer skin model causes underestimation of hemoglobin concentration [Hb] and melanin concentration [mel]. Additionally, the magnitude of the error is dependent on epidermal thickness. The one-layer assumption also causes [Hb] and [mel] to be correlated. Oxygen saturation is overestimated when it is below 50% and underestimated when it is above 50%. We also found that the vessel radius factor used to account for pigment packaging is correlated with epidermal thickness. © The Authors.

  15. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    Science.gov (United States)

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  16. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid

  17. Assessing assumptions of multivariate linear regression framework implemented for directionality analysis of fMRI.

    Science.gov (United States)

    Dang, Shilpa; Chaudhury, Santanu; Lall, Brejesh; Roy, Prasun Kumar

    2015-08-01

    Directionality analysis of time-series, recorded from task-activated regions-of-interest (ROIs) during functional Magnetic Resonance Imaging (fMRI), has helped in gaining insights of complex human behavior and human brain functioning. The most widely used standard method of Granger Causality for evaluating directionality employ linear regression modeling of temporal processes. Such a parameter-driven approach rests on various underlying assumptions about the data. The short-comings can arise when misleading conclusions are reached after exploration of data for which the assumptions are getting violated. In this study, we assess assumptions of Multivariate Autoregressive (MAR) framework which is employed for evaluating directionality among fMRI time-series recorded during a Sensory-Motor (SM) task. The fMRI time-series here is an averaged time-series from a user-defined ROI of multiple voxels. The "aim" is to establish a step-by-step procedure using statistical methods in conjunction with graphical methods to seek the validity of MAR models, specifically in the context of directionality analysis of fMRI data which has not been done previously to the best of our knowledge. Here, in our case of SM task (block design paradigm) there is violation of assumptions, indicating the inadequacy of MAR models to find directional interactions among different task-activated regions of brain.

  18. 77 FR 324 - Proposed Information Collection (Application for Assumption Approval and/or Release From Personal...

    Science.gov (United States)

    2012-01-04

    ... AFFAIRS Proposed Information Collection (Application for Assumption Approval and/or Release From Personal... to approve a claimant's request to be released from personal liability on a Government home loan... collection of information; (3) ways to enhance the quality, utility, and clarity of the information to be...

  19. Observations of the Dynamic Connectivity of the Non-Wetting Phase During Steady State Flow at the Pore Scale Using 3D X-ray Microtomography

    Science.gov (United States)

    Reynolds, C. A.; Menke, H. P.; Blunt, M. J.; Krevor, S. C.

    2015-12-01

    We observe a new type of non-wetting phase flow using time-resolved pore scale imaging. The traditional conceptual model of drainage involves a non-wetting phase invading a porous medium saturated with a wetting phase as either a fixed, connected flow path through the centres of pores or as discrete ganglia which move individually through the pore space, depending on the capillary number. We observe a new type of flow behaviour at low capillary number in which the flow of the non-wetting phase occurs through networks of persistent ganglia that occupy the large pores but continuously rearrange their connectivity (Figure 1). Disconnections and reconnections occur randomly to provide short-lived pseudo-steady state flow paths between pores. This process is distinctly different to the notion of flowing ganglia which coalesce and break-up. The size distribution of ganglia is dependent on capillary number. Experiments were performed by co-injecting N2and 25 wt% KI brine into a Bentheimer sandstone core (4mm diameter, 35mm length) at 50°C and 10 MPa. Drainage was performed at three flow rates (0.04, 0.3 and 1 ml/min) at a constant fractional flow of 0.5 and the variation in ganglia populations and connectivity observed. We obtained images of the pore space during steady state flow with a time resolution of 43 s over 1-2 hours. Experiments were performed at the Diamond Light Source synchrotron. Figure 1. The position of N2 in the pore space during steady state flow is summed over 40 time steps. White indicates that N2 occupies the space over >38 time steps and red <5 time steps.

  20. A Discourse on Putnam's Analogical Hypothesis of Mental State and ...

    African Journals Online (AJOL)

    -body problem. Computational functionalism attempted to reduce explanations of mental state to machine state explanation. This reductionism, on the initial assumption that mental state is a functional state of the whole organism, is used to ...

  1. Common-sense chemistry: The use of assumptions and heuristics in problem solving

    Science.gov (United States)

    Maeyer, Jenine Rachel

    Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build predictions and make decisions). A better understanding and characterization of these constraints are of central importance in the development of curriculum and teaching strategies that better support student learning in science. It was the overall goal of this thesis to investigate student reasoning in chemistry, specifically to better understand and characterize the assumptions and heuristics used by undergraduate chemistry students. To achieve this, two mixed-methods studies were conducted, each with quantitative data collected using a questionnaire and qualitative data gathered through semi-structured interviews. The first project investigated the reasoning heuristics used when ranking chemical substances based on the relative value of a physical or chemical property, while the second study characterized the assumptions and heuristics used when making predictions about the relative likelihood of different types of chemical processes. Our results revealed that heuristics for cue selection and decision-making played a significant role in the construction of answers during the interviews. Many study participants relied frequently on one or more of the following heuristics to make their decisions: recognition, representativeness, one-reason decision-making, and arbitrary trend. These heuristics allowed students to generate answers in the absence of requisite knowledge, but often led students astray. When characterizing assumptions, our results indicate that students relied on intuitive, spurious, and valid assumptions about the nature of chemical substances and processes in building their responses. In particular, many

  2. Analysis of Modeling Assumptions used in Production Cost Models for Renewable Integration Studies

    Energy Technology Data Exchange (ETDEWEB)

    Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Townsend, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bloom, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-01

    Renewable energy integration studies have been published for many different regions exploring the question of how higher penetration of renewable energy will impact the electric grid. These studies each make assumptions about the systems they are analyzing; however the effect of many of these assumptions has not been yet been examined and published. In this paper we analyze the impact of modeling assumptions in renewable integration studies, including the optimization method used (linear or mixed-integer programming) and the temporal resolution of the dispatch stage (hourly or sub-hourly). We analyze each of these assumptions on a large and a small system and determine the impact of each assumption on key metrics including the total production cost, curtailment of renewables, CO2 emissions, and generator starts and ramps. Additionally, we identified the impact on these metrics if a four-hour ahead commitment step is included before the dispatch step and the impact of retiring generators to reduce the degree to which the system is overbuilt. We find that the largest effect of these assumptions is at the unit level on starts and ramps, particularly for the temporal resolution, and saw a smaller impact at the aggregate level on system costs and emissions. For each fossil fuel generator type we measured the average capacity started, average run-time per start, and average number of ramps. Linear programming results saw up to a 20% difference in number of starts and average run time of traditional generators, and up to a 4% difference in the number of ramps, when compared to mixed-integer programming. Utilizing hourly dispatch instead of sub-hourly dispatch saw no difference in coal or gas CC units for either start metric, while gas CT units had a 5% increase in the number of starts and 2% increase in the average on-time per start. The number of ramps decreased up to 44%. The smallest effect seen was on the CO2 emissions and total production cost, with a 0.8% and 0

  3. An examination of the assumptions of specialization, mental disorder, and dangerousness in sex offenders.

    Science.gov (United States)

    Simon, L M

    2000-01-01

    Sex offenders have been singled out for differential treatment by the legal and mental health systems. This article attempts to inform law reform efforts and criminal justice mental health policy by examining the assumptions underlying differential legal and mental health treatment of sex offenders. These assumptions include the theories that sex offenders are mentally disordered and in need of treatment, specialists in sex crimes, and more dangerous than other criminal offenders. Empirical findings demonstrate that sex offenders are not specialists in sex crimes and are not mentally disordered. Examination of past research suggests that sex offenders are not at more risk than other criminal offenders to commit future sex crimes. Implications of research findings for selective prosecution of sex crime cases, mental health policy, sex offender legislation, and predictions of future dangerousness are discussed. Proposals for future research needs and law reform are presented. Copyright 2000 John Wiley & Sons, Ltd.

  4. Random Regression Models Based On The Skew Elliptically Contoured Distribution Assumptions With Applications To Longitudinal Data *

    Science.gov (United States)

    Zheng, Shimin; Rao, Uma; Bartolucci, Alfred A.; Singh, Karan P.

    2011-01-01

    Bartolucci et al.(2003) extended the distribution assumption from the normal (Lyles et al., 2000) to the elliptical contoured distribution (ECD) for random regression models used in analysis of longitudinal data accounting for both undetectable values and informative drop-outs. In this paper, the random regression models are constructed on the multivariate skew ECD. A real data set is used to illustrate that the skew ECDs can fit some unimodal continuous data better than the Gaussian distributions or more general continuous symmetric distributions when the symmetric distribution assumption is violated. Also, a simulation study is done for illustrating the model fitness from a variety of skew ECDs. The software we used is SAS/STAT, V. 9.13. PMID:21637734

  5. Tests of data quality, scaling assumptions, and reliability of the Danish SF-36

    DEFF Research Database (Denmark)

    Bjorner, J B; Damsgaard, M T; Watt, T

    1998-01-01

    We used general population data (n = 4084) to examine data completeness, response consistency, tests of scaling assumptions, and reliability of the Danish SF-36 Health Survey. We compared traditional multitrait scaling analyses to analyses using polychoric correlations and Spearman correlations. ...... with chronic diseases excepted). Concerning correlation methods, we found interesting differences indicating advantages of using methods that do not assume a normal distribution of answers as an addition to traditional methods.......We used general population data (n = 4084) to examine data completeness, response consistency, tests of scaling assumptions, and reliability of the Danish SF-36 Health Survey. We compared traditional multitrait scaling analyses to analyses using polychoric correlations and Spearman correlations...... discriminant validity, equal item-own scale correlations, and equal variances) were satisfactory in the total sample and in all subgroups. The SF-36 could discriminate between levels of health in all subgroups, but there were skewness, kurtosis, and ceiling effects in many subgroups (elderly people and people...

  6. Innovation or 'Inventions'? The conflict between latent assumptions in marine aquaculture and local fishery.

    Science.gov (United States)

    Martínez-Novo, Rodrigo; Lizcano, Emmánuel; Herrera-Racionero, Paloma; Miret-Pastor, Lluís

    2018-02-01

    Recent European policy highlights the need to promote local fishery and aquaculture by means of innovation and joint participation in fishery management as one of the keys to achieve the sustainability of our seas. However, the implicit assumptions held by the actors in the two main groups involved - innovators (scientists, businessmen and administration managers) and local fishermen - can complicate, perhaps even render impossible, mutual understanding and co-operation. A qualitative analysis of interviews with members of both groups in the Valencian Community (Spain) reveals those latent assumptions and their impact on the respective practices. The analysis shows that the innovation narrative in which one group is based and the inventions narrative used by the other one are rooted in two dramatically different, or even antagonistic, collective worldviews. Any environmental policy that implies these groups should take into account these strong discords.

  7. The importance of measuring growth in response to intervention models: Testing a core assumption.

    Science.gov (United States)

    Schatschneider, Christopher; Wagner, Richard K; Crawford, Elizabeth C

    2008-01-01

    A core assumption of response to instruction or intervention (RTI) models is the importance of measuring growth in achievement over time in response to effective instruction or intervention. Many RTI models actively monitor growth for identifying individuals who need different levels of intervention. A large-scale (N=23,438), two-year longitudinal study of first grade children was carried out to compare the predictive validity of measures of achievement status, growth in achievement, and their combination for predicting future reading achievement. The results indicate that under typical conditions, measures of growth do not make a contribution to prediction that is independent of measures of achievement status. These results question the validity of a core assumption of RTI models.

  8. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Gernaey, Krist V.; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant...... was compared for a series of model assumptions. Three different model approaches describing BNR are considered. In the reference case, the original model implementations are used to simulate WWTP1 (ASM1 & 3) and WWTP2 (ASM2d). The second set of models includes a reactive settler, which extends the description...... of the non-reactive TSS sedimentation and transport in the reference case with the full set of ASM processes. Finally, the third set of models is based on including electron acceptor dependency of biomass decay rates for ASM1 (WWTP1) and ASM2d (WWTP2). The results show that incorporation of a reactive...

  9. Load assumption for fatigue design of structures and components counting methods, safety aspects, practical application

    CERN Document Server

    Köhler, Michael; Pötter, Kurt; Zenner, Harald

    2017-01-01

    Understanding the fatigue behaviour of structural components under variable load amplitude is an essential prerequisite for safe and reliable light-weight design. For designing and dimensioning, the expected stress (load) is compared with the capacity to withstand loads (fatigue strength). In this process, the safety necessary for each particular application must be ensured. A prerequisite for ensuring the required fatigue strength is a reliable load assumption. The authors describe the transformation of the stress- and load-time functions which have been measured under operational conditions to spectra or matrices with the application of counting methods. The aspects which must be considered for ensuring a reliable load assumption for designing and dimensioning are discussed in detail. Furthermore, the theoretical background for estimating the fatigue life of structural components is explained, and the procedures are discussed for numerous applications in practice. One of the prime intentions of the authors ...

  10. Investigating Teachers’ and Students’ Beliefs and Assumptions about CALL Programme at Caledonian College of Engineering

    Directory of Open Access Journals (Sweden)

    Holi Ibrahim Holi Ali

    2012-01-01

    Full Text Available This study is set to investigate students’ and teachers’ perceptions and assumptions about newly implemented CALL Programme at the School of Foundation Studies, Caledonian College of Engineering, Oman. Two versions of questionnaire were administered to 24 teachers and 90 students to collect their beliefs and assumption about CALL programame. The results shows that the great majority of the students report that CALL is very interesting, motivating and useful to them and they learn a lot form it. However, the number of CALL hours should be increased, lab should be equipped and arranged in user friendly way, assessment should be integrated into CALL, and smart boards, black boards should be incorporated into the programme.

  11. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  12. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus

    Directory of Open Access Journals (Sweden)

    Constantinos Taliotis

    2017-10-01

    Full Text Available The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  13. Life-closing spirituality and the philosophic assumptions of the Roy adaptation model.

    Science.gov (United States)

    Dobratz, Marjorie C

    2004-10-01

    Secondary analysis of data from a previous study that referenced spirituality was coded, categorized, and grouped into themes. Life-closing spirituality for 44 (45.4%) of 97 total participants was shaped by a core theme of believing that was central to dying persons. Believing was linked to six other themes: comforting, releasing, connecting, giving, reframing, and requesting. These themes supported the philosophic assumptions and principles of humanism and veritivity as defined in the Roy adaptation model.

  14. Who needs the assumption of opportunistic behavior? Transaction cost economics does not!

    DEFF Research Database (Denmark)

    Koch, Carsten Allan

    2000-01-01

    The assumption of opportunistic behavior, familiar from transaction cost economics, has been and remains highly controversial. But opportunistic behavior, albeit undoubtedly an extremely important form of motivation, is not a necessary condition for the contractual problems studied by transaction...... cost economics to occur. A broader category of motivation, inappropriate motivation, which is such a necessary condition, is presented and discussed, and a number of implications are presented....

  15. Mercury and Air Toxics Standards Analysis Deconstructed: Changing Assumptions, Changing Results

    OpenAIRE

    Beasley, Blair; Woerman, Matt; Paul, Anthony; Burtraw, Dallas; Palmer, Karen

    2013-01-01

    Several recent studies have used simulation models to quantify the potential effects of recent environmental regulations on power plants, including the Mercury and Air Toxics Standards (MATS), one of the US Environmental Protection Agency’s most expensive regulations. These studies have produced inconsistent results about the effects on the industry, making general conclusions difficult. We attempt to reconcile these differences by representing the variety of assumptions in these studies with...

  16. Camera traps and mark-resight models: The value of ancillary data for evaluating assumptions

    Science.gov (United States)

    Parsons, Arielle W.; Simons, Theodore R.; Pollock, Kenneth H.; Stoskopf, Michael K.; Stocking, Jessica J.; O'Connell, Allan F.

    2015-01-01

    Unbiased estimators of abundance and density are fundamental to the study of animal ecology and critical for making sound management decisions. Capture–recapture models are generally considered the most robust approach for estimating these parameters but rely on a number of assumptions that are often violated but rarely validated. Mark-resight models, a form of capture–recapture, are well suited for use with noninvasive sampling methods and allow for a number of assumptions to be relaxed. We used ancillary data from continuous video and radio telemetry to evaluate the assumptions of mark-resight models for abundance estimation on a barrier island raccoon (Procyon lotor) population using camera traps. Our island study site was geographically closed, allowing us to estimate real survival and in situ recruitment in addition to population size. We found several sources of bias due to heterogeneity of capture probabilities in our study, including camera placement, animal movement, island physiography, and animal behavior. Almost all sources of heterogeneity could be accounted for using the sophisticated mark-resight models developed by McClintock et al. (2009b) and this model generated estimates similar to a spatially explicit mark-resight model previously developed for this population during our study. Spatially explicit capture–recapture models have become an important tool in ecology and confer a number of advantages; however, non-spatial models that account for inherent individual heterogeneity may perform nearly as well, especially where immigration and emigration are limited. Non-spatial models are computationally less demanding, do not make implicit assumptions related to the isotropy of home ranges, and can provide insights with respect to the biological traits of the local population.

  17. The Universality of Intuition an aposteriori Criticize to an apriori Assumption

    Directory of Open Access Journals (Sweden)

    Roohollah Haghshenas

    2015-03-01

    Full Text Available Intuition has a central role in philosophy, the role to arbitrating between different opinions. When a philosopher shows that "intuition" supports his view, he thinks this is a good reason for him. In contrast, if we show some contraries between intuition and a theory or some implications of it, we think a replacement or at least some revisions would be needed. There are some well-known examples of this role for intuition in many fields of philosophy the transplant case in ethics, the chinese nation case in philosophy of mind and the Gettier examples in epistemology. But there is an assumption here we suppose all people think in same manner, i.e. we think intuition(s is universal. Experimental philosophy tries to study this assumption experimentally. This project continues Quine's movement to "pursuit of truth" from a naturalistic point of view and making epistemology "as a branch of natural science." The work of experimental philosophy shows that in many cases people with different cultural backgrounds reflect to some specific moral or epistemological cases –like Gettier examples- differently and thus intuition is not universal. So, many problems that are based on this assumption maybe dissolved, have plural forms for plural cultures or bounded to some specific cultures –western culture in many cases.

  18. Fluid-Structure Interaction Modeling of Intracranial Aneurysm Hemodynamics: Effects of Different Assumptions

    Science.gov (United States)

    Rajabzadeh Oghaz, Hamidreza; Damiano, Robert; Meng, Hui

    2015-11-01

    Intracranial aneurysms (IAs) are pathological outpouchings of cerebral vessels, the progression of which are mediated by complex interactions between the blood flow and vasculature. Image-based computational fluid dynamics (CFD) has been used for decades to investigate IA hemodynamics. However, the commonly adopted simplifying assumptions in CFD (e.g. rigid wall) compromise the simulation accuracy and mask the complex physics involved in IA progression and eventual rupture. Several groups have considered the wall compliance by using fluid-structure interaction (FSI) modeling. However, FSI simulation is highly sensitive to numerical assumptions (e.g. linear-elastic wall material, Newtonian fluid, initial vessel configuration, and constant pressure outlet), the effects of which are poorly understood. In this study, a comprehensive investigation of the sensitivity of FSI simulations in patient-specific IAs is investigated using a multi-stage approach with a varying level of complexity. We start with simulations incorporating several common simplifications: rigid wall, Newtonian fluid, and constant pressure at the outlets, and then we stepwise remove these simplifications until the most comprehensive FSI simulations. Hemodynamic parameters such as wall shear stress and oscillatory shear index are assessed and compared at each stage to better understand the sensitivity of in FSI simulations for IA to model assumptions. Supported by the National Institutes of Health (1R01 NS 091075-01).

  19. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Science.gov (United States)

    Hsu, Anne; Griffiths, Thomas L

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  20. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  1. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Directory of Open Access Journals (Sweden)

    Giordano James

    2010-01-01

    Full Text Available Abstract A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order, and what counts as abnormality (i.e.- disorder. The distinction(s between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice.

  2. Evaluating the influence of the 'unity assumption' on the temporal perception of realistic audiovisual stimuli.

    Science.gov (United States)

    Vatakis, Argiro; Spence, Charles

    2008-01-01

    Vatakis, A. and Spence, C. (in press) [Crossmodal binding: Evaluating the 'unity assumption' using audiovisual speech stimuli. Perception &Psychophysics] recently demonstrated that when two briefly presented speech signals (one auditory and the other visual) refer to the same audiovisual speech event, people find it harder to judge their temporal order than when they refer to different speech events. Vatakis and Spence argued that the 'unity assumption' facilitated crossmodal binding on the former (matching) trials by means of a process of temporal ventriloquism. In the present study, we investigated whether the 'unity assumption' would also affect the binding of non-speech stimuli (video clips of object action or musical notes). The auditory and visual stimuli were presented at a range of stimulus onset asynchronies (SOAs) using the method of constant stimuli. Participants made unspeeded temporal order judgments (TOJs) regarding which modality stream had been presented first. The auditory and visual musical and object action stimuli were either matched (e.g., the sight of a note being played on a piano together with the corresponding sound) or else mismatched (e.g., the sight of a note being played on a piano together with the sound of a guitar string being plucked). However, in contrast to the results of Vatakis and Spence's recent speech study, no significant difference in the accuracy of temporal discrimination performance for the matched versus mismatched video clips was observed. Reasons for this discrepancy are discussed.

  3. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  4. The simplified reference tissue model: model assumption violations and their impact on binding potential.

    Science.gov (United States)

    Salinas, Cristian A; Searle, Graham E; Gunn, Roger N

    2015-02-01

    Reference tissue models have gained significant traction over the last two decades as the methods of choice for the quantification of brain positron emission tomography data because they balance quantitative accuracy with less invasive procedures. The principal advantage is the elimination of the need to perform arterial cannulation of the subject to measure blood and metabolite concentrations for input function generation. In particular, the simplified reference tissue model (SRTM) has been widely adopted as it uses a simplified model configuration with only three parameters that typically produces good fits to the kinetic data and a stable parameter estimation process. However, the model's simplicity and its ability to generate good fits to the data, even when the model assumptions are not met, can lead to misplaced confidence in binding potential (BPND) estimates. Computer simulation were used to study the bias introduced in BPND estimates as a consequence of violating each of the four core SRTM model assumptions. Violation of each model assumption led to bias in BPND (both over and underestimation). Careful assessment of the bias in SRTM BPND should be performed for new tracers and applications so that an appropriate decision about its applicability can be made.

  5. Using classroom data to teach students about data cleaning and testing assumptions

    Directory of Open Access Journals (Sweden)

    Kevin eCummiskey

    2012-09-01

    Full Text Available This paper discusses the influence that decisions about data cleaning and violations of statistical assumptions can have on drawing valid conclusions to research studies. The datasets provided in this paper were collected as part of a National Science Foundation grant to design online games and associated labs for use in undergraduate and graduate statistics courses that can effectively illustrate issues not always addressed in traditional instruction. Students play the role of a researcher by selecting from a wide variety of independent variables to explain why some students complete games faster than others. Typical project data sets are messy, with many outliers (usually from some students taking much longer than others and distributions that do not appear normal. Classroom testing of the games over several semesters has produced evidence of their efficacy in statistics education. The projects tend to be engaging for students and they make the impact of data cleaning and violations of model assumptions more relevant. We discuss the use of one of the games and associated guided lab in introducing students to issues prevalent in real data and the challenges involved in data cleaning and dangers when model assumptions are violated.

  6. Estimation of the energy loss at the blades in rowing: common assumptions revisited.

    Science.gov (United States)

    Hofmijster, Mathijs; De Koning, Jos; Van Soest, A J

    2010-08-01

    In rowing, power is inevitably lost as kinetic energy is imparted to the water during push-off with the blades. Power loss is estimated from reconstructed blade kinetics and kinematics. Traditionally, it is assumed that the oar is completely rigid and that force acts strictly perpendicular to the blade. The aim of the present study was to evaluate how reconstructed blade kinematics, kinetics, and average power loss are affected by these assumptions. A calibration experiment with instrumented oars and oarlocks was performed to establish relations between measured signals and oar deformation and blade force. Next, an on-water experiment was performed with a single female world-class rower rowing at constant racing pace in an instrumented scull. Blade kinematics, kinetics, and power loss under different assumptions (rigid versus deformable oars; absence or presence of a blade force component parallel to the oar) were reconstructed. Estimated power losses at the blades are 18% higher when parallel blade force is incorporated. Incorporating oar deformation affects reconstructed blade kinematics and instantaneous power loss, but has no effect on estimation of power losses at the blades. Assumptions on oar deformation and blade force direction have implications for the reconstructed blade kinetics and kinematics. Neglecting parallel blade forces leads to a substantial underestimation of power losses at the blades.

  7. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  8. On the relevance of assumptions associated with classical factor analytic approaches

    Directory of Open Access Journals (Sweden)

    Daniel eKasper

    2013-03-01

    Full Text Available A personal trait, for example a person's cognitive ability, represents a theoretical concept postulated to explain behavior. Interesting constructs are latent, that is, they cannot be observed. Latent variable modeling constitutes a methodology to deal with hypothetical constructs. Constructs are modeled as random variables and become components of a statistical model. As random variables, they possess a probability distribution in the population of reference. In applications, this distribution is typically assumed to be the normal distribution. The normality assumption may be reasonable in many cases, but there are situations where it cannot be justified. For example, this is true for criterion-referenced tests or for background characteristics of students in large scale assessment studies. Nevertheless, the normal procedures in combination with the classical factor analytic methods are frequently pursued, despite the effects of violating this ``implicit'' assumption are not clear in general.In a simulation study, we investigate whether classical factor analytic approaches can be instrumental in estimating the factorial structure and properties of the population distribution of a latent personal trait from educational test data, when violations of classical assumptions as the aforementioned are present. The results indicate that having a latent non-normal distribution clearly affects the estimation of the distribution of the factor scores and properties thereof. Thus, when the population distribution of a personal trait is assumed to be non-symmetric, we recommend avoiding those factor analytic approaches for estimation of a person's factor score, even though the number of extracted factors and the estimated loading matrix may not be strongly affected.An application to the Progress in International Reading Literacy Study (PIRLS is given. Comments on possible implications for the Programme for International Student Assessment (PISA complete the

  9. Evaluating the reliability of equilibrium dissolution assumption from residual gasoline in contact with water saturated sands

    Science.gov (United States)

    Lekmine, Greg; Sookhak Lari, Kaveh; Johnston, Colin D.; Bastow, Trevor P.; Rayner, John L.; Davis, Greg B.

    2017-01-01

    Understanding dissolution dynamics of hazardous compounds from complex gasoline mixtures is a key to long-term predictions of groundwater risks. The aim of this study was to investigate if the local equilibrium assumption for BTEX and TMBs (trimethylbenzenes) dissolution was valid under variable saturation in two dimensional flow conditions and evaluate the impact of local heterogeneities when equilibrium is verified at the scale of investigation. An initial residual gasoline saturation was established over the upper two-thirds of a water saturated sand pack. A constant horizontal pore velocity was maintained and water samples were recovered across 38 sampling ports over 141 days. Inside the residual NAPL zone, BTEX and TMBs dissolution curves were in agreement with the TMVOC model based on the local equilibrium assumption. Results compared to previous numerical studies suggest the presence of small scale dissolution fingering created perpendicular to the horizontal dissolution front, mainly triggered by heterogeneities in the medium structure and the local NAPL residual saturation. In the transition zone, TMVOC was able to represent a range of behaviours exhibited by the data, confirming equilibrium or near-equilibrium dissolution at the scale of investigation. The model locally showed discrepancies with the most soluble compounds, i.e. benzene and toluene, due to local heterogeneities exhibiting that at lower scale flow bypassing and channelling may have occurred. In these conditions mass transfer rates were still high enough to fall under the equilibrium assumption in TMVOC at the scale of investigation. Comparisons with other models involving upscaled mass transfer rates demonstrated that such approximations with TMVOC could lead to overestimate BTEX dissolution rates and underestimate the total remediation time.

  10. Evaluating the reliability of equilibrium dissolution assumption from residual gasoline in contact with water saturated sands.

    Science.gov (United States)

    Lekmine, Greg; Sookhak Lari, Kaveh; Johnston, Colin D; Bastow, Trevor P; Rayner, John L; Davis, Greg B

    2017-01-01

    Understanding dissolution dynamics of hazardous compounds from complex gasoline mixtures is a key to long-term predictions of groundwater risks. The aim of this study was to investigate if the local equilibrium assumption for BTEX and TMBs (trimethylbenzenes) dissolution was valid under variable saturation in two dimensional flow conditions and evaluate the impact of local heterogeneities when equilibrium is verified at the scale of investigation. An initial residual gasoline saturation was established over the upper two-thirds of a water saturated sand pack. A constant horizontal pore velocity was maintained and water samples were recovered across 38 sampling ports over 141days. Inside the residual NAPL zone, BTEX and TMBs dissolution curves were in agreement with the TMVOC model based on the local equilibrium assumption. Results compared to previous numerical studies suggest the presence of small scale dissolution fingering created perpendicular to the horizontal dissolution front, mainly triggered by heterogeneities in the medium structure and the local NAPL residual saturation. In the transition zone, TMVOC was able to represent a range of behaviours exhibited by the data, confirming equilibrium or near-equilibrium dissolution at the scale of investigation. The model locally showed discrepancies with the most soluble compounds, i.e. benzene and toluene, due to local heterogeneities exhibiting that at lower scale flow bypassing and channelling may have occurred. In these conditions mass transfer rates were still high enough to fall under the equilibrium assumption in TMVOC at the scale of investigation. Comparisons with other models involving upscaled mass transfer rates demonstrated that such approximations with TMVOC could lead to overestimate BTEX dissolution rates and underestimate the total remediation time. Copyright © 2016. Published by Elsevier B.V.

  11. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions.

    Science.gov (United States)

    Flores-Alsina, Xavier; Gernaey, Krist V; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant was compared for a series of model assumptions. Three different model approaches describing BNR are considered. In the reference case, the original model implementations are used to simulate WWTP1 (ASM1 & 3) and WWTP2 (ASM2d). The second set of models includes a reactive settler, which extends the description of the non-reactive TSS sedimentation and transport in the reference case with the full set of ASM processes. Finally, the third set of models is based on including electron acceptor dependency of biomass decay rates for ASM1 (WWTP1) and ASM2d (WWTP2). The results show that incorporation of a reactive settler: (1) increases the hydrolysis of particulates; (2) increases the overall plant's denitrification efficiency by reducing the S(NOx) concentration at the bottom of the clarifier; (3) increases the oxidation of COD compounds; (4) increases X(OHO) and X(ANO) decay; and, finally, (5) increases the growth of X(PAO) and formation of X(PHA,Stor) for ASM2d, which has a major impact on the whole P removal system. Introduction of electron acceptor dependent decay leads to a substantial increase of the concentration of X(ANO), X(OHO) and X(PAO) in the bottom of the clarifier. The paper ends with a critical discussion of the influence of the different model assumptions, and emphasizes the need for a model user to understand the significant differences in simulation results that are obtained when applying different combinations of 'standard' models.

  12. Plant uptake of elements in soil and pore water: field observations versus model assumptions.

    Science.gov (United States)

    Raguž, Veronika; Jarsjö, Jerker; Grolander, Sara; Lindborg, Regina; Avila, Rodolfo

    2013-09-15

    Contaminant concentrations in various edible plant parts transfer hazardous substances from polluted areas to animals and humans. Thus, the accurate prediction of plant uptake of elements is of significant importance. The processes involved contain many interacting factors and are, as such, complex. In contrast, the most common way to currently quantify element transfer from soils into plants is relatively simple, using an empirical soil-to-plant transfer factor (TF). This practice is based on theoretical assumptions that have been previously shown to not generally be valid. Using field data on concentrations of 61 basic elements in spring barley, soil and pore water at four agricultural sites in mid-eastern Sweden, we quantify element-specific TFs. Our aim is to investigate to which extent observed element-specific uptake is consistent with TF model assumptions and to which extent TF's can be used to predict observed differences in concentrations between different plant parts (root, stem and ear). Results show that for most elements, plant-ear concentrations are not linearly related to bulk soil concentrations, which is congruent with previous studies. This behaviour violates a basic TF model assumption of linearity. However, substantially better linear correlations are found when weighted average element concentrations in whole plants are used for TF estimation. The highest number of linearly-behaving elements was found when relating average plant concentrations to soil pore-water concentrations. In contrast to other elements, essential elements (micronutrients and macronutrients) exhibited relatively small differences in concentration between different plant parts. Generally, the TF model was shown to work reasonably well for micronutrients, whereas it did not for macronutrients. The results also suggest that plant uptake of elements from sources other than the soil compartment (e.g. from air) may be non-negligible. Copyright © 2013 Elsevier Ltd. All rights

  13. Assumption-versus data-based approaches to summarizing species' ranges.

    Science.gov (United States)

    Peterson, A Townsend; Navarro-Sigüenza, Adolfo G; Gordillo, Alejandro

    2016-08-04

    For conservation decision making, species' geographic distributions are mapped using various approaches. Some such efforts have downscaled versions of coarse-resolution extent-of-occurrence maps to fine resolutions for conservation planning. We examined the quality of the extent-of-occurrence maps as range summaries and the utility of refining those maps into fine-resolution distributional hypotheses. Extent-of-occurrence maps tend to be overly simple, omit many known and well-documented populations, and likely frequently include many areas not holding populations. Refinement steps involve typological assumptions about habitat preferences and elevational ranges of species, which can introduce substantial error in estimates of species' true areas of distribution. However, no model-evaluation steps are taken to assess the predictive ability of these models, so model inaccuracies are not noticed. Whereas range summaries derived by these methods may be useful in coarse-grained, global-extent studies, their continued use in on-the-ground conservation applications at fine spatial resolutions is not advisable in light of reliance on assumptions, lack of real spatial resolution, and lack of testing. In contrast, data-driven techniques that integrate primary data on biodiversity occurrence with remotely sensed data that summarize environmental dimensions (i.e., ecological niche modeling or species distribution modeling) offer data-driven solutions based on a minimum of assumptions that can be evaluated and validated quantitatively to offer a well-founded, widely accepted method for summarizing species' distributional patterns for conservation applications. © 2016 Society for Conservation Biology.

  14. Old and New Ideas for Data Screening and Assumption Testing for Exploratory and Confirmatory Factor Analysis

    Science.gov (United States)

    Flora, David B.; LaBrish, Cathy; Chalmers, R. Philip

    2011-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables. PMID:22403561

  15. The Avalanche Hypothesis and Compression of Morbidity: Testing Assumptions through Cohort-Sequential Analysis.

    Directory of Open Access Journals (Sweden)

    Jordan Silberman

    Full Text Available The compression of morbidity model posits a breakpoint in the adult lifespan that separates an initial period of relative health from a subsequent period of ever increasing morbidity. Researchers often assume that such a breakpoint exists; however, this assumption is hitherto untested.To test the assumption that a breakpoint exists--which we term a morbidity tipping point--separating a period of relative health from a subsequent deterioration in health status. An analogous tipping point for healthcare costs was also investigated.Four years of adults' (N = 55,550 morbidity and costs data were retrospectively analyzed. Data were collected in Pittsburgh, PA between 2006 and 2009; analyses were performed in Rochester, NY and Ann Arbor, MI in 2012 and 2013. Cohort-sequential and hockey stick regression models were used to characterize long-term trajectories and tipping points, respectively, for both morbidity and costs.Morbidity increased exponentially with age (P<.001. A morbidity tipping point was observed at age 45.5 (95% CI, 41.3-49.7. An exponential trajectory was also observed for costs (P<.001, with a costs tipping point occurring at age 39.5 (95% CI, 32.4-46.6. Following their respective tipping points, both morbidity and costs increased substantially (Ps<.001.Findings support the existence of a morbidity tipping point, confirming an important but untested assumption. This tipping point, however, may occur earlier in the lifespan than is widely assumed. An "avalanche of morbidity" occurred after the morbidity tipping point-an ever increasing rate of morbidity progression. For costs, an analogous tipping point and "avalanche" were observed. The time point at which costs began to increase substantially occurred approximately 6 years before health status began to deteriorate.

  16. The Avalanche Hypothesis and Compression of Morbidity: Testing Assumptions through Cohort-Sequential Analysis

    Science.gov (United States)

    Silberman, Jordan; Wang, Chun; Mason, Shawn T.; Schwartz, Steven M.; Hall, Matthew; Morrissette, Jason L.; Tu, Xin M.; Greenhut, Janet

    2015-01-01

    Background The compression of morbidity model posits a breakpoint in the adult lifespan that separates an initial period of relative health from a subsequent period of ever increasing morbidity. Researchers often assume that such a breakpoint exists; however, this assumption is hitherto untested. Purpose To test the assumption that a breakpoint exists—which we term a morbidity tipping point—separating a period of relative health from a subsequent deterioration in health status. An analogous tipping point for healthcare costs was also investigated. Methods Four years of adults’ (N = 55,550) morbidity and costs data were retrospectively analyzed. Data were collected in Pittsburgh, PA between 2006 and 2009; analyses were performed in Rochester, NY and Ann Arbor, MI in 2012 and 2013. Cohort-sequential and hockey stick regression models were used to characterize long-term trajectories and tipping points, respectively, for both morbidity and costs. Results Morbidity increased exponentially with age (Pmorbidity tipping point was observed at age 45.5 (95% CI, 41.3-49.7). An exponential trajectory was also observed for costs (Pmorbidity and costs increased substantially (Psmorbidity tipping point, confirming an important but untested assumption. This tipping point, however, may occur earlier in the lifespan than is widely assumed. An “avalanche of morbidity” occurred after the morbidity tipping point—an ever increasing rate of morbidity progression. For costs, an analogous tipping point and “avalanche” were observed. The time point at which costs began to increase substantially occurred approximately 6 years before health status began to deteriorate. PMID:25962130

  17. Extrapolation of pre-screening trends: Impact of assumptions on overdiagnosis estimates by mammographic screening.

    Science.gov (United States)

    Ripping, T M; Verbeek, A L M; Ten Haaf, K; van Ravesteyn, N T; Broeders, M J M

    2016-06-01

    Overdiagnosis by mammographic screening is defined as the excess in breast cancer incidence in the presence of screening compared to the incidence in the absence of screening. The latter is often estimated by extrapolating the pre-screening incidence trend. The aim of this theoretical study is to investigate the impact of assumptions in extrapolating the pre-screening incidence trend of invasive breast cancer on the estimated percentage of overdiagnosis. We extracted data on invasive breast cancer incidence and person-years by calendar year (1975-2009) and 5-year age groups (0-85 years) from Dutch databases. Different combinations of assumptions for extrapolating the pre-screening period were investigated, such as variations in the type of regression model, end of the pre-screening period, screened age range, post-screening age range and adjustment for a trend in women overdiagnosis, i.e. excess cancer incidence in the presence of screening as a proportion of the number of screen-detected and interval cancers. Most overdiagnosis percentages are overestimated because of inadequate adjustment for lead time. The overdiagnosis estimates range between -7.1% and 65.1%, with a median of 33.6%. The choice of pre-screening period has the largest influence on the estimated percentage of overdiagnosis: the median estimate is 17.1% for extrapolations using 1975-1986 as the pre-screening period and 44.7% for extrapolations using 1975-1988 as the pre-screening period. The results of this theoretical study most likely cover the true overdiagnosis estimate, which is unknown, and may not necessarily represent the median overdiagnosis estimate. This study shows that overdiagnosis estimates heavily depend on the assumptions made in extrapolating the incidence in the pre-screening period, especially on the choice of the pre-screening period. These limitations should be acknowledged when adopting this approach to estimate overdiagnosis. Copyright © 2016 Elsevier Ltd. All rights

  18. Adapting forest science, practice, and policy to shifting ground: From steady-state assumptions to dynamic change

    Science.gov (United States)

    Daniel B. Botkin

    2014-01-01

    What forestry needs in the Anthropogenic Era is what has been needed for the past 30 years. The proper methods, theory, and goals have been clear and are available; the failure has been, and continues to be, that our laws, policies, and actions are misdirected because we confuse a truly scientific base with nonscientific beliefs. The result is a confusion of folklore...

  19. Bases, Assumptions, and Results of the Flowsheet Calculations for the Decision Phase Salt Disposition Alternatives

    Energy Technology Data Exchange (ETDEWEB)

    Dimenna, R.A.; Jacobs, R.A.; Taylor, G.A.; Durate, O.E.; Paul, P.K.; Elder, H.H.; Pike, J.A.; Fowler, J.R.; Rutland, P.L.; Gregory, M.V.; Smith III, F.G.; Hang, T.; Subosits, S.G.; Campbell, S.G.

    2001-03-26

    The High Level Waste (HLW) Salt Disposition Systems Engineering Team was formed on March 13, 1998, and chartered to identify options, evaluate alternatives, and recommend a selected alternative(s) for processing HLW salt to a permitted wasteform. This requirement arises because the existing In-Tank Precipitation process at the Savannah River Site, as currently configured, cannot simultaneously meet the HLW production and Authorization Basis safety requirements. This engineering study was performed in four phases. This document provides the technical bases, assumptions, and results of this engineering study.

  20. Testing the assumption of linear dependence between the rolling friction torque and normal force

    Directory of Open Access Journals (Sweden)

    Alaci Stelian

    2017-01-01

    Full Text Available Rolling friction is present in all nonconforming bodies in contact. A permanent topic is the characterization of the moment of rolling friction. A number of authors accept the hypothesis of linear dependency between the rolling torque and the normal force while other researchers disagree with this assumption. The present paper proposes a method for testing the hypothesis of linear relationship between rolling moment and normal pressing force. A doubly supported cycloidal pendulum is used in two situations: symmetrically and asymmetrically supported, respectively. Under the hypothesis of a linear relationship, the motions of the pendulum should be identical.

  1. Diversion assumptions for high-powered research reactors. ISPO C-50 Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Binford, F.T.

    1984-01-01

    This study deals with diversion assumptions for high-powered research reactors -- specifically, MTR fuel; pool- or tank-type research reactors with light-water moderator; and water, beryllium, or graphite reflectors, and which have a power level of 25 MW(t) or more. The objective is to provide assistance to the IAEA in documentation of criteria and inspection observables related to undeclared plutonium production in the reactors described above, including: criteria for undeclared plutonium production, necessary design information for implementation of these criteria, verification guidelines including neutron physics and heat transfer, and safeguards measures to facilitate the detection of undeclared plutonium production at large research reactors.

  2. Untested assumptions: psychological research and credibility assessment in legal decision-making

    Directory of Open Access Journals (Sweden)

    Jane Herlihy

    2015-05-01

    Full Text Available Background: Trauma survivors often have to negotiate legal systems such as refugee status determination or the criminal justice system. Methods & results: We outline and discuss the contribution which research on trauma and related psychological processes can make to two particular areas of law where complex and difficult legal decisions must be made: in claims for refugee and humanitarian protection, and in reporting and prosecuting sexual assault in the criminal justice system. Conclusion: There is a breadth of psychological knowledge that, if correctly applied, would limit the inappropriate reliance on assumptions and myth in legal decision-making in these settings. Specific recommendations are made for further study.

  3. What is a god? Metatheistic assumptions in Old Testament Yahwism(s

    Directory of Open Access Journals (Sweden)

    J W Gericke

    2006-09-01

    Full Text Available In this article, the author provides a prolegomena to further research attempting to answer a most undamental and basic question � much more so than what has thus far been the case in the disciplines of Old Testament theology and history of Israelite religion. It concerns the implicit assumptions in the Hebrew Bible�s discourse about the fundamental nature of deity. In other words, the question is not, �What is� YHWH like?� but rather , �what, according to the Old Testament texts, is a god?�

  4. The reality behind the assumptions: Modelling and simulation support for the SAAF

    CSIR Research Space (South Africa)

    Naidoo, K

    2015-10-01

    Full Text Available stream_source_info Naidoo3_2015.pdf.txt stream_content_type text/plain stream_size 4771 Content-Encoding UTF-8 stream_name Naidoo3_2015.pdf.txt Content-Type text/plain; charset=UTF-8 The reality behind the assumptions... for the Defence Force. • 50. A growing percentage of relevant defence technologies are developed in the commercial domain, resulting in defence forces becomingly increasingly reliant on the use of commercial technologies. An agile SANDF will need to exploit...

  5. The reality behind the assumptions: Modelling and simulation support for the SAAF

    CSIR Research Space (South Africa)

    Naidoo, K

    2015-10-01

    Full Text Available stream_source_info Naidoo_2_2015.pdf.txt stream_content_type text/plain stream_size 4771 Content-Encoding UTF-8 stream_name Naidoo_2_2015.pdf.txt Content-Type text/plain; charset=UTF-8 The reality behind the assumptions... for the Defence Force. • 50. A growing percentage of relevant defence technologies are developed in the commercial domain, resulting in defence forces becomingly increasingly reliant on the use of commercial technologies. An agile SANDF will need to exploit...

  6. Testing the Assumptions of Sequential Bifurcation for Factor Screening (revision of CentER DP 2015-034)

    NARCIS (Netherlands)

    Shi, Wen; Kleijnen, J.P.C.

    2017-01-01

    Sequential bifurcation (or SB) is an efficient and effective factor-screening method; i.e., SB quickly identifies the important factors (inputs) in experiments with simulation models that have very many factors—provided the SB assumptions are valid. The specific SB assumptions are: (i) a secondorder

  7. Assessing the skill of hydrology models at simulating the water cycle in the HJ Andrews LTER: Assumptions, strengths and weaknesses

    Science.gov (United States)

    Simulated impacts of climate on hydrology can vary greatly as a function of the scale of the input data, model assumptions, and model structure. Four models are commonly used to simulate streamflow in model assumptions, and model structure. Four models are commonly used to simu...

  8. 75 FR 49407 - Benefits Payable in Terminated Single-Employer Plans; Interest Assumptions for Valuing and Paying...

    Science.gov (United States)

    2010-08-13

    ... and paying plan benefits of terminating single-employer plans covered by title IV of the Employee... regulation on Allocation of Assets in Single- Employer Plans (29 CFR part 4044). Assumptions under the asset allocation regulation are updated quarterly; assumptions under the benefit payments regulation are updated...

  9. The Assumption of Proportional Components when Candecomp Is Applied to Symmetric Matrices in the Context of INDSCAL

    Science.gov (United States)

    Dosse, Mohammed Bennani; Berge, Jos M. F.

    2008-01-01

    The use of Candecomp to fit scalar products in the context of INDSCAL is based on the assumption that the symmetry of the data matrices involved causes the component matrices to be equal when Candecomp converges. Ten Berge and Kiers gave examples where this assumption is violated for Gramian data matrices. These examples are believed to be local…

  10. Bias in regression coefficient estimates when assumptions for handling missing data are violated: a simulation study

    Directory of Open Access Journals (Sweden)

    Sander MJ van Kuijk

    2016-03-01

    Full Text Available BackgroundThe purpose of this simulation study is to assess the performance of multiple imputation compared to complete case analysis when assumptions of missing data mechanisms are violated.MethodsThe authors performed a stochastic simulation study to assess the performance of Complete Case (CC analysis and Multiple Imputation (MI with different missing data mechanisms (missing completely at random (MCAR, at random (MAR, and not at random (MNAR. The study focused on the point estimation of regression coefficients and standard errors.ResultsWhen data were MAR conditional on Y, CC analysis resulted in biased regression coefficients; they were all underestimated in our scenarios. In these scenarios, analysis after MI gave correct estimates. Yet, in case of MNAR MI yielded biased regression coefficients, while CC analysis performed well.ConclusionThe authors demonstrated that MI was only superior to CC analysis in case of MCAR or MAR. In some scenarios CC may be superior over MI. Often it is not feasible to identify the reason why data in a given dataset are missing. Therefore, emphasis should be put on reporting the extent of missing values, the method used to address them, and the assumptions that were made about the mechanism that caused missing data.

  11. Questioning the foundations of physics which of our fundamental assumptions are wrong?

    CERN Document Server

    Foster, Brendan; Merali, Zeeya

    2015-01-01

    The essays in this book look at way in which the fundaments of physics might need to be changed in order to make progress towards a unified theory. They are based on the prize-winning essays submitted to the FQXi essay competition “Which of Our Basic Physical Assumptions Are Wrong?”, which drew over 270 entries. As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.” The authors of the eighteen prize-winning essays have, where necessary, adapted their essays for the present volume so as to (a) incorporate the community feedback generated in the online discussion of the essays, (b) add new material that has come to light since their completion and (c) to ensure accessibility to a broad audience of re...

  12. On the "well-mixed" assumption and numerical 2-D tracing of atmospheric moisture

    Directory of Open Access Journals (Sweden)

    H. F. Goessling

    2013-06-01

    Full Text Available Atmospheric water vapour tracers (WVTs are an elegant tool to determine source–sink relations of moisture "online" in atmospheric general circulation models (AGCMs. However, it is sometimes desirable to establish such relations "offline" based on already existing atmospheric data (e.g. reanalysis data. One simple and frequently applied offline method is 2-D moisture tracing. It makes use of the "well-mixed" assumption, which allows for treating the vertical dimension integratively. Here we scrutinise the "well-mixed" assumption and 2-D moisture tracing by means of analytical considerations in combination with AGCM-WVT simulations. We find that vertically well-mixed conditions are seldom met. Due to the presence of vertical inhomogeneities, 2-D moisture tracing (i neglects a significant degree of fast-recycling, and (ii results in erroneous advection where the direction of the horizontal winds varies vertically. The latter is not so much the case in the extratropics, but in the tropics this can lead to large errors. For example, computed by 2-D moisture tracing, the fraction of precipitation in the western Sahel that originates from beyond the Sahara is ~40%, whereas the fraction that originates from the tropical and Southern Atlantic is only ~4%. According to full (i.e. 3-D moisture tracing, however, both regions contribute roughly equally, showing that the errors introduced by the 2-D approximation can be substantial.

  13. Through the lens of hetero-normative assumptions: re-thinking attitudes towards gay parenting.

    Science.gov (United States)

    Pennington, Jarred; Knight, Tess

    2011-01-01

    In this study we explored the attitudes and beliefs of nine heterosexual adults towards gay male and female couples parenting children. We conceptualised participants' perceptions as one primary lens through which gay parenting is viewed. Based on the narratives provided, this lens comprised hetero-normative, homophobic or heterosexist assumptions and coloured the way in which participants perceived aspects of the concept of gay couples parenting children. At times, participants attempted to adjust their primary lens and adopt different views that initially suggested ambivalence and sometimes contradictory positions. Despite the range of attitudes and assumptions about same-sex parenting, consensus over the potential negative developmental impact on children raised by same-sex parents remained evident. Evidence suggests that same-sex parenting is already a reality in Westernised nations and has little or no bearing on the sexual orientation of children. However, concern that children be brought up with every opportunity to 'become' heterosexual, whether they are the product of same-sex or opposite-sex parents, remains evident.

  14. Assessing women's sexuality after cancer therapy: checking assumptions with the focus group technique.

    Science.gov (United States)

    Bruner, D W; Boyd, C P

    1999-12-01

    Cancer and cancer therapies impair sexual health in a multitude of ways. The promotion of sexual health is therefore vital for preserving quality of life and is an integral part of total or holistic cancer management. Nursing, to provide holistic care, requires research that is meaningful to patients as well as the profession to develop educational and interventional studies to promote sexual health and coping. To obtain meaningful research data instruments that are reliable, valid, and pertinent to patients' needs are required. Several sexual functioning instruments were reviewed for this study and found to be lacking in either a conceptual foundation or psychometric validation. Without a defined conceptual framework, authors of the instruments must have made certain assumptions regarding what women undergoing cancer therapy experience and what they perceive as important. To check these assumptions before assessing women's sexuality after cancer therapies in a larger study, a pilot study was designed to compare what women experience and perceive as important regarding their sexuality with what is assessed in several currently available research instruments, using the focus group technique. Based on the focus group findings, current sexual functioning questionnaires may be lacking in pertinent areas of concern for women treated for breast or gynecologic malignancies. Better conceptual foundations may help future questionnaire design. Self-regulation theory may provide an acceptable conceptual framework from which to develop a sexual functioning questionnaire.

  15. The effects of behavioral and structural assumptions in artificial stock market

    Science.gov (United States)

    Liu, Xinghua; Gregor, Shirley; Yang, Jianmei

    2008-04-01

    Recent literature has developed the conjecture that important statistical features of stock price series, such as the fat tails phenomenon, may depend mainly on the market microstructure. This conjecture motivated us to investigate the roles of both the market microstructure and agent behavior with respect to high-frequency returns and daily returns. We developed two simple models to investigate this issue. The first one is a stochastic model with a clearing house microstructure and a population of zero-intelligence agents. The second one has more behavioral assumptions based on Minority Game and also has a clearing house microstructure. With the first model we found that a characteristic of the clearing house microstructure, namely the clearing frequency, can explain fat tail, excess volatility and autocorrelation phenomena of high-frequency returns. However, this feature does not cause the same phenomena in daily returns. So the Stylized Facts of daily returns depend mainly on the agents’ behavior. With the second model we investigated the effects of behavioral assumptions on daily returns. Our study implicates that the aspects which are responsible for generating the stylized facts of high-frequency returns and daily returns are different.

  16. Retrieval of Polar Stratospheric Cloud Microphysical Properties from Lidar Measurements: Dependence on Particle Shape Assumptions

    Science.gov (United States)

    Reichardt, J.; Reichardt, S.; Yang, P.; McGee, T. J.; Bhartia, P. K. (Technical Monitor)

    2001-01-01

    A retrieval algorithm has been developed for the microphysical analysis of polar stratospheric cloud (PSC) optical data obtained using lidar instrumentation. The parameterization scheme of the PSC microphysical properties allows for coexistence of up to three different particle types with size-dependent shapes. The finite difference time domain (FDTD) method has been used to calculate optical properties of particles with maximum dimensions equal to or less than 2 mu m and with shapes that can be considered more representative of PSCs on the scale of individual crystals than the commonly assumed spheroids. Specifically. these are irregular and hexagonal crystals. Selection of the optical parameters that are input to the inversion algorithm is based on a potential data set such as that gathered by two of the lidars on board the NASA DC-8 during the Stratospheric Aerosol and Gas Experiment 0 p (SAGE) Ozone Loss Validation experiment (SOLVE) campaign in winter 1999/2000: the Airborne Raman Ozone and Temperature Lidar (AROTEL) and the NASA Langley Differential Absorption Lidar (DIAL). The 0 microphysical retrieval algorithm has been applied to study how particle shape assumptions affect the inversion of lidar data measured in leewave PSCs. The model simulations show that under the assumption of spheroidal particle shapes, PSC surface and volume density are systematically smaller than the FDTD-based values by, respectively, approximately 10-30% and approximately 5-23%.

  17. The ozone depletion potentials on halocarbons: Their dependence of calculation assumptions

    Science.gov (United States)

    Karol, Igor L.; Kiselev, Andrey A.

    1994-01-01

    The concept of Ozone Depletion Potential (ODP) is widely used in the evaluation of numerous halocarbons and of their replacement effects on ozone, but the methods, assumptions and conditions used in ODP calculations have not been analyzed adequately. In this paper a model study of effects on ozone of the instantaneous releases of various amounts of CH3CCl3 and of CHF2Cl (HCFC-22) for several compositions of the background atmosphere are presented, aimed at understanding connections of ODP values with the assumptions used in their calculations. To facilitate the ODP computation in numerous versions for the long time periods after their releases, the above rather short-lived gases and the one-dimensional radiative photochemical model of the global annually averaged atmospheric layer up to 50 km height are used. The variation of released gas global mass from 1 Mt to 1 Gt leads to ODP value increase with its stabilization close to the upper bound of this range in the contemporary atmosphere. The same variations are analyzed for conditions of the CFC-free atmosphere of 1960's and for the anthropogenically loaded atmosphere in the 21st century according to the known IPCC 'business as usual' scenario. Recommendations for proper ways of ODP calculations are proposed for practically important cases.

  18. Temporal Distinctiveness in Task Switching: Assessing the Mixture-Distribution Assumption

    Directory of Open Access Journals (Sweden)

    James A Grange

    2016-02-01

    Full Text Available In task switching, increasing the response--cue interval has been shown to reduce the switch cost. This has been attributed to a time-based decay process influencing the activation of memory representations of tasks (task-sets. Recently, an alternative account based on interference rather than decay has been successfully applied to this data (Horoufchin et al., 2011. In this account, variation of the RCI is thought to influence the temporal distinctiveness (TD of episodic traces in memory, thus affecting their retrieval probability. This can affect performance as retrieval probability influences response time: If retrieval succeeds, responding is fast due to positive priming; if retrieval fails, responding is slow, due to having to perform the task via a slow algorithmic process. This account---and a recent formal model (Grange & Cross, 2015---makes the strong prediction that all RTs are a mixture of one of two processes: a fast process when retrieval succeeds, and a slow process when retrieval fails. The present paper assesses the evidence for this mixture-distribution assumption in TD data. In a first section, statistical evidence for mixture-distributions is found using the fixed-point property test. In a second section, a mathematical process model with mixture-distributions at its core is fitted to the response time distribution data. Both approaches provide good evidence in support of the mixture-distribution assumption, and thus support temporal distinctiveness accounts of the data.

  19. Baseline Projections for Latin America: Base-Year Assumptions, Key Drivers and Greenhouse Emissions

    Energy Technology Data Exchange (ETDEWEB)

    van Ruijven, Bas; Daenzer, Katie; Fisher-Vanden, Karen; Kober, Tom; Paltsev, S.; Beach, Robert H.; Calderon, Silvia; Calvin, Katherine V.; Labriet, Maryse; Kitous, Alban; Lucena, Andre; Van Vuuren, Detlef

    2016-09-01

    This paper provides an overview of the base-year assumptions and core baseline projections for the set of models participating in the LAMP and CLIMACAP projects. We present the range in core baseline projections for Latin America, and identify key differences between model projections including how these projections compare to historic trends. We find relatively large differences across models in base year assumptions related to population, GDP, energy and CO2 emissions due to the use of different data sources, but also conclude that this does not influence the range of projections. We find that population and GDP projections across models span a broad range, comparable to the range represented by the set of Shared Socioeconomic Pathways (SSPs). Kaya-factor decomposition indicates that the set of core baseline scenarios mirrors trends experienced over the past decades. Emissions in Latin America are projected to rise as result of GDP and population growth and a minor shift in the energy mix toward fossil fuels. Most scenarios assume a somewhat higher GDP growth than historically observed and continued decline of population growth. Minor changes in energy intensity or energy mix are projected over the next few decades.

  20. A computer code for forward calculation and inversion of the H/V spectral ratio under the diffuse field assumption

    Science.gov (United States)

    García-Jerez, Antonio; Piña-Flores, José; Sánchez-Sesma, Francisco J.; Luzón, Francisco; Perton, Mathieu

    2016-12-01

    During a quarter of a century, the main characteristics of the horizontal-to-vertical spectral ratio of ambient noise HVSRN have been extensively used for site effect assessment. In spite of the uncertainties about the optimum theoretical model to describe these observations, over the last decade several schemes for inversion of the full HVSRN curve for near surface surveying have been developed. In this work, a computer code for forward calculation of H/V spectra based on the diffuse field assumption (DFA) is presented and tested. It takes advantage of the recently stated connection between the HVSRN and the elastodynamic Green's function which arises from the ambient noise interferometry theory. The algorithm allows for (1) a natural calculation of the Green's functions imaginary parts by using suitable contour integrals in the complex wavenumber plane, and (2) separate calculation of the contributions of Rayleigh, Love, P-SV and SH waves as well. The stability of the algorithm at high frequencies is preserved by means of an adaptation of the Wang's orthonormalization method to the calculation of dispersion curves, surface-waves medium responses and contributions of body waves. This code has been combined with a variety of inversion methods to make up a powerful tool for passive seismic surveying.

  1. A computer code for forward calculation and inversion of the H/V spectral ratio under the diffuse field assumption

    CERN Document Server

    García-Jerez, Antonio; Sánchez-Sesma, Francisco J; Luzón, Francisco; Perton, Mathieu

    2016-01-01

    During a quarter of a century, the main characteristics of the horizontal-to-vertical spectral ratio of ambient noise HVSRN have been extensively used for site effect assessment. In spite of the uncertainties about the optimum theoretical model to describe these observations, several schemes for inversion of the full HVSRN curve for near surface surveying have been developed over the last decade. In this work, a computer code for forward calculation of H/V spectra based on the diffuse field assumption (DFA) is presented and tested.It takes advantage of the recently stated connection between the HVSRN and the elastodynamic Green's function which arises from the ambient noise interferometry theory. The algorithm allows for (1) a natural calculation of the Green's functions imaginary parts by using suitable contour integrals in the complex wavenumber plane, and (2) separate calculation of the contributions of Rayleigh, Love, P-SV and SH waves as well. The stability of the algorithm at high frequencies is preserv...

  2. Tale of Two Courthouses: A Critique of the Underlying Assumptions in Chronic Disease Self-Management for Aboriginal People

    Directory of Open Access Journals (Sweden)

    Isabelle Ellis

    2009-12-01

    Full Text Available This article reviews the assumptions that underpin thecommonly implemented Chronic Disease Self-Managementmodels. Namely that there are a clear set of instructions forpatients to comply with, that all health care providers agreewith; and that the health care provider and the patient agreewith the chronic disease self-management plan that wasdeveloped as part of a consultation. These assumptions areevaluated for their validity in the remote health care context,particularly for Aboriginal people. These assumptions havebeen found to lack validity in this context, therefore analternative model to enhance chronic disease care isproposed.

  3. Recursive Subspace Identification of AUV Dynamic Model under General Noise Assumption

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A recursive subspace identification algorithm for autonomous underwater vehicles (AUVs is proposed in this paper. Due to the advantages at handling nonlinearities and couplings, the AUV model investigated here is for the first time constructed as a Hammerstein model with nonlinear feedback in the linear part. To better take the environment and sensor noises into consideration, the identification problem is concerned as an errors-in-variables (EIV one which means that the identification procedure is under general noise assumption. In order to make the algorithm recursively, propagator method (PM based subspace approach is extended into EIV framework to form the recursive identification method called PM-EIV algorithm. With several identification experiments carried out by the AUV simulation platform, the proposed algorithm demonstrates its effectiveness and feasibility.

  4. On the synergy of nuclear data for fusion and model assumptions

    Science.gov (United States)

    Avrigeanu, Vlad; Avrigeanu, Marilena

    2017-09-01

    A deuteron breakup (BU) parametrization is involved within the BU analysis of recently measured reaction-in-flight (RIF) neutron time-of-flight spectrum, while open questions underlined previously on related fast-neutron induced reaction on Zr isotopes are also addressed in a consistent way, at once with the use of a recent optical potential for α-particles to understand the large discrepancy between the measured and calculated cross sections of the 94Zr(n,α)91Sr reaction. Thus the synergy between the above-mentioned three distinct subjects may finally lead to smaller uncertainties of the nuclear data for fusion while the RIF neutron spectra may also be used to support nuclear model assumptions.

  5. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure.

    Science.gov (United States)

    Weir, Scott M; Suski, Jamie G; Salice, Christopher J

    2010-12-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Experimental evaluation of the pure configurational stress assumption in the flow dynamics of entangled polymer melts

    DEFF Research Database (Denmark)

    Rasmussen, Henrik K.; Bejenariu, Anca Gabriela; Hassager, Ole

    2010-01-01

    A filament stretching rheometer was used for measuring the startup of uni-axial elongational flow followed by reversed bi-axial flow, both with a constant elongational strain rate. A narrow molecular mass distribution linear polyisoprene with a molecular weight of 483 kg/mole was subjected...... to the flow in the non-linear flow regime. This has allowed highly elastic measurements within the limit of pure orientational stress, as the time of the flow was considerably smaller than the Rouse time. A Doi-Edwards [J. Chem. Soc., Faraday Trans. 2 74, 1818-1832 (1978)] type of constitutive model...... with the assumption of pure configurational stress was accurately able to predict the startup as well as the reversed flow behavior. This confirms that this commonly used theoretical picture for the flow of polymeric liquids is a correct physical principle to apply. c 2010 The Society of Rheology. [DOI: 10.1122/1.3496378]...

  7. Impact of velocity distribution assumption on simplified laser speckle imaging equation.

    Science.gov (United States)

    Ramirez-San-Juan, Julio C; Ramos-García, Ruben; Guizar-Iturbide, Ileana; Martínez-Niconoff, Gabriel; Choi, Bernard

    2008-03-03

    Since blood flow is tightly coupled to the health status of biological tissue, several instruments have been developed to monitor blood flow and perfusion dynamics. One such instrument is laser speckle imaging. The goal of this study was to evaluate the use of two velocity distribution assumptions (Lorentzian- and Gaussian-based) to calculate speckle flow index (SFI) values. When the normalized autocorrelation function for the Lorentzian and Gaussian velocity distributions satisfy the same definition of correlation time, then the same velocity range is predicted for low speckle contrast (0 < C < 0.6) and predict different flow velocity range for high contrast. Our derived equations form the basis for simplified calculations of SFI values.

  8. Cultural values embodying universal norms: a critique of a popular assumption about cultures and human rights.

    Science.gov (United States)

    Jing-Bao, Nie

    2005-09-01

    In Western and non-Western societies, it is a widely held belief that the concept of human rights is, by and large, a Western cultural norm, often at odds with non-Western cultures and, therefore, not applicable in non-Western societies. The Universal Draft Declaration on Bioethics and Human Rights reflects this deep-rooted and popular assumption. By using Chinese culture(s) as an illustration, this article points out the problems of this widespread misconception and stereotypical view of cultures and human rights. It highlights the often ignored positive elements in Chinese cultures that promote and embody universal human values such as human dignity and human rights. It concludes, accordingly, with concrete suggestions on how to modify the Declaration.

  9. Space transportation nodes assumptions and requirements: Lunar base systems study task 2.1

    Science.gov (United States)

    Kahn, Taher Ali; Simonds, Charles H.; Stump, William R.

    1988-01-01

    The Space Transportation Nodes Assumptions and Requirements task was performed as part of the Advanced Space Transportation Support Contract, a NASA Johnson Space Center (JSC) study intended to provide planning for a Lunar Base near the year 2000. The original task statement has been revised to satisfy the following queries: (1) What vehicles are to be processed at the transportation node; (2) What is the flow of activities involved in a vehicle passing through the node; and (3) What node support resources are necessary to support a lunar scenario traffic model composed of a mix of vehicles in an active flight schedule. The Lunar Base Systems Study is concentrating on the initial years of the Phase 2 Lunar Base Scenario. The study will develop the first five years of that phase in order to define the transportation and surface systems (including mass, volumes, power requirements, and designs).

  10. HARDINESS, WORLD ASSUMPTIONS, MOTIVATION OF ATHLETES OF CONTACT AND NOT CONTACT KINDS OF SPORT

    Directory of Open Access Journals (Sweden)

    Elena Vladimirovna Molchanova

    2017-04-01

    Full Text Available Investigation of personal psychological specificity of athletes of contact (freestyle wrestling and not contact (archery kinds of sport were carried out. Pronounced deviation in hardiness, world assumptions, motives for sport doing were obtained. In particularly, archery athletes possess higher values of hardiness and positively view the world, than wrestlers, while possess less motives for sport doing as “successful for life quality and skills” and “physical perfection”. Thus for athletes not contact kinds of sports rather coping in permanent stressed conditions are predicted. The obtained results are practically important for counseling work of sport psychologists and moreover they could be a basement for training teach programs and challenge stress overcoming programs.

  11. GLRT-Based Spectrum Sensing with Blindly Learned Feature under Rank-1 Assumption

    CERN Document Server

    Zhang, Peng

    2011-01-01

    Prior knowledge can improve the performance of spectrum sensing. Instead of using universal features as prior knowledge, we propose to blindly learn the localized feature at the secondary user. Motivated by pattern recognition in machine learning, we define signal feature as the leading eigenvector of the signal's sample covariance matrix. Feature learning algorithm (FLA) for blind feature learning and feature template matching algorithm (FTM) for spectrum sensing are proposed. Furthermore, we implement the FLA and FTM in hardware. Simulations and hardware experiments show that signal feature can be learned blindly. In addition, by using signal feature as prior knowledge, the detection performance can be improved by about 2 dB. Motivated by experimental results, we derive several GLRT based spectrum sensing algorithms under rank-1 assumption, considering signal feature, signal power and noise power as the available parameters. The performance of our proposed algorithms is tested on both synthesized rank-1 sig...

  12. The role of relevance and mutual assumption in the language of contract communication

    Directory of Open Access Journals (Sweden)

    Dave Mansergh

    1996-03-01

    Full Text Available Contract communication problems are common within the landscape industry. The contextual frames of reference and assumptions held by both designer and contractor affect the way information is interpreted. In order to interpret a piece of communication correctly, both parties must learn and understand the meanings and implications of the language used. This requires the formation of mutual understanding between them, whereby quality is more likely to be achieved. Relevance theory offers an explanation as to why contract communication problems occur and a guide for achieving successful contract communication. The importance of good communication within the landscape construction industry cannot be over emphasised... On site problems ... usually occur due to communication failure. (Mayer 1987, p.1

  13. Changing assumption for the design process – New roles of the active end user

    Directory of Open Access Journals (Sweden)

    Monika Hestad

    2009-12-01

    Full Text Available The aim of this article is to discuss how end user involvement in all stages of a product life cycle changes the assumptions of the design process. This article is based on a literature review and three case studies – Imsdal (Ringnes/Carlsberg, Jordan and Stokke. Several examples of how consumers or users are involved in various stages of the product life cycle are presented. The product development is affected both by end users’ activity and by previous knowledge of the product. The use of the product is changing the meaning, and even the disposal of the product is affecting how the product is perceived. The product becomes part of a cultural and historical context in which the end user is actively shaping.  

  14. Washington International Renewable Energy Conference 2008 Pledges: Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, B.; Bilello, D. E.; Cowlin, S. C.; Mann, M.; Wise, A.

    2008-08-01

    The 2008 Washington International Renewable Energy Conference (WIREC) was held in Washington, D.C., from March 4-6, 2008, and involved nearly 9,000 people from 125 countries. The event brought together worldwide leaders in renewable energy (RE) from governments, international organizations, nongovernmental organizations, and the private sector to discuss the role that renewables can play in alleviating poverty, growing economies, and passing on a healthy planet to future generations. The conference concluded with more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy. The U.S. government authorized the National Renewable Energy Laboratory (NREL) to estimate the carbon dioxide (CO2) savings that would result from the pledges made at the 2008 conference. This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions derived from those pledges.

  15. Testing assumptions of statistical learning: is it long-term and implicit?

    Science.gov (United States)

    Kim, Robyn; Seitz, Aaron; Feenstra, Heather; Shams, Ladan

    2009-09-18

    Statistical learning has been studied as a mechanism by which people automatically and implicitly learn patterns in the environment. Here, we sought to examine general assumptions about statistical learning, including whether the learning is long-term, and whether it can occur implicitly. We exposed participants to a stream of stimuli, then tested them immediately after, or 24h after, exposure, with separate tests meant to measure implicit and explicit knowledge. To measure implicit learning, we analyzed reaction times during a rapid serial visual presentation detection task; for explicit learning, we used a matching questionnaire. Subjects' reaction time performance indicated that they did implicitly learn the exposed sequences, and furthermore, this learning was unrelated to explicit learning. These learning effects were observed both immediately after exposure and after a 24-h delay. These experiments offer concrete evidence that statistical learning is long-term and that the learning involves implicit learning mechanisms.

  16. CRITICAL ASSUMPTIONS IN THE F-TANK FARM CLOSURE OPERATIONAL DOCUMENTATION REGARDING WASTE TANK INTERNAL CONFIGURATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Hommel, S.; Fountain, D.

    2012-03-28

    The intent of this document is to provide clarification of critical assumptions regarding the internal configurations of liquid waste tanks at operational closure, with respect to F-Tank Farm (FTF) closure documentation. For the purposes of this document, FTF closure documentation includes: (1) Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the FTF PA) (SRS-REG-2007-00002), (2) Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site (DOE/SRS-WD-2012-001), (3) Tier 1 Closure Plan for the F-Area Waste Tank Systems at the Savannah River Site (SRR-CWDA-2010-00147), (4) F-Tank Farm Tanks 18 and 19 DOE Manual 435.1-1 Tier 2 Closure Plan Savannah River Site (SRR-CWDA-2011-00015), (5) Industrial Wastewater Closure Module for the Liquid Waste Tanks 18 and 19 (SRRCWDA-2010-00003), and (6) Tank 18/Tank 19 Special Analysis for the Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the Tank 18/Tank 19 Special Analysis) (SRR-CWDA-2010-00124). Note that the first three FTF closure documents listed apply to the entire FTF, whereas the last three FTF closure documents listed are specific to Tanks 18 and 19. These two waste tanks are expected to be the first two tanks to be grouted and operationally closed under the current suite of FTF closure documents and many of the assumptions and approaches that apply to these two tanks are also applicable to the other FTF waste tanks and operational closure processes.

  17. A rigid thorax assumption affects model loading predictions at the upper but not lower lumbar levels.

    Science.gov (United States)

    Ignasiak, Dominika; Ferguson, Stephen J; Arjmand, Navid

    2016-09-06

    A number of musculoskeletal models of the human spine have been used for predictions of lumbar and muscle forces. However, the predictive power of these models might be limited by a commonly made assumption; thoracic region is represented as a single lumped rigid body. This study hence aims to investigate the impact of such assumption on the predictions of spinal and muscle forces. A validated thoracolumbar spine model was used with a flexible thorax (T1-T12), a completely rigid one or rigid with thoracic posture updated at each analysis step. The simulations of isometric forward flexion up to 80°, with and without a 20kg hand load, were performed, based on the previously measured kinematics. Depending on the simulated task, the rigid model predicted slightly or moderately lower compressive loading than the flexible one. The differences were relatively greater at the upper lumbar levels (average underestimation of 14% at the T12L1 for flexion tasks and of 18% for flexion tasks with hand load) as compared to the lower levels (3% and 8% at the L5S1 for unloaded and loaded tasks, respectively). The rigid model with updated thoracic posture predicted compressive forces similar to those of the rigid model. Predicted muscle forces were, however, very different between the three models. This study indicates that the lumbar spine models with a rigid thorax definition can be used for loading investigations at the lowermost spinal levels. For predictions of upper lumbar spine loading, using models with an articulated thorax is advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Problematic assumptions have slowed down depression research: why symptoms, not syndromes are the way forward

    Directory of Open Access Journals (Sweden)

    Eiko I Fried

    2015-03-01

    Full Text Available Major Depression (MD is a highly heterogeneous diagnostic category. Diverse symptoms such as sad mood, anhedonia, and fatigue are routinely added to an unweighted sum-score, and cutoffs are used to distinguish between depressed participants and healthy controls. Researchers then investigate outcome variables like MD risk factors, biomarkers, and treatment response in such samples. These practices presuppose that (1 depression is a discrete condition, and that (2 symptoms are interchangeable indicators of this latent disorder. Here I review these two assumptions, elucidate their historical roots, show how deeply engrained they are in psychological and psychiatric research, and document that they contrast with evidence. Depression is not a consistent syndrome with clearly demarcated boundaries, and depression symptoms are not interchangeable indicators of an underlying disorder. Current research practices lump individuals with very different problems into one category, which has contributed to the remarkably slow progress in key research domains such as the development of efficacious antidepressants or the identification of biomarkers for depression.The recently proposed network framework offers an alternative to the problematic assumptions. MD is not understood as a distinct condition, but as heterogeneous symptom cluster that substantially overlaps with other syndromes such as anxiety disorders. MD is not framed as an underlying disease with a number of equivalent indicators, but as a network of symptoms that have direct causal influence on each other: insomnia can cause fatigue which then triggers concentration and psychomotor problems. This approach offers new opportunities for constructing an empirically based classification system and has broad implications for future research.

  19. Limitations to the Dutch cannabis toleration policy: Assumptions underlying the reclassification of cannabis above 15% THC.

    Science.gov (United States)

    Van Laar, Margriet; Van Der Pol, Peggy; Niesink, Raymond

    2016-08-01

    The Netherlands has seen an increase in Δ9-tetrahydrocannabinol (THC) concentrations from approximately 8% in the 1990s up to 20% in 2004. Increased cannabis potency may lead to higher THC-exposure and cannabis related harm. The Dutch government officially condones the sale of cannabis from so called 'coffee shops', and the Opium Act distinguishes cannabis as a Schedule II drug with 'acceptable risk' from other drugs with 'unacceptable risk' (Schedule I). Even in 1976, however, cannabis potency was taken into account by distinguishing hemp oil as a Schedule I drug. In 2011, an advisory committee recommended tightening up legislation, leading to a 2013 bill proposing the reclassification of high potency cannabis products with a THC content of 15% or more as a Schedule I drug. The purpose of this measure was twofold: to reduce public health risks and to reduce illegal cultivation and export of cannabis by increasing punishment. This paper focuses on the public health aspects and describes the (explicit and implicit) assumptions underlying this '15% THC measure', as well as to what extent these are supported by scientific research. Based on scientific literature and other sources of information, we conclude that the 15% measure can provide in theory a slight health benefit for specific groups of cannabis users (i.e., frequent users preferring strong cannabis, purchasing from coffee shops, using 'steady quantities' and not changing their smoking behaviour), but certainly not for all cannabis users. These gains should be weighed against the investment in enforcement and the risk of unintended (adverse) effects. Given the many assumptions and uncertainty about the nature and extent of the expected buying and smoking behaviour changes, the measure is a political choice and based on thin evidence. Copyright © 2016 Springer. Published by Elsevier B.V. All rights reserved.

  20. Scoring of coronary artery calcium scans: history, assumptions, current limitations, and future directions.

    Science.gov (United States)

    Alluri, Krishna; Joshi, Parag H; Henry, Travis S; Blumenthal, Roger S; Nasir, Khurram; Blaha, Michael J

    2015-03-01

    Coronary artery calcium (CAC) scanning is a reliable, noninvasive technique for estimating overall coronary plaque burden and for identifying risk for future cardiac events. Arthur Agatston and Warren Janowitz published the first technique for scoring CAC scans in 1990. Given the lack of available data correlating CAC with burden of coronary atherosclerosis at that time, their scoring algorithm was remarkable, but somewhat arbitrary. Since then, a few other scoring techniques have been proposed for the measurement of CAC including the Volume score and Mass score. Yet despite new data, little in this field has changed in the last 15 years. The main focus of our paper is to review the implications of the current approach to scoring CAC scans in terms of correlation with the central disease - coronary atherosclerosis. We first discuss the methodology of each available scoring system, describing how each of these scores make important indirect assumptions in the way they account (or do not account) for calcium density, location of calcium, spatial distribution of calcium, and microcalcification/emerging calcium that might limit their predictive power. These assumptions require further study in well-designed, large event-driven studies. In general, all of these scores are adequate and are highly correlated with each other. Despite its age, the Agatston score remains the most extensively studied and widely accepted technique in both the clinical and research settings. After discussing CAC scoring in the era of contrast enhanced coronary CT angiography, we discuss suggested potential modifications to current CAC scanning protocols with respect to tube voltage, tube current, and slice thickness which may further improve the value of CAC scoring. We close with a focused discussion of the most important future directions in the field of CAC scoring. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. The Assessment of the Intelligence of Latinos in the United States. (La Medicion de la Inteligencia de los Latinos en los Estados Unidos).

    Science.gov (United States)

    Cauce, Ana M.; And Others

    Most of the research on the assessment of the intelligence of Latinos in the United States appears to be based on some possibly erroneous or at least dubious assumptions. Among these are the following: (1) the assumption of bilinguality; (2) the assumption of equal proficiency in the English language; (3) the assumption of the equivalence of…

  2. Improving inference for aerial surveys of bears: The importance of assumptions and the cost of unnecessary complexity

    National Research Council Canada - National Science Library

    Schmidt, Joshua H; Wilson, Tammy L; Thompson, William L; Reynolds, Joel H

    2017-01-01

    ... is critical to appropriate inference. When the underlying assumptions of sampling approaches are violated, both increased bias and reduced precision of the population estimator may result. Bear ( Ursus spp...

  3. Evaluating assumptions for least squares analysis using the general linear model: a guide for the pharmaceutical industry statistician.

    Science.gov (United States)

    Darken, Patrick F

    2004-08-01

    A review of graphical and test based methods for evaluating assumptions underlying the use of least squares analysis with the general linear model is presented along with some discussion of robustness. Alternative analyses are described for situations where there is evidence that the assumptions are not reasonable. Evaluation of the assumptions is illustrated through the use of an example from a clinical trial used for US registration purposes. It is recommended that: (1) most assumptions required for the least squares analysis of data using the general linear model can be judged using residuals graphically without the need for formal testing, (2) it is more important to normalize data or to use nonparametric methods when there is heterogeneous variance between treatment groups, and (3) nonparametric analyses can be used to demonstrate robustness of results and that it is best to specify these analyses prior to unblinding.

  4. Quantum Darwinism Requires an Extra-Theoretical Assumption of Encoding Redundancy

    Science.gov (United States)

    Fields, Chris

    2010-10-01

    Observers restricted to the observation of pointer states of apparatus cannot conclusively demonstrate that the pointer of an apparatus mathcal{A} registers the state of a system of interest S without perturbing S. Observers cannot, therefore, conclusively demonstrate that the states of a system S are redundantly encoded by pointer states of multiple independent apparatus without destroying the redundancy of encoding. The redundancy of encoding required by quantum Darwinism must, therefore, be assumed from outside the quantum-mechanical formalism and without the possibility of experimental demonstration.

  5. Individualism, collectivism and ethnic identity: cultural assumptions in accounting for caregiving behaviour in Britain.

    Science.gov (United States)

    Willis, Rosalind

    2012-09-01

    Britain is experiencing the ageing of a large number of minority ethnic groups for the first time in its history, due to the post-war migration of people from the Caribbean and the Indian subcontinent. Stereotypes about a high level of provision of informal caregiving among minority ethnic groups are common in Britain, as in the US, despite quantitative studies refuting this assumption. This paper reports on a qualitative analysis of in-depth interviews with older people from five different ethnic groups about their conceptualisation of their ethnic identity, and their attributions of motivations of caregiving within their own ethnic group and in other groups. It is argued that ethnic identity becomes salient after migration and becoming a part of an ethnic minority group in the new country. Therefore, White British people who have never migrated do not have a great sense of ethnic identity. Further, a strong sense of ethnic identity is linked with identifying with the collective rather than the individual, which explains why the White British participants gave an individualist account of their motivations for informal care, whereas the minority ethnic participants gave a collectivist account of their motivations of care. Crucially, members of all ethnic groups were providing or receiving informal care, so it was the attribution and not the behaviour which differed.

  6. DIDACTICAL - METHODICAL ASSUMPTIONS AND CONDITIONS FOR SUCCESSFUL SOLUTION OF ECOLOGICAL PROBLEMS AT PRESCHOOL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Zvezdan Arsić

    2013-06-01

    Full Text Available The ecological crisis, arising as a result of the highestaspirations of mankind for the production of goods, is getting wider and wider. The significance of the problem and a real threat to the environment, leads us to conclude that, today more than ever before, there is a need to develop environmental awareness and culture from the earliest period of life. This means that contemporary environmental issues, should have asignificant place and role in the organization of educational work in kindergartens. The  need to start with environmental education as the preschool level, results from the psychophysical characteristics of child development, and the fact that this is a period when the foundations for future personality.Taking into account the above requirements and findings, we intend to work in addition to pointing out the importance of institutional preschool education for the environment, and focusour attention on certain didactic-methodological assumptions and conditions, whichshould be respected and to ensure the achievement of theenvironmental education in preschool was as expected.

  7. Diversity within African American, female therapists: variability in clients' expectations and assumptions about the therapist.

    Science.gov (United States)

    Kelly, Jennifer F; Greene, Beverly

    2010-06-01

    Despite the presence of some literature that has addressed the characteristics of the African American female therapist, most psychotherapy training proceeds with the assumption that therapists are members of dominant groups, and most of the psychological and psychotherapy literature has been written by therapists and psychologists who come from dominant cultural perspectives. Not as much has been written about psychological paradigms or the process of psychotherapy from the perspective of the therapist who is not a dominant group member. This article explores both the common and divergent experiences that we, the authors, share as African American female therapists and the different reactions we frequently elicit in clients. We also explore how individual differences in our physical appearances, personal backgrounds, and different characteristics of our respective practices elicit distinct responses from clients that we believe are based on differences between us, despite the fact that we are both African American women. We believe that many of the stereotypes that affect perceptions of African American female clients also exist for African American female therapists. We will address how the intersection of gender, race, and sexual orientation of the client highlights the complexity of culturally competent practice. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  8. The current theoretical assumptions of the Bobath concept as determined by the members of BBTA.

    Science.gov (United States)

    Raine, Sue

    2007-01-01

    The Bobath concept is a problem-solving approach to the assessment and treatment of individuals following a lesion of the central nervous system that offers therapists a framework for their clinical practice. The aim of this study was to facilitate a group of experts in determining the current theoretical assumptions underpinning the Bobath concept.A four-round Delphi study was used. The expert sample included all 15 members of the British Bobath Tutors Association. Initial statements were identified from the literature with respondents generating additional statements. Level of agreement was determined by using a five-point Likert scale. Level of consensus was set at 80%. Eighty-five statements were rated from the literature along with 115 generated by the group. Ninety-three statements were identified as representing the theoretical underpinning of the Bobath concept. The Bobath experts agreed that therapists need to be aware of the principles of motor learning such as active participation, opportunities for practice and meaningful goals. They emphasized that therapy is an interactive process between individual, therapist, and the environment and aims to promote efficiency of movement to the individual's maximum potential rather than normal movement. Treatment was identified by the experts as having "change of functional outcome" at its center.

  9. A framework for testing assumptions about foraging scales, body mass, and niche separation using telemetry data.

    Science.gov (United States)

    Cumming, Graeme S; Henry, Dominic A W; Reynolds, Chevonne

    2017-07-01

    Ecological theory predicts that if animals with very similar dietary requirements inhabit the same landscape, then they should avoid niche overlap by either exploiting food resources at different times or foraging at different spatial scales. Similarly, it is often assumed that animals that fall in different body mass modes and share the same body plan will use landscapes at different spatial scales. We developed a new methodological framework for understanding the scaling of foraging (i.e. the range and distribution of scales at which animals use their landscapes) by applying a combination of three well-established methods to satellite telemetry data to quantify foraging patch size distributions: (1) first-passage time analysis; (2) a movement-based kernel density estimator; and (3) statistical comparison of resulting histograms and tests for multimodality. We demonstrate our approach using two sympatric, ecologically similar species of African ducks with quite different body masses: Egyptian Geese (actually a shelduck), and Red-billed Teal. Contrary to theoretical predictions, the two species, which are sympatric throughout the year, foraged at almost identical spatial scales. Our results show how ecologists can use GPS tracking data to explicitly quantify and compare the scales of foraging by different organisms within an animal community. Our analysis demonstrates both a novel approach to foraging data analysis and the need for caution when making assumptions about the relationships among niche separation, diet, and foraging scale.

  10. Allele Age Under Non-Classical Assumptions is Clarified by an Exact Computational Markov Chain Approach.

    Science.gov (United States)

    De Sanctis, Bianca; Krukov, Ivan; de Koning, A P Jason

    2017-09-19

    Determination of the age of an allele based on its population frequency is a well-studied problem in population genetics, for which a variety of approximations have been proposed. We present a new result that, surprisingly, allows the expectation and variance of allele age to be computed exactly (within machine precision) for any finite absorbing Markov chain model in a matter of seconds. This approach makes none of the classical assumptions (e.g., weak selection, reversibility, infinite sites), exploits modern sparse linear algebra techniques, integrates over all sample paths, and is rapidly computable for Wright-Fisher populations up to N e  = 100,000. With this approach, we study the joint effect of recurrent mutation, dominance, and selection, and demonstrate new examples of "selective strolls" where the classical symmetry of allele age with respect to selection is violated by weakly selected alleles that are older than neutral alleles at the same frequency. We also show evidence for a strong age imbalance, where rare deleterious alleles are expected to be substantially older than advantageous alleles observed at the same frequency when population-scaled mutation rates are large. These results highlight the under-appreciated utility of computational methods for the direct analysis of Markov chain models in population genetics.

  11. The necessary distinction between methodology and philosophical assumptions in healthcare research.

    Science.gov (United States)

    Mesel, Terje

    2013-09-01

    Methodological discussions within healthcare research have traditionally described a methodological dichotomy between qualitative and quantitative methods. The aim of this article is to demonstrate that such a dichotomy presents unnecessary obstacles for good research design and is methodologically and philosophically unsustainable. The issue of incommensurability is not a question of method but rather a question of the philosophical premises underpinning a given method. Thus, transparency on the philosophical level is important for validity and consistency as well as for attempts to integrate or establish an interface to other research. I argue that it is necessary to make a distinction between methodology and philosophical assumptions and to ensure consistency in these correlations. Furthermore, I argue that the question of incommensurability is best answered at this basic philosophical level. The complexity of health care calls for methodological pluralism and creativity that utilises the strength of both qualitative and quantitative approaches. Transparency and consistency on the philosophical level can facilitate new mixed methods research designs that may be promising methodological assets for healthcare research. I believe we are ill served by fortified positions that continue to uphold old battle lines. Empirical research begins in the field of practice and requires a certain amount of pragmatism. However, this pragmatism must be philosophically informed. © 2012 The Authors. Scandinavian Journal of Caring Sciences © 2012 Nordic College of Caring Science.

  12. Does the Assumption on Innovation Process Play an Important Role for Filtered Historical Simulation Model?

    Directory of Open Access Journals (Sweden)

    Emrah Altun

    2018-01-01

    Full Text Available Most of the financial institutions compute the Value-at-Risk (VaR of their trading portfolios using historical simulation-based methods. In this paper, we examine the Filtered Historical Simulation (FHS model introduced by Barone-Adesi et al. (1999 theoretically and empirically. The main goal of this study is to find an answer for the following question: “Does the assumption on innovation process play an important role for the Filtered Historical Simulation model?”. For this goal, we investigate the performance of FHS model with skewed and fat-tailed innovations distributions such as normal, skew normal, Student’s-t, skew-T, generalized error, and skewed generalized error distributions. The performances of FHS models are evaluated by means of unconditional and conditional likelihood ratio tests and loss functions. Based on the empirical results, we conclude that the FHS models with generalized error and skew-T distributions produce more accurate VaR forecasts.

  13. 3 DOF Spherical Pendulum Oscillations with a Uniform Slewing Pivot Center and a Small Angle Assumption

    Directory of Open Access Journals (Sweden)

    Alexander V. Perig

    2014-01-01

    Full Text Available The present paper addresses the derivation of a 3 DOF mathematical model of a spherical pendulum attached to a crane boom tip for uniform slewing motion of the crane. The governing nonlinear DAE-based system for crane boom uniform slewing has been proposed, numerically solved, and experimentally verified. The proposed nonlinear and linearized models have been derived with an introduction of Cartesian coordinates. The linearized model with small angle assumption has an analytical solution. The relative and absolute payload trajectories have been derived. The amplitudes of load oscillations, which depend on computed initial conditions, have been estimated. The dependence of natural frequencies on the transport inertia forces and gravity forces has been computed. The conservative system, which contains first time derivatives of coordinates without oscillation damping, has been derived. The dynamic analogy between crane boom-driven payload swaying motion and Foucault’s pendulum motion has been grounded and outlined. For a small swaying angle, good agreement between theoretical and averaged experimental results was obtained.

  14. Evaluation of the accuracy of antioxidant competition assays: incorrect assumptions with major impact.

    Science.gov (United States)

    Balk, Jiska M; Bast, Aalt; Haenen, Guido R M M

    2009-07-15

    The activity of antioxidants is frequently determined in competition assays. In these assays an antioxidant (A) and a detector molecule (D) compete for the reactive species (R). The competitive inhibitory effect of A on the reaction of D with R is a measure of the antioxidant activity of A. In determining the activity of A, it is in general incorrectly assumed that the concentrations of A and D remain equal to the initial concentration. However, the principle of the assay is that some A and D is consumed and consequently the concentrations of A and D will decrease during a competition assay, resulting in a deviation in the observed antioxidant activity. Computer modeling was used to obtain a graphical tool to estimate the extent of the deviation caused by the incorrect assumption that the concentrations of A and D do not decrease. Several competition assays were evaluated using this graphical tool, demonstrating that frequently inaccurate antioxidant activities have been reported. In general, differences between antioxidants are underestimated and the activity of all antioxidants shifts toward the antioxidant activity of D. A strategy is provided to improve the accuracy of a competition assay. To obtain accurate results in a competition assay, the reaction rate constant of the detector molecule with the reactive species should be comparable to that of the antioxidant. In addition, the concentration of the reactive species should be as low as possible.

  15. Deficient crisis-probing practices and taken-for-granted assumptions in health organisations

    Science.gov (United States)

    Canyon, Deon V.; Adhikari, Ashmita; Cordery, Thomas; Giguère-Simmonds, Philippe; Huang, Jessica; Nguyen, Helen; Watson, Michael; Yang, Daniel

    2011-01-01

    The practice of crisis-probing in proactive organisations involves meticulous and sustained investigation into operational processes and management structures for potential weaknesses and flaws before they become difficult to resolve. In health organisations, crisis probing is a necessary part of preparing to manage emerging health threats. This study examined the degree of pre-emptive probing in health organisations and the type of crisis training provided to determine whether or not they are prepared in this area. This evidence-based study draws on cross-sectional responses provided by executives from chiropractic, physiotherapy, and podiatry practices; dental and medical clinics; pharmacies; aged care facilities; and hospitals. The data show a marked lack of mandatory probing and a generalised failure to reward crisis reporting. Crisis prevention training is poor in all organisations except hospitals and aged care facilities where it occurs at an adequate frequency. However this training focuses primarily on natural disasters, fails to address most other crisis types, is mostly reactive and not designed to probe for and uncover key taken-for-granted assumptions. Crisis-probing in health organisations is inadequate, and improvements in this area may well translate into measurable improvements in preparedness and response outcomes. PMID:24149030

  16. The European Water Framework Directive: How Ecological Assumptions Frame Technical and Social Change

    Directory of Open Access Journals (Sweden)

    Patrick Steyaert

    2007-06-01

    Full Text Available The European Water Framework Directive (WFD is built upon significant cognitive developments in the field of ecological science but also encourages active involvement of all interested parties in its implementation. The coexistence in the same policy text of both substantive and procedural approaches to policy development stimulated this research as did our concerns about the implications of substantive ecological visions within the WFD policy for promoting, or not, social learning processes through participatory designs. We have used a qualitative analysis of the WFD text which shows the ecological dimension of the WFD dedicates its quasi-exclusive attention to a particular current of thought in ecosystems science focusing on ecosystems status and stability and considering human activities as disturbance factors. This particular worldview is juxtaposed within the WFD with a more utilitarian one that gives rise to many policy exemptions without changing the general underlying ecological model. We discuss these policy statements in the light of the tension between substantive and procedural policy developments. We argue that the dominant substantive approach of the WFD, comprising particular ecological assumptions built upon "compositionalism," seems to be contradictory with its espoused intention of involving the public. We discuss that current of thought in regard to more functionalist thinking and adaptive management, which offers greater opportunities for social learning, i.e., place a set of interdependent stakeholders in an intersubjective position in which they operate a "social construction" of water problems through the co-production of knowledge.

  17. Understanding the influence of gender role identity on the assumption of family caregiving roles by men.

    Science.gov (United States)

    Hirsch, C

    1996-01-01

    Previous explanations of limited participation by males as family caregivers assume that socialization to dominant gender stereotypes is a universal barrier among men. Overlooked are 1) variations in the degree of internalization of gender typed attitudes that enable intense participation in a wide variety of personal care tasks and the assumption of the chief caregiver role among some men, and 2) social psychological processes used to resolve cognitive dissonance among men considering caregiver activities and/or role enactment. The present examination of these processes is responsive to calls for enhancing our understanding of the personal meaning that caregiving has for men. The introduction of Risman's view that current experience influences socialized predispositions, allows the delineation of conditions under which husbands, sons, and other male relatives who have internalized stereotypical self images of masculinity can also assume caregiver roles in the family. Drawing on interview data from a purposive sample of thirty-two men who were chief caregivers for elderly relatives, case study material is presented to illustrate several pathways by which male respondents gained access to family caregiving roles.

  18. Assumption of linearity in soil and plant concentration ratios: an experimental evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Sheppard, S.C.; Evenden, W.G.

    1988-01-01

    We have evaluated one of the main assumptions in the use of concentration ratios to describe the transfer of elements in the environment. The ratios examined in detail were the 'concentration ratio' (CR) of leaf to soil and the 'partition coefficient' (Ksub(d)) of solid- to liquid-phase concentrations in soil. Use of these ratios implies a linear relationship between the concentrations. Soil was experimentally contaminated to evaluate this linearity over more than a 1000-fold range in concentration. A secondary objective was to determine CR and Ksub(d) values in a long-term (2 y) outdoor study using a peat soil and blueberries. The elements I, Se, Cs, Pb and U were chosen as environmentally important elements. The results indicated that relationships of leaf and leachate concentrations were not consistently linearly related to the total soil concentrations for each of the elements. The modelling difficulties implied by these concentration dependencies can be partially offset by including the strong negative correlation between CR and Ksub(d). The error introduced by using a mean value of the ratios for Se or U resulted in up to a ten-fold increase in variability for CR and a three-fold increase for Ksub(d).

  19. Provably secure identity-based identification and signature schemes from code assumptions

    Science.gov (United States)

    Zhao, Yiming

    2017-01-01

    Code-based cryptography is one of few alternatives supposed to be secure in a post-quantum world. Meanwhile, identity-based identification and signature (IBI/IBS) schemes are two of the most fundamental cryptographic primitives, so several code-based IBI/IBS schemes have been proposed. However, with increasingly profound researches on coding theory, the security reduction and efficiency of such schemes have been invalidated and challenged. In this paper, we construct provably secure IBI/IBS schemes from code assumptions against impersonation under active and concurrent attacks through a provably secure code-based signature technique proposed by Preetha, Vasant and Rangan (PVR signature), and a security enhancement Or-proof technique. We also present the parallel-PVR technique to decrease parameter values while maintaining the standard security level. Compared to other code-based IBI/IBS schemes, our schemes achieve not only preferable public parameter size, private key size, communication cost and signature length due to better parameter choices, but also provably secure. PMID:28809940

  20. Utility of Web search query data in testing theoretical assumptions about mephedrone.

    Science.gov (United States)

    Kapitány-Fövény, Máté; Demetrovics, Zsolt

    2017-05-01

    With growing access to the Internet, people who use drugs and traffickers started to obtain information about novel psychoactive substances (NPS) via online platforms. This paper aims to analyze whether a decreasing Web interest in formerly banned substances-cocaine, heroin, and MDMA-and the legislative status of mephedrone predict Web interest about this NPS. Google Trends was used to measure changes of Web interest on cocaine, heroin, MDMA, and mephedrone. Google search results for mephedrone within the same time frame were analyzed and categorized. Web interest about classic drugs found to be more persistent. Regarding geographical distribution, location of Web searches for heroin and cocaine was less centralized. Illicit status of mephedrone was a negative predictor of its Web search query rates. The connection between mephedrone-related Web search rates and legislative status of this substance was significantly mediated by ecstasy-related Web search queries, the number of documentaries, and forum/blog entries about mephedrone. The results might provide support for the hypothesis that mephedrone's popularity was highly correlated with its legal status as well as it functioned as a potential substitute for MDMA. Google Trends was found to be a useful tool for testing theoretical assumptions about NPS. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Revealing patterns of cultural transmission from frequency data: equilibrium and non-equilibrium assumptions

    Science.gov (United States)

    Crema, Enrico R.; Kandler, Anne; Shennan, Stephen

    2016-12-01

    A long tradition of cultural evolutionary studies has developed a rich repertoire of mathematical models of social learning. Early studies have laid the foundation of more recent endeavours to infer patterns of cultural transmission from observed frequencies of a variety of cultural data, from decorative motifs on potsherds to baby names and musical preferences. While this wide range of applications provides an opportunity for the development of generalisable analytical workflows, archaeological data present new questions and challenges that require further methodological and theoretical discussion. Here we examine the decorative motifs of Neolithic pottery from an archaeological assemblage in Western Germany, and argue that the widely used (and relatively undiscussed) assumption that observed frequencies are the result of a system in equilibrium conditions is unwarranted, and can lead to incorrect conclusions. We analyse our data with a simulation-based inferential framework that can overcome some of the intrinsic limitations in archaeological data, as well as handle both equilibrium conditions and instances where the mode of cultural transmission is time-variant. Results suggest that none of the models examined can produce the observed pattern under equilibrium conditions, and suggest. instead temporal shifts in the patterns of cultural transmission.

  2. Manufactured Homes Acquisition Program : Heat Loss Assumptions and Calculations, Heat Loss Coefficient Tables.

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Bob; Baylon, David

    1992-05-01

    This manual is intended to assist builders of manufactured homes in assessing the thermal performance of structural components used in the Manufactured Housing Acquisition Program (MAP) sponsored by the Bonneville Power Administration (BPA). U-factors for these components are calculated using the ASHRAE (1989) parallel heat loss method, with adaptations made for the construction practices found in the Pacific Northwest manufactured home industry. This report is divided into two parts. The first part describes the general assumptions and calculation procedures used to develop U-factors and R-values for specific materials used in the construction industry, overall U-factors for component sections, and the impact of complex framing and thermal configurations on various components' heat loss rates. The individual components of manufactured homes are reviewed in terms of overall thermal conductivity. The second part contains tables showing the results of heat loss calculations expressed as U-factors for various configurations of the major building components: floor systems, ceiling systems, wall systems, windows, doors and skylights. These values can be used to establish compliance with the MAP specifications and thermal performance criteria or to compare manufactured homes built to different standards.

  3. Manufactured Homes Acquisition Program : Heat Loss Assumptions and Calculations, Heat Loss Coefficient Tables.

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Bob; Baylon, David.

    1992-05-01

    This manual is intended to assist builders of manufactured homes in assessing the thermal performance of structural components used in the Manufactured Housing Acquisition Program (MAP) sponsored by the Bonneville Power Administration (BPA). U-factors for these components are calculated using the ASHRAE (1989) parallel heat loss method, with adaptations made for the construction practices found in the Pacific Northwest manufactured home industry. This report is divided into two parts. The first part describes the general assumptions and calculation procedures used to develop U-factors and R-values for specific materials used in the construction industry, overall U-factors for component sections, and the impact of complex framing and thermal configurations on various components` heat loss rates. The individual components of manufactured homes are reviewed in terms of overall thermal conductivity. The second part contains tables showing the results of heat loss calculations expressed as U-factors for various configurations of the major building components: floor systems, ceiling systems, wall systems, windows, doors and skylights. These values can be used to establish compliance with the MAP specifications and thermal performance criteria or to compare manufactured homes built to different standards.

  4. Improving thermal ablation delineation with electrode vibration elastography using a bidirectional wave propagation assumption.

    Science.gov (United States)

    DeWall, Ryan J; Varghese, Tomy

    2012-01-01

    Thermal ablation procedures are commonly used to treat hepatic cancers and accurate ablation representation on shear wave velocity images is crucial to ensure complete treatment of the malignant target. Electrode vibration elastography is a shear wave imaging technique recently developed to monitor thermal ablation extent during treatment procedures. Previous work has shown good lateral boundary delineation of ablated volumes, but axial delineation was more ambiguous, which may have resulted from the assumption of lateral shear wave propagation. In this work, we assume both lateral and axial wave propagation and compare wave velocity images to those assuming only lateral shear wave propagation in finite element simulations, tissue-mimicking phantoms, and bovine liver tissue. Our results show that assuming bidirectional wave propagation minimizes artifacts above and below ablated volumes, yielding a more accurate representation of the ablated region on shear wave velocity images. Area overestimation was reduced from 13.4% to 3.6% in a stiff-inclusion tissue-mimicking phantom and from 9.1% to 0.8% in a radio-frequency ablation in bovine liver tissue. More accurate ablation representation during ablation procedures increases the likelihood of complete treatment of the malignant target, decreasing tumor recurrence. © 2012 IEEE

  5. Was That Assumption Necessary? Reconsidering Boundary Conditions for Analytical Solutions to Estimate Streambed Fluxes

    Science.gov (United States)

    Luce, Charles H.; Tonina, Daniele; Applebee, Ralph; DeWeese, Timothy

    2017-11-01

    Two common refrains about using the one-dimensional advection diffusion equation to estimate fluid fluxes and thermal conductivity from temperature time series in streambeds are that the solution assumes that (1) the surface boundary condition is a sine wave or nearly so, and (2) there is no gradient in mean temperature with depth. Although the mathematical posing of the problem in the original solution to the problem might lead one to believe these constraints exist, the perception that they are a source of error is a fallacy. Here we develop a mathematical proof demonstrating the equivalence of the solution as developed based on an arbitrary (Fourier integral) surface temperature forcing when evaluated at a single given frequency versus that derived considering a single frequency from the beginning. The implication is that any single frequency can be used in the frequency-domain solutions to estimate thermal diffusivity and 1-D fluid flux in streambeds, even if the forcing has multiple frequencies. This means that diurnal variations with asymmetric shapes or gradients in the mean temperature with depth are not actually assumptions, and deviations from them should not cause errors in estimates. Given this clarification, we further explore the potential for using information at multiple frequencies to augment the information derived from time series of temperature.

  6. Examination of the diurnal assumptions of the test of variables of attention for elementary students.

    Science.gov (United States)

    Hurford, David P; Lasater, Kara A; Erickson, Sara E; Kiesling, Nicole E

    2013-04-01

    To examine the diurnal assumptions of the test of variables of attention (TOVA). The present study assessed 122 elementary students aged 5.5 to 10.0 years who were randomly assigned to one of four different groups based on time of administration (M-M: morning-morning, M-A: morning-afternoon, A-M: afternoon-morning, and A-A: afternoon-afternoon). Morning administration occurred between 8:00 and 10:00 a.m., and afternoon administration occurred between 1:00 and 3:00 p.m. Reliability was consistent across groups, and there were no significant differences between groups. Classification of the students into ADHD or non-ADHD groups was similar across groups, and the children who were identified as ADHD with the Vanderbilt ADHD diagnostic teacher rating scale were consistently classified as ADHD on the TOVA regardless of time of day. The results of the present study indicate that the psychometric values of the TOVA remain intact whether its administration was in the morning or afternoon.

  7. Controversies in psychotherapy research: epistemic differences in assumptions about human psychology.

    Science.gov (United States)

    Shean, Glenn D

    2013-01-01

    It is the thesis of this paper that differences in philosophical assumptions about the subject matter and treatment methods of psychotherapy have contributed to disagreements about the external validity of empirically supported therapies (ESTs). These differences are evident in the theories that are the basis for both the design and interpretation of recent psychotherapy efficacy studies. The natural science model, as applied to psychotherapy outcome research, transforms the constitutive features of the study subject in a reciprocal manner so that problems, treatments, and indicators of effectiveness are limited to what can be directly observed. Meaning-based approaches to therapy emphasize processes and changes that do not lend themselves to experimental study. Hermeneutic philosophy provides a supplemental model to establishing validity in those instances where outcome indicators do not lend themselves to direct observation and measurement and require "deep" interpretation. Hermeneutics allows for a broadening of psychological study that allows one to establish a form of validity that is applicable when constructs do not refer to things that literally "exist" in nature. From a hermeneutic perspective the changes that occur in meaning-based therapies must be understood and evaluated on the manner in which they are applied to new situations, the logical ordering and harmony of the parts with the theoretical whole, and the capability of convincing experts and patients that the interpretation can stand up against other ways of understanding. Adoption of this approach often is necessary to competently evaluate the effectiveness of meaning-based therapies.

  8. Tank waste remediation system retrieval and disposal mission key enabling assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Baldwin, J.H.

    1998-01-09

    An overall systems approach has been applied to develop action plans to support the retrieval and immobilization waste disposal mission. The review concluded that the systems and infrastructure required to support the mission are known. Required systems are either in place or plans have been developed. An analysis of the programmatic, management and technical activities necessary to declare Readiness to Proceed with execution of the mission demonstrates that the system, people, and hardware will be on line and ready to support the private contractors. The systems approach included defining the retrieval and immobilized waste disposal mission requirements and evaluating the readiness of the TWRS contractor to supply waste feed to the private contractors in June 2002. The Phase 1 feed delivery requirements from the Private Contractor Request for Proposals were reviewed, transfer piping routes were mapped on it, existing systems were evaluated, and upgrade requirements were defined. Technical Basis Reviews were completed to define work scope in greater detail, cost estimates and associated year by year financial analyses were completed. Personnel training, qualifications, management systems and procedures were reviewed and shown to be in place and ready to support the Phase 1B mission. Key assumptions and risks that could negatively impact mission success were evaluated and appropriate mitigative actions plans were planned and scheduled.

  9. Spatial modelling of assumption of tourism development with geographic IT using

    Directory of Open Access Journals (Sweden)

    Jitka Machalová

    2010-01-01

    Full Text Available The aim of this article is to show the possibilities of spatial modelling and analysing of assumptions of tourism development in the Czech Republic with the objective to make decision-making processes in tourism easier and more efficient (for companies, clients as well as destination managements. The development and placement of tourism depend on the factors (conditions that influence its application in specific areas. These factors are usually divided into three groups: selective, localization and realization. Tourism is inseparably connected with space – countryside. The countryside can be modelled and consecutively analysed by the means of geographical information technologies. With the help of spatial modelling and following analyses the localization and realization conditions in the regions of the Czech Republic have been evaluated. The best localization conditions have been found in the Liberecký region. The capital city of Prague has negligible natural conditions; however, those social ones are on a high level. Next, the spatial analyses have shown that the best realization conditions are provided by the capital city of Prague. Then the Central-Bohemian, South-Moravian, Moravian-Silesian and Karlovarský regions follow. The development of tourism destination is depended not only on the localization and realization factors but it is basically affected by the level of local destination management. Spatial modelling can help destination managers in decision-making processes in order to optimal use of destination potential and efficient targeting their marketing activities.

  10. Density assumptions for converting geodetic glacier volume change to mass change

    Directory of Open Access Journals (Sweden)

    M. Huss

    2013-05-01

    Full Text Available The geodetic method is widely used for assessing changes in the mass balance of mountain glaciers. However, comparison of repeated digital elevation models only provides a glacier volume change that must be converted to a change in mass using a density assumption or model. This study investigates the use of a constant factor for the volume-to-mass conversion based on a firn compaction model applied to simplified glacier geometries with idealized climate forcing, and two glaciers with long-term mass balance series. It is shown that the "density" of geodetic volume change is not a constant factor and is systematically smaller than ice density in most cases. This is explained by the accretion/removal of low-density firn layers, and changes in the firn density profile with positive/negative mass balance. Assuming a value of 850 ± 60 kg m−3 to convert volume change to mass change is appropriate for a wide range of conditions. For short time intervals (≤3 yr, periods with limited volume change, and/or changing mass balance gradients, the conversion factor can however vary from 0–2000 kg m−3 and beyond, which requires caution when interpreting glacier mass changes based on geodetic surveys.

  11. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    Energy Technology Data Exchange (ETDEWEB)

    Weir, Scott M., E-mail: scott.weir@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States); Suski, Jamie G., E-mail: jamie.suski@ttu.ed [Texas Tech University, Department of Biological Sciences, Box 43131, Lubbock, TX (United States); Salice, Christopher J., E-mail: chris.salice@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States)

    2010-12-15

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  12. Estimating Alarm Thresholds for Process Monitoring Data under Different Assumptions about the Data Generating Mechanism

    Directory of Open Access Journals (Sweden)

    Tom Burr

    2013-01-01

    Full Text Available Process monitoring (PM for nuclear safeguards sometimes requires estimation of thresholds corresponding to small false alarm rates. Threshold estimation dates to the 1920s with the Shewhart control chart; however, because possible new roles for PM are being evaluated in nuclear safeguards, it is timely to consider modern model selection options in the context of threshold estimation. One of the possible new PM roles involves PM residuals, where a residual is defined as residual = data − prediction. This paper reviews alarm threshold estimation, introduces model selection options, and considers a range of assumptions regarding the data-generating mechanism for PM residuals. Two PM examples from nuclear safeguards are included to motivate the need for alarm threshold estimation. The first example involves mixtures of probability distributions that arise in solution monitoring, which is a common type of PM. The second example involves periodic partial cleanout of in-process inventory, leading to challenging structure in the time series of PM residuals.

  13. Provably secure identity-based identification and signature schemes from code assumptions.

    Directory of Open Access Journals (Sweden)

    Bo Song

    Full Text Available Code-based cryptography is one of few alternatives supposed to be secure in a post-quantum world. Meanwhile, identity-based identification and signature (IBI/IBS schemes are two of the most fundamental cryptographic primitives, so several code-based IBI/IBS schemes have been proposed. However, with increasingly profound researches on coding theory, the security reduction and efficiency of such schemes have been invalidated and challenged. In this paper, we construct provably secure IBI/IBS schemes from code assumptions against impersonation under active and concurrent attacks through a provably secure code-based signature technique proposed by Preetha, Vasant and Rangan (PVR signature, and a security enhancement Or-proof technique. We also present the parallel-PVR technique to decrease parameter values while maintaining the standard security level. Compared to other code-based IBI/IBS schemes, our schemes achieve not only preferable public parameter size, private key size, communication cost and signature length due to better parameter choices, but also provably secure.

  14. Personal and Communal Assumptions to Determine Pragmatic Meanings of Phatic Functions

    Directory of Open Access Journals (Sweden)

    Kunjana Rahardi

    2016-11-01

    Full Text Available This research was meant to describe the manifestations of phatic function in the education domain. The phatic function in the communication and interaction happening in the education domain could be accurately identified when the utterances were not separated from their determining pragmatic context. The context must not be limited only to contextual and social or societal perspectives, but must be defined as basic assumptions. The data of this research included various kinds of speech gathered naturally in education circles that contain phatic functions. Two methods of data gathering were employed in this study, namely listening and conversation methods. Recorded data was analyzed through the steps as follows (1 data were identified based on the discourse markers found (2 data were classified based on the phatic perception criteria; (3 data were interpreted based on the referenced theories; (4 data were described in the form of analysis result description. The research proves that phatic function in the form of small talks in the education domain cannot be separated from the context surrounding it. 

  15. Holistic approach to education and upbringing: Contradictory to the general assumption of life

    Directory of Open Access Journals (Sweden)

    Mihajlović Ljubiša M.

    2014-01-01

    Full Text Available Holistic education is a compprehensive view of education based on the assumption that each individual finds his own identity, meaning and objective in life through the connection with the community, nature and human values such as compassion and peace. Within holistic education the teacher is viewed not as an authority figure who guides and controls, but rather as a 'friend', a facilitator of learning: a guide and a companion in gaining experience. The norm is cooperation rather than competition. However, is this possible in real life? The answer is simple - it is not. Why? The reason why lies in the foundation of life itself: a molecule built in such a way that it does not permit such an idealistic approach to life, and therefore, to education. It is a DNK molecule: the molecule of life exhibiting, among other, the following characteristics: it seeks procreation, and exhibits the tendency of eternal struggle, competition. This is in stark opposition to holistic approach to education which does not recognize competition, struggle, gradation and rivalry. The development of an advanced and socially responsible society demands partial, measured application of holism. This needs to be reflected in education as well: approved competition, clear and fair gradation, the best in certain areas become the elite, with the rest following or to be found solutions in accordance with their abilities.

  16. Validity of the assumption of Gaussian turbulence; Gyldighed af antagelsen om Gaussisk turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M.; Hansen, K.S.; Juul Pedersen, B.

    2000-07-01

    Wind turbines are designed to withstand the impact of turbulent winds, which fluctuations usually are assumed of Gaussian probability distribution. Based on a large number of measurements from many sites, this seems a reasonable assumption in flat homogeneous terrain whereas it may fail in complex terrain. At these sites the wind speed often has a skew distribution with more frequent lulls than gusts. In order to simulate aerodynamic loads, a numerical turbulence simulation method was developed and implemented. This method may simulate multiple time series of variable not necessarily Gaussian distribution without distortion of the spectral distribution or spatial coherence. The simulated time series were used as input to the dynamic-response simulation program Vestas Turbine Simulator (VTS). In this way we simulated the dynamic response of systems exposed to turbulence of either Gaussian or extreme, yet realistic, non-Gaussian probability distribution. Certain loads on turbines with active pitch regulation were enhanced by up to 15% compared to pure Gaussian turbulence. It should, however, be said that the undesired effect depends on the dynamic system, and it might be mitigated by optimisation of the wind turbine regulation system after local turbulence characteristics. (au)

  17. Public-private partnerships to improve primary healthcare surgeries: clarifying assumptions about the role of private provider activities.

    Science.gov (United States)

    Mudyarabikwa, Oliver; Tobi, Patrick; Regmi, Krishna

    2017-07-01

    Aim To examine assumptions about public-private partnership (PPP) activities and their role in improving public procurement of primary healthcare surgeries. PPPs were developed to improve the quality of care and patient satisfaction. However, evidence of their effectiveness in delivering health benefits is limited. A qualitative study design was employed. A total of 25 interviews with public sector staff (n=23) and private sector managers (n=2) were conducted to understand their interpretations of assumptions in the activities of private investors and service contractors participating in Local Improvement Finance Trust (LIFT) partnerships. Realist evaluation principles were applied in the data analysis to interpret the findings. Six thematic areas of assumed health benefits were identified: (i) quality improvement; (ii) improved risk management; (iii) reduced procurement costs; (iv) increased efficiency; (v) community involvement; and (vi) sustainable investment. Primary Care Trusts that chose to procure their surgeries through LIFT were expected to support its implementation by providing an environment conducive for the private participants to achieve these benefits. Private participant activities were found to be based on a range of explicit and tacit assumptions perceived helpful in achieving government objectives for LIFT. The success of PPPs depended upon private participants' (i) capacity to assess how PPP assumptions added value to their activities, (ii) effectiveness in interpreting assumptions in their expected activities, and (iii) preparedness to align their business principles to government objectives for PPPs. They risked missing some of the expected benefits because of some factors constraining realization of the assumptions. The ways in which private participants preferred to carry out their activities also influenced the extent to which expected benefits were achieved. Giving more discretion to public than private participants over critical

  18. Sensitivity of Hydrological Model Simulations to Underling Assumptions in a Stochastic Downscaling method

    Science.gov (United States)

    Sapriza, Gonzalo; Jodar, Jorge; Carrera, Jesús; Gupta, Hoshin V.

    2013-04-01

    Climate Change Impacts Studies (CCIS) for Water Resources Management (WRM) are of crucial importance for the human community and especially for water scarce Mediterranean- like regions, where the available water is expected to decrease due to climate change. General Circulation Models (GCM) are one of the most valuable tools available to perform CCIS. However, they cannot be directly applied to water resources evaluations due to their coarse spatial resolution and bias in their simulation of certain outputs, especially precipitation. Downscaling methods have been developed to address this problem, by defining statistical relationships between the variables simulated by GCMs and local observations. Once these relationships are defined and tested via post evaluation during a control period, the relationship is used to generate synthetic time series for the future, based on the different future climate scenarios simulated by the GCMs. For CCIS in WRM, synthetic time series of precipitation and temperature are applied as input variables to run hydrological models and obtain future projections of hydrological response. The main drawbacks of this procedure are: (1) inevitably we have to assume time stationary in the downscaling parameters (which in principle can vary with climate change), and (2) The downscaling parameterizations are another source of model uncertainties that must be quantified and communicated. Here, we evaluate the sensitivity of hydrological model simulations to assumptions underlying a downscaling method based on a Stochastic Rainfall Generating process (SRGP). The method is used to demonstrate that exact daily rainfall sequences are not necessary for climate impacts assessment, and that the "stochastically equivalent" rainfall sequence simulations provided by the model are both sufficient, and provide important added value in terms of realistic assessments of uncertainty. The method also establishes which parameters of the rainfall generating

  19. Matrix Diffusion for Performance Assessment - Experimental Evidence, Modelling Assumptions and Open Issues

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, A

    2004-07-01

    In this report a comprehensive overview on the matrix diffusion of solutes in fractured crystalline rocks is presented. Some examples from observations in crystalline bedrock are used to illustrate that matrix diffusion indeed acts on various length scales. Fickian diffusion is discussed in detail followed by some considerations on rock porosity. Due to the fact that the dual-porosity medium model is a very common and versatile method for describing solute transport in fractured porous media, the transport equations and the fundamental assumptions, approximations and simplifications are discussed in detail. There is a variety of geometrical aspects, processes and events which could influence matrix diffusion. The most important of these, such as, e.g., the effect of the flow-wetted fracture surface, channelling and the limited extent of the porous rock for matrix diffusion etc., are addressed. In a further section open issues and unresolved problems related to matrix diffusion are mentioned. Since matrix diffusion is one of the key retarding processes in geosphere transport of dissolved radionuclide species, matrix diffusion was consequently taken into account in past performance assessments of radioactive waste repositories in crystalline host rocks. Some issues regarding matrix diffusion are site-specific while others are independent of the specific situation of a planned repository for radioactive wastes. Eight different performance assessments from Finland, Sweden and Switzerland were considered with the aim of finding out how matrix diffusion was addressed, and whether a consistent picture emerges regarding the varying methodology of the different radioactive waste organisations. In the final section of the report some conclusions are drawn and an outlook is given. An extensive bibliography provides the reader with the key papers and reports related to matrix diffusion. (author)

  20. Microwave Properties of Ice-Phase Hydrometeors for Radar and Radiometers: Sensitivity to Model Assumptions

    Science.gov (United States)

    Johnson, Benjamin T.; Petty, Grant W.; Skofronick-Jackson, Gail

    2012-01-01

    A simplied framework is presented for assessing the qualitative sensitivities of computed microwave properties, satellite brightness temperatures, and radar reflectivities to assumptions concerning the physical properties of ice-phase hydrometeors. Properties considered included the shape parameter of a gamma size distribution andthe melted-equivalent mass median diameter D0, the particle density, dielectric mixing formula, and the choice of complex index of refraction for ice. We examine these properties at selected radiometer frequencies of 18.7, 36.5, 89.0, and 150.0 GHz; and radar frequencies at 2.8, 13.4, 35.6, and 94.0 GHz consistent with existing and planned remote sensing instruments. Passive and active microwave observables of ice particles arefound to be extremely sensitive to the melted-equivalent mass median diameter D0 ofthe size distribution. Similar large sensitivities are found for variations in the ice vol-ume fraction whenever the geometric mass median diameter exceeds approximately 1/8th of the wavelength. At 94 GHz the two-way path integrated attenuation is potentially large for dense compact particles. The distribution parameter mu has a relatively weak effect on any observable: less than 1-2 K in brightness temperature and up to 2.7 dB difference in the effective radar reflectivity. Reversal of the roles of ice and air in the MaxwellGarnett dielectric mixing formula leads to a signicant change in both microwave brightness temperature (10 K) and radar reflectivity (2 dB). The choice of Warren (1984) or Warren and Brandt (2008) for the complex index of refraction of ice can produce a 3%-4% change in the brightness temperature depression.

  1. Interrogating Commonly Applied Initial Condition Assumptions in Geospeedometry using NanoSIMS

    Science.gov (United States)

    Till, C. B.; Boyce, J. W.

    2014-12-01

    The geologically short (days to centuries) timescales associated with thermochemical changes in magma chambers during the prelude to eruption are typically beyond the resolution of long-lived radioisotopic geochronometers but can be resolved by "geospeedometry" that quantifies the relatively rapid diffusional relaxation of compositional zoning in igneous phenocrysts. When combined with absolute dating, geospeedometry can reveal long-term chronologies of compositional and thermal oscillations in magma chambers. The ability to accurately constrain timescales via geospeedometry is limited in part by the spatial resolution of commonly used analytical techniques and the derivative chemical profiles, especially in the case of very short timescales or very slow diffusing elements, where the chemical profile is often approximated as a step-function. We present geospeedometry of chemical profiles collected with the NanoSIMS ion microprobe in multi-collection mode with sub-micron resolution (e.g., 0.3 micron spacing). This data facilitates a comparison of how well the timescales calculated with the different experimentally-determined diffusivities and collected along the same profile agree and an interrogation of commonly made model assumptions in geospeedometry. For example, electron probe profiles across internal sanidine zone boundaries from a ca. 250 ka rhyolite lava from Yellowstone Caldera reveal a step function in anorthite content at 10 micron spacing and evidence for earlier dissolution prior to new zone growth, yet we find no observable difference in the width of Ba, Sr and Mg diffusion profiles collected via NanoSIMS for the same profile. These observations support the hypothesis that very little to no diffusive relaxation has affected the initial concentration profile that was produced by crystal growth during magma mixing. Our results highlight the need to quantitatively constrain initial conditions when applying geospeedometry to intermediate to silicic

  2. Influence of road network and population demand assumptions in evacuation modeling for distant tsunamis

    Science.gov (United States)

    Henry, Kevin; Wood, Nathan J.; Frazier, Tim G.

    2017-01-01

    Tsunami evacuation planning in coastal communities is typically focused on local events where at-risk individuals must move on foot in a matter of minutes to safety. Less attention has been placed on distant tsunamis, where evacuations unfold over several hours, are often dominated by vehicle use and are managed by public safety officials. Traditional traffic simulation models focus on estimating clearance times but often overlook the influence of varying population demand, alternative modes, background traffic, shadow evacuation, and traffic management alternatives. These factors are especially important for island communities with limited egress options to safety. We use the coastal community of Balboa Island, California (USA), as a case study to explore the range of potential clearance times prior to wave arrival for a distant tsunami scenario. We use a first-in–first-out queuing simulation environment to estimate variations in clearance times, given varying assumptions of the evacuating population (demand) and the road network over which they evacuate (supply). Results suggest clearance times are less than wave arrival times for a distant tsunami, except when we assume maximum vehicle usage for residents, employees, and tourists for a weekend scenario. A two-lane bridge to the mainland was the primary traffic bottleneck, thereby minimizing the effect of departure times, shadow evacuations, background traffic, boat-based evacuations, and traffic light timing on overall community clearance time. Reducing vehicular demand generally reduced clearance time, whereas improvements to road capacity had mixed results. Finally, failure to recognize non-residential employee and tourist populations in the vehicle demand substantially underestimated clearance time.

  3. Assessing pricing assumptions for weather index insurance in a changing climate

    Directory of Open Access Journals (Sweden)

    J.D. Daron

    2014-01-01

    Full Text Available Weather index insurance is being offered to low-income farmers in developing countries as an alternative to traditional multi-peril crop insurance. There is widespread support for index insurance as a means of climate change adaptation but whether or not these products are themselves resilient to climate change has not been well studied. Given climate variability and climate change, an over-reliance on historical climate observations to guide the design of such products can result in premiums which mislead policyholders and insurers alike, about the magnitude of underlying risks. Here, a method to incorporate different sources of climate data into the product design phase is presented. Bayesian Networks are constructed to demonstrate how insurers can assess the product viability from a climate perspective, using past observations and simulations of future climate. Sensitivity analyses illustrate the dependence of pricing decisions on both the choice of information, and the method for incorporating such data. The methods and their sensitivities are illustrated using a case study analysing the provision of index-based crop insurance in Kolhapur, India. We expose the benefits and limitations of the Bayesian Network approach, weather index insurance as an adaptation measure and climate simulations as a source of quantitative predictive information. Current climate model output is shown to be of limited value and difficult to use by index insurance practitioners. The method presented, however, is shown to be an effective tool for testing pricing assumptions and could feasibly be employed in the future to incorporate multiple sources of climate data.

  4. A critical assessment of the equal environment assumption of the twin method for schizophrenia

    Directory of Open Access Journals (Sweden)

    Roar eFosse

    2015-04-01

    Full Text Available The classical twin method (CTM is central to the view that schizophrenia is ~80% heritable. The CTM rests on the equal environments assumption (EEA that identical and fraternal twin pairs experience equivalent trait relevant environmental exposures. The EEA has not been directly tested for schizophrenia with measures of child social adversity, which is particularly etiologically relevant to the disorder. However, if child social adversity is more similar in identical than fraternal pairs in the general twin population, the EEA is unlikely to be valid for schizophrenia, a question which we tested in this study. Using results from prior twin studies, we tested if intraclass correlations for the following five categories of child social adversity are larger in identical than fraternal twins: bullying, sexual abuse, physical maltreatment, emotional neglect and abuse, and general trauma. Eleven relevant studies that encompassed 9119 twin pairs provided 24 comparisons of intraclass correlations, which we grouped into the five social exposure categories. Fisher’s z-test revealed significantly higher correlations in identical than fraternal pairs for each exposure category (z ≥ 3.53, p <.001. The difference remained consistent across gender, study site (country, sample size, whether psychometric instruments were used, whether interviewing was proximate or distant to the exposures, and whether informants were twins or third persons. Combined with other evidence that the differential intraclass correlation for child social adversity cannot be explained by evocative gene-environment covariation, our results indicate that the CTM does not provide any valid indication of genomic effects in schizophrenia.

  5. Deciphering assumptions about stepped wedge designs: the case of Ebola vaccine research.

    Science.gov (United States)

    Doussau, Adélaïde; Grady, Christine

    2016-12-01

    Ethical concerns about randomising persons to a no-treatment arm in the context of Ebola epidemic led to consideration of alternative designs. The stepped wedge (SW) design, in which participants or clusters are randomised to receive an intervention at different time points, gained popularity. Common arguments in favour of using this design are (1) when an intervention is likely to do more good than harm, (2) all participants should receive the experimental intervention at some time point during the study and (3) the design might be preferable for practical reasons. We examine these assumptions when considering Ebola vaccine research. First, based on the claim that a stepped wedge design is indicated when it is likely that the intervention will do more good than harm, we reviewed published and ongoing SW trials to explore previous use of this design to test experimental drugs or vaccines, and found that SW design has never been used for trials of experimental drugs or vaccines. Given that Ebola vaccines were all experimental with no prior efficacy data, the use of a stepped wedge design would have been unprecedented. Second, we show that it is rarely true that all participants receive the intervention in SW studies, but rather, depending on certain design features, all clusters receive the intervention. Third, we explore whether the SW design is appealing for feasibility reasons and point out that there is significant complexity. In the setting of the Ebola epidemic, spatiotemporal variation may have posed problematic challenges to a stepped wedge design for vaccine research. Finally, we propose a set of points to consider for scientific reviewers and ethics committees regarding proposals for SW designs. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. The French bioethics public consultation and the anonymity doctrine: empirical ethics and normative assumptions.

    Science.gov (United States)

    Spranzi, Marta; Brunet, Laurence

    2015-03-01

    The French bioethics laws of 1994 contain the principles of the anonymity and non commodification of all donations of body parts and products including gametes in medically assisted reproduction. The two revisions of the law, in 2004 and 2011 have upheld the rule. In view of the latest revision process, the French government organized a large public consultation in 2009 ("Etats généraux de la bioéthique"). Within the event a "consensus conference" was held in Rennes about different aspects of assisted reproduction (access, anonymity, gratuity and surrogacy). In what follows we shall first describe the anonymity clause for gamete donations in the French law and the debates surrounding it. We shall then analyse the procedure used for the 2009 public consultation and the related consensus conference, as well as its upshot concerning the anonymity doctrine. In this respect we shall compare the citizens' own recommendations on the gamete anonymity issue and its translation in the consultation's final report drafted by a philosopher mandated by the organizing committee. Whereas the final report cited some fundamental ethical arguments as reason for upholding the provisions of the law-most notably the refusal of the 'all biological' approach to reproductive issues-citizens were more careful and tentative in their position although they also concluded that for pragmatic reasons the anonymity rule should continue to hold. We shall argue that the conservative upshot of the public consultation is due to some main underlying presuppositions concerning the citizens' role and expertise as well as to the specific design of the consensus conference. Our conclusion will be that public consultations and consensus conferences can only serve as an empirical support for devising suitable bioethics norms by using second-order normative assumptions.

  7. The biosphere at Forsmark. Data, assumptions and models used in the SR-Can assessment

    Energy Technology Data Exchange (ETDEWEB)

    Karlsson, Sara; Kautsky, Ulrik; Loefgren, Anders; Soederbaeck, Bjoern (eds.)

    2006-10-15

    This report summarises the method adopted for safety assessment following a radionuclide release into the biosphere. The approach utilises the information about the site as far as possible and presents a way of calculating risk to humans. The parameters are topography, where there is good understanding of the present conditions and the development over time is fairly predictable. The topography affects surface hydrology, sedimentation, size of drainage areas and the characteristics of ecosystems. Other parameters are human nutritional intake, which is assumed to be constant over time, and primary production (photosynthesis), which also is a fairly constant parameter over time. The Landscape Dose Factor approach (LDF) gives an integrated measure for the site and also resolves the issues relating to the size of the group with highest exposure. If this approach is widely accepted as method, still some improvements and refinement are necessary, e.g. collecting missing site data, reanalysing site data, reviewing radionuclide specific data, reformulating ecosystem models and evaluating the results with further sensitivity analysis. The report presents descriptions and estimates not presented elsewhere, as well as summaries of important steps in the biosphere modelling that are presented in more detail in separate reports. The intention is to give the reader a coherent description of the steps taken to calculate doses to biota and humans, including a description of the data used, the rationale for a number of assumptions made during parameterisation, and of how the landscape context is applied in the modelling, and also to present the models used and the results obtained.

  8. On the accuracy of calculation of the mean residence time of drug in the body and its volumes of distribution based on the assumption of central elimination.

    Science.gov (United States)

    Berezhkovskiy, Leonid M

    2016-01-01

    1. The steady state and terminal volumes of distribution, as well as the mean residence time of drug in the body (Vss, Vβ, and MRT) are the common pharmacokinetic parameters calculated using the drug plasma concentration-time profile (Cp(t)) following intravenous (iv bolus or constant rate infusion) drug administration. 2. These traditional calculations are valid for the linear pharmacokinetic system with central elimination (i.e. elimination rate being proportional to drug concentration in plasma). The assumption of central elimination is not valid in general, so that the accuracy of the traditional calculation of these parameters is uncertain. 3. The comparison of Vss, Vβ, and MRT calculated by the derived exact equations and by the commonly used ones was made considering a physiological model. It turned out that the difference between the exact and simplified calculations does not exceed 2%. 4. Thus the calculations of Vss, Vβ, and MRT, which are based on the assumption of central elimination, may be considered as quite accurate. Consequently it can be used as the standard for comparisons with kinetic and in silico models.

  9. Long-term impact of parental divorce on optimism and trust: changes in general assumptions or narrow beliefs?

    Science.gov (United States)

    Franklin, K M; Janoff-Bulman, R; Roberts, J E

    1990-10-01

    Two studies were conducted to examine the long-term impact of parental divorce on beliefs about the self and others. In Study 1, college-aged children of divorce and students from intact families did not differ on 8 basic assumptions or on measures of depression. Those whose parents are divorced, however, were less optimistic about the success of their own future marriages. Assumptions about the benevolence of people best predicted the marital optimism of the parental divorce group, but not of the intact family group. In Study 2, assumptions about the benevolence of people were explored in terms of trust beliefs. College-aged children of divorce and a matched sample from intact homes differed only on marriage-related beliefs, not on generalized trust. Children of divorced reported less trust of a future spouse and were less optimistic about marriage. Exploratory analyses found that continuous conflict in family of origin adversely affected all levels of trust.

  10. Testing for Sufficient-Cause Gene-Environment Interactions Under the Assumptions of Independence and Hardy-Weinberg Equilibrium.

    Science.gov (United States)

    Lee, Wen-Chung

    2015-07-01

    To detect gene-environment interactions, a logistic regression model is typically fitted to a set of case-control data, and the focus is on testing of the cross-product terms (gene × environment) in the model. A significant result is indicative of a gene-environment interaction under a multiplicative model for disease odds. Based on the sufficient-cause model for rates, in this paper we put forward a general approach to testing for sufficient-cause gene-environment interactions in case-control studies. The proposed tests can be tailored to detect a particular type of sufficient-cause gene-environment interaction with greater sensitivity. These tests include testing for autosomal dominant, autosomal recessive, and gene-dosage interactions. The tests can also detect trend interactions (e.g., a larger gene-environment interaction with a higher level of environmental exposure) and threshold interactions (e.g., gene-environment interaction occurs only when environmental exposure reaches a certain threshold level). Two assumptions are necessary for the validity of the tests: 1) the rare-disease assumption and 2) the no-redundancy assumption. Another 2 assumptions are optional but, if imposed correctly, can boost the statistical powers of the tests: 3) the gene-environment independence assumption and 4) the Hardy-Weinberg equilibrium assumption. SAS code (SAS Institute, Inc., Cary, North Carolina) for implementing the methods is provided. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Applying Human Factors Principles to Mitigate Usability Issues Related to Embedded Assumptions in Health Information Technology Design.

    Science.gov (United States)

    Gibbons, Michael C; Lowry, Svetlana Z; Patterson, Emily S

    2014-12-18

    There is growing recognition that design flaws in health information technology (HIT) lead to increased cognitive work, impact workflows, and produce other undesirable user experiences that contribute to usability issues and, in some cases, patient harm. These usability issues may in turn contribute to HIT utilization disparities and patient safety concerns, particularly among "non-typical" HIT users and their health care providers. Health care disparities are associated with poor health outcomes, premature death, and increased health care costs. HIT has the potential to reduce these disparate outcomes. In the computer science field, it has long been recognized that embedded cultural assumptions can reduce the usability, usefulness, and safety of HIT systems for populations whose characteristics differ from "stereotypical" users. Among these non-typical users, inappropriate embedded design assumptions may contribute to health care disparities. It is unclear how to address potentially inappropriate embedded HIT design assumptions once detected. The objective of this paper is to explain HIT universal design principles derived from the human factors engineering literature that can help to overcome potential usability and/or patient safety issues that are associated with unrecognized, embedded assumptions about cultural groups when designing HIT systems. Existing best practices, guidance, and standards in software usability and accessibility were subjected to a 5-step expert review process to identify and summarize those best practices, guidance, and standards that could help identify and/or address embedded design assumptions in HIT that could negatively impact patient safety, particularly for non-majority HIT user populations. An iterative consensus-based process was then used to derive evidence-based design principles from the data to address potentially inappropriate embedded cultural assumptions. Design principles that may help identify and address embedded HIT

  12. Applying Human Factors Principles to Mitigate Usability Issues Related to Embedded Assumptions in Health Information Technology Design

    Science.gov (United States)

    Lowry, Svetlana Z; Patterson, Emily S

    2014-01-01

    Background There is growing recognition that design flaws in health information technology (HIT) lead to increased cognitive work, impact workflows, and produce other undesirable user experiences that contribute to usability issues and, in some cases, patient harm. These usability issues may in turn contribute to HIT utilization disparities and patient safety concerns, particularly among “non-typical” HIT users and their health care providers. Health care disparities are associated with poor health outcomes, premature death, and increased health care costs. HIT has the potential to reduce these disparate outcomes. In the computer science field, it has long been recognized that embedded cultural assumptions can reduce the usability, usefulness, and safety of HIT systems for populations whose characteristics differ from “stereotypical” users. Among these non-typical users, inappropriate embedded design assumptions may contribute to health care disparities. It is unclear how to address potentially inappropriate embedded HIT design assumptions once detected. Objective The objective of this paper is to explain HIT universal design principles derived from the human factors engineering literature that can help to overcome potential usability and/or patient safety issues that are associated with unrecognized, embedded assumptions about cultural groups when designing HIT systems. Methods Existing best practices, guidance, and standards in software usability and accessibility were subjected to a 5-step expert review process to identify and summarize those best practices, guidance, and standards that could help identify and/or address embedded design assumptions in HIT that could negatively impact patient safety, particularly for non-majority HIT user populations. An iterative consensus-based process was then used to derive evidence-based design principles from the data to address potentially inappropriate embedded cultural assumptions. Results Design principles that

  13. Mathematical formulae to estimate chronic subdural haematoma volume. Flawed assumption regarding ellipsoid morphology.

    Science.gov (United States)

    Manickam, Appukutty; Marshman, Laurence A G; Johnston, Ross; Thomas, Piers A W

    2017-06-01

    Mathematical formulae are commonly used to estimate intra-cranial haematoma volume. Such formulae tacitly assume an ellipsoid geometrical morphology. Recently, the 'XYZ/2' formula has been validated and recommended for chronic subdural haematoma (CSDH) volumetric estimation. We aimed to assess the precision and accuracy of mathematical formulae specifically in estimating CSDH volume, and to determine typical CSDH 3-D morphology. Three extant formulae ('XYZ/2', 'π/6·XYZ' and '2/3S·h') were compared against computer-assisted 3D volumetric analysis as Gold standard in CTs where CSDH sufficiently contrasted with brain. Scatter-plots (n=45) indicated that, in contrast to prior reports, all formulae most commonly over-estimated CSDH volume against 3-D Gold standard ('2/3S·h': 44.4%, 'XYZ/2': 48.84% and 'π/6·XYZ': 55.6%). With all formulae, imprecision increased with increased CSDH volume: in particular, with clinically-relevant CSDH volumes (i.e. >50ml). Deviations >10% of equivalence were observed in 60% of estimates for 2/3S·h, 77.8% for 'XYZ/2' and 84.4% for 'π/6·XYZ'. The maximum error for 'XYZ/2' was 142.3% of a clinically-relevant volume. Three-D simulations revealed that only 4/45 (9%) CSDH remotely conformed to ellipsoid geometrical morphology. Most (41/45, 91%) demonstrated highly irregular morphology neither recognisable as ellipsoid, nor as any other regular/non-regular geometric solid. Mathematical formulae, including 'XYZ/2', most commonly proved inaccurate and imprecise when applied to CSDH. In contrast to prior studies, all most commonly over-estimated CSDH volume. Imprecision increased with CSDH volume, and was maximal with clinically-relevant CSDH volumes. Errors most commonly related to a flawed assumption regarding ellipsoid 3-D CSDH morphology. The validity of mean comparisons, or correlation analyses, used in prior studies is questioned. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  14. The Microtremor H/V Spectral Ratio: The Physical Basis of the Diffuse Field Assumption

    Science.gov (United States)

    Sanchez-Sesma, F. J.

    2016-12-01

    The microtremor H/V spectral ratio (MHVSR) is popular to obtain the dominant frequency at a site. Despite the success of MHVSR some controversy arose regarding its physical basis. One approach is the Diffuse Field Assumption, DFA. It is then assumed that noise diffuse features come from multiple scattering within the medium. According to theory, the average of the autocorrelation is proportional to directional energy density (DED) and to the imaginary part of the Green's function for same source and receiver. Then, the square of MHVSR is a ratio of DEDs which, in a horizontally layered system, is 2xImG11/ImG33, where ImG11 and ImG33 are the imaginary parts of Green's functions for horizontal and vertical components. This has physical implications that emerge from the duality DED-force, implicit in the DFA. Consider a surface force at a half-space. The radiated energy is carried away by various wave types and the proportions of each one are precisely the fractions of the energy densities of a diffuse elastic wave field at the free surface. Thus, some properties of applied forces are also characteristics of DEDs. For example, consider a Poisson solid. For a normal point load, 67 per cent of energy is carried away by Rayleigh waves. For the tangential case, it is less well known that, 77 per cent of energy goes as shear waves. In a full space, 92 per cent of the energy is emitted as shear waves. The horizontal DED at the half-space surface implies significant emission of down-going shear waves that explains the curious stair-like resonance spectrum of ImG11. Both ImG11 and ImG33 grow linearly versus frequency and this represents wave emission. For a layered medium, besides wave emission, the ensuing variations correspond to reflected waves. For high frequencies, ImG33 depends on the properties of the top layer. Reflected body waves are very small and Rayleigh waves behave in the top layer as in a kind of mini half-space. From HVSR one can invert the velocity model

  15. Is a proposed reaction mechanism free from unnecessary assumptions? Occam's razor applied in a mathematical way to complex first-order reaction systems.

    Science.gov (United States)

    Bergson, Göran; Linderberg, Jan

    2008-05-08

    Following Occam's principle, a proposed reaction mechanism should not contain assumptions about the existence of reactive intermediates and reaction paths that are unnecessary for a full description and interpretation of the available facts. A mechanism refers, in this paper, to a proposed reaction scheme or network that represents the reactions supposed to be going on in a complex reaction system with observable species as well as unobservable reactive intermediates. The scope is limited here to (pseudo) first-order reactions and the steady-state approximation is invoked in order to relate unknown mechanistic rate constants to experimentally determined ones, and, when available, theoretically calculated quantities. When the resulting, nonlinear system of equations admits a unique solution within a physically reasonable domain, it is concluded that the reaction mechanism fulfills Occam's principle. Otherwise, there are many or no solutions. No subjective or qualitative arguments enter the procedure and the outcome is not negotiable.

  16. Plato and the Modern American "Right": Agendas, Assumptions, and the Culture of Fear

    Science.gov (United States)

    Ramsey, Paul

    2009-01-01

    This article presents an interpretation of Plato's "Republic" that has many striking similarities to the social agenda of modern educational conservatives in the United States, which is particularly timely because George W. Bush's administration is, at this writing, coming to an end. Plato's ideal city is best seen as one that promoted an…

  17. Moving beyond Assumptions: The Use of Virtual Reference Data in an Academic Library

    Science.gov (United States)

    Nolen, David S.; Powers, Amanda Clay; Zhang, Li; Xu, Yue; Cannady, Rachel E.; Li, Judy

    2012-01-01

    The Mississippi State University Libraries' Virtual Reference Service collected statistics about virtual reference usage. Analysis of the data collected by an entry survey from chat and e-mail transactions provided librarians with concrete information about what patron groups were the highest and lowest users of virtual reference services. These…

  18. 76 FR 52353 - Assumption Buster Workshop: “Current Implementations of Cloud Computing Indicate a New Approach...

    Science.gov (United States)

    2011-08-22

    ... Assumption Buster Workshop: ``Current Implementations of Cloud Computing Indicate a New Approach to Security...: ``Current implementations of cloud computing indicate a new approach to security'' Implementations of cloud computing have provided new ways of thinking about how to secure data and computation. Cloud is a platform...

  19. Do irregular grids make a difference? Relaxing the spatial regularity assumption, in cellular models of social dynamics

    NARCIS (Netherlands)

    Flache, A; Hegselmann, R

    2001-01-01

    Three decades of CA-modelling in the social sciences have shown that the cellular automata framework is a useful tool to explore the relationship between micro assumptions and macro outcomes in social dynamics. However, virtually all CA-applications in the social sciences rely on a potentially

  20. Do Irregular Grids make a Difference? Relaxing the Spatial Regularity Assumption in Cellular Models of Social Dynamics

    NARCIS (Netherlands)

    Flache, Andreas; Hegselmann, Rainer

    2001-01-01

    Three decades of CA-modelling in the social sciences have shown that the cellular automata framework is a useful tool to explore the relationship between micro assumptions and macro outcomes in social dynamics. However, virtually all CA-applications in the social sciences rely on a potentially

  1. Models of Parent Involvement in the Educational Process of Their Severely Handicapped Child: Past Assumptions and Future Directions.

    Science.gov (United States)

    Halvorsen, Ann Tiedmann

    Models of parent involvement in the education of severely handicapped children are reviewed, and the assertion is made that most parent involvement programs reflect professional priorities rather than parental viewpoints and needs. Assumptions underlying models in which parents are considered teachers are reviewed, and discrepancies between…

  2. The Effect of Multicollinearity and the Violation of the Assumption of Normality on the Testing of Hypotheses in Regression Analysis.

    Science.gov (United States)

    Vasu, Ellen S.; Elmore, Patricia B.

    The effects of the violation of the assumption of normality coupled with the condition of multicollinearity upon the outcome of testing the hypothesis Beta equals zero in the two-predictor regression equation is investigated. A monte carlo approach was utilized in which three differenct distributions were sampled for two sample sizes over…

  3. Testing the Assumption of Measurement Invariance in the SAMHSA Mental Health and Alcohol Abuse Stigma Assessment in Older Adults

    NARCIS (Netherlands)

    King-Kallimanis, B.L.; Oort, F.J.; Lynn, N.; Schonfeld, L.

    2012-01-01

    This study examined the assumption of measurement invariance of the SAMSHA Mental Health and Alcohol Abuse Stigma Assessment. This is necessary to make valid comparisons across time and groups. The data come from the Primary Care Research in Substance Abuse and Mental Health for Elderly trial, a

  4. A Two-Step First Difference Estimator for a Panel Data Tobit Model under Conditional Mean Independence Assumptions

    NARCIS (Netherlands)

    Kalwij, A.S.

    2004-01-01

    This study develops a two-step estimator for a panel data Tobit model based on taking first-differences of the equation of interest, under conditional mean independence assumptions.The necessary correction terms are non-standard and a substantial part is therefore devoted to the formal derivation of

  5. Schooling Mobile Phones: Assumptions about Proximal Benefits, the Challenges of Shifting Meanings, and the Politics of Teaching

    Science.gov (United States)

    Philip, Thomas M.; Garcia, Antero

    2015-01-01

    Mobile devices are increasingly upheld as powerful tools for learning and school reform. In this article, we prioritize youth voices to critically examine assumptions about student interest in mobile devices that often drive the incorporation of new technologies into schools. By demonstrating how the very meaning of mobile phones shift as they are…

  6. Guideline for Adopting the Local Reaction Assumption for Porous Absorbers in Terms of Random Incidence Absorption Coefficients

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2011-01-01

    Room surfaces have been extensively modeled as locally reacting in room acoustic predictions although such modeling could yield significant errors under certain conditions. Therefore, this study aims to propose a guideline for adopting the local reaction assumption by comparing predicted random i...

  7. TESTING THE ASSUMPTIONS AND INTERPRETING THE RESULTS OF THE RASCH MODEL USING LOG-LINEAR PROCEDURES IN SPSS

    NARCIS (Netherlands)

    TENVERGERT, E; GILLESPIE, M; KINGMA, J

    This paper shows how to use the log-linear subroutine of SPSS to fit the Rasch model. It also shows how to fit less restrictive models obtained by relaxing specific assumptions of the Rasch model. Conditional maximum likelihood estimation was achieved by including dummy variables for the total

  8. Why Bother about Writing a Masters Dissertation? Assumptions of Faculty and Masters Students in an Iranian Setting

    Science.gov (United States)

    Hasrati, Mostafa

    2013-01-01

    This article reports the results of a mixed methodology analysis of the assumptions of academic staff and Masters students in an Iranian university regarding various aspects of the assessment of the Masters degree thesis, including the main objective for writing the thesis, the role of the students, supervisors and advisors in writing the…

  9. Timber value—a matter of choice: a study of how end use assumptions affect timber values.

    Science.gov (United States)

    John H. Beuter

    1971-01-01

    The relationship between estimated timber values and actual timber prices is discussed. Timber values are related to how, where, and when the timber is used. An analysis demonstrates the relative values of a typical Douglas-fir stand under assumptions about timber use.

  10. Preliminary Review of Models, Assumptions, and Key Data used in Performance Assessments and Composite Analysis at the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Arthur S. Rood; Swen O. Magnuson

    2009-07-01

    This document is in response to a request by Ming Zhu, DOE-EM to provide a preliminary review of existing models and data used in completed or soon to be completed Performance Assessments and Composite Analyses (PA/CA) documents, to identify codes, methodologies, main assumptions, and key data sets used.

  11. Improving inference for aerial surveys of bears: The importance of assumptions and the cost of unnecessary complexity.

    Science.gov (United States)

    Schmidt, Joshua H; Wilson, Tammy L; Thompson, William L; Reynolds, Joel H

    2017-07-01

    Obtaining useful estimates of wildlife abundance or density requires thoughtful attention to potential sources of bias and precision, and it is widely understood that addressing incomplete detection is critical to appropriate inference. When the underlying assumptions of sampling approaches are violated, both increased bias and reduced precision of the population estimator may result. Bear (Ursus spp.) populations can be difficult to sample and are often monitored using mark-recapture distance sampling (MRDS) methods, although obtaining adequate sample sizes can be cost prohibitive. With the goal of improving inference, we examined the underlying methodological assumptions and estimator efficiency of three datasets collected under an MRDS protocol designed specifically for bears. We analyzed these data using MRDS, conventional distance sampling (CDS), and open-distance sampling approaches to evaluate the apparent bias-precision tradeoff relative to the assumptions inherent under each approach. We also evaluated the incorporation of informative priors on detection parameters within a Bayesian context. We found that the CDS estimator had low apparent bias and was more efficient than the more complex MRDS estimator. When combined with informative priors on the detection process, precision was increased by >50% compared to the MRDS approach with little apparent bias. In addition, open-distance sampling models revealed a serious violation of the assumption that all bears were available to be sampled. Inference is directly related to the underlying assumptions of the survey design and the analytical tools employed. We show that for aerial surveys of bears, avoidance of unnecessary model complexity, use of prior information, and the application of open population models can be used to greatly improve estimator performance and simplify field protocols. Although we focused on distance sampling-based aerial surveys for bears, the general concepts we addressed apply to a

  12. In-house photocrystallographic studies and theoretical prediction of Zn[4-Cl-PhS]2phenanthroline excitation

    DEFF Research Database (Denmark)

    Schmøkel, Mette Stokkebro; Kaminski, Radoslaw; Benedict, Jason B.

    Depending on the type of compound, time scale, and the lifetimes of the metastable species in question photocrystallographic studies can be performed by either in-house pseudo-steady state or synchrotron-based pump-probe experiments using either monochromatic or polychromatic sources [1]. Here we...

  13. Theoretical analysis of hydriding reactions of ZrCo and LaNi 5 ...

    African Journals Online (AJOL)

    ZrCo) and lanthanum penta-nickel (LaNi5) intermetallic alloys were studied using kinetic data, and modified shrinking core model to account for concentration and temperature gradients in the gas film and ash layer under pseudo steady state ...

  14. Effect of bioaugmentation and biostimulation on sulfate-reducing column startup captured by functional gene profiling.

    Science.gov (United States)

    Pereyra, Luciana P; Hiibel, Sage R; Perrault, Elizabeth M; Reardon, Kenneth F; Pruden, Amy

    2012-10-01

    Sulfate-reducing permeable reactive zones (SR-PRZs) depend upon a complex microbial community to utilize a lignocellulosic substrate and produce sulfides, which remediate mine drainage by binding heavy metals. To gain insight into the impact of the microbial community composition on the startup time and pseudo-steady-state performance, functional genes corresponding to cellulose-degrading (CD), fermentative, sulfate-reducing, and methanogenic microorganisms were characterized in columns simulating SR-PRZs using quantitative polymerase chain reaction (qPCR) and denaturing gradient gel electrophoresis (DGGE). Duplicate columns were bioaugmented with sulfate-reducing or CD bacteria or biostimulated with ethanol or carboxymethyl cellulose and compared with baseline dairy manure inoculum and uninoculated controls. Sulfate removal began after ~ 15 days for all columns and pseudo-steady state was achieved by Day 30. Despite similar performance, DGGE profiles of 16S rRNA gene and functional genes at pseudo-steady state were distinct among the column treatments, suggesting the potential to control ultimate microbial community composition via bioaugmentation and biostimulation. qPCR revealed enrichment of functional genes in all columns between the initial and pseudo-steady-state time points. This is the first functional gene-based study of CD, fermentative and sulfate-reducing bacteria and methanogenic archaea in a lignocellulose-based environment and provides new qualitative and quantitative insight into startup of a complex microbial system. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  15. Molecular Thermodynamics for Cell Biology as Taught with Boxes

    Science.gov (United States)

    Mayorga, Luis S.; Lopez, Maria Jose; Becker, Wayne M.

    2012-01-01

    Thermodynamic principles are basic to an understanding of the complex fluxes of energy and information required to keep cells alive. These microscopic machines are nonequilibrium systems at the micron scale that are maintained in pseudo-steady-state conditions by very sophisticated processes. Therefore, several nonstandard concepts need to be…

  16. Single family heating and cooling requirements: Assumptions, methods, and summary results

    Energy Technology Data Exchange (ETDEWEB)

    Ritschard, R.L.; Hanford, J.W.; Sezgen, A.O. (Lawrence Berkeley Lab., CA (United States))

    1992-03-01

    The research has created a data base of hourly building loads using a state-of-the-art building simulation code (DOE-2.ID) for 8 prototypes, representing pre-1940s to 1990s building practices, in 16 US climates. The report describes the assumed modeling inputs and building operations, defines the building prototypes and selection of base cities, compares the simulation results to both surveyed and measured data sources, and discusses the results. The full data base with hourly space conditioning, water heating, and non-HVAC electricity consumption is available from GRI. In addition, the estimated loads on a per square foot basis are included as well as the peak heating and cooling loads.

  17. Biomedical subjectivities and reproductive assumptions in the CAMELIA clinical trial in Cambodia.

    Science.gov (United States)

    Petitet, Pascale Hancart

    2014-01-01

    The inclusion of women in clinical trials has raised a variety of ethical and practical issues in their implementation. In the recent CAMELIA clinical trial in Cambodia, the inclusion criteria included a negative pregnancy test and signature of the consent form confirming commitment to double contraceptive use as patients were given drugs contra-indicated in case of pregnancy. But despite precautions and the requirement stated in the informed consent form, 19 out of 236 enrolled women became pregnant during the trial. The current paper describes the frictions and subjectivities that emerge as new medical technologies travel to resource-poor settings--and more specifically, how trial researchers, health workers, and research subjects involved in the CAMELIA trial negotiate the injunction to avoid pregnancy while using a teratogenic drug.

  18. Manifestation of Coupled Geometric Complexity in Urban Road Networks under Mono-Centric Assumption

    CERN Document Server

    Peiravian, Farideddin

    2015-01-01

    This article analyzes the complex geometry of urban transportation networks as a gateway to understanding their encompassing urban systems. Using a proposed ring-buffer approach and applying it to 50 urban areas in the United States, we measure road lengths in concentric rings from carefully-selected urban centers and study how the trends evolve as we move away from these centers. Overall, we find that the complexity of urban transportation networks is naturally coupled, consisting of two distinct patterns: (1) a fractal component (i.e., power law) that represent a uniform grid, and (2) a second component that can be exponential, power law, or logarithmic that captures changes in road density. From this second component, we introduce two new indices, density index and decay index, which jointly capture essential characteristics of urban systems and therefore can help us gain new insights into how cities evolve.

  19. Transmission dynamics of Bacillus thuringiensis infecting Plodia interpunctella: a test of the mass action assumption with an insect pathogen.

    Science.gov (United States)

    Knell, R J; Begon, M; Thompson, D J

    1996-01-22

    Central to theoretical studies of host-pathogen population dynamics is a term describing transmission of the pathogen. This usually assumes that transmission is proportional to the density of infectious hosts or particles and of susceptible individuals. We tested this assumption with the bacterial pathogen Bacillus thuringiensis infecting larvae of Plodia interpunctella, the Indian meal moth. Transmission was found to increase in a more than linear way with host density in fourth and fifth instar P. interpunctella, and to decrease with the density of infectious cadavers in the case of fifth instar larvae. Food availability was shown to play an important part in this process. Therefore, on a number of counts, the usual assumption was found not to apply in our experimental system.

  20. Linearity assumption in soil-to-plant transfer factors of natural uranium and radium in Helianthus annuus L

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, P. Blanco [Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain); Tome, F. Vera [Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain)]. E-mail: fvt@unex.es; Fernandez, M. Perez [Area de Ecologia, Departamento de Fisica, Facultad de Ciencias, Universidad de Extremadura, 06071 Badajoz (Spain); Lozano, J.C. [Laboratorio de Radiactividad Ambiental, Facultad de Ciencias, Universidad de Salamanca, 37008 Salamanca (Spain)

    2006-05-15

    The linearity assumption of the validation of soil-to-plant transfer factors of natural uranium and {sup 226}Ra was tested using Helianthus annuus L. (sunflower) grown in a hydroponic medium. Transfer of natural uranium and {sup 226}Ra was tested in both the aerial fraction of plants and in the overall seedlings (roots and shoots). The results show that the linearity assumption can be considered valid in the hydroponic growth of sunflowers for the radionuclides studied. The ability of sunflowers to translocate uranium and {sup 226}Ra was also investigated, as well as the feasibility of using sunflower plants to remove uranium and radium from contaminated water, and by extension, their potential for phytoextraction. In this sense, the removal percentages obtained for natural uranium and {sup 226}Ra were 24% and 42%, respectively. Practically all the uranium is accumulated in the roots. However, 86% of the {sup 226}Ra activity concentration in roots was translocated to the aerial part.

  1. General aptitude and the assumption of truth in deductively rational reasoning about probable but false antecedent to consequent relations

    Science.gov (United States)

    Schroyens, Walter; Fleerackers, Lieve; Maes, Sunile

    2010-01-01

    Two experiments (N1 = 117 and N2 = 245) on reasoning with knowledge-rich conditionals showed a main effect of logical validity, which was due to the negative effect of counter-examples being smaller for valid than for invalid arguments. These findings support the thesis that some people tend to inhibit background inconsistent with the hypothetical truth of the premises, while others tend to abandon the implicit truth-assumption when they have factual evidence to the contrary. Findings show that adhering to the truth-assumption in the face of conflicting evidence to the contrary requires an investment of time and effort which people with a higher general aptitude are more likely to do. PMID:21228921

  2. Social support, world assumptions, and exposure as predictors of anxiety and quality of life following a mass trauma.

    Science.gov (United States)

    Grills-Taquechel, Amie E; Littleton, Heather L; Axsom, Danny

    2011-05-01

    This study examined the influence of a mass trauma (the Virginia Tech campus shootings) on anxiety symptoms and quality of life, as well as the potential vulnerability/protective roles of world assumptions and social support. Pre-trauma adjustment data, collected in the six months prior to the shooting, was examined along with two-month post-shooting data in a sample of 298 female students enrolled at the university at the time of the shootings. Linear regression analyses revealed consistent predictive roles for world assumptions pertaining to control and self-worth as well as family support. In addition, for those more severely exposed to the shooting, greater belief in a lack of control over outcomes appeared to increase vulnerability for post-trauma physiological and emotional anxiety symptoms. Implications of the results for research and intervention following mass trauma are discussed. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Ecological equivalence: a realistic assumption for niche theory as a testable alternative to neutral theory.

    Directory of Open Access Journals (Sweden)

    C Patrick Doncaster

    Full Text Available BACKGROUND: Hubbell's 2001 neutral theory unifies biodiversity and biogeography by modelling steady-state distributions of species richness and abundances across spatio-temporal scales. Accurate predictions have issued from its core premise that all species have identical vital rates. Yet no ecologist believes that species are identical in reality. Here I explain this paradox in terms of the ecological equivalence that species must achieve at their coexistence equilibrium, defined by zero net fitness for all regardless of intrinsic differences between them. I show that the distinction of realised from intrinsic vital rates is crucial to evaluating community resilience. PRINCIPAL FINDINGS: An analysis of competitive interactions reveals how zero-sum patterns of abundance emerge for species with contrasting life-history traits as for identical species. I develop a stochastic model to simulate community assembly from a random drift of invasions sustaining the dynamics of recruitment following deaths and extinctions. Species are allocated identical intrinsic vital rates for neutral dynamics, or random intrinsic vital rates and competitive abilities for niche dynamics either on a continuous scale or between dominant-fugitive extremes. Resulting communities have steady-state distributions of the same type for more or less extremely differentiated species as for identical species. All produce negatively skewed log-normal distributions of species abundance, zero-sum relationships of total abundance to area, and Arrhenius relationships of species to area. Intrinsically identical species nevertheless support fewer total individuals, because their densities impact as strongly on each other as on themselves. Truly neutral communities have measurably lower abundance/area and higher species/abundance ratios. CONCLUSIONS: Neutral scenarios can be parameterized as null hypotheses for testing competitive release, which is a sure signal of niche dynamics

  4. Ecological equivalence: a realistic assumption for niche theory as a testable alternative to neutral theory.

    Science.gov (United States)

    Doncaster, C Patrick

    2009-10-14

    Hubbell's 2001 neutral theory unifies biodiversity and biogeography by modelling steady-state distributions of species richness and abundances across spatio-temporal scales. Accurate predictions have issued from its core premise that all species have identical vital rates. Yet no ecologist believes that species are identical in reality. Here I explain this paradox in terms of the ecological equivalence that species must achieve at their coexistence equilibrium, defined by zero net fitness for all regardless of intrinsic differences between them. I show that the distinction of realised from intrinsic vital rates is crucial to evaluating community resilience. An analysis of competitive interactions reveals how zero-sum patterns of abundance emerge for species with contrasting life-history traits as for identical species. I develop a stochastic model to simulate community assembly from a random drift of invasions sustaining the dynamics of recruitment following deaths and extinctions. Species are allocated identical intrinsic vital rates for neutral dynamics, or random intrinsic vital rates and competitive abilities for niche dynamics either on a continuous scale or between dominant-fugitive extremes. Resulting communities have steady-state distributions of the same type for more or less extremely differentiated species as for identical species. All produce negatively skewed log-normal distributions of species abundance, zero-sum relationships of total abundance to area, and Arrhenius relationships of species to area. Intrinsically identical species nevertheless support fewer total individuals, because their densities impact as strongly on each other as on themselves. Truly neutral communities have measurably lower abundance/area and higher species/abundance ratios. Neutral scenarios can be parameterized as null hypotheses for testing competitive release, which is a sure signal of niche dynamics. Ignoring the true strength of interactions between and within species

  5. The Manifestations of Positive Leadership Strategies in the Doctrinal Assumptions of the U.S. Army Leadership Concept

    OpenAIRE

    Andrzej Lis

    2015-01-01

    The aim of the paper is to identify the manifestations of positive leadership strategies in the doctrinal assumptions of the U.S. Army leadership concept. The components of the U.S. Army leadership requirements model are be tested against the Cameron’s (2012) model of positive leadership strategies including: building a positive work climate; fostering positive relationships among the members of an organisation; establishing and promoting positive communication and manifesting the meaningfuln...

  6. The Manifestations of Positive Leadership Strategies in the Doctrinal Assumptions of the U.S. Army Leadership Concept

    Directory of Open Access Journals (Sweden)

    Andrzej Lis

    2015-06-01

    Full Text Available The aim of the paper is to identify the manifestations of positive leadership strategies in the doctrinal assumptions of the U.S. Army leadership concept. The components of the U.S. Army leadership requirements model are be tested against the Cameron’s (2012 model of positive leadership strategies including: building a positive work climate; fostering positive relationships among the members of an organisation; establishing and promoting positive communication and manifesting the meaningfulness of work.

  7. Testing the Assumption of Measurement Invariance in the SAMHSA Mental Health and Alcohol Abuse Stigma Assessment in Older Adults

    OpenAIRE

    King-Kallimanis, B. L.; Oort, F. J.; Lynn, N.; Schonfeld, L.

    2012-01-01

    This study examined the assumption of measurement invariance of the SAMSHA Mental Health and Alcohol Abuse Stigma Assessment. This is necessary to make valid comparisons across time and groups. The data come from the Primary Care Research in Substance Abuse and Mental Health for Elderly trial, a longitudinal multisite, randomized trial examining two modes of care (Referral and Integrated). A sample of 1,198 adults over the age of 65 who screened positive for depression, anxiety, and/or at-ris...

  8. Why Is a Financial Crisis Important? The Significance of the Relaxation of the Assumption of Perfect Competition

    OpenAIRE

    Yew-Kwang Ng

    2009-01-01

    Under the usual assumption of perfect competition, we have money being neutral and changes in nominal aggregate demand cannot affect the real economic variables. If so, a financial crisis cannot be very important. However, the real world is characterized more by non-perfect competition when changes in nominal demand can affect real variables. This paper shows the important differences and explains the crux of these differences from both the demand and cost sides. It also provides a simplified...

  9. Robustness to failure of assumptions of tests for a common slope amongst several allometric lines--a simulation study.

    Science.gov (United States)

    Warton, David I

    2007-04-01

    In allometry, researchers are commonly interested in estimating the slope of the major axis or standardized major axis (methods of bivariate line fitting related to principal components analysis). This study considers the robustness of two tests for a common slope amongst several axes. It is of particular interest to measure the robustness of these tests to slight violations of assumptions that may not be readily detected in sample datasets. Type I error is estimated in simulations of data generated with varying levels of nonnormality, heteroscedasticity and nonlinearity. The assumption failures introduced in simulations were difficult to detect in a moderately sized dataset, with an expert panel only able to correct detect assumption violations 34-45% of the time. While the common slope tests were robust to nonnormal and heteroscedastic errors from the line, Type I error was inflated if the two variables were related in a slightly nonlinear fashion. Similar results were also observed for the linear regression case. The common slope tests were more liberal when the simulated data had greater nonlinearity, and this effect was more evident when the underlying distribution had longer tails than the normal. This result raises concerns for common slopes testing, as slight nonlinearities such as those in simulations are often undetectable in moderately sized datasets. Consequently, practitioners should take care in checking for nonlinearity and interpreting the results of a test for common slope. This work has implications for the robustness of inference in linear models in general.

  10. Variability of hemodynamic parameters using the common viscosity assumption in a computational fluid dynamics analysis of intracranial aneurysms.

    Science.gov (United States)

    Suzuki, Takashi; Takao, Hiroyuki; Suzuki, Takamasa; Suzuki, Tomoaki; Masuda, Shunsuke; Dahmani, Chihebeddine; Watanabe, Mitsuyoshi; Mamori, Hiroya; Ishibashi, Toshihiro; Yamamoto, Hideki; Yamamoto, Makoto; Murayama, Yuichi

    2017-01-01

    In most simulations of intracranial aneurysm hemodynamics, blood is assumed to be a Newtonian fluid. However, it is a non-Newtonian fluid, and its viscosity profile differs among individuals. Therefore, the common viscosity assumption may not be valid for all patients. This study aims to test the suitability of the common viscosity assumption. Blood viscosity datasets were obtained from two healthy volunteers. Three simulations were performed for three different-sized aneurysms, two using measured value-based non-Newtonian models and one using a Newtonian model. The parameters proposed to predict an aneurysmal rupture obtained using the non-Newtonian models were compared with those obtained using the Newtonian model. The largest difference (25%) in the normalized wall shear stress (NWSS) was observed in the smallest aneurysm. Comparing the difference ratio to the NWSS with the Newtonian model between the two Non-Newtonian models, the difference of the ratio was 17.3%. Irrespective of the aneurysmal size, computational fluid dynamics simulations with either the common Newtonian or non-Newtonian viscosity assumption could lead to values different from those of the patient-specific viscosity model for hemodynamic parameters such as NWSS.

  11. Teaching and Learning Science in the 21st Century: Challenging Critical Assumptions in Post-Secondary Science

    Directory of Open Access Journals (Sweden)

    Amanda L. Glaze

    2018-01-01

    Full Text Available It is widely agreed upon that the goal of science education is building a scientifically literate society. Although there are a range of definitions for science literacy, most involve an ability to problem solve, make evidence-based decisions, and evaluate information in a manner that is logical. Unfortunately, science literacy appears to be an area where we struggle across levels of study, including with students who are majoring in the sciences in university settings. One reason for this problem is that we have opted to continue to approach teaching science in a way that fails to consider the critical assumptions that faculties in the sciences bring into the classroom. These assumptions include expectations of what students should know before entering given courses, whose responsibility it is to ensure that students entering courses understand basic scientific concepts, the roles of researchers and teachers, and approaches to teaching at the university level. Acknowledging these assumptions and the potential for action to shift our teaching and thinking about post-secondary education represents a transformative area in science literacy and preparation for the future of science as a field.

  12. Multiple Linear Regressions by Maximizing the Likelihood under Assumption of Generalized Gauss-Laplace Distribution of the Error.

    Science.gov (United States)

    Jäntschi, Lorentz; Bálint, Donatella; Bolboacă, Sorana D

    2016-01-01

    Multiple linear regression analysis is widely used to link an outcome with predictors for better understanding of the behaviour of the outcome of interest. Usually, under the assumption that the errors follow a normal distribution, the coefficients of the model are estimated by minimizing the sum of squared deviations. A new approach based on maximum likelihood estimation is proposed for finding the coefficients on linear models with two predictors without any constrictive assumptions on the distribution of the errors. The algorithm was developed, implemented, and tested as proof-of-concept using fourteen sets of compounds by investigating the link between activity/property (as outcome) and structural feature information incorporated by molecular descriptors (as predictors). The results on real data demonstrated that in all investigated cases the power of the error is significantly different by the convenient value of two when the Gauss-Laplace distribution was used to relax the constrictive assumption of the normal distribution of the error. Therefore, the Gauss-Laplace distribution of the error could not be rejected while the hypothesis that the power of the error from Gauss-Laplace distribution is normal distributed also failed to be rejected.

  13. Quantum State Tomography via Reduced Density Matrices.

    Science.gov (United States)

    Xin, Tao; Lu, Dawei; Klassen, Joel; Yu, Nengkun; Ji, Zhengfeng; Chen, Jianxin; Ma, Xian; Long, Guilu; Zeng, Bei; Laflamme, Raymond

    2017-01-13

    Quantum state tomography via local measurements is an efficient tool for characterizing quantum states. However, it requires that the original global state be uniquely determined (UD) by its local reduced density matrices (RDMs). In this work, we demonstrate for the first time a class of states that are UD by their RDMs under the assumption that the global state is pure, but fail to be UD in the absence of that assumption. This discovery allows us to classify quantum states according to their UD properties, with the requirement that each class be treated distinctly in the practice of simplifying quantum state tomography. Additionally, we experimentally test the feasibility and stability of performing quantum state tomography via the measurement of local RDMs for each class. These theoretical and experimental results demonstrate the advantages and possible pitfalls of quantum state tomography with local measurements.

  14. Helping Students to Recognize and Evaluate an Assumption in Quantitative Reasoning: A Basic Critical-Thinking Activity with Marbles and Electronic Balance

    OpenAIRE

    Slisko, Josip; Cruz, Adrian Corona

    2013-01-01

    There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process.  When students use simple mathematical model to reason quantitatively about a situation, they usually do not consider which implicit assumptions they have made and, consequently, they do not evaluate if these assumption...

  15. Incorporating Love- and Rayleigh-wave magnitudes, unequal earthquake and explosion variance assumptions and interstation complexity for improved event screening

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Dale N [Los Alamos National Laboratory; Bonner, Jessie L [WESTON GEOPHYSICAL; Stroujkova, Anastasia [WESTON GEOPHYSICAL; Shumway, Robert [UC/DAVIS; Russell, David R [AFTAC

    2009-01-01

    Our objective is to improve seismic event screening using the properties of surface waves, We are accomplishing this through (1) the development of a Love-wave magnitude formula that is complementary to the Russell (2006) formula for Rayleigh waves and (2) quantifying differences in complexities and magnitude variances for earthquake and explosion-generated surface waves. We have applied the M{sub s} (VMAX) analysis (Bonner et al., 2006) using both Love and Rayleigh waves to events in the Middle East and Korean Peninsula, For the Middle East dataset consisting of approximately 100 events, the Love M{sub s} (VMAX) is greater than the Rayleigh M{sub s} (VMAX) estimated for individual stations for the majority of the events and azimuths, with the exception of the measurements for the smaller events from European stations to the northeast. It is unclear whether these smaller events suffer from magnitude bias for the Love waves or whether the paths, which include the Caspian and Mediterranean, have variable attenuation for Love and Rayleigh waves. For the Korean Peninsula, we have estimated Rayleigh- and Love-wave magnitudes for 31 earthquakes and two nuclear explosions, including the 25 May 2009 event. For 25 of the earthquakes, the network-averaged Love-wave magnitude is larger than the Rayleigh-wave estimate. For the 2009 nuclear explosion, the Love-wave M{sub s} (VMAX) was 3.1 while the Rayleigh-wave magnitude was 3.6. We are also utilizing the potential of observed variances in M{sub s} estimates that differ significantly in earthquake and explosion populations. We have considered two possible methods for incorporating unequal variances into the discrimination problem and compared the performance of various approaches on a population of 73 western United States earthquakes and 131 Nevada Test Site explosions. The approach proposes replacing the M{sub s} component by M{sub s} + a* {sigma}, where {sigma} denotes the interstation standard deviation obtained from the

  16. Handling realistic assumptions in hypothesis testing of 3D co-localization of genomic elements.

    Science.gov (United States)

    Paulsen, Jonas; Lien, Tonje G; Sandve, Geir Kjetil; Holden, Lars; Borgan, Ornulf; Glad, Ingrid K; Hovig, Eivind

    2013-05-01

    The study of chromatin 3D structure has recently gained much focus owing to novel techniques for detecting genome-wide chromatin contacts using next-generation sequencing. A deeper understanding of the architecture of the DNA inside the nucleus is crucial for gaining insight into fundamental processes such as transcriptional regulation, genome dynamics and genome stability. Chromatin conformation capture-based methods, such as Hi-C and ChIA-PET, are now paving the way for routine genome-wide studies of chromatin 3D structure in a range of organisms and tissues. However, appropriate methods for analyzing such data are lacking. Here, we propose a hypothesis test and an enrichment score of 3D co-localization of genomic elements that handles intra- or interchromosomal interactions, both separately and jointly, and that adjusts for biases caused by structural dependencies in the 3D data. We show that maintaining structural properties during resampling is essential to obtain valid estimation of P-values. We apply the method on chromatin states and a set of mutated regions in leukemia cells, and find significant co-localization of these elements, with varying enrichment scores, supporting the role of chromatin 3D structure in shaping the landscape of somatic mutations in cancer.

  17. Maringá and its historical heritage: a case study on the cathedral of the Assumption - doi: 10.4025/actascitechnol.v35i4.11063

    Directory of Open Access Journals (Sweden)

    Leonardo Cassimiro Barbosa

    2013-10-01

    Full Text Available The most tangible register of a civilization’s evolution is the heritage it preserves over the years. It is a vehicle for the transmission of peoples’ memory and culture. Although the city of Maringa in the state of Paraná, Brazil, is just 66 years old, it has several important buildings within its urban context whose preservation is not guaranteed by law. In fact, they are in danger of disappearing amid the city’s fast growth. Current research, surveying the preservation state of historical buildings in the municipality, is based on published studies, research at the City Hall, reports by the Historical Heritage Commission and in loco visits, with special emphasis on the Cathedral of the Assumption, the city’s symbol, whose preservation is still not legally guaranteed. The history of the building of the Cathedral, its most relevant external and internal architectonic features and its furniture for future inventories are focused. Current study raised the historic deployment of the Cathedral, its most relevant architectural features, both exterior and interior, including some of its details, to serve as a basis for inventories for future legal registrations and interventions.  

  18. Public timber supply, market adjustments, and local economies: economic assumptions of the Northwest Forest Plan.

    Science.gov (United States)

    Power, Thomas Michael

    2006-04-01

    The Northwest Forest Plan in the Pacific Northwest sought to stabilize local economies, including local employment and income, by stabilizing the flow of wood fiber from public forests. This is also a common forest management objective in other regions and countries. Because this economic strategy ignores basic market adjustments, it is likely to fail and to unnecessarily damage forest ecosystems. Application of basic economic principles on how markets operate significantly changes the apparent efficacy of efforts to manage local economies by managing timber supply. The emphasis on timber supply tends to ignore the dominant role that the demand for wood fiber and wood products, rather than wood-fiber supply, plays in determining levels of harvest and production. Contemporary economics indicates that markets tend to operate to offset reductions in wood-fiber supply. This significantly moderates the economic cost of reducing commercial timber harvest in the pursuit of environmental objectives. In addition, contemporary economic analysis indicates that the economic links between natural forests and local communities are much broader than simply the flow of commercially valuable logs to manufacturing facilities. At least in the United States, the flow of environmental services from natural forests has increasingly become an amenity that has drawn people and economic activity to forested areas. Attractive site-specific qualities, including those supported by natural forests, can potentially support local economic development even in the face of reduced timber harvests. These market-related adjustments partially explain the Northwest Forest Plan's overestimation of the expected regional impacts associated with reduced federal timber supply and the ineffectiveness of the plan's efforts to protect communities by stabilizing federal timber supply

  19. State Complexity of Testing Divisibility

    Directory of Open Access Journals (Sweden)

    Emilie Charlier

    2010-08-01

    Full Text Available Under some mild assumptions, we study the state complexity of the trim minimal automaton accepting the greedy representations of the multiples of m >= 2 for a wide class of linear numeration systems. As an example, the number of states of the trim minimal automaton accepting the greedy representations of the multiples of m in the Fibonacci system is exactly 2m^2.

  20. The Metaphysical Assumptions of the Conception of Truth in Martin Smiglecki’s Logic

    Directory of Open Access Journals (Sweden)

    Tomasz Pawlikowski

    2015-06-01

    Full Text Available The central element of the concept of truth in Smiglecki’s Logica (1618 is his approach to formulating definitions. Where the establishing of the truth is concerned, he always points to compliance at the level of the community (conformitas in respect of whether the intellectual recognition of a thing or things is in accordance with its intellectual equivalent, or the principles behind the latter, where these are understood as designating the corresponding idea inherent in the intellect of God. This is a form of the classical definition of truth --- similar to that used by St. Thomas Aquinas --- with a wide scope of applicability: to the field of existence (transcendental truth, to cognition and language (logical truth, and even to moral beliefs (moral rightness. Smiglecki distinguishes three types of truth: truth assigned to being, truth assigned to cognition, and truth assigned to moral convictions. Of these, the first is identified with transcendental truth, while the second is attributed not only to propositions and sentences, but also to concepts. The truth of concepts results from compliance with things by way of representation, while the truth of propositions and sentences issues from a compliance with things involving the implementation of some form of expression or other. Logical truth pertains to propositions rather than concepts. The kind of moral truth he writes about is what we would now be more likely to call “truthfulness”. With the exception of moral truth, which he defined as compliance of a statement with someone’s internal thoughts, Smiglecki considers every kind of truth to be a conditioned state of the object of knowledge. He says (a that the ultimate object of reference of human cognitive functioning is a real being, absolutely true by virtue of compliance with its internal principles and their idea as present in the intellect of God, and (b that the compatibility of human cognition with a real being is the ultimate

  1. The early bird gets the shrimp: Confronting assumptions of isotopic equilibrium and homogeneity in a wild bird population

    Science.gov (United States)

    Wunder, Michael B.; Jehl, Joseph R.; Stricker, Craig A.

    2012-01-01

    1. Because stable isotope distributions in organic material vary systematically across energy gradients that exist in ecosystems, community and population structures, and in individual physiological systems, isotope values in animal tissues have helped address a broad range of questions in animal ecology. It follows that every tissue sample provides an isotopic profile that can be used to study dietary or movement histories of individual animals. Interpretations of these profiles depend on the assumption that metabolic pools are isotopically well mixed and in equilibrium with dietary resources prior to tissue synthesis, and they extend to the population level by assuming isotope profiles are identically distributed for animals using the same proximal dietary resource. As these assumptions are never fully met, studying structure in the variance of tissue isotope values from wild populations is informative. 2. We studied variation in δ13C, δ15N, δ2H and δ18O data for feathers from a population of eared grebes (Podiceps nigricollis) that migrate to Great Salt Lake each fall to moult feathers. During this time, they cannot fly and feed almost exclusively on superabundant brine shrimp (Artemia franciscana). The ecological simplicity of this situation minimized the usual spatial and trophic complexities often present in natural studies of feather isotope values. 3. Ranges and variances of isotope values for the feathers were larger than those from previously published studies that report feather isotopic variance, but they were bimodally distributed in all isotope dimensions. Isotope values for proximal dietary resources and local surface water show that some of the feathers we assumed to have been grown locally must have been grown before birds reached isotopic equilibrium with local diet or immediately prior to arrival at Great Salt Lake. 4. Our study provides novel insights about resource use strategies in eared grebes during migration. More generally, it

  2. [Conceptual assumptions to create a system for preparation of healthcare human resources in Ukraine].

    Science.gov (United States)

    Грузєва, Тетяна С; Пельо, Ігор М; Сміянов, Владислав А; Галієнко, Людмила І

    development of public health service is due the state of health of population in Ukraine, the existing challenges and threats, strategic directions of development of national health system and international obligations. Staffing of public health service needs of training a new generation of professionals and that actualizes the formation of modern curricula and programs. Experience of training of public health professionals in more than 30 Universities in Europe and the world, as well as the requirements of the European program of core competencies of public health professionals, are the foundation for the formation of national training programs and plans according to the national context.

  3. R0 for vector-borne diseases: impact of the assumption for the duration of the extrinsic incubation period.

    Science.gov (United States)

    Hartemink, Nienke; Cianci, Daniela; Reiter, Paul

    2015-03-01

    Mathematical modeling and notably the basic reproduction number R0 have become popular tools for the description of vector-borne disease dynamics. We compare two widely used methods to calculate the probability of a vector to survive the extrinsic incubation period. The two methods are based on different assumptions for the duration of the extrinsic incubation period; one method assumes a fixed period and the other method assumes a fixed daily rate of becoming infectious. We conclude that the outcomes differ substantially between the methods when the average life span of the vector is short compared to the extrinsic incubation period.

  4. The “Equality Of Resources” As a Theoretical Assumption of Public Policies to Social Inclusion for People with Disabilities

    Directory of Open Access Journals (Sweden)

    João Daniel Daibes Resque

    2016-11-01

    Full Text Available The present article aims, from the analysis of Ronal Dworkin´s book "Sovereign Virtude: Theory and Practice of Equality", to show the theoretical assumptions on which the forementioned philosopher develops his egalitarian theory of distributive justice, called of "Equality of resources", which is based in an equal distribution of the total existing resources in the society, and as this egalitarian theory intends to explain the justice in alternatives distributions and compensations that may be used as a theoretical foundation to the development of public policies that targets the social inclusion for people with disabilities.

  5. Understanding "elder abuse and neglect": a critique of assumptions underpinning responses to the mistreatment and neglect of older people.

    Science.gov (United States)

    Harbison, Joan; Coughlan, Stephen; Beaulieu, Marie; Karabanow, Jeff; Vanderplaat, Madine; Wildeman, Sheila; Wexler, Ezra

    2012-04-01

    This article provides an overview of the ways in which the mistreatment and neglect of older people have come to be understood as a social problem, one which is underpinned by a variety of substantive and theoretical assumptions. It connects the process of conceptualizing elder abuse and neglect to political-economic and social evolution. The authors draw on a review of the literature, government sources, interest group websites, and their own research to provide a critical commentary illustrating how these understandings have become manifest in legislation, policies, and programs pertaining to "elder abuse and neglect" in Canada. Suggestions are provided for changes in direction for policies, programs, and research.

  6. Can Total Quality Management Succeed at Your College--Now? (Does Your College Meet the Essential Prerequisites and Underlying Assumptions of TQM?)

    Science.gov (United States)

    Hammons, James O.

    1994-01-01

    Defines Total Quality Management (TQM) and describes prerequisites for successful implementation, underlying assumptions, and the cultural barriers hindering implementation. Indicates that TQM's long-term benefits outweigh costs at most colleges. Urges practitioners to rate their schools with respect to the prerequisites and assumptions to…

  7. What impact do assumptions about missing data have on conclusions? A practical sensitivity analysis for a cancer survival registry

    Directory of Open Access Journals (Sweden)

    M. Smuk

    2017-02-01

    Full Text Available Abstract Background Within epidemiological and clinical research, missing data are a common issue and often over looked in publications. When the issue of missing observations is addressed it is usually assumed that the missing data are ‘missing at random’ (MAR. This assumption should be checked for plausibility, however it is untestable, thus inferences should be assessed for robustness to departures from missing at random. Methods We highlight the method of pattern mixture sensitivity analysis after multiple imputation using colorectal cancer data as an example. We focus on the Dukes’ stage variable which has the highest proportion of missing observations. First, we find the probability of being in each Dukes’ stage given the MAR imputed dataset. We use these probabilities in a questionnaire to elicit prior beliefs from experts on what they believe the probability would be in the missing data. The questionnaire responses are then used in a Dirichlet draw to create a Bayesian ‘missing not at random’ (MNAR prior to impute the missing observations. The model of interest is applied and inferences are compared to those from the MAR imputed data. Results The inferences were largely insensitive to departure from MAR. Inferences under MNAR suggested a smaller association between Dukes’ stage and death, though the association remained positive and with similarly low p values. Conclusions We conclude by discussing the positives and negatives of our method and highlight the importance of making people aware of the need to test the MAR assumption.

  8. Testing a key assumption in animal communication: between-individual variation in female visual systems alters perception of male signals.

    Science.gov (United States)

    Ronald, Kelly L; Ensminger, Amanda L; Shawkey, Matthew D; Lucas, Jeffrey R; Fernández-Juricic, Esteban

    2017-12-15

    Variation in male signal production has been extensively studied because of its relevance to animal communication and sexual selection. Although we now know much about the mechanisms that can lead to variation between males in the properties of their signals, there is still a general assumption that there is little variation in terms of how females process these male signals. Variation between females in signal processing may lead to variation between females in how they rank individual males, meaning that one single signal may not be universally attractive to all females. We tested this assumption in a group of female wild-caught brown-headed cowbirds ( Molothrus ater ), a species that uses a male visual signal (e.g. a wingspread display) to make its mate-choice decisions. We found that females varied in two key parameters of their visual sensory systems related to chromatic and achromatic vision: cone densities (both total and proportions) and cone oil droplet absorbance. Using visual chromatic and achromatic contrast modeling, we then found that this between-individual variation in visual physiology leads to significant between-individual differences in how females perceive chromatic and achromatic male signals. These differences may lead to variation in female preferences for male visual signals, which would provide a potential mechanism for explaining individual differences in mate-choice behavior. © 2017. Published by The Company of Biologists Ltd.

  9. When to use the projection assumption and the weak-phase object approximation in phase contrast cryo-EM

    Energy Technology Data Exchange (ETDEWEB)

    Vulović, Miloš [Quantitative Imaging Group, Faculty of Applied Sciences, Delft University of Technology, Lorentzweg 1, 2628 CJ Delft (Netherlands); Electron Microscopy Section, Molecular Cell Biology, Leiden University Medical Center, P.O. Box 9600, 2300 RC Leiden (Netherlands); Voortman, Lenard M.; Vliet, Lucas J. van [Quantitative Imaging Group, Faculty of Applied Sciences, Delft University of Technology, Lorentzweg 1, 2628 CJ Delft (Netherlands); Rieger, Bernd, E-mail: b.rieger@tudelft.nl [Quantitative Imaging Group, Faculty of Applied Sciences, Delft University of Technology, Lorentzweg 1, 2628 CJ Delft (Netherlands)

    2014-01-15

    The projection assumption (PA) and the weak-phase object approximation (WPOA) are commonly used to model image formation in cryo-electron microscopy. For simulating the next step in resolution improvement we show that it is important to revisit these two approximations as well as their limitations. Here we start off by inspecting both approximations separately to derive their respective conditions of applicability. The thick-phase grating approximation (TPGA) imposes less strict conditions on the interaction potential than PA or WPOA and gives comparable exit waves as a multislice calculation. We suggest the ranges of applicability for four models (PA, PA+WPOA, WPOA, and TPGA) given different interaction potentials using exit wave simulations. The conditions of applicability for the models are based on two measures, a worst-case (safest) and an average criterion. This allows us to present a practical guideline for when to use each image formation model depending on the spatial frequency, thickness and strength of the interaction potential of a macromolecular complex. - Highlights: • Show applicability of projection assumption and weak-phase object approximation. • PA and WPOA are satisfied in cryo-EM tomograms at >30 AA resolution. • At ∼5 AA resolution PA and WPOA fail for typical macromolecules. • Thick phase grating approximation is an excellent substitute to multi-slice computation for life sciences.

  10. A estratégia dos bônus: três pressupostos e uma consequência Bonus strategy: three assumptions and one consequence

    Directory of Open Access Journals (Sweden)

    Marcos Barbosa de Oliveira

    2009-11-01

    Full Text Available A 'estratégia dos bônus' é definida como a prática, por parte dos empregadores, de procurar fazer com que os empregados trabalhem mais e melhor usando como incentivo a concessão de vantagens monetárias adicionais ao salário, condicionadas ao aumento de produtividade. Dois exemplos da estratégia são mencionados, um deles referente à Secretaria da Educação do Estado de São Paulo, outro à Universidade de São Paulo. Examinam-se a seguir três pressupostos da estratégia a concepção penosa do trabalho, o trabalhador imbuído do espírito do capitalismo e a recompensa monetária como única forma de incentivo procurando-se mostrar que nenhum deles tem validade universal, sendo portanto estritamente falsos. Apresentam-se a seguir evidências adicionais para a invalidade dos pressupostos, oriundas do trabalho de professores aposentados, ou que já têm condições de se aposentar, mas continuam na ativa. Na última seção é exposta a consequência mais nefasta do uso da estratégia dos bônus: a 'idiotização' da sociedade.The 'bonus strategy' is defined by employers as a way to seek to make the workers work more and better using grants for additional monetary benefits to wages as an incentive, conditional on an increase in productivity. Two examples of the strategy are mentioned, one of them referring to the Department of Education of the State of São Paulo and the other to the University of São Paulo (USP. It examines the following three assumptions of the strategy the conception of painful labor, workers imbued with the spirit of capitalism, and monetary reward as the only form of incentive in an attempt to show that none of them have universal validity and are therefore strictly false. Then we present the additional evidence for the invalidity of the assumptions, derived from the work of retired teachers and teachers who have the credentials to retire, but that are still working. In the last section, the most disastrous

  11. The early bird gets the shrimp: confronting assumptions of isotopic equilibrium and homogeneity in a wild bird population.

    Science.gov (United States)

    Wunder, Michael B; Jehl, Joseph R; Stricker, Craig A

    2012-11-01

    1. Because stable isotope distributions in organic material vary systematically across energy gradients that exist in ecosystems, community and population structures, and in individual physiological systems, isotope values in animal tissues have helped address a broad range of questions in animal ecology. It follows that every tissue sample provides an isotopic profile that can be used to study dietary or movement histories of individual animals. Interpretations of these profiles depend on the assumption that metabolic pools are isotopically well mixed and in equilibrium with dietary resources prior to tissue synthesis, and they extend to the population level by assuming isotope profiles are identically distributed for animals using the same proximal dietary resource. As these assumptions are never fully met, studying structure in the variance of tissue isotope values from wild populations is informative. 2. We studied variation in δ(13) C, δ(15) N, δ(2) H and δ(18) O data for feathers from a population of eared grebes (Podiceps nigricollis) that migrate to Great Salt Lake each fall to moult feathers. During this time, they cannot fly and feed almost exclusively on superabundant brine shrimp (Artemia franciscana). The ecological simplicity of this situation minimized the usual spatial and trophic complexities often present in natural studies of feather isotope values. 3. Ranges and variances of isotope values for the feathers were larger than those from previously published studies that report feather isotopic variance, but they were bimodally distributed in all isotope dimensions. Isotope values for proximal dietary resources and local surface water show that some of the feathers we assumed to have been grown locally must have been grown before birds reached isotopic equilibrium with local diet or immediately prior to arrival at Great Salt Lake. 4. Our study provides novel insights about resource use strategies in eared grebes during migration. More generally

  12. Using Instrument Simulators and a Satellite Database to Evaluate Microphysical Assumptions in High-Resolution Simulations of Hurricane Rita

    Science.gov (United States)

    Hristova-Veleva, S. M.; Chao, Y.; Chau, A. H.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Martin, J. M.; Poulsen, W. L.; Rodriguez, E.; Stiles, B. W.; Turk, J.; Vu, Q.

    2009-12-01

    Improving forecasting of hurricane intensity remains a significant challenge for the research and operational communities. Many factors determine a tropical cyclone’s intensity. Ultimately, though, intensity is dependent on the magnitude and distribution of the latent heating that accompanies the hydrometeor production during the convective process. Hence, the microphysical processes and their representation in hurricane models are of crucial importance for accurately simulating hurricane intensity and evolution. The accurate modeling of the microphysical processes becomes increasingly important when running high-resolution models that should properly reflect the convective processes in the hurricane eyewall. There are many microphysical parameterizations available today. However, evaluating their performance and selecting the most representative ones remains a challenge. Several field campaigns were focused on collecting in situ microphysical observations to help distinguish between different modeling approaches and improve on the most promising ones. However, these point measurements cannot adequately reflect the space and time correlations characteristic of the convective processes. An alternative approach to evaluating microphysical assumptions is to use multi-parameter remote sensing observations of the 3D storm structure and evolution. In doing so, we could compare modeled to retrieved geophysical parameters. The satellite retrievals, however, carry their own uncertainty. To increase the fidelity of the microphysical evaluation results, we can use instrument simulators to produce satellite observables from the model fields and compare to the observed. This presentation will illustrate how instrument simulators can be used to discriminate between different microphysical assumptions. We will compare and contrast the members of high-resolution ensemble WRF model simulations of Hurricane Rita (2005), each member reflecting different microphysical assumptions

  13. Setting the agenda in emergency medicine in the southern African region: Conference assumptions and recommendations, Emergency Medicine Conference 2014: Gaborone, Botswana

    Directory of Open Access Journals (Sweden)

    Lloyd D. Christopher

    2014-09-01

    Full Text Available The first international emergency medicine (EM conference in Botswana was held on 15th and 16th May 2014 at the Gaborone International Convention Centre. The support from key stakeholders positioned the conference, from its conception, to deliver expert guidance on emergency medicine relevance, education and systems implementation. The conference theme was aptly: “Setting the Agenda in Emergency Medicine in the Southern African Region.” Over 300 local, regional and international delegates convened to participate in this landmark event. Country representation included Botswana, South Africa, Zambia, Namibia, Zimbabwe, Swaziland, Lesotho, Nigeria and the United States of America. Conference assumptions intersected emergency care, African burden of injury and illness and the role of the state; the public protection ethic of emergency care, and the developmental, economic and health interest in promoting EM. The recommendations addressed emergency care relevance; health systems research as an imperative for emergency systems development in southern Africa; community agency as a requisite for emergency care resilience; emergency care workers as pivotal to the emergency medical system, and support of EM system implementation. The conference recommendations – by way of setting an agenda, augur well for emergency care development and implementation in the southern African region and are likely to prove useful to the southern African countries seeking to address health service quality, EM advocacy support and implementation guidance. Emergency medicine is the only discipline with ‘universality’ and ‘responsivity’ at the point of need. This implies the widespread potential for facilitation of access to health care: a public health goal nuanced by the African development agenda.

  14. Serum neurophysins during passive assumption of the erect posture in men and during pregnancy: effect of syncope.

    Science.gov (United States)

    Legros, J J

    1976-06-01

    A slight but reproducible increase in serum total immunoreactive neurophysin levels (IRN) occurred 30 minutes following passive assumption of the erect posture (tilt test) in 12 men (17%, 2 P is less than .02) and in 6 pregnant women (20%, 2P=NS) who showed normal cardiovascular adaptation during the test and did not suffer syncope. The increase was much more marked in 4 men in whom there was a fall in blood pressure and syncope in assuming the upright posture. An increase of twice the basal level was also found in a pregnant women who experienced syncope, although she was maintained horizontal. Our results show that it is important to ensure the absence of these intercurrent phenomena in the correct interpretation of dynamic clinical tests of neurohypophysial function.

  15. CHANGES IN THE ORGANIZATION AND LOGISTICS OF SUPPLIES IN SUGAR FACTORIES (METHODOLOGICAL ASSUMPTIONS FOR THE STUDY OF EFFECTIVENESS

    Directory of Open Access Journals (Sweden)

    Marcin Polowczyk

    2016-12-01

    Full Text Available The article attempts to determine the methodological assumptions for further research on the effectiveness of changes in the organization and logistics of supplies in sugar factories. For this purpose, data were collected on historical ways of organizing transport of sugar beets to various sugar factories. Preliminary analysis of the data shows that the direction of development of the organization and logistics of beet deliveries is towards a process fully organized and supervised by individual sugar producers. This approach allows for better use of existing resources and the development of new directions of the organization processes. All actions taken by the sugar sector are aimed at introducing optimization, acceleration of movement and use of the latest communication techniques in order to reduce effort and time in the organization of the process.

  16. Sensitivity of Utility-Scale Solar Deployment Projections in the SunShot Vision Study to Market and Performance Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Eurek, K.; Denholm, P.; Margolis, R.; Mowers, M.

    2013-04-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The ReEDS model was used to simulate utility PV and CSP deployment for this present study, based on several market and performance assumptions - electricity demand, natural gas prices, coal retirements, cost and performance of non-solar renewable technologies, PV resource variability, distributed PV deployment, and solar market supply growth - in addition to the SunShot solar price projections. This study finds that utility-scale solar deployment is highly sensitive to solar prices. Other factors can have significant impacts, particularly electricity demand and natural gas prices.

  17. Predicting salt intrusion into freshwater aquifers resulting from CO2 injection – A study on the influence of conservative assumptions

    DEFF Research Database (Denmark)

    Walter, Lena; Binning, Philip John; Class, Holger

    2013-01-01

    . A crucial task is to choose an appropriate conceptual model and relevant scenarios. Overly conservative assumptions may lead to estimation of unacceptably high risks, and thus prevent the implementation of a CO2 storage project unnecessarily. On the other hand, risk assessment should not lead...... to an underestimation of hazards. This study compares two conceptual model approaches for the numerical simulation of brine-migration scenarios through a vertical fault and salt intrusion into a fresh water aquifer. The first approach calculates salt discharge into freshwater using an immiscible two-phase model......-phase model is applied in the CO2 storage reservoir and spatially coupled to a single-phase (water) two-component (water, salt) model, where salt mass fraction is a variable. A Dirichlet–Neumann technique is used for the coupling conditions at the interface of the two models. The results show...

  18. Contributions to the history of psychology: XLVIII. Ancient Greek roots of the assumptions of modern clinical psychology.

    Science.gov (United States)

    Robbins, R A

    1988-06-01

    This paper is an account of studies of the linguistic transformation that took place in ancient Greece between the eighth and fourth centuries B.C., searching for factors which contribute to the shift in how humans perceived themselves. The group or force-field consciousness of the men of the Iliad and the linguistic factors which allowed "individuality" to emerge by the time of Plato is explored. The account relates the emergence of the notion of "madness" to the development of the individual and asks whether madness is an artifact of individuality and explores the relationship of these developments to our present underlying assumption of a duality in human nature composed of the rational and the irrational.

  19. Assumption Centred Modelling of Ecosystem Responses to CO2 at Six US Atmospheric CO2 Enrichment Experiments.

    Science.gov (United States)

    Walker, A. P.; De Kauwe, M. G.; Medlyn, B. E.; Zaehle, S.; Luus, K. A.; Ryan, E.; Xia, J.; Norby, R. J.

    2015-12-01

    Plant photosynthetic rates increase and stomatal apertures decrease in response to elevated atmospheric CO[2] (eCO2), increasing both plant carbon (C) availability and water use efficiency. These physiological responses to eCO2 are well characterised and understood, however the ecological effects of these responses as they cascade through a suite of plant and ecosystem processes are complex and subject to multiple interactions and feedbacks. Therefore the response of the terrestrial carbon sink to increasing atmospheric CO[2] remains the largest uncertainty in global C cycle modelling to date, and is a huge contributor to uncertainty in climate change projections. Phase 2 of the FACE Model-Data Synthesis (FACE-MDS) project synthesises ecosystem observations from five long-term Free-Air CO[2] Enrichment (FACE) experiments and one open top chamber (OTC) experiment to evaluate the assumptions of a suite of terrestrial ecosystem models. The experiments are: The evergreen needleleaf Duke Forest FACE (NC), the deciduous broadleaf Oak Ridge FACE (TN), the prairie heating and FACE (WY), and the Nevada desert FACE, and the evergreen scrub oak OTC (FL). An assumption centered approach is being used to analyse: the interaction between eCO2 and water limitation on plant productivity; the interaction between eCO2 and temperature on plant productivity; whether increased rates of soil decomposition observed in many eCO2 experiments can account for model deficiencies in N uptake shown during Phase 1 of the FACE-MDS; and tracing carbon through the ecosystem to identify the exact cause of changes in ecosystem C storage.

  20. Social aspects of revitalization of rural areas. Implementation of the rural revival programme in lodzkie voivodeship. Assumptions for sociological research

    Directory of Open Access Journals (Sweden)

    Pamela Jeziorska-Biel

    2012-04-01

    Full Text Available Essential elements of the process of rural renovation programme are: stimulating activity of local communities, cooperation for development, while preserving social identity, cultural heritage and natural environment. Implementing a rural revival programme in Poland: Sectoral Operational Programme “The Restructuring and Modernisation of the Food Sector and the Development of Rural Areas in 2004-2006” (action 2.3 “Rural renovation and protection and preservation of cultural heritage” evokes criticism. A wide discussion is carried amongst researchers, politicians, social activists, and local government practitioners. The main question remains: “is rural renovation process in Poland conducted in accordance with the rules in European countries or it is only a new formula of rural modernisation with the use of European funds?” The authors are joining the discussion and in the second part of the article they are presenting the assumption of sociological research. The aim of the analysis is to grasp the essence of revitalization of rural areas located in Łódzkie voivodeship, and analyse the question of specificity of rural Revival Programmes. What is the scope and manner of use of local capital? If so, are the results obtained from implementing a rural revival programme in 2004-2006 within the scope of sustainable development? What activities are predominant in the process of project implementation? Is it rural modernisation, revitalization of the rural areas, barrier removal and change in Infrastructure, or creation of social capital and subjectivity of the local community? Has the process of rural renovation in Łódzkie voivodeship got the so called “social face” and if so, to what extent? The major assumption is that rural renovation programme in Łódzkie voivodeship relates more to revitalization material aspects than “spirituality”.

  1. Sensitivity of C-Band Polarimetric Radar-Based Drop Size Distribution Measurements to Maximum Diameter Assumptions

    Science.gov (United States)

    Carey, Lawrence D.; Petersen, Walter A.

    2011-01-01

    The estimation of rain drop size distribution (DSD) parameters from polarimetric radar observations is accomplished by first establishing a relationship between differential reflectivity (Z(sub dr)) and the central tendency of the rain DSD such as the median volume diameter (D0). Since Z(sub dr) does not provide a direct measurement of DSD central tendency, the relationship is typically derived empirically from rain drop and radar scattering models (e.g., D0 = F[Z (sub dr)] ). Past studies have explored the general sensitivity of these models to temperature, radar wavelength, the drop shape vs. size relation, and DSD variability. Much progress has been made in recent years in measuring the drop shape and DSD variability using surface-based disdrometers, such as the 2D Video disdrometer (2DVD), and documenting their impact on polarimetric radar techniques. In addition to measuring drop shape, another advantage of the 2DVD over earlier impact type disdrometers is its ability to resolve drop diameters in excess of 5 mm. Despite this improvement, the sampling limitations of a disdrometer, including the 2DVD, make it very difficult to adequately measure the maximum drop diameter (D(sub max)) present in a typical radar resolution volume. As a result, D(sub max) must still be assumed in the drop and radar models from which D0 = F[Z(sub dr)] is derived. Since scattering resonance at C-band wavelengths begins to occur in drop diameters larger than about 5 mm, modeled C-band radar parameters, particularly Z(sub dr), can be sensitive to D(sub max) assumptions. In past C-band radar studies, a variety of D(sub max) assumptions have been made, including the actual disdrometer estimate of D(sub max) during a typical sampling period (e.g., 1-3 minutes), D(sub max) = C (where C is constant at values from 5 to 8 mm), and D(sub max) = M*D0 (where the constant multiple, M, is fixed at values ranging from 2.5 to 3.5). The overall objective of this NASA Global Precipitation Measurement

  2. Estimating risks and relative risks in case-base studies under the assumptions of gene-environment independence and Hardy-Weinberg equilibrium.

    Science.gov (United States)

    Chui, Tina Tsz-Ting; Lee, Wen-Chung

    2014-01-01

    Many diseases result from the interactions between genes and the environment. An efficient method has been proposed for a case-control study to estimate the genetic and environmental main effects and their interactions, which exploits the assumptions of gene-environment independence and Hardy-Weinberg equilibrium. To estimate the absolute and relative risks, one needs to resort to an alternative design: the case-base study. In this paper, the authors show how to analyze a case-base study under the above dual assumptions. This approach is based on a conditional logistic regression of case-counterfactual controls matched data. It can be easily fitted with readily available statistical packages. When the dual assumptions are met, the method is approximately unbiased and has adequate coverage probabilities for confidence intervals. It also results in smaller variances and shorter confidence intervals as compared with a previous method for a case-base study which imposes neither assumption.

  3. MATHEMATICAL FRAMEWORK OF THE WELL PRODUCTIVITY INDEX FOR FAST FORCHHEIMER (NON-DARCY) FLOWS IN POROUS MEDIA

    KAUST Repository

    AULISA, EUGENIO

    2009-08-01

    Motivated by the reservoir engineering concept of the well Productivity Index, we introduced and analyzed a functional, denoted as "diffusive capacity", for the solution of the initial-boundary value problem (IBVP) for a linear parabolic equation.21 This IBVP described laminar (linear) Darcy flow in porous media; the considered boundary conditions corresponded to different regimes of the well production. The diffusive capacities were then computed as steady state invariants of the solutions to the corresponding time-dependent boundary value problem. Here similar features for fast or turbulent nonlinear flows subjected to the Forchheimer equations are analyzed. It is shown that under some hydrodynamic and thermodynamic constraints, there exists a so-called pseudo steady state regime for the Forchheimer flows in porous media. In other words, under some assumptions there exists a steady state invariant over a certain class of solutions to the transient IBVP modeling the Forchheimer flow for slightly compressible fluid. This invariant is the diffusive capacity, which serves as the mathematical representation of the so-called well Productivity Index. The obtained results enable computation of the well Productivity Index by resolving a single steady state boundary value problem for a second-order quasilinear elliptic equation. Analytical and numerical studies highlight some new relations for the well Productivity Index in linear and nonlinear cases. The obtained analytical formulas can be potentially used for the numerical well block model as an analog of Piecemann. © 2009 World Scientific Publishing Company.

  4. Sea urchin fertilization assay: an evaluation of assumptions related to sample salinity adjustment and use of natural and synthetic marine waters for testing.

    Science.gov (United States)

    Jonczyk, E; Gilron, G; Zajdlik, B

    2001-04-01

    Most industrial effluents discharged into the marine coastal environment are freshwater in nature and therefore require manipulation prior to testing with marine organisms. The sea urchin fertilization test is a common marine bioassay used for routine environmental monitoring, investigative evaluations, and/or regulatory testing of effluents and sediment pore waters. The existing Canadian and U.S. Environmental Protection Agencies test procedures using sea urchin (and sand dollar) gametes allow for sample salinity adjustment using either brine or dry salts. Moreover, these procedures also allow for the use of either natural or synthetic marine water for culturing/holding test organisms and for full-scale testing. At present, it is unclear to what extent these variables affect test results for whole effluents. The test methods simply state that there are no data available and that the use of artificial dry sea salts should be considered provisional. We conducted a series of concurrent experiments aimed at comparing the two different treatments of sample salinity adjustment and the use of natural versus synthetic seawater in order to test these assumptions and evaluate effects on the estimated end points generated by the sea urchin fertilization sublethal toxicity test. Results from these experiments indicated that there is no significant difference in test end points when dry salts or brine are used for sample salinity adjustment. Similarly, results obtained from parallel (split-sample) industrial effluent tests with natural and artificial seawater suggest that both dilution waters produce similar test results. However, data obtained from concurrent tests with the reference toxicant, copper sulfate, showed higher variability and greater sensitivity when using natural seawater as control/dilution water.

  5. Wind Power accuracy and forecast. D3.1. Assumptions on accuracy of wind power to be considered at short and long term horizons

    Energy Technology Data Exchange (ETDEWEB)

    Morthorst, P.E.; Coulondre, J.M.; Schroeder, S.T.; Meibom, P.

    2010-07-15

    The main objective of the Optimate project (An Open Platform to Test Integration in new MArkeT designs of massive intermittent Energy sources dispersed in several regional power markets) is to develop a new tool for testing these new market designs with large introduction of variable renewable energy sources. In Optimate a novel network/system/market modelling approach is being developed, generating an open simulation platform able to exhibit the comparative benefits of several market design options. This report constitutes delivery 3.1 on the assumptions on accuracy of wind power to be considered at short and long term horizons. The report handles the issues of state-of-the-art prediction, how predictions for wind power enter into the Optimate model and a simple and a more advanced methodology of how to generate trajectories of prediction errors to be used in Optimate. The main conclusion is that undoubtedly, the advanced approach is to be preferred to the simple one seen from a theoretical viewpoint. However, the advanced approach was developed to the Wilmar-model with the purpose of describing the integration of large-scale wind power in Europe. As the main purpose of the Optimate model is not to test the integration of wind power, but to test new market designs assuming a strong growth in wind power production, a more simplified approach for describing wind power forecasts should be sufficient. Thus a further development of the simple approach is suggested, eventually including correlations between geographical areas. In this report the general methodologies for generating trajectories for wind power forecasts are outlined. However, the methods are not yet implemented. In the next phase of Optimate, the clusters will be defined and the needed data collected. Following this phase actual results will be generated to be used in Optimate. (LN)

  6. Rating leniency and halo in multisource feedback ratings: testing cultural assumptions of power distance and individualism-collectivism.

    Science.gov (United States)

    Ng, Kok-Yee; Koh, Christine; Ang, Soon; Kennedy, Jeffrey C; Chan, Kim-Yin

    2011-09-01

    This study extends multisource feedback research by assessing the effects of rater source and raters' cultural value orientations on rating bias (leniency and halo). Using a motivational perspective of performance appraisal, the authors posit that subordinate raters followed by peers will exhibit more rating bias than superiors. More important, given that multisource feedback systems were premised on low power distance and individualistic cultural assumptions, the authors expect raters' power distance and individualism-collectivism orientations to moderate the effects of rater source on rating bias. Hierarchical linear modeling on data collected from 1,447 superiors, peers, and subordinates who provided developmental feedback to 172 military officers show that (a) subordinates exhibit the most rating leniency, followed by peers and superiors; (b) subordinates demonstrate more halo than superiors and peers, whereas superiors and peers do not differ; (c) the effects of power distance on leniency and halo are strongest for subordinates than for peers and superiors; (d) the effects of collectivism on leniency were stronger for subordinates and peers than for superiors; effects on halo were stronger for subordinates than superiors, but these effects did not differ for subordinates and peers. The present findings highlight the role of raters' cultural values in multisource feedback ratings. PsycINFO Database Record (c) 2011 APA, all rights reserved

  7. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  8. The metaphysics of D-CTCs: On the underlying assumptions of Deutsch's quantum solution to the paradoxes of time travel

    Science.gov (United States)

    Dunlap, Lucas

    2016-11-01

    I argue that Deutsch's model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a "multiverse", and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent a quantum solution to the paradoxes of time travel.

  9. Thai SF-36 health survey: tests of data quality, scaling assumptions, reliability and validity in healthy men and women

    Directory of Open Access Journals (Sweden)

    Sleigh Adrian

    2008-07-01

    Full Text Available Abstract Background Since its translation to Thai in 2000, the SF-36 Health Survey has been used extensively in many different clinical settings in Thailand. Its popularity has increased despite the absence of published evidence that the translated instrument satisfies scoring assumptions, the psychometric properties required for valid interpretation of the SF-36 summated ratings scales. The purpose of this paper was to examine these properties and to report on the reliability and validity of the Thai SF-36 in a non-clinical general population. Methods 1345 distance-education university students who live in all areas of Thailand completed a questionnaire comprising the Thai SF-36 (Version 1. Median age was 31 years. Psychometric tests recommended by the International Quality of Life Assessment Project were used. Results Data quality was satisfactory: questionnaire completion rate was high (97.5% and missing data rates were low ( Conclusion The summated ratings method can be used for scoring the Thai SF-36. The instrument was found to be reliable and valid for use in a general non-clinical population. Version 2 of the SF-36 could improve ceiling and floor effects in the role functioning scales. Further work is warranted to refine items that measure the concepts of social functioning, vitality and mental health to improve the reliability and discriminant validity of these scales.

  10. Three-Dimensional Phylogeny Explorer: Distinguishing paralogs, lateral transfer, and violation of "molecular clock" assumption with 3D visualization

    Directory of Open Access Journals (Sweden)

    Lee Christopher

    2007-06-01

    Full Text Available Abstract Background Construction and interpretation of phylogenetic trees has been a major research topic for understanding the evolution of genes. Increases in sequence data and complexity are creating a need for more powerful and insightful tree visualization tools. Results We have developed 3D Phylogeny Explorer (3DPE, a novel phylogeny tree viewer that maps trees onto three spatial axes (species on the X-axis; paralogs on Z; evolutionary distance on Y, enabling one to distinguish at a glance evolutionary features such as speciation; gene duplication and paralog evolution; lateral gene transfer; and violation of the "molecular clock" assumption. Users can input any tree on the online 3DPE, then rotate, scroll, rescale, and explore it interactively as "live" 3D views. All objects in 3DPE are clickable to display subtrees, connectivity path highlighting, sequence alignments, and gene summary views, and etc. To illustrate the value of this visualization approach for microbial genomes, we also generated 3D phylogeny analyses for all clusters from the public COG database. We constructed tree views using well-established methods and graph algorithms. We used Scientific Python to generate VRML2 3D views viewable in any web browser. Conclusion 3DPE provides a novel phylogenetic tree projection method into 3D space and its web-based implementation with live 3D features for reconstruction of phylogenetic trees of COG database.

  11. Comparative study of aggregations under different dependency assumptions for assessment of undiscovered recoverable oil resources in the world

    Science.gov (United States)

    Crovelli, R.A.

    1985-01-01

    The U.S. Geological Survey assessed all significant sedimentary basins in the world for undiscovered conventionally recoverable crude-oil resources. Probabilistic methodology was applied to each basin assessment to produce estimates in the form of probability distributions. Basin probability distributions were computer aggregated to produce resource estimates for the entire world. Aggregation was approximated by a three-parameter lognormal distribution by combining the first three central moments of basin distributions. For purposes of experiment and study, world aggregation was conducted under four different sets of assumptions. The four cases are (1) dependent assessments of all basins, (2) dependent assessments within continental areas, but independent assessments among continental areas, (3) dependent assessments within countries, but independent assessments among countries, and (4) independent assessments of all basins. Mean estimate remained the same in all four cases, but the width of interval estimate formed using the 95th and 5th fractiles decreased with reduced dependency in going from first to fourth case. ?? 1985 Plenum Publishing Corporation.

  12. Testing the Assumption of Measurement Invariance in the SAMHSA Mental Health and Alcohol Abuse Stigma Assessment in Older Adults.

    Science.gov (United States)

    King-Kallimanis, Bellinda L; Oort, Frans J; Lynn, Nancy; Schonfeld, Lawrence

    2012-12-01

    This study examined the assumption of measurement invariance of the SAMSHA Mental Health and Alcohol Abuse Stigma Assessment. This is necessary to make valid comparisons across time and groups. The data come from the Primary Care Research in Substance Abuse and Mental Health for Elderly trial, a longitudinal multisite, randomized trial examining two modes of care (Referral and Integrated). A sample of 1,198 adults over the age of 65 who screened positive for depression, anxiety, and/or at-risk drinking was used. Structural equation modeling was used to assess measurement invariance in a two-factor measurement model (Perceived Stigma, Comfort Level). Irrespective of their stigma level, one bias indicated that with time, respondents find it easier to acknowledge that it is difficult to start treatment if others know they are in treatment. Other biases indicated that sex, mental quality of life and the subject of stigma had undue influence on respondents' feeling people would think differently of them if they received treatment and on respondents' comfort in talking to a mental health provider. Still, in the present study, these biases in response behavior had little effect on the evaluation of group differences and changes in stigma. Stigma decreased for patients of both the Referral and Integrated care groups.

  13. Impedance measurement of non-locally reactive samples and the influence of the assumption of local reaction.

    Science.gov (United States)

    Brandão, Eric; Mareze, Paulo; Lenzi, Arcanjo; da Silva, Andrey R

    2013-05-01

    In this paper, the measurement of the absorption coefficient of non-locally reactive sample layers of thickness d1 backed by a rigid wall is investigated. The investigation is carried out with the aid of real and theoretical experiments, which assume a monopole sound source radiating sound above an infinite non-locally reactive layer. A literature search revealed that the number of papers devoted to this matter is rather limited in comparison to those which address the measurement of locally reactive samples. Furthermore, the majority of papers published describe the use of two or more microphones whereas this paper focuses on the measurement with the pressure-particle velocity sensor (PU technique). For these reasons, the assumption that the sample is locally reactive is initially explored, so that the associated measurement errors can be quantified. Measurements in the impedance tube and in a semi-anechoic room are presented to validate the theoretical experiment. For samples with a high non-local reaction behavior, for which the measurement errors tend to be high, two different algorithms are proposed in order to minimize the associated errors.

  14. Testing implicit assumptions regarding the age vs. size dependence of stem biomechanics using Pittocaulon (Senecio) praecox (Asteraceae).

    Science.gov (United States)

    Rosell, Julieta A; Olson, Mark E

    2007-02-01

    Strong covariation between organismal traits is often taken as an indication of a potentially adaptively significant relationship. Because one of the main functions of woody stems is mechanical support, identifying the factors that covary with biomechanics is essential for inference of adaptation. To date in such studies, stem biomechanics is plotted against stem age or size, thus with implicit assumptions regarding the importance of each in determining mechanics. Likewise, comparing ontogenies between individuals is central to the study of ontogenetic evolution (e.g., heterochrony). Both absolute age and size have been used, but the rationale for choosing one over the other has not been examined. Sampling a plant of simple architecture across microsites with differing sizes for the same absolute age, we compared regressions of stem length, mechanics, and tissue areas against age and size. Stem length was predicted by diameter but not by age, and stem biomechanics and tissue areas were better explained by stem length rather than age. We show that the allometric and mechanical properties observed across microsites are uniform despite great plasticity in other features (e.g., size and wood anatomy) and suggest that this uniformity is an example of developmental homeostasis. Finally, we discuss reasons for preferring size over absolute age as a basis for comparing ontogenies between individuals.

  15. A Violation of the Conditional Independence Assumption in the Two-High-Threshold Model of Recognition Memory

    Science.gov (United States)

    Chen, Tina; Starns, Jeffrey J.; Rotello, Caren M.

    2015-01-01

    The 2-high-threshold (2HT) model of recognition memory assumes that test items result in distinct internal states: they are either detected or not, and the probability of responding at a particular confidence level that an item is "old" or "new" depends on the state-response mapping parameters. The mapping parameters are…

  16. Analytic Review of Modeling Studies of ARV Based PrEP Interventions Reveals Strong Influence of Drug-Resistance Assumptions on the Population-Level Effectiveness

    Science.gov (United States)

    Dimitrov, Dobromir; Boily, Marie-Claude; Brown, Elizabeth R.; Hallett, Timothy B.

    2013-01-01

    Background Four clinical trials have shown that oral and topical pre-exposure prophylaxis (PrEP) based on tenofovir may be effective in preventing HIV transmission. The expected reduction in HIV transmission and the projected prevalence of drug resistance due to PrEP use vary significantly across modeling studies as a result of the broad spectrum of assumptions employed. Our goal is to quantify the influence of drug resistance assumptions on the predicted population-level impact of PrEP. Methods All modeling studies which evaluate the impact of oral or topical PrEP are reviewed and key assumptions regarding mechanisms of generation and spread of drug-resistant HIV are identified. A dynamic model of the HIV epidemic is developed to assess and compare the impact of oral PrEP using resistance assumptions extracted from published studies. The benefits and risks associated with ten years of PrEP use are evaluated under identical epidemic, behavioral and intervention conditions in terms of cumulative fractions of new HIV infections prevented, resistance prevalence among those infected with HIV, and fractions of infections in which resistance is transmitted. Results Published models demonstrate enormous variability in resistance-generating assumptions and uncertainty in parameter values. Depending on which resistance parameterization is used, a resistance prevalence between 2% and 44% may be expected if 50% efficacious oral PrEP is used consistently by 50% of the population over ten years. We estimated that resistance may be responsible for up to a 10% reduction or up to a 30% contribution to the fraction of prevented infections predicted in different studies. Conclusions Resistance assumptions used in published studies have a strong influence on the projected impact of PrEP. Modelers and virologists should collaborate toward clarifying the set of resistance assumptions biologically relevant to the PrEP products which are already in use or soon to be added to the arsenal

  17. The geometry of the flux cone of a metabolic network.

    Science.gov (United States)

    Wagner, Clemens; Urbanczik, Robert

    2005-12-01

    The analysis of metabolic networks has become a major topic in biotechnology in recent years. Applications range from the enhanced production of selected outputs to the prediction of genotype-phenotype relationships. The concepts used are based on the assumption of a pseudo steady-state of the network, so that for each metabolite inputs and outputs are balanced. The stoichiometric network analysis expands the steady state into a combination of nonredundant subnetworks with positive coefficients called extremal currents. Based on the unidirectional representation of the system these subnetworks form a convex cone in the flux-space. A modification of this approach allowing for reversible reactions led to the definition of elementary modes. Extreme pathways are obtained with the same method but splitting up internal reactions into forward and backward rates. In this study, we explore the relationship between these concepts. Due to the combinatorial explosion of the number of elementary modes in large networks, we promote a further set of metabolic routes, which we call the minimal generating set. It is the smallest subset of elementary modes required to describe all steady states of the system. For large-scale networks, the size of this set is of several magnitudes smaller than that of elementary modes and of extreme pathways.

  18. Quantifying tap-to-household water quality deterioration in urban communities in Vellore, India: The impact of spatial assumptions.

    Science.gov (United States)

    Alarcon Falconi, Tania M; Kulinkina, Alexandra V; Mohan, Venkata Raghava; Francis, Mark R; Kattula, Deepthi; Sarkar, Rajiv; Ward, Honorine; Kang, Gagandeep; Balraj, Vinohar; Naumova, Elena N

    2017-01-01

    Municipal water sources in India have been found to be highly contaminated, with further water quality deterioration occurring during household storage. Quantifying water quality deterioration requires knowledge about the exact source tap and length of water storage at the household, which is not usually known. This study presents a methodology to link source and household stored water, and explores the effects of spatial assumptions on the association between tap-to-household water quality deterioration and enteric infections in two semi-urban slums of Vellore, India. To determine a possible water source for each household sample, we paired household and tap samples collected on the same day using three spatial approaches implemented in GIS: minimum Euclidean distance; minimum network distance; and inverse network-distance weighted average. Logistic and Poisson regression models were used to determine associations between water quality deterioration and household-level characteristics, and between diarrheal cases and water quality deterioration. On average, 60% of households had higher fecal coliform concentrations in household samples than at source taps. Only the weighted average approach detected a higher risk of water quality deterioration for households that do not purify water and that have animals in the home (RR=1.50 [1.03, 2.18], p=0.033); and showed that households with water quality deterioration were more likely to report diarrheal cases (OR=3.08 [1.21, 8.18], p=0.02). Studies to assess contamination between source and household are rare due to methodological challenges and high costs associated with collecting paired samples. Our study demonstrated it is possible to derive useful spatial links between samples post hoc; and that the pairing approach affects the conclusions related to associations between enteric infections and water quality deterioration. Copyright © 2016 Elsevier GmbH. All rights reserved.

  19. Residential applliance data, assumptions and methodology for end-use forecasting with EPRI-REEPS 2.1

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, R.J,; Johnson, F.X.; Brown, R.E.; Hanford, J.W.; Kommey, J.G.

    1994-05-01

    This report details the data, assumptions and methodology for end-use forecasting of appliance energy use in the US residential sector. Our analysis uses the modeling framework provided by the Appliance Model in the Residential End-Use Energy Planning System (REEPS), which was developed by the Electric Power Research Institute. In this modeling framework, appliances include essentially all residential end-uses other than space conditioning end-uses. We have defined a distinct appliance model for each end-use based on a common modeling framework provided in the REEPS software. This report details our development of the following appliance models: refrigerator, freezer, dryer, water heater, clothes washer, dishwasher, lighting, cooking and miscellaneous. Taken together, appliances account for approximately 70% of electricity consumption and 30% of natural gas consumption in the US residential sector. Appliances are thus important to those residential sector policies or programs aimed at improving the efficiency of electricity and natural gas consumption. This report is primarily methodological in nature, taking the reader through the entire process of developing the baseline for residential appliance end-uses. Analysis steps documented in this report include: gathering technology and market data for each appliance end-use and specific technologies within those end-uses, developing cost data for the various technologies, and specifying decision models to forecast future purchase decisions by households. Our implementation of the REEPS 2.1 modeling framework draws on the extensive technology, cost and market data assembled by LBL for the purpose of analyzing federal energy conservation standards. The resulting residential appliance forecasting model offers a flexible and accurate tool for analyzing the effect of policies at the national level.

  20. The inversion of spectral ratio H/V in a layered system using the diffuse field assumption (DFA)

    Science.gov (United States)

    Piña-Flores, José; Perton, Mathieu; García-Jerez, Antonio; Carmona, Enrique; Luzón, Francisco; Molina-Villegas, Juan C.; Sánchez-Sesma, Francisco J.

    2017-01-01

    In order to evaluate the site effects on seismic ground motion and establish preventive measures to mitigate these effects, the dynamic characterization of sites is mandatory. Among the various geophysical tools aimed to this end, the horizontal to vertical spectral ratio (H/V) is a simple way to assess the dominant frequency of a site from seismic ambient noise. The aim of this communication is contributing to enhance the potential of this measurement with a novel method that allows extracting from the H/V the elastic properties of the subsoil, assumed here as a multilayer medium. For that purpose, we adopt the diffuse field assumption from both the experimental and the modelling perspectives. At the experimental end, the idea is to define general criteria that make the data processing closely supported by theory. On the modelling front, the challenge is to compute efficiently the imaginary part of Green's function. The Cauchy's residue theory in the horizontal wavenumber complex plane is the selected approach. This method allows both identifying the contributions of body and surface waves and computing them separately. This permits exploring the theoretical properties of the H/V under different compositions of the seismic ambient noise. This answers some questions that historically aroused and gives new insights into the H/V method. The efficient forward calculation is the prime ingredient of an inversion scheme based on both gradient and heuristic searches. The availability of efficient forward calculation of H/V allows exploring some relevant relationships between the H/V curves and the parameters. This allows generating useful criteria to speed up inversion. As in many inverse problems, the non-uniqueness issues also emerge here. A joint inversion method that considers also the dispersion curves of surface waves extracted from seismic ambient noise is presented and applied to experimental data. This joint scheme mitigates effectively the non-uniqueness.

  1. Solid state physics for metallurgists

    CERN Document Server

    Weiss, Richard J

    2013-01-01

    Metal Physics and Physical Metallurgy, Volume 6: Solid State Physics for Metallurgists provides an introduction to the basic understanding of the properties that make materials useful to mankind. This book discusses the electronic structure of matter, which is the domain of solid state physics.Organized into 12 chapters, this volume begins with an overview of the electronic structure of free atoms and the electronic structure of solids. This text then examines the basis of the Bloch theorem, which is the exact periodicity of the potential. Other chapters consider the fundamental assumption in

  2. FIGHTING THE CLASSICAL CRIME-SCENE ASSUMPTIONS. CRITICAL ASPECTS IN ESTABLISHING THE CRIME-SCENE PERIMETER IN COMPUTER-BASED EVIDENCE CASES

    Directory of Open Access Journals (Sweden)

    Cristina DRIGĂ

    2016-05-01

    Full Text Available Physical-world forensic investigation has the luxury of being tied to the sciences governing the investigated space, hence some assumptions can be made with some degree of certainty when investigating a crime. Cyberspace on the other hand, has a dual nature comprising both a physical layer susceptible of scientific analysis, and a virtual layer governed entirely by the conventions established between the various actors involved at a certain moment in time, defining the actual digital landscape and being the layer where the actual facts relevant from the legal point of view occur. This distinct nature renders unusable many of the assumptions which the legal professionals and the courts of law are used to operate with. The article intends to identify the most important features of cyberspace having immediate legal consequences, with the purpose to establish new and safe assumptions from the legal professional's perspective when cross-examining facts that occurred in cyberspace.

  3. Comparisons between a new point kernel-based scheme and the infinite plane source assumption method for radiation calculation of deposited airborne radionuclides from nuclear power plants.

    Science.gov (United States)

    Zhang, Xiaole; Efthimiou, George; Wang, Yan; Huang, Meng

    2018-04-01

    Radiation from the deposited radionuclides is indispensable information for environmental impact assessment of nuclear power plants and emergency management during nuclear accidents. Ground shine estimation is related to multiple physical processes, including atmospheric dispersion, deposition, soil and air radiation shielding. It still remains unclear that whether the normally adopted "infinite plane" source assumption for the ground shine calculation is accurate enough, especially for the area with highly heterogeneous deposition distribution near the release point. In this study, a new ground shine calculation scheme, which accounts for both the spatial deposition distribution and the properties of air and soil layers, is developed based on point kernel method. Two sets of "detector-centered" grids are proposed and optimized for both the deposition and radiation calculations to better simulate the results measured by the detectors, which will be beneficial for the applications such as source term estimation. The evaluation against the available data of Monte Carlo methods in the literature indicates that the errors of the new scheme are within 5% for the key radionuclides in nuclear accidents. The comparisons between the new scheme and "infinite plane" assumption indicate that the assumption is tenable (relative errors within 20%) for the area located 1 km away from the release source. Within 1 km range, the assumption mainly causes errors for wet deposition and the errors are independent of rain intensities. The results suggest that the new scheme should be adopted if the detectors are within 1 km from the source under the stable atmosphere (classes E and F), or the detectors are within 500 m under slightly unstable (class C) or neutral (class D) atmosphere. Otherwise, the infinite plane assumption is reasonable since the relative errors induced by this assumption are within 20%. The results here are only based on theoretical investigations. They should

  4. Assumption-free estimation of heritability from genome-wide identity-by-descent sharing between full siblings.

    Directory of Open Access Journals (Sweden)

    2006-03-01

    Full Text Available The study of continuously varying, quantitative traits is important in evolutionary biology, agriculture, and medicine. Variation in such traits is attributable to many, possibly interacting, genes whose expression may be sensitive to the environment, which makes their dissection into underlying causative factors difficult. An important population parameter for quantitative traits is heritability, the proportion of total variance that is due to genetic factors. Response to artificial and natural selection and the degree of resemblance between relatives are all a function of this parameter. Following the classic paper by R. A. Fisher in 1918, the estimation of additive and dominance genetic variance and heritability in populations is based upon the expected proportion of genes shared between different types of relatives, and explicit, often controversial and untestable models of genetic and non-genetic causes of family resemblance. With genome-wide coverage of genetic markers it is now possible to estimate such parameters solely within families using the actual degree of identity-by-descent sharing between relatives. Using genome scans on 4,401 quasi-independent sib pairs of which 3,375 pairs had phenotypes, we estimated the heritability of height from empirical genome-wide identity-by-descent sharing, which varied from 0.374 to 0.617 (mean 0.498, standard deviation 0.036. The variance in identity-by-descent sharing per chromosome and per genome was consistent with theory. The maximum likelihood estimate of the heritability for height was 0.80 with no evidence for non-genetic causes of sib resemblance, consistent with results from independent twin and family studies but using an entirely separate source of information. Our application shows that it is feasible to estimate genetic variance solely from within-family segregation and provides an independent validation of previously untestable assumptions. Given sufficient data, our new paradigm will

  5. Cultural Nuances, Assumptions, and the Butterfly Effect: Addressing the Unpredictability Caused by Unconscious Values Structures in Cross-Cultural Interactions

    Science.gov (United States)

    Remer, Rory

    2007-01-01

    Cultural values, cross-cultural interaction patterns that are produced by dynamical (chaotic) systems, have a significant impact on interaction, particularly among and between people from different cultures. The butterfly effect, which states that small differences in initial conditions may have severe consequences for patterns in the long run,…

  6. Influence of methylphenidate treatment assumptions on cognitive function in healthy young adults in a double-blind, placebo-controlled trial

    Directory of Open Access Journals (Sweden)

    Mommaerts JL

    2013-08-01

    Full Text Available Jean-Luc Mommaerts,1 Gerlinde Beerens,1 Lieve Van den Block,1 Eric Soetens,2 Sandrina Schol,1 Erwin Van De Vijver,1 Dirk Devroey1 1Department of Family Medicine, 2Department of Cognitive and Biological Psychology, Vrije Universiteit Brussel, Belgium Background: Increasing numbers of students use stimulants such as methylphenidate (MPH to improve their study capacity, making them prone to subsequent prolonged drug abuse. This study explored the cognitive effects of MPH in students who either assumed they received MPH or assumed they received a placebo. Methods: In a double-blind, randomized, placebo-controlled trial with a between-subjects design, 21 students were subjected to partial sleep deprivation, receiving no more than 4 hours sleep the night before they were tested. In the morning, they were given either a placebo or 20 mg of MPH. They then performed free recall verbal tests and Go/No-Go tasks repeatedly, their moods were evaluated using Profile of Mood States and their tiredness was assessed using a visual analog scale, with evaluation of vigilance. Results: No significant differences were found between those subjects who received MPH and those who received a placebo. However, significant differences were found between subjects who assumed they had received MPH or had no opinion, and those who assumed they had received a placebo. At three minutes, one hour, and one day after memorizing ten lists of 20 words, those who assumed they had received MPH recalled 54%, 58%, and 54% of the words, respectively, whereas those who assumed they had received placebo only recalled 35%, 37%, and 34%. Conclusion: Healthy, partially sleep-deprived young students who assume they have received 20 mg of MPH experience a substantial placebo effect that improves consolidation of information into long-term memory. This is independent of any pharmacologic effects of MPH, which had no significant effects on verbal memory in this study. This information may be

  7. The bright lights of city regions - Assumptions, realities and implications of changing population dynamics: Zooming in on the Gauteng city region

    CSIR Research Space (South Africa)

    Pieterse, A

    2014-09-01

    Full Text Available to assumptions of migration and urbanisation. Firstly, even though poverty has been perceived as largely a rural issue, the urbanisation of poverty is in fact occurring at a large scale and city regions, particularly the Gauteng city region, is dealing...

  8. Dangerous Assumptions and Unspoken Limitations: A Disability Studies in Education Response to Morgan, Farkas, Hillemeier, Mattison, Maczuga, Li, and Cook (2015)

    Science.gov (United States)

    Collins, Kathleen M.; Connor, David; Ferri, Beth; Gallagher, Deborah; Samson, Jennifer F.

    2016-01-01

    In this article, we critically review the work of Morgan et al. (2015) and offer Disability Studies in Education (DSE) as an alternative conceptualization to traditional research within special education. We first unpack many of Morgan et al.'s (2015) assumptions, which are grounded in deficit discourses about children, family structures, economic…

  9. Extended Paper: Reconceptualising Foundational Assumptions of Resilience: A Cross-Cultural, Spatial Systems Domain of Relevance for Agency and Phenomenology in Resilience

    Science.gov (United States)

    Downes, Paul

    2017-01-01

    This article seeks to amplify Bronfenbrenner's (1979) concerns with concentric structured, nested systems and phenomenology, for Ungar's (2012) extension of resilience to systems based on Bronfenbrenner's (1979, 1995) socio-ecological paradigm. Resilience rests on interconnected assumptions regarding space, agency and system blockage, as well as…

  10. A Statement of Assumptions and Principles Concerning Education about Death, Dying, and Bereavement for Professionals in Health Care and Human Services.

    Science.gov (United States)

    Omega: Journal of Death and Dying, 1991

    1991-01-01

    Presents assumptions and principles for education about death, dying, and bereavement developed by the Education Work Group of the International Work Group on Death, Dying, and Bereavement. Intended for professionals in health care and human services concerned with education in this area. (PVV)

  11. "Is It Okay to Eat a Dog in Korea...like China?" Assumptions of National Food-Eating Practices in Intercultural Interaction

    Science.gov (United States)

    Brandt, Adam; Jenks, Christopher

    2011-01-01

    There is a small body of research which shows how intercultural communication is constituted in and through talk-in-interaction, and can be made relevant or irrelevant by interactants on a moment-by-moment basis. Our paper builds on this literature by investigating how cultural assumptions of national food-eating practices are deployed, contested…

  12. Assessing the impact on optimal production capacities in a closed-loop logistics system of the assumption that returns are stochastically independent of sales

    Directory of Open Access Journals (Sweden)

    Ernest Benedito

    2011-10-01

    Full Text Available Purpose: This paper is concerned with a reverse logistic system where returns are stochastically dependents on sales. The aim of the paper is to assess the influence on optimal production capacities when is assumed that returns are stochastically independent of sales.Design/methodology/approach: This paper presents a model of the system. An approximated model where is assumed that returns are stochastically independent of sales, is formulated to obtain the optimal capacities. The optimal costs of the original and the approximated models are compared in order to assess the influence of the assumption made on returns.Findings: The assumption that returns are stochastically independent of sales is significant in few cases. Research limitations/implications: The impact of the assumption on returns is assessed indirectly, by comparing the optimal costs of both models: the original and approximated.Practical implications: The problem of calculating the optimal capacities in the original model is hard to solve, however in the approximated model the problem is tractable. When the impact of the assumption that returns are stochastically independent of sales is not significant, the approximated model can be used to calculate the optimal capacities of the original model.Originality/value: Prior to this paper, few papers have addressed with the problem of calculating the optimal capacities of reverse logistics systems. The models found in these papers assumed that returns are stochastically independent of sales.

  13. Seeing a Colleague Encourage a Student to Make an Assumption while Proving: What Teachers Put in Play when Casting an Episode of Instruction

    Science.gov (United States)

    Nachlieli, Talli; Herbst, Patricio

    2009-01-01

    This article reports on an investigation of how teachers of geometry perceived an episode of instruction presented to them as a case of engaging students in proving. Confirming what was hypothesized, participants found it remarkable that a teacher would allow a student to make an assumption while proving. But they perceived this episode in various…

  14. Downscaling of Extreme Precipitation: Proposing a New Statistical Approach and Investigating a Taken-for-Granted Assumption

    Science.gov (United States)

    Elshorbagy, Amin; Alam, Shahabul

    2015-04-01

    (AMPs) is presented. For constructing IDF-curves, only AMPs of different durations are needed. Strong correlation between the AMPs at the coarse-grid scale as output from GCMs and AMPs at the local finer scale is observed in many locations worldwide even though such a correlation may not exist between the corresponding time series of continuous precipitation records. The use of the GP technique, in particular its genetic symbolic regression variant, for downscaling the annual maximum precipitation is further expanded in two ways. First, the exploration and feature extraction capabilities of GP are utilized to develop both GCM-variant and GCM-invariant downscaling models/mathematical expressions. Second, the developed models as well as clustering methods and statistical tests are used to investigate a fundamental assumption of all statistical downscaling methods; that is the validity of the downscaling relationship developed based on a historical time period (e.g., 1960-1990) for the same task during future periods (e.g., up to year 2100). The proposed approach is applied to the case of constructing IDF curves for the City of Saskatoon, Canada. This study reveals that developing a downscaling relationship that is generic and GCM-invariant might lead to more reliable downscaling of future projections, even though the higher reliability comes at the cost of accuracy.

  15. Fundamental study of the mechanism and kinetics of cellulose hydrolysis by acids and enzymes. Final report, June 1, 1978-January 31, 1981

    Energy Technology Data Exchange (ETDEWEB)

    Gong, C.S.; Chang, M.

    1981-02-01

    There are three basic enzymes (e.g., endoglucanase (C/sub x/), exoglucanase (C/sub 1/) and cellobiase) comprising the majority of extracellular cellulase enzymes produced by the cellulolytic mycelial fungi, Trichoderma reesei, and other cellulolytic microorganisms. The enzymes exhibited different mode of actions in respect to the hydrolysis of cellulose and cellulose derived oligosaccharides. In combination, these enzymes complimented each other to hydrolyze cellulose to its basic constituent, glucose. The kinetics of cellobiase were developed on the basis of applying the pseudo-steady state assumption to hydrolyze cellobiose to glucose. The results indicated that cellobiase was subjected to end-product inhibition by glucose. The kinetic modeling of exoglucanase (C/sub 1/) with respect to cellodextrins was studied. Both glucose and cellobiose were found to be inhibitors of this enzyme with cellobiose being a stronger inhibitor than glucose. Similarly, endoglucanase (C/sub x/) is subject to end-product inhibition by glucose. Crystallinity of the cellulose affects the rate of hydrolysis by cellulases. Hence, the changes in crystallinity of cellulose in relation to chemical pretreatment and enzyme hydrolysis was compared. The study of cellulase biosynthesis resulted in the conclusion that exo- and endo-glucanases are co-induced while cellobiase is synthesized independent of the other two enzymes. The multiplicity of cellulase enzymes are the end results of post-translational modification during and/or after the secretion of enzymes into growth environment.

  16. How Realistic Are the Scientific Assumptions of the Neuroenhancement Debate? Assessing the Pharmacological Optimism and Neuroenhancement Prevalence Hypotheses

    Science.gov (United States)

    Schleim, Stephan; Quednow, Boris B.

    2018-01-01

    Since two decades, neuroenhancement is a major topic in neuroethics and still receives much attention in the scholarly literature as well as in public media. In contrast to high hopes at the beginning of the “Decade of the Brain” in the United States and Europe that we subsume under the “pharmacological optimism hypothesis,” recent evidence from clinical neuroscience suggests that developing drugs that make healthy people smarter is even more difficult than finding new treatments for patients with mental disorders. However, cognitive enhancing drugs even for patients with impaired intellectual performance have not been successfully developed yet and new drugs that might have a disruptive impact on this field are unlikely to be developed in the near future. Additionally, we discuss theoretical, empirical, and historical evidence to assess whether cognitive enhancement of the healthy is common or even epidemic and if its application will further increase in the near future, as suggested by the “neuroenhancement prevalence hypothesis.” Reports, surveys, and reviews from the 1930s until today indicate that psychopharmacological neuroenhancement is a fact but less common than often stated, particularly in the public media. Non-medical use of psychostimulants for the purpose of cognitive enhancement exists since at least 80 years and it might actually have been more common in the past than today. Therefore, we conclude that the pharmacological optimism hypothesis and neuroenhancement prevalence hypotheses have to be rejected and argue that the neuroenhancement debate should take the available evidence more into account. PMID:29403383

  17. How Realistic Are the Scientific Assumptions of the Neuroenhancement Debate? Assessing the Pharmacological Optimism and Neuroenhancement Prevalence Hypotheses

    Directory of Open Access Journals (Sweden)

    Stephan Schleim

    2018-01-01

    Full Text Available Since two decades, neuroenhancement is a major topic in neuroethics and still receives much attention in the scholarly literature as well as in public media. In contrast to high hopes at the beginning of the “Decade of the Brain” in the United States and Europe that we subsume under the “pharmacological optimism hypothesis,” recent evidence from clinical neuroscience suggests that developing drugs that make healthy people smarter is even more difficult than finding new treatments for patients with mental disorders. However, cognitive enhancing drugs even for patients with impaired intellectual performance have not been successfully developed yet and new drugs that might have a disruptive impact on this field are unlikely to be developed in the near future. Additionally, we discuss theoretical, empirical, and historical evidence to assess whether cognitive enhancement of the healthy is common or even epidemic and if its application will further increase in the near future, as suggested by the “neuroenhancement prevalence hypothesis.” Reports, surveys, and reviews from the 1930s until today indicate that psychopharmacological neuroenhancement is a fact but less common than often stated, particularly in the public media. Non-medical use of psychostimulants for the purpose of cognitive enhancement exists since at least 80 years and it might actually have been more common in the past than today. Therefore, we conclude that the pharmacological optimism hypothesis and neuroenhancement prevalence hypotheses have to be rejected and argue that the neuroenhancement debate should take the available evidence more into account.

  18. The unity assumption facilitates cross-modal binding of musical, non-speech stimuli: The role of spectral and amplitude envelope cues.

    Science.gov (United States)

    Chuen, Lorraine; Schutz, Michael

    2016-07-01

    An observer's inference that multimodal signals originate from a common underlying source facilitates cross-modal binding. This 'unity assumption' causes asynchronous auditory and visual speech streams to seem simultaneous (Vatakis & Spence, Perception & Psychophysics, 69(5), 744-756, 2007). Subsequent tests of non-speech stimuli such as musical and impact events found no evidence for the unity assumption, suggesting the effect is speech-specific (Vatakis & Spence, Acta Psychologica, 127(1), 12-23, 2008). However, the role of amplitude envelope (the changes in energy of a sound over time) was not previously appreciated within this paradigm. Here, we explore whether previous findings suggesting speech-specificity of the unity assumption were confounded by similarities in the amplitude envelopes of the contrasted auditory stimuli. Experiment 1 used natural events with clearly differentiated envelopes: single notes played on either a cello (bowing motion) or marimba (striking motion). Participants performed an un-speeded temporal order judgments task; viewing audio-visually matched (e.g., marimba auditory with marimba video) and mismatched (e.g., cello auditory with marimba video) versions of stimuli at various stimulus onset asynchronies, and were required to indicate which modality was presented first. As predicted, participants were less sensitive to temporal order in matched conditions, demonstrating that the unity assumption can facilitate the perception of synchrony outside of speech stimuli. Results from Experiments 2 and 3 revealed that when spectral information was removed from the original auditory stimuli, amplitude envelope alone could not facilitate the influence of audiovisual unity. We propose that both amplitude envelope and spectral acoustic cues affect the percept of audiovisual unity, working in concert to help an observer determine when to integrate across modalities.

  19. Review of technical justification of assumptions and methods used by the Environmental Protection Agency for estimating risks avoided by implementing MCLs for radionuclides

    Energy Technology Data Exchange (ETDEWEB)

    Morris, S.C.; Rowe, M.D.; Holtzman, S.; Meinhold, A.F.

    1992-11-01

    The Environmental Protection Agency (EPA) has proposed regulations for allowable levels of radioactive material in drinking water (40 CFR Part 141, 56 FR 33050, July 18, 1991). This review examined the assumptions and methods used by EPA in calculating risks that would be avoided by implementing the proposed Maximum Contaminant Levels for uranium, radium, and radon. Proposed limits on gross alpha and beta-gamma emitters were not included in this review.

  20. Examining assumptions regarding valid electronic monitoring of medication therapy: development of a validation framework and its application on a European sample of kidney transplant patients

    Directory of Open Access Journals (Sweden)

    Steiger Jürg

    2008-02-01

    Full Text Available Abstract Background Electronic monitoring (EM is used increasingly to measure medication non-adherence. Unbiased EM assessment requires fulfillment of assumptions. The purpose of this study was to determine assumptions needed for internal and external validity of EM measurement. To test internal validity, we examined if (1 EM equipment functioned correctly, (2 if all EM bottle openings corresponded to actual drug intake, and (3 if EM did not influence a patient's normal adherence behavior. To assess external validity, we examined if there were indications that using EM affected the sample representativeness. Methods We used data from the Supporting Medication Adherence in Renal Transplantation (SMART study, which included 250 adult renal transplant patients whose adherence to immunosuppressive drugs was measured during 3 months with the Medication Event Monitoring System (MEMS. Internal validity was determined by assessing the prevalence of nonfunctioning EM systems, the prevalence of patient-reported discrepancies between cap openings and actual intakes (using contemporaneous notes and interview at the end of the study, and by exploring whether adherence was initially uncharacteristically high and decreased over time (an indication of a possible EM intervention effect. Sample representativeness was examined by screening for differences between participants and non-participants or drop outs on non-adherence. Results Our analysis revealed that some assumptions were not fulfilled: 1 one cap malfunctioned (0.4%, 2 self-reported mismatches between bottle openings and actual drug intake occurred in 62% of the patients (n = 155, and 3 adherence decreased over the first 5 weeks of the monitoring, indicating that EM had a waning intervention effect. Conclusion The validity assumptions presented in this article should be checked in future studies using EM as a measure of medication non-adherence.