WorldWideScience

Sample records for basic assumptions underlying

  1. Some Considerations on the Basic Assumptions in Rotordynamics

    Science.gov (United States)

    GENTA, G.; DELPRETE, C.; BRUSA, E.

    1999-10-01

    The dynamic study of rotors is usually performed under a number of assumptions, namely small displacements and rotations, small unbalance and constant angular velocity. The latter assumption can be substituted by a known time history of the spin speed. The present paper develops a general non-linear model which can be used to study the rotordynamic behaviour of both fixed and free rotors without resorting to the mentioned assumptions and compares the results obtained from a number of non-linear numerical simulations with those computed through the usual linearized approach. It is so possible to verify that the validity of the rotordynamic models extends to situations in which fairly large unbalances and whirling motions are present and, above all, it is shown that the doubts forwarded about the application of a model which is based on constant spin speed to the case of free rotors in which the angular momentum is constant have no ground. Rotordynamic models can thus be used to study the stability in the small of spinning spacecrafts and the insight obtained from the study of rotors is useful to understand their attitude dynamics and its interactions with the vibration dynamics.

  2. Primary prevention in public health: an analysis of basic assumptions.

    Science.gov (United States)

    Ratcliffe, J; Wallack, L

    1985-01-01

    The common definition of primary prevention is straightforward; but how it is transformed into a framework to guide action is based on personal and societal feelings and beliefs about the basis for social organization. This article focuses on the two contending primary prevention strategies of health promotion and health protection. The contention between the two strategies stems from a basic disagreement about disease causality in modern society. Health promotion is based on the "lifestyle" theory of disease causality, which sees individual health status linked ultimately to personal decisions about diet, stress, and drug habits. Primary prevention, from this perspective, entails persuading individuals to forgo their risk-taking, self-destructive behavior. Health protection, on the other hand, is based on the "social-structural" theory of disease causality. This theory sees the health status of populations linked ultimately to the unequal distribution of social resources, industrial pollution, occupational stress, and "anti-health promotion" marketing practices. Primary prevention, from this perspective, requires changing existing social and, particularly, economic policies and structures. In order to provide a basis for choosing between these contending strategies, the demonstrated (i.e., past) impact of each strategy on the health of the public is examined. Two conclusions are drawn. First, the health promotion strategy shows little potential for improving the public health, because it systematically ignores the risk-imposing, other-destructive behavior of influential actors (policy-makers and institutions) in society. And second, effective primary prevention efforts entail an "upstream" approach that results in far-reaching sociopolitical and economic change.

  3. Learning disabilities theory and Soviet psychology: a comparison of basic assumptions.

    Science.gov (United States)

    Coles, G S

    1982-09-01

    Critics both within and outside the Learning Disabilities (LD) field have pointed to the weaknesses of LD theory. Beginning with the premise that a significant problem of LD theory has been its failure to explore fully its fundamental assumptions, this paper examines a number of these assumptions about individual and social development, cognition, and learning. These assumptions are compared with a contrasting body of premises found in Soviet psychology, particularly in the works of Vygotsky, Leontiev, and Luria. An examination of the basic assumptions of LD theory and Soviet psychology shows that a major difference lies in their respective nondialectical and dialectical interpretation of the relationship of social factors and cognition, learning, and neurological development.

  4. Understanding the multiple realities of everyday life: basic assumptions in focus-group methodology.

    Science.gov (United States)

    Ivanoff, Synneve Dahlin; Hultberg, John

    2006-06-01

    In recent years, there has been a notable growth in the use of focus groups within occupational therapy. It is important to understand what kind of knowledge focus-group methodology is meant to acquire. The purpose of this article is to create an understanding of the basic assumptions within focus-group methodology from a theory of science perspective in order to elucidate and encourage reflection on the paradigm. This will be done based on a study of contemporary literature. To further the knowledge of basic assumptions the article will focus on the following themes: the focus-group research arena, the foundation and its core components; subjects, the role of the researcher and the participants; activities, the specific tasks and procedures. Focus-group methodology can be regarded as a specific research method within qualitative methodology with its own form of methodological criteria, as well as its own research procedures. Participants construct a framework to make sense of their experiences, and in interaction with others these experiences will be modified, leading to the construction of new knowledge. The role of the group leader is to facilitate a fruitful environment for the meaning to emerge and to ensure that the understanding of the meaning emerges independently of the interpreter. Focus-group methodology thus shares, in the authors' view, some basic assumptions with social constructivism.

  5. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-02-14

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model.

  6. Co-Dependency: An Examination of Underlying Assumptions.

    Science.gov (United States)

    Myer, Rick A.; And Others

    1991-01-01

    Discusses need for careful examination of codependency as diagnostic category. Critically examines assumptions that codependency is disease, addiction, or predetermined by the environment. Discusses implications of assumptions. Offers recommendations for mental health counselors focusing on need for systematic research, redirection of efforts to…

  7. Washing machines, driers and dishwashers. Background reports. Vol. 1: Basic assumptions and impact analyses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-06-01

    Before analyzing wet appliances to establish a common European Union (EU) basis for defining efficiency in domestic wet appliances, a framework has to be set up. The first part of this Background Report deals with such a framework and with definitions, basic assumptions and test methods. The next sections give a short introduction on the framework of wet appliances and definitions taken from international standards. Chapter 2 elaborates on basic assumptions regarding appliance categories, capacity, energy efficiency and performance. Chapter 3 contains a survey of test methods from international standard and chapter 4 shows the present state of standard in International Standardization Organization (IEC) and Commite Europeen de Normalisation Electrotechnique (CENELEC). The next two chapter of the report deal with the user of wet appliances: the consumer. Analysis in more detail aspects of daily use, such as ownership level, frequency of use, type of programme used is given. An important question for this study is whether a `European consumer` exists; section 5.5 deals with this subject. Two elements of the marketing mix: product and price are considered. Several possible product options are reviewed and attention is paid to the impact of price on conumsers` buying decicions. The findings of this report and recommendations are summarized. (au)

  8. Testing the habituation assumption underlying models of parasitoid foraging behavior

    Science.gov (United States)

    Abram, Katrina; Colazza, Stefano; Peri, Ezio

    2017-01-01

    Background Habituation, a form of non-associative learning, has several well-defined characteristics that apply to a wide range of physiological and behavioral responses in many organisms. In classic patch time allocation models, habituation is considered to be a major mechanistic component of parasitoid behavioral strategies. However, parasitoid behavioral responses to host cues have not previously been tested for the known, specific characteristics of habituation. Methods In the laboratory, we tested whether the foraging behavior of the egg parasitoid Trissolcus basalis shows specific characteristics of habituation in response to consecutive encounters with patches of host (Nezara viridula) chemical contact cues (footprints), in particular: (i) a training interval-dependent decline in response intensity, and (ii) a training interval-dependent recovery of the response. Results As would be expected of a habituated response, wasps trained at higher frequencies decreased their behavioral response to host footprints more quickly and to a greater degree than those trained at low frequencies, and subsequently showed a more rapid, although partial, recovery of their behavioral response to host footprints. This putative habituation learning could not be blocked by cold anesthesia, ingestion of an ATPase inhibitor, or ingestion of a protein synthesis inhibitor. Discussion Our study provides support for the assumption that diminishing responses of parasitoids to chemical indicators of host presence constitutes habituation as opposed to sensory fatigue, and provides a preliminary basis for exploring the underlying mechanisms. PMID:28321365

  9. Limitations of force-free magnetic field extrapolations: revisiting basic assumptions

    CERN Document Server

    Peter, H; Chitta, L P; Cameron, R H

    2015-01-01

    Force-free extrapolations are widely used to study the magnetic field in the solar corona based on surface measurements. The extrapolations assume that the ratio of internal energy of the plasma to magnetic energy, the plasma-beta is negligible. Despite the widespread use of this assumption observations, models, and theoretical considerations show that beta is of the order of a few percent to more than 10%, and thus not small. We investigate what consequences this has for the reliability of extrapolation results. We use basic concepts starting with the force and the energy balance to infer relations between plasma-beta and free magnetic energy, to study the direction of currents in the corona with respect to the magnetic field, and to estimate the errors in the free magnetic energy by neglecting effects of the plasma (beta<<1). A comparison with a 3D MHD model supports our basic considerations. If plasma-beta is of the order of the relative free energy (the ratio of the free magnetic energy to the total...

  10. Coefficient Alpha as an Estimate of Test Reliability under Violation of Two Assumptions.

    Science.gov (United States)

    Zimmerman, Donald W.; And Others

    1993-01-01

    Coefficient alpha was examined through computer simulation as an estimate of test reliability under violation of two assumptions. Coefficient alpha underestimated reliability under violation of the assumption of essential tau-equivalence of subtest scores and overestimated it under violation of the assumption of uncorrelated subtest error scores.…

  11. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  12. Uncertainties in sandy shorelines evolution under the Bruun rule assumption

    Directory of Open Access Journals (Sweden)

    Gonéri eLe Cozannet

    2016-04-01

    Full Text Available In the current practice of sandy shoreline change assessments, the local sedimentary budget is evaluated using the sediment balance equation, that is, by summing the contributions of longshore and cross-shore processes. The contribution of future sea-level-rise induced by climate change is usually obtained using the Bruun rule, which assumes that the shoreline retreat is equal to the change of sea-level divided by the slope of the upper shoreface. However, it remains unsure that this approach is appropriate to account for the impacts of future sea-level rise. This is due to the lack of relevant observations to validate the Bruun rule under the expected sea-level rise rates. To address this issue, this article estimates the coastal settings and period of time under which the use of the Bruun rule could be (invalidated, in the case of wave-exposed gently-sloping sandy beaches. Using the sedimentary budgets of Stive (2004 and probabilistic sea-level rise scenarios based on IPCC, we provide shoreline change projections that account for all uncertain hydrosedimentary processes affecting idealized coasts (impacts of sea-level rise, storms and other cross-shore and longshore processes. We evaluate the relative importance of each source of uncertainties in the sediment balance equation using a global sensitivity analysis. For scenario RCP 6.0 and 8.5 and in the absence of coastal defences, the model predicts a perceivable shift toward generalized beach erosion by the middle of the 21st century. In contrast, the model predictions are unlikely to differ from the current situation in case of scenario RCP 2.6. Finally, the contribution of sea-level rise and climate change scenarios to sandy shoreline change projections uncertainties increases with time during the 21st century. Our results have three primary implications for coastal settings similar to those provided described in Stive (2004 : first, the validation of the Bruun rule will not necessarily be

  13. STRONG APPROXIMATION FOR MOVING AVERAGE PROCESSES UNDER DEPENDENCE ASSUMPTIONS

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Let {Xt, t ≥ 1} be a moving average process defined by Xt = ∞Σk=0akξt-k,where {ak,k ≥ 0} is a sequence of real numbers and {ξt,-∞< t <∞} is a doubly infinite sequence of strictly stationary dependent random variables. Under the conditions of {ak, k ≥ 0} which entail that {Xt, t ≥ 1} is either a long memory process or a linear process, the strong approximation of {Xt, t ≥ 1} to a Gaussian process is studied. Finally,the results are applied to obtain the strong approximation of a long memory process to a fractional Brownian motion and the laws of the iterated logarithm for moving average processes.

  14. 'Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community’

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie

    2016-01-01

    In this full-day workshop we want to discuss how the IDC community can make underlying assumptions, values and views regarding children and childhood in making design decisions more explicit. What assumptions do IDC designers and researchers make, and how can they be supported in reflecting...... on those assumptions and the possible influences on their design decisions? How can we make the assumptions explicit, discuss them in the IDC community and use the discussion to develop higher quality design and research? The workshop will support discussion between researchers, designers and practitioners......, and intends to share different approaches for uncovering and reflecting on values, assumptions and views about children and childhood in design....

  15. Randomness Amplification under Minimal Fundamental Assumptions on the Devices

    Science.gov (United States)

    Ramanathan, Ravishankar; Brandão, Fernando G. S. L.; Horodecki, Karol; Horodecki, Michał; Horodecki, Paweł; Wojewódka, Hanna

    2016-12-01

    Recently, the physically realistic protocol amplifying the randomness of Santha-Vazirani sources producing cryptographically secure random bits was proposed; however, for reasons of practical relevance, the crucial question remained open regarding whether this can be accomplished under the minimal conditions necessary for the task. Namely, is it possible to achieve randomness amplification using only two no-signaling components and in a situation where the violation of a Bell inequality only guarantees that some outcomes of the device for specific inputs exhibit randomness? Here, we solve this question and present a device-independent protocol for randomness amplification of Santha-Vazirani sources using a device consisting of two nonsignaling components. We show that the protocol can amplify any such source that is not fully deterministic into a fully random source while tolerating a constant noise rate and prove the composable security of the protocol against general no-signaling adversaries. Our main innovation is the proof that even the partial randomness certified by the two-party Bell test [a single input-output pair (u* , x* ) for which the conditional probability P (x*|u*) is bounded away from 1 for all no-signaling strategies that optimally violate the Bell inequality] can be used for amplification. We introduce the methodology of a partial tomographic procedure on the empirical statistics obtained in the Bell test that ensures that the outputs constitute a linear min-entropy source of randomness. As a technical novelty that may be of independent interest, we prove that the Santha-Vazirani source satisfies an exponential concentration property given by a recently discovered generalized Chernoff bound.

  16. Combustion Effects in Laser-oxygen Cutting: Basic Assumptions, Numerical Simulation and High Speed Visualization

    Science.gov (United States)

    Zaitsev, Alexander V.; Ermolaev, Grigory V.

    Laser-oxygen cutting is very complicated for theoretical description technological process. Iron-oxygen combustion playing a leading role making it highly effective, able to cut thicker plates and, at the same time, producing special types of striations and other defects on the cut surface. In this paper results of numerical simulation based on elementary assumptions on iron-oxygen combustion are verified with high speed visualization of laser-oxygen cutting process. On a base of assumption that iron oxide lost its protective properties after melting simulation of striation formation due cycles of laser induced non self-sustained combustion is proposed. Assumption that reaction limiting factor is oxygen transport from the jet to cutting front allows to calculate reaction intensity by solving Navier - Stokes and diffusion system in gas phase. Influence of oxygen purity and pressure is studied theoretically. The results of numerical simulation are examined with high speed visualization of laser-oxygen cutting of 4-20 mm mild steel plates at cutting conditions close to industrial.

  17. Unrealistic Assumptions in Economics: an Analysis under the Logic of Socioeconomic Processes

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2014-11-01

    Full Text Available The realism of assumptions is an ongoing debate within the philosophy of economics. One of the most referenced papers in this matter belongs to Milton Friedman. He defends the use of unrealistic assumptions, not only because of a pragmatic issue, but also the intrinsic difficulties of determining the extent of realism. On the other hand, realists have criticized (and still do today the use of unrealistic assumptions - such as the assumption of rational choice, perfect information, homogeneous goods, etc. However, they did not accompany their statements with a proper epistemological argument that supports their positions. In this work it is expected to show that the realism of (a particular sort of assumptions is clearly relevant when examining economic models, since the system under study (the real economies is not compatible with logic of invariance and of mechanisms, but with the logic of possibility trees. Because of this, models will not function as tools for predicting outcomes, but as representations of alternative scenarios, whose similarity to the real world will be examined in terms of the verisimilitude of a class of model assumptions

  18. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  19. Maximizing the Delivery of MPR Broadcasting Under Realistic Physical Layer Assumptions

    Institute of Scientific and Technical Information of China (English)

    Francois Ingelrest; David Simplot-Ryl

    2008-01-01

    It is now commonly accepted that the unit disk graph used to model the physical layer in wireless network sdoes not reflect real radio transmissions, and that a more realistic model should be considered for experimental simulations.Previous work on realistic scenarios has been focused on unicast, however broadcast requirements are fundamentally different and cannot be derived from the unicast case. There fore, the broadcast protocols must be adapted in order to still be efficient under realistic assumptions. In this paper, we study the well-known multipoint relay broadcast protocol (MPR), in which each node has to choose a set of 1-hop neighbors to act as relays in order to cover the whole 2-hop neighborhood. We giveexperimental results showing that the original strategy used to select these multipoint relays does not suit a realistic model.On the basis of these results, we propose new selection strategies solely based on link quality. One of the key aspects of our solutions is that our strategies do not require any additional hardware and may be implemented at the application layer,which is particularly relevant to the context of ad hoc and sensor networks where energy savings are mandatory. We finall yprovide new experimental results that demonstrate the superiority of our strategies under realistic physical assumptions.

  20. Physics Courses X-Rayed - A Comparative Analysis of High School Physics Courses in Terms of Basic Assumptions

    Science.gov (United States)

    Hobbs, E. D.

    1974-01-01

    Reports an attempt to infer from official statements and from course materials some of the assumptions and theoretical positions which underlie four high school physics courses: Nuffield Physics, ECCP's "The Man Made World," Harvard Project Physics, and PSSC Physics. (PEB)

  1. Special relativity as the limit of an Aristotelian universal friction theory under Reye's assumption

    CERN Document Server

    Minguzzi, E

    2014-01-01

    This work explores a classical mechanical theory under two further assumptions: (a) there is a universal dry friction force (Aristotelian mechanics), and (b) the variation of the mass of a body due to wear is proportional to the work done by the friction force on the body (Reye's hypothesis). It is shown that mass depends on velocity as in Special Relativity, and that the velocity is constant for a particular characteristic value. In the limit of vanishing friction the theory satisfies a relativity principle as bodies do not decelerate and, therefore, the absolute frame becomes unobservable. However, the limit theory is not Newtonian mechanics, with its Galilei group symmetry, but rather Special Relativity. This result suggests to regard Special Relativity as the limit of a theory presenting universal friction and exchange of mass-energy with a reservoir (vacuum). Thus, quite surprisingly, Special Relativity follows from the absolute space (ether) concept and could have been discovered following studies of Ar...

  2. Limitations to the Dutch cannabis toleration policy: Assumptions underlying the reclassification of cannabis above 15% THC.

    Science.gov (United States)

    Van Laar, Margriet; Van Der Pol, Peggy; Niesink, Raymond

    2016-08-01

    The Netherlands has seen an increase in Δ9-tetrahydrocannabinol (THC) concentrations from approximately 8% in the 1990s up to 20% in 2004. Increased cannabis potency may lead to higher THC-exposure and cannabis related harm. The Dutch government officially condones the sale of cannabis from so called 'coffee shops', and the Opium Act distinguishes cannabis as a Schedule II drug with 'acceptable risk' from other drugs with 'unacceptable risk' (Schedule I). Even in 1976, however, cannabis potency was taken into account by distinguishing hemp oil as a Schedule I drug. In 2011, an advisory committee recommended tightening up legislation, leading to a 2013 bill proposing the reclassification of high potency cannabis products with a THC content of 15% or more as a Schedule I drug. The purpose of this measure was twofold: to reduce public health risks and to reduce illegal cultivation and export of cannabis by increasing punishment. This paper focuses on the public health aspects and describes the (explicit and implicit) assumptions underlying this '15% THC measure', as well as to what extent these are supported by scientific research. Based on scientific literature and other sources of information, we conclude that the 15% measure can provide in theory a slight health benefit for specific groups of cannabis users (i.e., frequent users preferring strong cannabis, purchasing from coffee shops, using 'steady quantities' and not changing their smoking behaviour), but certainly not for all cannabis users. These gains should be weighed against the investment in enforcement and the risk of unintended (adverse) effects. Given the many assumptions and uncertainty about the nature and extent of the expected buying and smoking behaviour changes, the measure is a political choice and based on thin evidence.

  3. Compound-nuclear reaction cross sections via the Surrogate method: considering the underlying assumptions

    Science.gov (United States)

    Escher, Jutta; Dietrich, Frank

    2006-10-01

    The Surrogate Nuclear Reactions approach makes it possible to determine compound-nuclear reaction cross sections indirectly. The method has been employed to determine (n,f) cross sections for various actinides, including unstable species [1-4]; other, primarily neutron- induced, reactions are being considered also [5,6]. The extraction of the sought-after cross sections typically relies on approximations to the full Surrogate formalism [7]. This presentation will identify and critically examine the most significant assumptions underlying the experimental work carried out so far. Calculations that test the validity of the approximations employed will be presented. [1] J.D. Cramer and H.C. Britt, Nucl. Sci. and Eng. 41, 177(1970); H.C. Britt and J.B. Wilhelmy, ibid. 72, 222(1979) [2] M. Petit et al, Nucl. Phys. A735, 345(2004) [3] C. Plettner et al, Phys. Rev. C 71, 051602(2005); J. Burke et al, Phys. Rev. C. 73, 054604(2006) [4] W. Younes and H.C. Britt, Phys. Rev. C 67, 024610(2003); 68, 034610(2003) [5] L.A. Bernstein et al, AIP Conf. Proc. 769, 890(2005) [6] J. Escher et al, Nucl. Phys. A758, 43c(2005) [7] J. Escher and F.S. Dietrich, submitted (2006)

  4. A Memory-Based Model of Posttraumatic Stress Disorder: Evaluating Basic Assumptions Underlying the PTSD Diagnosis

    Science.gov (United States)

    Rubin, David C.; Berntsen, Dorthe; Bohni, Malene Klindt

    2008-01-01

    In the mnemonic model of posttraumatic stress disorder (PTSD), the current memory of a negative event, not the event itself, determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; American Psychiatric Association,…

  5. Exploring the Estimation of Examinee Locations Using Multidimensional Latent Trait Models under Different Distributional Assumptions

    Science.gov (United States)

    Jang, Hyesuk

    2014-01-01

    This study aims to evaluate a multidimensional latent trait model to determine how well the model works in various empirical contexts. Contrary to the assumption of these latent trait models that the traits are normally distributed, situations in which the latent trait is not shaped with a normal distribution may occur (Sass et al, 2008; Woods…

  6. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    Science.gov (United States)

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  7. An Algorithm for Determining Database Consistency Under the Coles World Assumption

    Institute of Scientific and Technical Information of China (English)

    沈一栋

    1992-01-01

    It is well-known that there are circumstances where applying Reiter's closed world assumption(CWA)will lead to logical inconsistencies.In this paper,a new characterization of the CA consistency is pesented and an algorithm is proposed for determining whether a datalase without function symbols is consistent with the CWA.The algorithm is shown to be efficient.

  8. Determination of the optimal periodic maintenance policy under imperfect repair assumption

    OpenAIRE

    Maria Luiza Guerra de Toledo

    2014-01-01

    . An appropriate maintenance policy is essential to reduce expenses and risks related to repairable systems failures. The usual assumptions of minimal or perfect repair at failures are not suitable for many real systems, requiring the application of Imperfect Repair models. In this work, the classes Arithmetic Reduction of Age and Arithmetic Reduction of Intensity, proposed by Doyen and Gaudoin (2004) are explored. Likelihood functions for such models are derived, and the parameters are es...

  9. Efficient Accountable Authority Identity-Based Encryption under Static Complexity Assumptions

    CERN Document Server

    Libert, Benoît

    2008-01-01

    At Crypto'07, Goyal introduced the concept of Accountable Authority Identity-Based Encryption (A-IBE) as a convenient means to reduce the amount of trust in authorities in Identity-Based Encryption (IBE). In this model, if the Private Key Generator (PKG) maliciously re-distributes users' decryption keys, it runs the risk of being caught and prosecuted. Goyal proposed two constructions: a first one based on Gentry's IBE which relies on strong assumptions (such as q-Bilinear Diffie-Hellman Inversion) and a second one resting on the more classical Decision Bilinear Diffie-Hellman (DBDH) assumption but that is too inefficient for practical use. In this work, we propose a new construction that is secure assuming the hardness of the DBDH problem. The efficiency of our scheme is comparable with that of Goyal's main proposal with the advantage of relying on static assumptions (i.e. the strength of which does not depend on the number of queries allowed to the adversary). By limiting the number of adversarial rewinds i...

  10. On the underlying assumptions of threshold Boolean networks as a model for genetic regulatory network behavior

    Science.gov (United States)

    Tran, Van; McCall, Matthew N.; McMurray, Helene R.; Almudevar, Anthony

    2013-01-01

    Boolean networks (BoN) are relatively simple and interpretable models of gene regulatory networks. Specifying these models with fewer parameters while retaining their ability to describe complex regulatory relationships is an ongoing methodological challenge. Additionally, extending these models to incorporate variable gene decay rates, asynchronous gene response, and synergistic regulation while maintaining their Markovian nature increases the applicability of these models to genetic regulatory networks (GRN). We explore a previously-proposed class of BoNs characterized by linear threshold functions, which we refer to as threshold Boolean networks (TBN). Compared to traditional BoNs with unconstrained transition functions, these models require far fewer parameters and offer a more direct interpretation. However, the functional form of a TBN does result in a reduction in the regulatory relationships which can be modeled. We show that TBNs can be readily extended to permit self-degradation, with explicitly modeled degradation rates. We note that the introduction of variable degradation compromises the Markovian property fundamental to BoN models but show that a simple state augmentation procedure restores their Markovian nature. Next, we study the effect of assumptions regarding self-degradation on the set of possible steady states. Our findings are captured in two theorems relating self-degradation and regulatory feedback to the steady state behavior of a TBN. Finally, we explore assumptions of synchronous gene response and asynergistic regulation and show that TBNs can be easily extended to relax these assumptions. Applying our methods to the budding yeast cell-cycle network revealed that although the network is complex, its steady state is simplified by the presence of self-degradation and lack of purely positive regulatory cycles. PMID:24376454

  11. On the underlying assumptions of threshold Boolean networks as a model for genetic regulatory network behavior

    Directory of Open Access Journals (Sweden)

    Van eTran

    2013-12-01

    Full Text Available Boolean networks (BoN are relatively simple and interpretable models of gene regulatorynetworks. Specifying these models with fewer parameters while retaining their ability to describe complex regulatory relationships is an ongoing methodological challenge. Additionally, extending these models to incorporate variable gene decay rates, asynchronous gene response, and synergistic regulation while maintaining their Markovian nature increases the applicability of these models to genetic regulatory networks.We explore a previously-proposed class of BoNs characterized by linear threshold functions, which we refer to as threshold Boolean networks (TBN. Compared to traditional BoNs with unconstrained transition functions, these models require far fewer parameters and offer a more direct interpretation. However, the functional form of a TBN does result in a reduction in the regulatory relationships which can be modeled.We show that TBNs can be readily extended to permit self-degradation, with explicitly modeled degradation rates. We note that the introduction of variable degradation compromises the Markovian property fundamental to BoN models but show that a simple state augmentation procedure restores their Markovian nature. Next, we study the effect of assumptions regarding self-degradation on the set of possible steady states. Our findings are captured in two theorems relating self-degradation and regulatory feedback to the steady state behavior of a TBN. Finally, we explore assumptions of synchronous gene response and asynergistic regulation and show that TBNs can be easily extended to relax these assumptions.Applying our methods to the budding yeast cell-cycle network revealed that although the network is complex, its steady state is simplified by the presence of self-degradation and lack of purely positive regulatory cycles.

  12. Evaluation of Horizontal Electric Field Under Different Lightning Current Models by Perfect Ground Assumption

    Institute of Scientific and Technical Information of China (English)

    LIANG Jianfeng; LI Yanming

    2012-01-01

    Lightning electromagnetics can affect the reliability of the power system or communication system.Therefore,evaluation of electromagnetic fields generated by lightning return stroke is indispensable.Arnold sommerfeld proposed a model to calculate the electromagnetic field,but it involved the time-consuming sommerfeld integral.However,perfect conductor ground assumption can account for fast calculation,thus this paper reviews the perfect ground equation for evaluation of lightning electromagnetic fields,presents three engineering lightning return stroke models,and calculates the horizontal electric field caused by three lightning return stroke models.According to the results,the amplitude of lightning return stroke has a strong impact on horizontal electric fields,and the steepness of lightning return stroke influences the horizontal electric fields.Moreover,the perfect ground method is faster than the sommerfeld integral method.

  13. To describe or prescribe: assumptions underlying a prescriptive nursing process approach to spiritual care.

    Science.gov (United States)

    Pesut, Barbara; Sawatzky, Rick

    2006-06-01

    Increasing attention is being paid to spirituality in nursing practice. Much of the literature on spiritual care uses the nursing process to describe this aspect of care. However, the use of the nursing process in the area of spirituality may be problematic, depending upon the understandings of the nature and intent of this process. Is it primarily a descriptive process meant to make visible the nursing actions to provide spiritual support, or is it a prescriptive process meant to guide nursing actions for intervening in the spirituality of patients? A prescriptive nursing process approach implies influencing, and in some cases reframing, the spirituality of patients and thereby extends beyond general notions of spiritual support. In this paper we discuss four problematic assumptions that form the basis for a prescriptive approach to spiritual care. We conclude that this approach extends the nursing role beyond appropriate professional boundaries, making it ethically problematic.

  14. Estimating Alarm Thresholds for Process Monitoring Data under Different Assumptions about the Data Generating Mechanism

    Directory of Open Access Journals (Sweden)

    Tom Burr

    2013-01-01

    Full Text Available Process monitoring (PM for nuclear safeguards sometimes requires estimation of thresholds corresponding to small false alarm rates. Threshold estimation dates to the 1920s with the Shewhart control chart; however, because possible new roles for PM are being evaluated in nuclear safeguards, it is timely to consider modern model selection options in the context of threshold estimation. One of the possible new PM roles involves PM residuals, where a residual is defined as residual = data − prediction. This paper reviews alarm threshold estimation, introduces model selection options, and considers a range of assumptions regarding the data-generating mechanism for PM residuals. Two PM examples from nuclear safeguards are included to motivate the need for alarm threshold estimation. The first example involves mixtures of probability distributions that arise in solution monitoring, which is a common type of PM. The second example involves periodic partial cleanout of in-process inventory, leading to challenging structure in the time series of PM residuals.

  15. Finding the right fit: A comparison of process assumptions underlying popular drift-diffusion models.

    Science.gov (United States)

    Ashby, Nathaniel J S; Jekel, Marc; Dickert, Stephan; Glöckner, Andreas

    2016-12-01

    Recent research makes increasing use of eye-tracking methodologies to generate and test process models. Overall, such research suggests that attention, generally indexed by fixations (gaze duration), plays a critical role in the construction of preference, although the methods used to support this supposition differ substantially. In 2 studies we empirically test prototypical versions of prominent processing assumptions against 1 another and several base models. We find that general evidence accumulation processes provide a good fit to the data. An accumulation process that assumes leakage and temporal variability in evidence weighting (i.e., a primacy effect) fits the aggregate data, both in terms of choices and decision times, and does so across varying types of choices (e.g., charitable giving and hedonic consumption) and numbers of options well. However, when comparing models on the level of the individual, for a majority of participants simpler models capture choice data better. The theoretical and practical implications of these findings are discussed. (PsycINFO Database Record

  16. Evaluating assumptions and parameterization underlying process-based ecosystem models: the case of LPJ-GUESS

    Science.gov (United States)

    Pappas, C.; Fatichi, S.; Leuzinger, S.; Burlando, P.

    2012-04-01

    Dynamic vegetation models have been widely used for analyzing ecosystem dynamics and climate feedbacks. Their performance has been tested extensively against observations and by model intercomparison studies. In the present study, the LPJ-GUESS state-of-the-art ecosystem model was evaluated with respect to its structure, hypothesis, and parameterization by performing a global sensitivity analysis (GSA). The study aims at examining potential model limitations, particularly with regards to regional and watershed scale applications. A detailed GSA based on variance decomposition is presented to investigate the structural assumptions of the model and to highlight processes and parameters that cause the highest variability in the outputs. First order and total sensitivity indexes were calculated for each of the parameters using Sobol's methodology. In order to elucidate the role of climate on model sensitivity synthetic climate scenarios were generated based on climatic data from Switzerland. The results clearly indicate a very high sensitivity of LPJ-GUESS to photosynthetic parameters. Intrinsic quantum efficiency alone is able to explain about 60% of the variability in vegetation carbon fluxes and pools for most of the investigated climate conditions. Processes related to light were also found important together with parameters affecting plant structure (growth, establishment and mortality). The model shows minor sensitivity to hydrological and soil texture parameters, questioning its skills in representing spatial vegetation heterogeneity at regional or watershed scales. We conclude that LPJ-GUESS' structure and possibly the one of other, structurally similar, dynamic vegetation models may need to be reconsidered. Specifically, the oversensitivity of the photosynthetic component deserves a particular attention, as this seems to contradict an increasing number of observations suggesting that photosynthesis may be a consequence rather than the driver of plant growth.

  17. Multiple change-points estimation of moving-average processes under dependence assumptions

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lixin; LI Yunxia

    2004-01-01

    In this paper, some results of convergence for a least-square estimator in the problem of multiple change-points estimation are presented and the moving-average processes of ρ-mixing sequence in the mean shifts are discussed. When the number of change points is known, the consistency of change-points estimator is derived. When the number of changes is unknown, the consistency of the change-points number and the change-points estimator by penalized least-squares method are obtained. The results are also true for φ-mixing, α-mixing, associated and negative associated sequences under suitable conditions.

  18. Change-point Estimation of a Mean Shift in Moving-average Processes Under Dependence Assumptions

    Institute of Scientific and Technical Information of China (English)

    Yun-xia Li

    2006-01-01

    In this paper we discuss the least-square estimator of the unknown change point in a mean shift for moving-average processes of ALNQD sequence. The consistency and the rate of convergence for the estimated change point are established. The asymptotic distribution for the change point estimator is obtained. The results are also true for ρ-mixing, ψ-mixing, α-mixing sequences under suitable conditions. These results extend those of Bai[1], who studied the mean shift point of a linear process of i.i.d. variables, and the condition ∞∑j=0j|aj|<∞in Bai is weakened to∞∑j=0|aj|<∞.

  19. Newton’s problem of minimal resistance under the single-impact assumption

    Science.gov (United States)

    Plakhov, Alexander

    2016-02-01

    A parallel flow of non-interacting point particles is incident on a body at rest. When hitting the body’s surface, the particles are reflected elastically. Assuming that each particle hits the body at most once (the single impact condition (SIC)), the force of resistance of the body along the flow direction can be written down in a simple analytical form. The problem of minimal resistance within this model was first considered by Newton (Newton 1687 Philosophiae Naturalis Principia Mathematica) in the class of bodies with a fixed length M along the flow direction and with a fixed maximum orthogonal cross section Ω , under the additional conditions that the body is convex and rotationally symmetric. Here we solve the problem (first stated in Buttazzo et al 1995 Minimum problems over sets of concave functions and related questions Math. Nachr. 173 71-89) for the wider class of bodies satisfying the SIC and with the additional conditions removed. The scheme of solution is inspired by Besicovitch’s method of solving the Kakeya problem (Besicovitch 1963 The Kakeya problem Am. Math. Mon. 70 697-706). If Ω is a disc, the decrease of resistance as compared with the original Newton problem is more than twofold; the ratio tends to 2 as M\\to 0 and to 20.25 as M\\to ∞ . We also prove that the infimum of resistance is 0 for a wider class of bodies with both single and double reflections allowed.

  20. Error estimations of dry deposition velocities of air pollutants using bulk sea surface temperature under common assumptions

    Science.gov (United States)

    Lan, Yung-Yao; Tsuang, Ben-Jei; Keenlyside, Noel; Wang, Shu-Lun; Arthur Chen, Chen-Tung; Wang, Bin-Jye; Liu, Tsun-Hsien

    2010-07-01

    It is well known that skin sea surface temperature (SSST) is different from bulk sea surface temperature (BSST) by a few tenths of a degree Celsius. However, the extent of the error associated with dry deposition (or uptake) estimation by using BSST is not well known. This study tries to conduct such an evaluation using the on-board observation data over the South China Sea in the summers of 2004 and 2006. It was found that when a warm layer occurred, the deposition velocities using BSST were underestimated within the range of 0.8-4.3%, and the absorbed sea surface heat flux was overestimated by 21 W m -2. In contrast, under cool skin only conditions, the deposition velocities using BSST were overestimated within the range of 0.5-2.0%, varying with pollutants and the absorbed sea surface heat flux was underestimated also by 21 W m -2. Scale analysis shows that for a slightly soluble gas (e.g., NO 2, NO and CO), the error in the solubility estimation using BSST is the major source of the error in dry deposition estimation. For a highly soluble gas (e.g., SO 2), the error in the estimation of turbulent heat fluxes and, consequently, aerodynamic resistance and gas-phase film resistance using BSST is the major source of the total error. In contrast, for a medium soluble gas (e.g., O 3 and CO 2) both the errors from the estimations of the solubility and aerodynamic resistance are important. In addition, deposition estimations using various assumptions are discussed. The largest uncertainty is from the parameterizations for chemical enhancement factors. Other important areas of uncertainty include: (1) various parameterizations for gas-transfer velocity; (2) neutral-atmosphere assumption; (3) using BSST as SST, and (4) constant pH value assumption.

  1. Uncovering Underlying Assumptions Regarding Education and Technology in Educational Reform Efforts A conversation with Dr. Larry Johnson

    Directory of Open Access Journals (Sweden)

    Gabriela Melano

    2000-11-01

    Full Text Available Educational systems around the world, and specifically in the United States, have long been awaiting for genuine reform efforts. Technology is often perceived as a panacea, if not as a crucial instrument in any educational reform effort. In a conversation with one of his students, Doctor Johnson discusses how the underlying assumptions embedded in our current schooling practices need to be seriously reviewed before any technology strategy is considered. New understandings, as opposed to mere information, is what schools need to reach in order to transform themselves. Finally, Dr. Johnson provides two brief examples, one in the United States and another in México, were hermeneutical approaches have been used for educational reform endeavors.

  2. Can Total Quality Management Succeed at Your College--Now? (Does Your College Meet the Essential Prerequisites and Underlying Assumptions of TQM?)

    Science.gov (United States)

    Hammons, James O.

    1994-01-01

    Defines Total Quality Management (TQM) and describes prerequisites for successful implementation, underlying assumptions, and the cultural barriers hindering implementation. Indicates that TQM's long-term benefits outweigh costs at most colleges. Urges practitioners to rate their schools with respect to the prerequisites and assumptions to…

  3. Assessing the Sensitivity of a Reservoir Management System Under Plausible Assumptions About Future Climate Over Seasons to Decades

    Science.gov (United States)

    Ward, M. N.; Brown, C. M.; Baroang, K. M.; Kaheil, Y. H.

    2011-12-01

    We illustrate an analysis procedure that explores the robustness and overall productivity of a reservoir management system under plausible assumptions about climate fluctuation and change. Results are presented based on a stylized version of a multi-use reservoir management model adapted from Angat Dam, Philippines. It represents a modest-sized seasonal storage reservoir in a climate with a pronounced dry season. The reservoir management model focuses on October-March, during which climatological inflow declines due to the arrival of the dry season, and reservoir management becomes critical and challenging. Inflow is assumed to be impacted by climate fluctuations representing interannal variation (white noise), decadal to multidecadal variation (MDV, here represented by a stochastic autoregressive process) and global change (GC), here represented by a systematic linear trend in seasonal inflow total over the simulation period of 2008-2047. Reservoir reliability, and risk of extreme persistent water shortfall, is assessed under different combinations and magnitudes of GC and MDV. We include an illustration of adaptive management, using seasonal forecasts and updated climate normals. A set of seasonal forecast and observed inflow values are generated for 2008-2047 by randomly rearranging the forecast-observed pairs for 1968-2007. Then, trends are imposed on the observed series, with differing assumptions about the extent to which the seasonal forecasts can be expected to track the trend. We consider the framework presented here well-suited to providing insights about managing the climate risks in reservoir operations, providing guidance on expected benefits and risks of different strategies and climate scenarios.

  4. Herd immunity effect of the HPV vaccination program in Australia under different assumptions regarding natural immunity against re-infection.

    Science.gov (United States)

    Korostil, Igor A; Peters, Gareth W; Law, Matthew G; Regan, David G

    2013-04-01

    Deterministic dynamic compartmental transmission models (DDCTMs) of human papillomavirus (HPV) transmission have been used in a number of studies to estimate the potential impact of HPV vaccination programs. In most cases, the models were built under the assumption that an individual who cleared HPV infection develops (life-long) natural immunity against re-infection with the same HPV type (this is known as SIR scenario). This assumption was also made by two Australian modelling studies evaluating the impact of the National HPV Vaccination Program to assist in the health-economic assessment of male vaccination. An alternative view denying natural immunity after clearance (SIS scenario) was only presented in one study, although neither scenario has been supported by strong evidence. Some recent findings, however, provide arguments in favour of SIS. We developed HPV transmission models implementing life-time (SIR), limited, and non-existent (SIS) natural immunity. For each model we estimated the herd immunity effect of the ongoing Australian HPV vaccination program and its extension to cover males. Given the Australian setting, we aimed to clarify the extent to which the choice of model structure would influence estimation of this effect. A statistically robust and efficient calibration methodology was applied to ensure credibility of our results. We observed that for non-SIR models the herd immunity effect measured in relative reductions in HPV prevalence in the unvaccinated population was much more pronounced than for the SIR model. For example, with vaccine efficacy of 95% for females and 90% for males, the reductions for HPV-16 were 3% in females and 28% in males for the SIR model, and at least 30% (females) and 60% (males) for non-SIR models. The magnitude of these differences implies that evaluations of the impact of vaccination programs using DDCTMs should incorporate several model structures until our understanding of natural immunity is improved.

  5. Tale of Two Courthouses: A Critique of the Underlying Assumptions in Chronic Disease Self-Management for Aboriginal People

    Directory of Open Access Journals (Sweden)

    Isabelle Ellis

    2009-12-01

    Full Text Available This article reviews the assumptions that underpin thecommonly implemented Chronic Disease Self-Managementmodels. Namely that there are a clear set of instructions forpatients to comply with, that all health care providers agreewith; and that the health care provider and the patient agreewith the chronic disease self-management plan that wasdeveloped as part of a consultation. These assumptions areevaluated for their validity in the remote health care context,particularly for Aboriginal people. These assumptions havebeen found to lack validity in this context, therefore analternative model to enhance chronic disease care isproposed.

  6. Estimating risks and relative risks in case-base studies under the assumptions of gene-environment independence and Hardy-Weinberg equilibrium.

    Science.gov (United States)

    Chui, Tina Tsz-Ting; Lee, Wen-Chung

    2014-01-01

    Many diseases result from the interactions between genes and the environment. An efficient method has been proposed for a case-control study to estimate the genetic and environmental main effects and their interactions, which exploits the assumptions of gene-environment independence and Hardy-Weinberg equilibrium. To estimate the absolute and relative risks, one needs to resort to an alternative design: the case-base study. In this paper, the authors show how to analyze a case-base study under the above dual assumptions. This approach is based on a conditional logistic regression of case-counterfactual controls matched data. It can be easily fitted with readily available statistical packages. When the dual assumptions are met, the method is approximately unbiased and has adequate coverage probabilities for confidence intervals. It also results in smaller variances and shorter confidence intervals as compared with a previous method for a case-base study which imposes neither assumption.

  7. The Basic Assumption of the Theory of Generalized Virtual Microeconomics%论广义虚拟微观经济学的基本假设

    Institute of Scientific and Technical Information of China (English)

    李小宁

    2014-01-01

    Based on the“psychological needs”, the ifve basic hypotseses and new characteristics of general-ized virtual microeconomics are discussed.%根据心理需求的性质,提出了广义虚拟微观经济学的5条基本假设,并探讨了广义虚拟微观经济学的新范式。

  8. A Study of the Effects of Underlying Assumptions in the Reduction of Multi-Object Photometry of Transiting Exoplanets

    Science.gov (United States)

    Fitzpatrick, M. Ryleigh; Silva Martins-Filho, Walter; Griffith, Caitlin Ann; Pearson, Kyle; Zellem, Robert Thomas; AzGOE

    2016-10-01

    The analysis of ground-based photometric observations of planetary transits must treat the effects of the Earth's atmosphere, which exceed the signal of the extrasolar planet. Generally, this is achieved by dividing the signal of the host star and planet from that of nearby field stars to reveal the lightcurve. The lightcurve is then fit to a model of the planet's orbit and physical characteristics, also taking into account the characteristics of the star. The fit to the in and out-of-transit data establish the depth of the lightcurve. The question arises, what is the best way to select and treat reference stars to best characterize and remove the shared atmospheric systematics that plague our transit signal. To explore these questions we examine the effects of several assumptions that underline the calculation of the light curve depth. Our study involves repeated photometric observations of hot Jupiter primary transits in the U and B filters. Data were taken with the University of Arizona's Kuiper 1.55m telescope/Mont4K CCD. Each exoplanet observed offers a unique field with stars of various brightness, spectral types and angular distance from the host star. While these observations are part of a larger study of the Rayleigh scattering signature of hot Jupiter exoplanets, here we study the effects of various choices during reduction, specifically the treatment of reference stars and atmospheric systematics.We calculate the lightcurve for all permutations of reference stars, considering several out-of-transit assumptions (e.g. linear, quadratic or exponential). We assess the sensitivity of the transit depths based on the spread of values. In addition we look for characteristics that minimize the scatter in the reduced lightcurve and analyze the effects of the treatment of individual variables on the resultant lightcurve model. Here we present the results of an in depth statistical analysis that classifies the effect of various parameters and choices involved in

  9. A computer code for forward calculation and inversion of the H/V spectral ratio under the diffuse field assumption

    Science.gov (United States)

    García-Jerez, Antonio; Piña-Flores, José; Sánchez-Sesma, Francisco J.; Luzón, Francisco; Perton, Mathieu

    2016-12-01

    During a quarter of a century, the main characteristics of the horizontal-to-vertical spectral ratio of ambient noise HVSRN have been extensively used for site effect assessment. In spite of the uncertainties about the optimum theoretical model to describe these observations, over the last decade several schemes for inversion of the full HVSRN curve for near surface surveying have been developed. In this work, a computer code for forward calculation of H/V spectra based on the diffuse field assumption (DFA) is presented and tested. It takes advantage of the recently stated connection between the HVSRN and the elastodynamic Green's function which arises from the ambient noise interferometry theory. The algorithm allows for (1) a natural calculation of the Green's functions imaginary parts by using suitable contour integrals in the complex wavenumber plane, and (2) separate calculation of the contributions of Rayleigh, Love, P-SV and SH waves as well. The stability of the algorithm at high frequencies is preserved by means of an adaptation of the Wang's orthonormalization method to the calculation of dispersion curves, surface-waves medium responses and contributions of body waves. This code has been combined with a variety of inversion methods to make up a powerful tool for passive seismic surveying.

  10. The metaphysics of D-CTCs: On the underlying assumptions of Deutsch's quantum solution to the paradoxes of time travel

    Science.gov (United States)

    Dunlap, Lucas

    2016-11-01

    I argue that Deutsch's model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a "multiverse", and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent a quantum solution to the paradoxes of time travel.

  11. A computer code for forward calculation and inversion of the H/V spectral ratio under the diffuse field assumption

    CERN Document Server

    García-Jerez, Antonio; Sánchez-Sesma, Francisco J; Luzón, Francisco; Perton, Mathieu

    2016-01-01

    During a quarter of a century, the main characteristics of the horizontal-to-vertical spectral ratio of ambient noise HVSRN have been extensively used for site effect assessment. In spite of the uncertainties about the optimum theoretical model to describe these observations, several schemes for inversion of the full HVSRN curve for near surface surveying have been developed over the last decade. In this work, a computer code for forward calculation of H/V spectra based on the diffuse field assumption (DFA) is presented and tested.It takes advantage of the recently stated connection between the HVSRN and the elastodynamic Green's function which arises from the ambient noise interferometry theory. The algorithm allows for (1) a natural calculation of the Green's functions imaginary parts by using suitable contour integrals in the complex wavenumber plane, and (2) separate calculation of the contributions of Rayleigh, Love, P-SV and SH waves as well. The stability of the algorithm at high frequencies is preserv...

  12. Understanding the basic biology underlying the flavor world of children

    Directory of Open Access Journals (Sweden)

    Julie A. MENNELLA, Alison K. VENTURA

    2010-12-01

    Full Text Available Health organizations worldwide recommend that adults and children minimize intakes of excess energy and salty, sweet, and fatty foods (all of which are highly preferred tastes and eat diets richer in whole grains, low- and non- fat dairy products, legumes, fish, lean meat, fruits, and vegetables (many of which taste bitter. Despite such recommendations and the well-established benefits of these foods to human health, adults are not complying, nor are their children. A primary reason for this difficulty is the remarkably potent rewarding properties of the tastes and flavors of foods high in sweetness, saltiness, and fatness. While we cannot easily change children’s basic ingrained biology of liking sweets and avoiding bitterness, we can modulate their flavor preferences by providing early exposure, starting in utero, to a wide variety of flavors within healthy foods, such as fruits, vegetables, and whole grains. Because the flavors of foods mothers eat during pregnancy and lactation also flavor amniotic fluid and breast milk and become preferred by infants, pregnant and lactating women should widen their food choices to include as many flavorful and healthy foods as possible. These experiences, combined with repeated exposure to nutritious foods and flavor variety during the weaning period and beyond, should maximize the chances that children will select and enjoy a healthier diet [Current Zoology 56 (6: 834–841, 2010].

  13. Gastric sensitivity and reflexes: basic mechanisms underlying clinical problems.

    Science.gov (United States)

    Azpiroz, Fernando; Feinle-Bisset, Christine; Grundy, David; Tack, Jan

    2014-02-01

    Both reflex and sensory mechanisms control the function of the stomach, and disturbances in these mechanisms may explain the pathophysiology of disorders of gastric function. The objective of this report is to perform a literature-based critical analysis of new, relevant or conflicting information on gastric sensitivity and reflexes, with particular emphasis on the comprehensive integration of basic and clinical research data. The stomach exerts both phasic and tonic muscular (contractile and relaxatory) activity. Gastric tone determines the capacity of the stomach and mediates both gastric accommodation to a meal as well as gastric emptying, by partial relaxation or progressive recontraction, respectively. Perception and reflex afferent pathways from the stomach are activated independently by specific stimuli, suggesting that the terminal nerve endings operate as specialized receptors. Particularly, perception appears to be related to stimulation of tension receptors, while the existence of volume receptors in the stomach is uncertain. Reliable techniques have been developed to measure gastric perception and reflexes both in experimental and clinical conditions, and have facilitated the identification of abnormal responses in patients with gastric disorders. Gastroparesis is characterised by impaired gastric tone and contractility, whereas patients with functional dyspepsia have impaired accommodation, associated with antral distention and increased gastric sensitivity. An integrated view of fragmented knowledge allows the design of pathophysiological models in an attempt to explain disorders of gastric function, and may facilitate the development of mechanistically orientated treatments.

  14. The Almost Sure Central Limit Theorems for the Maxima of Sums under Some New Weak Dependence Assumptions

    Institute of Scientific and Technical Information of China (English)

    Marcin DUDZI(N)SKI; Przemyslaw G(O)RKA

    2013-01-01

    We prove the almost sure central limit theorems for the maxima of partial sums of r.v.'s under a general condition of dependence due to Doukhan and Louhichi.We will separately consider the centered sequences and the sequences with positive expected values.

  15. Modeling probability and additive summation for detection across multiple mechanisms under the assumptions of signal detection theory.

    Science.gov (United States)

    Kingdom, Frederick A A; Baldwin, Alex S; Schmidtmann, Gunnar

    2015-01-01

    Many studies have investigated how multiple stimuli combine to reach threshold. There are broadly speaking two ways this can occur: additive summation (AS) where inputs from the different stimuli add together in a single mechanism, or probability summation (PS) where different stimuli are detected independently by separate mechanisms. PS is traditionally modeled under high threshold theory (HTT); however, tests have shown that HTT is incorrect and that signal detection theory (SDT) is the better framework for modeling summation. Modeling the equivalent of PS under SDT is, however, relatively complicated, leading many investigators to use Monte Carlo simulations for the predictions. We derive formulas that employ numerical integration to predict the proportion correct for detecting multiple stimuli assuming PS under SDT, for the situations in which stimuli are either equal or unequal in strength. Both formulas are general purpose, calculating performance for forced-choice tasks with M alternatives, n stimuli, in Q monitored mechanisms, each subject to a non-linear transducer with exponent τ. We show how the probability (and additive) summation formulas can be used to simulate psychometric functions, which when fitted with Weibull functions make signature predictions for how thresholds and psychometric function slopes vary as a function of τ, n, and Q. We also show how one can fit the formulas directly to real psychometric functions using data from a binocular summation experiment, and show how one can obtain estimates of τ and test whether binocular summation conforms more to PS or AS. The methods described here can be readily applied using software functions newly added to the Palamedes toolbox.

  16. A Scalable Method for Regioselective 3-Acylation of 2-Substituted Indoles under Basic Conditions

    DEFF Research Database (Denmark)

    Johansson, Karl Henrik; Urruticoechea, Andoni; Larsen, Inna;

    2015-01-01

    Privileged structures such as 2-arylindoles are recurrent molecular scaffolds in bioactive molecules. We here present an operationally simple, high yielding and scalable method for regioselective 3-acylation of 2-substituted indoles under basic conditions using functionalized acid chlorides. The ...

  17. Unders and Overs: Using a Dice Game to Illustrate Basic Probability Concepts

    Science.gov (United States)

    McPherson, Sandra Hanson

    2015-01-01

    In this paper, the dice game "Unders and Overs" is described and presented as an active learning exercise to introduce basic probability concepts. The implementation of the exercise is outlined and the resulting presentation of various probability concepts are described.

  18. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  19. An Unusual N-Boc Deprotection of Benzamides under Basic Conditions

    Institute of Scientific and Technical Information of China (English)

    YIN Biaolin; ZHANG Yuanxiu

    2009-01-01

    For the first time,an unusal cleavage of N-tert-butyloxycarbonyl(N-Boc)protection from N-Boc-protected benzamide under basic conditions in excellent yields is reported.The deprotection involves the N-Boc emigration from the benzamide to form 2-O-Boo group followed by O-Boc deprotection on the phenyl ring.

  20. Environment Assumptions for Synthesis

    CERN Document Server

    Chatterjee, Krishnendu; Jobstmann, Barbara

    2008-01-01

    The synthesis problem asks to construct a reactive finite-state system from an $\\omega$-regular specification. Initial specifications are often unrealizable, which means that there is no system that implements the specification. A common reason for unrealizability is that assumptions on the environment of the system are incomplete. We study the problem of correcting an unrealizable specification $\\phi$ by computing an environment assumption $\\psi$ such that the new specification $\\psi\\to\\phi$ is realizable. Our aim is to construct an assumption $\\psi$ that constrains only the environment and is as weak as possible. We present a two-step algorithm for computing assumptions. The algorithm operates on the game graph that is used to answer the realizability question. First, we compute a safety assumption that removes a minimal set of environment edges from the graph. Second, we compute a liveness assumption that puts fairness conditions on some of the remaining environment edges. We show that the problem of findi...

  1. Understanding E2 versus SN2 Competition under Acidic and Basic Conditions.

    Science.gov (United States)

    Wolters, Lando P; Ren, Yi; Bickelhaupt, F Matthias

    2014-02-01

    Our purpose is to understand the mechanism through which pH affects the competition between base-induced elimination and substitution. To this end, we have quantum chemically investigated the competition between elimination and substitution pathways in H2O+C2H5OH2 (+) and OH(-)+C2H5OH, that is, two related model systems that represent, in a generic manner, the same reaction under acidic and basic conditions, respectively. We find that substitution is favored in the acidic case while elimination prevails under basic conditions. Activation-strain analyses of the reaction profiles reveal that the switch in preferred reactivity from substitution to elimination, if one goes from acidic to basic catalysis, is related to (1) the higher basicity of the deprotonated base, and (2) the change in character of the substrates LUMO from C(β)-H bonding in C2H5OH2 (+) to C(β)-H antibonding in C2H5OH.

  2. Monthly values of the standardized precipitation index in the State of São Paulo, Brazil: trends and spectral features under the normality assumption

    Directory of Open Access Journals (Sweden)

    Gabriel Constantino Blain

    2012-01-01

    Full Text Available The aim of this study was to describe monthly series of the Standardized Precipitation Index obtained from four weather stations of the State of São Paulo, Brazil. The analyses were carried out by evaluating the normality assumption of the SPI distributions, the spectral features of these series and, the presence of climatic trends in these datasets. It was observed that the Pearson type III distribution was better than the gamma 2-parameter distribution in providing monthly SPI series closer to the normality assumption inherent to the use of this standardized index. The spectral analyses carried out in the time-frequency domain did not allow us to establish a dominant mode in the analyzed series. In general, the Mann-Kendall and the Pettitt tests indicated the presence of no significant trend in the SPI series. However, both trend tests have indicated that the temporal variability of this index, observed at the months of October over the last 60 years, cannot be seen as the result of a purely random process. This last inference is due to the concentration of decreasing trends, with a common beginning (1983/84 in the four locations of the study.

  3. The Metaphysics of {D-CTCs}: On the Underlying Assumptions of {Deutsch}'s Quantum Solution to the Paradoxes of Time Travel

    CERN Document Server

    Dunlap, Lucas

    2015-01-01

    I argue that Deutsch's model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a "multiverse'', and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent ...

  4. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption....... In this paper we investigate a method for testing the mar assumption in the presence of other distributional constraints. We present methods to (approximately) compute a test statistic consisting of the ratio of two profile likelihood functions. This requires the optimization of the likelihood under...... no assumptionson the missingness mechanism, for which we use our recently proposed AI \\& M algorithm. We present experimental results on synthetic data that show that our approximate test statistic is a good indicator for whether data is mar relative to the given distributional assumptions....

  5. Water oxidation catalysis with nonheme iron complexes under acidic and basic conditions: homogeneous or heterogeneous?

    Science.gov (United States)

    Hong, Dachao; Mandal, Sukanta; Yamada, Yusuke; Lee, Yong-Min; Nam, Wonwoo; Llobet, Antoni; Fukuzumi, Shunichi

    2013-08-19

    Thermal water oxidation by cerium(IV) ammonium nitrate (CAN) was catalyzed by nonheme iron complexes, such as Fe(BQEN)(OTf)2 (1) and Fe(BQCN)(OTf)2 (2) (BQEN = N,N'-dimethyl-N,N'-bis(8-quinolyl)ethane-1,2-diamine, BQCN = N,N'-dimethyl-N,N'-bis(8-quinolyl)cyclohexanediamine, OTf = CF3SO3(-)) in a nonbuffered aqueous solution; turnover numbers of 80 ± 10 and 20 ± 5 were obtained in the O2 evolution reaction by 1 and 2, respectively. The ligand dissociation of the iron complexes was observed under acidic conditions, and the dissociated ligands were oxidized by CAN to yield CO2. We also observed that 1 was converted to an iron(IV)-oxo complex during the water oxidation in competition with the ligand oxidation. In addition, oxygen exchange between the iron(IV)-oxo complex and H2(18)O was found to occur at a much faster rate than the oxygen evolution. These results indicate that the iron complexes act as the true homogeneous catalyst for water oxidation by CAN at low pHs. In contrast, light-driven water oxidation using [Ru(bpy)3](2+) (bpy = 2,2'-bipyridine) as a photosensitizer and S2O8(2-) as a sacrificial electron acceptor was catalyzed by iron hydroxide nanoparticles derived from the iron complexes under basic conditions as the result of the ligand dissociation. In a buffer solution (initial pH 9.0) formation of the iron hydroxide nanoparticles with a size of around 100 nm at the end of the reaction was monitored by dynamic light scattering (DLS) in situ and characterized by X-ray photoelectron spectra (XPS) and transmission electron microscope (TEM) measurements. We thus conclude that the water oxidation by CAN was catalyzed by short-lived homogeneous iron complexes under acidic conditions, whereas iron hydroxide nanoparticles derived from iron complexes act as a heterogeneous catalyst in the light-driven water oxidation reaction under basic conditions.

  6. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  7. Basic substances under EC 1107/2009 phytochemical regulation: experience with non-biocide and food products as biorationals

    Directory of Open Access Journals (Sweden)

    Marchand Patrice A.

    2016-07-01

    Full Text Available Basic Substances are a newly effective category of Plant Protection Product under EC Regulation No 1107/2009. The first approved application of Equisetum arvense L. opened Part C of Implementing Regulation (EU No 540/2011, which lists the basic substance approved. Although E. arvense was described as a fungicide extract, subsequent applications like chitosan were related to non-biocide molecules. Consequently, plant protection product data were collected from research on alternative or traditional crop protection methods. They are notably issued or derived from foodstuffs (plants, plant by-products, plant derived products, substances and derived substances from animal origin. Applications are currently submitted by our Institute, under evaluation at different stages of the approval process or already approved. Remarkably, this Basic Substance category under pesticide EU Regulation was surprisingly designed for these non-biocidal plant protection products. In fact, components described as the “active substance” of most of the actual applications are food products like sugars and lecithin. Basic Substance applications for these foodstuffs are therefore a straightforward way of easily gaining approval for them. Here we describe the approval context and detail the agricultural uses of theses food products as Biological Control Agents (BCAs or biorationals for crop protection. From all deposited or approved Basic Substance Application (BSA, a proof has been provided that non-biocide and food products via physical barrier or lure effects may be effective plant protection products with an acceptable low profile of concern for public and agricultural safety.

  8. Mechanistic Insight into the Lability of the Benzyloxycarbonyl (Z) Group inN-Protected Peptides under Mild Basic Conditions

    OpenAIRE

    Tena Solsona, Marta; Angulo Pachón, César Augusto; Escuder, Beatriu; Miravet Celades, Juan Felipe

    2014-01-01

    The unexpected lability of Z protecting group under mild basic conditions at room temperature is explained by a mechanism based on anchimeric assistance. It is found that the vicinal amide group stabilizes the tetrahedral intermediate formed after the nucleophilic addition of hydroxide to the carbonyl of the Z group. This effect operates in N-protected tripeptides and tetrapeptides but Z-protected dipeptides are stable under the same conditions due to the blockage of the vicinal amide NH by ...

  9. Degradation of carbofuran and carbofuran-derivatives in presence of humic substances under basic conditions.

    Science.gov (United States)

    Morales, Jorge; Manso, José A; Cid, Antonio; Mejuto, Juan C

    2012-11-01

    The influence of humic aggregates in water solution upon the chemical stability of carbofuran (CF) and the carbofuran-derivatives, 3-hydroxy-carbofuran (HCF) and 3-keto-carbofuran (KCF), has been investigated in basic media. An inhibition upon the basic hydrolysis of 3-hydroxy-carbofuran and 3-keto-carbofuran (≈ 1.7 and ≈ 1.5-fold, respectively) was observed and it was rationalized in terms of the micellar pseudophase model. Nevertheless, non-significant effect upon the carbofuran stability was found in the presence of humic substances. These behaviors have been compared with the corresponding ones in other synthetic colloidal aggregates.

  10. New Cryptosystem Using Multiple Cryptographic Assumptions

    Directory of Open Access Journals (Sweden)

    E. S. Ismail

    2011-01-01

    Full Text Available Problem statement: A cryptosystem is a way for a sender and a receiver to communicate digitally by which the sender can send receiver any confidential or private message by first encrypting it using the receiver’s public key. Upon receiving the encrypted message, the receiver can confirm the originality of the message’s contents using his own secret key. Up to now, most of the existing cryptosystems were developed based on a single cryptographic assumption like factoring, discrete logarithms, quadratic residue or elliptic curve discrete logarithm. Although these schemes remain secure today, one day in a near future they may be broken if one finds a polynomial algorithm that can efficiently solve the underlying cryptographic assumption. Approach: By this motivation, we designed a new cryptosystem based on two cryptographic assumptions; quadratic residue and discrete logarithms. We integrated these two assumptions in our encrypting and decrypting equations so that the former depends on one public key whereas the latter depends on one corresponding secret key and two secret numbers. Each of public and secret keys in our scheme determines the assumptions we use. Results: The newly developed cryptosystem is shown secure against the three common considering algebraic attacks using a heuristic security technique. The efficiency performance of our scheme requires 2Texp+2Tmul +Thash time complexity for encryption and Texp+2Tmul+Tsrt time complexity for decryption and this magnitude of complexity is considered minimal for multiple cryptographic assumptions-like cryptosystems. Conclusion: The new cryptosystem based on multiple cryptographic assumptions offers a greater security level than that schemes based on a single cryptographic assumption. The adversary has to solve the two assumptions simultaneously to recover the original message from the received corresponding encrypted message but this is very unlikely to happen.

  11. Faulty assumptions for repository requirements

    Energy Technology Data Exchange (ETDEWEB)

    Sutcliffe, W G

    1999-06-03

    Long term performance requirements for a geologic repository for spent nuclear fuel and high-level waste are based on assumptions concerning water use and subsequent deaths from cancer due to ingesting water contaminated with radio isotopes ten thousand years in the future. This paper argues that the assumptions underlying these requirements are faulty for a number of reasons. First, in light of the inevitable technological progress, including efficient desalination of water, over the next ten thousand years, it is inconceivable that a future society would drill for water near a repository. Second, even today we would not use water without testing its purity. Third, today many types of cancer are curable, and with the rapid progress in medical technology in general, and the prevention and treatment of cancer in particular, it is improbable that cancer caused by ingesting contaminated water will be a sign&ant killer in the far future. This paper reviews the performance requirements for geological repositories and comments on the difficulties in proving compliance in the face of inherent uncertainties. The already tiny long-term risk posed by a geologic repository is presented and contrasted with contemporary every day risks. A number of examples of technological progress, including cancer treatments, are advanced. The real and significant costs resulting from the overly conservative requirements are then assessed. Examples are given of how money (and political capital) could be put to much better use to save lives today and in the future. It is concluded that although a repository represents essentially no long-term risk, monitored retrievable dry storage (above or below ground) is the current best alternative for spent fuel and high-level nuclear waste.

  12. Basic Psychological Skills Usage and Competitive Anxiety Responses: Perceived Underlying Mechanisms

    Science.gov (United States)

    Wadey, Ross; Hanton, Sheldon

    2008-01-01

    This study examined the relationship between basic psychological skills usage (i.e., goal-setting, imagery, self-talk, and relaxation) and the intensity and directional dimensions of competitive anxiety. Semistructured interviews were used on a sample of 15 elite athletes (M age = 24.3 years, SD = 4.2) from a variety of team and individual sports.…

  13. Measurement of basic thermal-hydraulic characteristics under the test facility and reactor conditions

    Energy Technology Data Exchange (ETDEWEB)

    Eduard A Boltenko; Victor P Sharov [Elektrogorsk Research and Engineering Center, EREC, Bezimyannaja Street, 6, Elektrogorsk, Moscow Region, 142530 (Russian Federation); Dmitriy E Boltenko [State Scientific Center of Russian Federation IPPE, Bondarenko Square, Obhinsk, Kaluga Region, 249020 (Russian Federation)

    2005-07-01

    Full text of publication follows: The nuclear power of Russia is based on the reactors of two types: water-water - WWER and uranium - graphite channel RBMK. The nuclear power development is possible with performance of the basic condition - level of nuclear power plants (NPP) safety should satisfy the rigid requirements. The calculated proof of NPPs safety made by means of thermal-hydraulic codes of improved estimation, verified on experimental data is the characteristic of this level. The data for code verification can be obtained at the integral facilities simulating a circulation circuit of NPP with the basic units and intended for investigation of circuit behaviour in transient and accident conditions. For verification of mathematical models in transient and accident conditions, development of physically reasonable methods for definition of the various characteristics of two-phase flow the experimental data, as the integrated characteristics of a flow, and data on the local characteristics and structure of a flow is necessary. For safety assurance of NPP it is necessary to monitor and determine the basic thermalhydraulic characteristics of reactor facility (RF). It is possible to refer coolant flow-rate, core input and output water temperature, heat-power. The description of the EREC works in the field completion and adaptation of certain methods with reference to measurements in dynamic modes of test facility conditions and development of methods for measurements of basic thermal-hydraulic characteristics of reactor facilities is presented in the paper. (authors)

  14. Separation of basic proteins from Leishmania using a combination of Free flow electrophoresis (FFE) and 2D electrophoresis (2-DE) under basic conditions.

    Science.gov (United States)

    Brotherton, Marie-Christine; Racine, Gina; Ouellette, Marc

    2015-01-01

    Basic proteins, an important class of proteins in intracellular organisms such as Leishmania, are usually underrepresented on 2D gels. This chapter describes a method combining basic proteins fractionation using Free flow electrophoresis in isoelectric focusing mode (IEF-FFE) followed by protein separation using two-dimensional gel electrophoresis (2-DE) in basic conditions. The combination of these two techniques represents a great improvement for the visualization of Leishmania proteins with basic pI using 2D gels.

  15. Basic Relations Type of System Dynamics DELAY_N Input-Output Under the Discrete State and Its Application

    Institute of Scientific and Technical Information of China (English)

    WeiyuanTu

    2004-01-01

    DELAY_N is a kind of important function in system dynamic. Studies about the basic relational type between its input and output under the discrete state have the significant influence and instruction in determination system dynamics model parameter, in debugging model, and in analyzing result. The article has obtained the DELAY_N inputoutput relational expressions, has obtained the input average delay time and the delay variance expression, in this foundution, proposed one method of determination delay functiontype.

  16. THE TRANSFORMATIONAL PROCESSES INVOLVING MOTOR SKILLS THAT OCCUR UNDER THE INFLUENCE OF BASIC PRELIMINARY TRAINING IN YOUNG HANDBALL PLAYERS

    Directory of Open Access Journals (Sweden)

    Markovic Sasa

    2011-06-01

    Full Text Available The population from which we extracted a sample of 76 subjects consisted of elementary school students in Kursumlija, all male, aged 12-13, who were divided into a sub-sample consisting of 38 young handball players who took part in the training sessions of a school of handball and another sub-sample consisting of 38 non-athletes, who only took part in their regular physical education classes. The aim of the research was to determine the transformation processes involving motor skills, which occur under the influence of basic preliminary training in young handball players. The subject matter of the study was to examine whether a statistically significant increase in the level of motor skills would occur under the influence of physical exercise as part of basic preliminary training in the final as compared to the initial state. Six motor tests which define the dimensions of explosive and repetitive strength were used. The results of the research indicate that significant transformational processes involving the motor skills of young handball players occurred in the final as compared to the initial measuring, under the influence of basic preliminary training.

  17. Effects of turbulence on mixed-phase deep convective clouds under different basic-state winds and aerosol concentrations

    Science.gov (United States)

    Lee, Hyunho; Baik, Jong-Jin; Han, Ji-Young

    2014-12-01

    The effects of turbulence-induced collision enhancement (TICE) on mixed-phase deep convective clouds are numerically investigated using a 2-D cloud model with bin microphysics for uniform and sheared basic-state wind profiles and different aerosol concentrations. Graupel particles account for the most of the cloud mass in all simulation cases. In the uniform basic-state wind cases, graupel particles with moderate sizes account for some of the total graupel mass in the cases with TICE, whereas graupel particles with large sizes account for almost all the total graupel mass in the cases without TICE. This is because the growth of ice crystals into small graupel particles is enhanced due to TICE. The changes in the size distributions of graupel particles due to TICE result in a decrease in the mass-averaged mean terminal velocity of graupel particles. Therefore, the downward flux of graupel mass, and thus the melting of graupel particles, is reduced due to TICE, leading to a decrease in the amount of surface precipitation. Moreover, under the low aerosol concentration, TICE increases the sublimation of ice particles, consequently playing a partial role in reducing the amount of surface precipitation. The effects of TICE are less pronounced in the sheared basic-state wind cases than in the uniform basic-state wind cases because the number of ice crystals is much smaller in the sheared basic-state wind cases than in the uniform basic-state wind cases. Thus, the size distributions of graupel particles in the cases with and without TICE show little difference.

  18. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  19. Mexican-American Cultural Assumptions and Implications.

    Science.gov (United States)

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  20. Basic Facts about Low-Income Children: Children under 3 Years, 2013. Fact Sheet

    Science.gov (United States)

    Jiang, Yang; Ekono, Mercedes; Skinner, Curtis

    2015-01-01

    Children under 18 years represent 23 percent of the population, but they comprise 33 percent of all people in poverty. Among all children, 44 percent live in low-income families and approximately one in every five (22 percent) live in poor families. Our very youngest children--infants and toddlers under age 3 years--appear to be particularly…

  1. Basic Facts about Low-Income Children: Children under 6 Years, 2013. Fact Sheet

    Science.gov (United States)

    Jiang, Yang; Ekono, Mercedes; Skinner, Curtis

    2015-01-01

    Children under 18 years represent 23 percent of the population, but they comprise 33 percent of all people in poverty. Among all children, 44 percent live in low-income families and approximately one in every five (22 percent) live in poor families. Young children under age 6 years appear to be particularly vulnerable, with 48 percent living in…

  2. The cis-trans Isomerisation of Homologous 2-Hydroxycycloalkanecarboxylic Acids under Basic Conditions

    Institute of Scientific and Technical Information of China (English)

    GYARMATI, Cs. Zsuzsanna; P(A)LINK(O), István; BOKROS, Attila; MARTINEK, A. Tamás; BERN(A)TH, Gábor

    2006-01-01

    The cis→trans isomerisation of homologous 2-hydroxycycloalkanecarboxylic acids in strongly basic aqueous solution was studied starting from the cis isomers. It was found that the cyclopentane, cyclohexane and cycloheptane homologues afforded synthetically useful amounts of the trans acids and the procedure resulted in relatively small quantities of the corresponding olefinic acids. In contrast, the isomerisation of the cis-2-hydroxycyclooctanecarboxylic acid produced roughly equal amounts of the cis and trans isomers and the 1-cyclooctenecarboxylic acid at equilibrium. Molecular modelling with the PM3 semiempirical method of the reactants, products and the intermediates applying explicit water molecules as reaction medium gave a fair estimate for the rate sequence of the idealised (dehydration-free) isomerisation reactions in aqueous base solution.

  3. Basic Facts about Low-Income Children: Children under 18 Years, 2013. Fact Sheet

    Science.gov (United States)

    Jiang, Yang; Ekono, Mercedes; Skinner, Curtis

    2015-01-01

    Children under 18 years represent 23 percent of the population, but they comprise 33 percent of all people in poverty. Among all children, 44 percent live in low-income families and approximately one in every five (22 percent) live in poor families. Being a child in a low-income or poor family does not happen by chance. Parental education and…

  4. Refined transition-state models for proline-catalyzed asymmetric Michael reactions under basic and base-free conditions.

    Science.gov (United States)

    Sharma, Akhilesh K; Sunoj, Raghavan B

    2012-12-07

    The stereocontrolling transition state (TS) models for C-C bond formation relying on hydrogen bonding have generally been successful in proline-catalyzed aldol, Mannich, α-amination, and α-aminoxylation reactions. However, the suitability of the hydrogen-bonding model in protic and aprotic conditions as well as under basic and base-free conditions has not been well established for Michael reactions. Through a comprehensive density functional theory investigation, we herein analyze different TS models for the stereocontrolling C-C bond formation, both in the presence and absence of a base in an aprotic solvent (THF). A refined stereocontrolling TS for the Michael reaction between cyclohexanone and nitrostyrene is proposed. The new TS devoid of hydrogen bonding between the nitro group of nitrostyrene and carboxylic acid of proline, under base-free conditions, is found to be more preferred over the conventional hydrogen-bonding model besides being able to reproduce the experimentally observed stereochemical outcome. A DBU-bound TS is identified as more suitable for rationalizing the origin of asymmetric induction under basic reaction conditions. In both cases, the most preferred approach of nitrostyrene is identified as occurring from the face anti to the carboxylic acid of proline-enamine. The predicted enantio- and diastereoselectivities are in very good agreement with the experimental observations.

  5. 39 Questionable Assumptions in Modern Physics

    Science.gov (United States)

    Volk, Greg

    2009-03-01

    The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.

  6. Mesoporous Structure Control of Silica in Room-Temperature Synthesis under Basic Conditions

    Directory of Open Access Journals (Sweden)

    Jeong Wook Seo

    2015-01-01

    Full Text Available Various types of mesoporous silica, such as continuous cubic-phase MCM-48, hexagonal-phase MCM-41, and layer-phase spherical silica particles, have been synthesized at room temperature using cetyltrimethylammonium bromide as a surfactant, ethanol as a cosurfactant, tetraethyl orthosilicate as a silica precursor, and ammonia as a condensation agent. Special care must be taken both in the filtering of the resultant solid products and in the drying process. In the drying process, further condensation of the silica after filtering was induced. As the surfactant and cosurfactant concentrations in the reaction mixture increased and the NH3 concentration decreased, under given conditions, continuous cubic MCM-48 and layered silica became the dominant phases. A cooperative synthesis mechanism, in which both the surfactant and silica were involved in the formation of mesoporous structures, provided a good explanation of the experimental results.

  7. The OPERA hypothesis: assumptions and clarifications.

    Science.gov (United States)

    Patel, Aniruddh D

    2012-04-01

    Recent research suggests that musical training enhances the neural encoding of speech. Why would musical training have this effect? The OPERA hypothesis proposes an answer on the basis of the idea that musical training demands greater precision in certain aspects of auditory processing than does ordinary speech perception. This paper presents two assumptions underlying this idea, as well as two clarifications, and suggests directions for future research.

  8. Adrenomedullin promotes differentiation of oligodendrocyte precursor cells into myelin-basic-protein expressing oligodendrocytes under pathological conditions in vitro.

    Science.gov (United States)

    Maki, Takakuni; Takahashi, Yoko; Miyamoto, Nobukazu; Liang, Anna C; Ihara, Masafumi; Lo, Eng H; Arai, Ken

    2015-07-01

    Oligodendrocytes, which are the main cell type in cerebral white matter, are generated from their precursor cells (oligodendrocyte precursor cells: OPCs). However, the differentiation from OPCs to oligodendrocytes is disturbed under stressed conditions. Therefore, drugs that can improve oligodendrocyte regeneration may be effective for white matter-related diseases. Here we show that a vasoactive peptide adrenomedullin (AM) promotes the in vitro differentiation of OPCs under pathological conditions. Primary OPCs were prepared from neonatal rat brains, and differentiated into myelin-basic-protein expressing oligodendrocytes over time. This in vitro OPC differentiation was inhibited by prolonged chemical hypoxic stress induced by non-lethal CoCl(2) treatment. However, AM promoted the OPC differentiation under the hypoxic stress conditions, and the AM receptor antagonist AM(22-52) canceled the AM-induced OPC differentiation. In addition, AM treatment increased the phosphorylation level of Akt in OPC cultures, and correspondingly, the PI3K/Akt inhibitor LY294002 blocked the AM-induced OPC differentiation. Taken together, AM treatment rescued OPC maturation under pathological conditions via an AM-receptor-PI3K/Akt pathway. Oligodendrocytes play critical roles in white matter by forming myelin sheath. Therefore, AM signaling may be a promising therapeutic target to boost oligodendrocyte regeneration in CNS disorders.

  9. NUCLEOTIDE DEGRADATION PRODUCTS, TOTAL VOLATILE BASIC NITROGEN, SENSORY AND MICROBIOLOGICAL QUALITY OF NILE PERCH (LATES NILOTICUS FILLETS UNDER CHILLED STORAGE

    Directory of Open Access Journals (Sweden)

    Andrew Kiri Amegovu

    2012-10-01

    Full Text Available Degradation products of adenosine nucleotide and total volatile basic nitrogen (TVBN concentration provide means of ascertaining freshness of commercial fish products. A complementary sensory analysis has also been adopted by export markets for assessing the quality of fresh fish. Nucleotide breakdown products and TVBN was determined in fresh fillets from beach seined and gill netted Nile perch, a highly commercialized freshwater fish from Lake Victoria (Uganda, under chilled storage. Microbiological and sensory qualities were also evaluated. Total plate and Pseudomonas spp. counts positively correlated with TVBN. Basing on sensory, microbiological and biochemical attributes of the fillets, shelf-life of gill netted Nile perch was lower (13 days than that of the beach seined (17 days. Fillets of beach seined Nile perch have a better keeping quality than that of the gill netted.

  10. Modern Cosmology: Assumptions and Limits

    CERN Document Server

    Hwang, Jai-chan

    2012-01-01

    Physical cosmology tries to understand the Universe at large with its origin and evolution. Observational and experimental situations in cosmology do not allow us to proceed purely based on the empirical means. We examine in which sense our cosmological assumptions in fact have shaped our current cosmological worldview with consequent inevitable limits. Cosmology, as other branches of science and knowledge, is a construct of human imagination reflecting the popular belief system of the era. The question at issue deserves further philosophic discussions. In Whitehead's words, "philosophy, in one of its functions, is the critic of cosmologies". (Whitehead 1925)

  11. Modern Cosmology: Assumptions and Limits

    Science.gov (United States)

    Hwang, Jai-Chan

    2012-06-01

    Physical cosmology tries to understand the Universe at large with its origin and evolution. Observational and experimental situations in cosmology do not allow us to proceed purely based on the empirical means. We examine in which sense our cosmological assumptions in fact have shaped our current cosmological worldview with consequent inevitable limits. Cosmology, as other branches of science and knowledge, is a construct of human imagination reflecting the popular belief system of the era. The question at issue deserves further philosophic discussions. In Whitehead's words, ``philosophy, in one of its functions, is the critic of cosmologies.'' (Whitehead 1925).

  12. Basic Soil Productivity of Spring Maize in Black Soil Under Long-Term Fertilization Based on DSSAT Model

    Institute of Scientific and Technical Information of China (English)

    ZHA Yan; WU Xue-ping; HE Xin-hua; ZHANG Hui-min; GONG Fu-fei; CAI Dian-xiong; ZHU Ping; GAO Hong-jun

    2014-01-01

    Increasing basic farmland soil productivity has signiifcance in reducing fertilizer application and maintaining high yield of crops. In this study, we deifned that the basic soil productivity (BSP) is the production capacity of a farmland soil with its own physical and chemical properties for a speciifc crop season under local environment and ifeld management. Based on 22-yr (1990-2011) long-term experimental data on black soil (Typic hapludoll) in Gongzhuling, Jilin Province, Northeast China, the decision support system for an agro-technology transfer (DSSAT)-CERES-Maize model was applied to simulate the yield by BSP of spring maize (Zea mays L.) to examine the effects of long-term fertilization on changes of BSP and explore the mechanisms of BSP increasing. Five treatments were examined:(1) no-fertilization control (control);(2) chemical nitrogen, phosphorus, and potassium (NPK); (3) NPK plus farmyard manure (NPKM); (4) 1.5 time of NPKM (1.5NPKM) and (5) NPK plus straw (NPKS). Results showed that after 22-yr fertilization, the yield by BSP of spring maize signiifcantly increased 78.0, 101.2, and 69.4% under the NPKM, 1.5NPKM and NPKS, respectively, compared to the initial value (in 1992), but not signiifcant under NPK (26.9%increase) and the control (8.9%decrease). The contribution percentage of BSP showed a signiifcant rising trend (PNPKM>NPK≈NPKS, indicating that organic manure combined with chemical fertilizers (1.5NPKM and NPKM) could more effectively increase BSP compared with the inorganic fertilizer application alone (NPK) in the black soil. This study showed that soil organic matter (SOM) was the key factor among various fertility factors that could affect BSP in the black soil, and total N, total P and/or available P also played important role in BSP increasing. Compared with the chemical fertilization, a balanced chemical plus manure or straw fertilization (NPKM or NPKS) not only increased the concentrations of soil nutrient, but also improved the

  13. Anionic liposome template synthesis of raspberry-like hollow silica particle under ambient conditions with basic catalyst.

    Science.gov (United States)

    Ishii, Haruyuki; Sato, Kumi; Nagao, Daisuke; Konno, Mikio

    2012-04-01

    Hollow silica particle was obtained with a vesicle template synthesis in water under ambient conditions in the presence of ammonia. Biomimetic vesicles, liposomes were used, which consisted of a zwitterionic phospholipid, 1,2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC), and a tiny amount of charged amphiphiles, hexadecylamine (HDA) or dicetylphosphate (DCP). Aggregation of silica occurred for DPPC or cationic DPPC/HDA liposome, whereas well-dispersed hollow silica particle could be obtained for anionic DPPC/DCP liposome. The hollow particle synthesized with the anionic liposome had single-layered and raspberry-like structures. Electrostatic repulsion between anionic vesicles maintained stable dispersion of the as-synthesized particles during the reaction. Formation of the raspberry-like morphology is explained by silica particle precipitation selectively induced around the liposomes under basic conditions due to affinity of silica precursors for the liposomes. Synthesis of well-dispersed hollow silica particle with a raspberry-like morphology is the first report in vesicle template syntheses.

  14. Programa saúde da família no brasil: um enfoque sobre seus pressupostos básicos, operacionalização e vantagens Family health program in brazil: a focus on its basic assumptions, performance and advantages

    Directory of Open Access Journals (Sweden)

    Milena Lopes Santana

    2001-07-01

    Full Text Available De sua concepção até o momento atual, são muitas as análises a respeito do Programa Saúde da Família (PSF no Brasil. Embora ainda em número reduzido, integrantes das unidades de saúde da família, secretários municipais de saúde, prefeitos, elementos do Ministério da Saúde, bem como docentes de universidades e pesquisadores renomados da saúde pública e outras áreas afins, têm se disposto a discutir e a refletir sobre tal estratégia. Dessa forma, tornou-se pertinente fazer uma revisão da literatura sobre o PSF, a qual foi abordada em temas: retrospectiva histórica do período que antecedeu o PSF; seus pressupostos básicos; estratégias de operacionalização: a família como foco de assistência, o princípio da vigilância à saúde, a atuação da equipe multidisciplinar; os diferentes modelos de implantação no Brasil; aspectos facilitadores ou não dessa implantação, bem como as vantagens e desvantagens do PSF no sistema de saúde brasileiro.Since its conception up to the moment, many have been the analysis concerning the Family Health Program in Brazil (FHP. Although still in a small number, members of the Family Health Units, Health Municipal Secretaries, Mayors, members of health Ministry, as well as Universities teaching staff and renowned researchers of public health and other similar branches, they have disposed themselves towards discussing and considering such strategy. Thus, it became appropriate to carry out a review on the literature about The FHP, which was approached in topics: historic retrospective of the period that preceded The FHP; its basic assumptions; performance strategies; the family as the center of assistance, the principle of health vigilance, the performance of the multidisciplinarian staff, the different patterns of implantation in Brazil, the facilitating aspects or not of this launching in Brazil, as well as the advantages and disadvantages of The FHP in Brazilian Health System.

  15. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other appl...

  16. Basic-functionalized recyclable ionic liquid catalyst: A solvent-free approach for Michael addition of 1,3-dicarbonyl compounds to nitroalkenes under ultrasound irradiation.

    Science.gov (United States)

    Narayanaperumal, Senthil; da Silva, Rodrigo César; Feu, Karla Santos; de la Torre, Alexander Fernández; Corrêa, Arlene G; Paixão, Márcio Weber

    2013-05-01

    A task-specific ionic liquid (TSIL) has been introduced as a recyclable catalyst in Michael addition. A series of nitroalkenes and various C-based nucleophiles were reacted in the presence of 30mol% of recyclable basic-functionalized ionic liquid. Good to excellent yields were obtained in 30min under ultrasound irradiation.

  17. The impact of vascular endothelial growth factor and basic fibroblast growth factor on cardiac fibroblasts grown under altered gravity conditions

    DEFF Research Database (Denmark)

    Ulbrich, Claudia; Leder, Annekatrin; Pietsch, Jessica

    2010-01-01

    Myocardium is very sensitive to gravitational changes. During a spaceflight cardiovascular atrophy paired with rhythm problems and orthostatic intolerance can occur. The aim of this study was to investigate the impact of basic fibroblast growth factor (bFGF) and vascular endothelial growth factor...

  18. Different Random Distributions Research on Logistic-Based Sample Assumption

    Directory of Open Access Journals (Sweden)

    Jing Pan

    2014-01-01

    Full Text Available Logistic-based sample assumption is proposed in this paper, with a research on different random distributions through this system. It provides an assumption system of logistic-based sample, including its sample space structure. Moreover, the influence of different random distributions for inputs has been studied through this logistic-based sample assumption system. In this paper, three different random distributions (normal distribution, uniform distribution, and beta distribution are used for test. The experimental simulations illustrate the relationship between inputs and outputs under different random distributions. Thereafter, numerical analysis infers that the distribution of outputs depends on that of inputs to some extent, and this assumption system is not independent increment process, but it is quasistationary.

  19. The Regressivity of the Property Tax. The Incidence of the Property Tax Under Alternative Assumptions of Incidence in Four States--Connecticut, Minnesota, Missouri and South Dakota. Report No. F76-4.

    Science.gov (United States)

    Odden, Allan; Vincent, Phillip E.

    This booklet summarizes the recent debate over the regressivity or progressivity of the property tax and presents some evidence on how the property tax burden is distributed among income classes. Section 1 discusses property tax incidence under both the conventional and new economic views. Discussed specifically are the conditions under which…

  20. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  1. The Self in Guidance: Assumptions and Challenges.

    Science.gov (United States)

    Edwards, Richard; Payne, John

    1997-01-01

    Examines the assumptions of "self" made in the professional and managerial discourses of guidance. Suggests that these assumptions obstruct the capacity of guidance workers to explain their own practices. Drawing on contemporary debates over identity, modernity, and postmodernity, argues for a more explicit debate about the self in guidance. (RJM)

  2. Assumptions of Multiple Regression: Correcting Two Misconceptions

    Directory of Open Access Journals (Sweden)

    Matt N. Williams

    2013-09-01

    Full Text Available In 2002, an article entitled - Four assumptions of multiple regression that researchers should always test- by.Osborne and Waters was published in PARE. This article has gone on to be viewed more than 275,000 times.(as of August 2013, and it is one of the first results displayed in a Google search for - regression.assumptions- . While Osborne and Waters' efforts in raising awareness of the need to check assumptions.when using regression are laudable, we note that the original article contained at least two fairly important.misconceptions about the assumptions of multiple regression: Firstly, that multiple regression requires the.assumption of normally distributed variables; and secondly, that measurement errors necessarily cause.underestimation of simple regression coefficients. In this article, we clarify that multiple regression models.estimated using ordinary least squares require the assumption of normally distributed errors in order for.trustworthy inferences, at least in small samples, but not the assumption of normally distributed response or.predictor variables. Secondly, we point out that regression coefficients in simple regression models will be.biased (toward zero estimates of the relationships between variables of interest when measurement error is.uncorrelated across those variables, but that when correlated measurement error is present, regression.coefficients may be either upwardly or downwardly biased. We conclude with a brief corrected summary of.the assumptions of multiple regression when using ordinary least squares.

  3. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... to articulate and discuss underlying and conflicting assumptions in IT PPM, serving as a basis for adjusting organizations’ IT PPM practices. Keywords: IT project portfolio management or IT PPM, literature review, scientific discourses, underlying assumptions, unintended consequences, epistemological biases......: (1) IT PPM as the top management marketplace, (2) IT PPM as the cause of social dilemmas at the lower organizational levels (3) IT PPM as polity between different organizational interests, (4) IT PPM as power relations that suppress creativity and diversity. Our metaphors can be used by practitioners...

  4. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  5. A Cmparison of Closed World Assumptions

    Institute of Scientific and Technical Information of China (English)

    沈一栋

    1992-01-01

    In this Paper.we introduce a notion of the family of closed world assumptions and compare several well-known closed world approaches in the family to the extent to whic an incomplete database is com pleted.

  6. Examining Computational Assumptions For Godiva IV

    Energy Technology Data Exchange (ETDEWEB)

    Kirkland, Alexander Matthew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Jaegers, Peter James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-11

    Over the course of summer 2016, the effects of several computational modeling assumptions with respect to the Godiva IV reactor were examined. The majority of these assumptions pertained to modeling errors existing in the control rods and burst rod. The Monte Carlo neutron transport code, MCNP, was used to investigate these modeling changes, primarily by comparing them to that of the original input deck specifications.

  7. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  8. Asthma Basics

    Science.gov (United States)

    ... Old Feeding Your 1- to 2-Year-Old Asthma Basics KidsHealth > For Parents > Asthma Basics A A ... Asthma Categories en español Asma: aspectos fundamentales About Asthma Asthma is a common lung condition in kids ...

  9. Volatilization of elemental mercury from fresh blast furnace sludge mixed with basic oxygen furnace sludge under different temperatures.

    Science.gov (United States)

    Földi, Corinna; Dohrmann, Reiner; Mansfeldt, Tim

    2015-11-01

    Blast furnace sludge (BFS) is a waste with elevated mercury (Hg) content due to enrichment during the production process of pig iron. To investigate the volatilization potential of Hg, fresh samples of BFS mixed with basic oxygen furnace sludge (BOFS; a residue of gas purification from steel making, processed simultaneously in the cleaning devices of BFS and hence mixed with BFS) were studied in sealed column experiments at different temperatures (15, 25, and 35 °C) for four weeks (total Hg: 0.178 mg kg(-1)). The systems were regularly flushed with ambient air (every 24 h for the first 100 h, followed by every 72 h) for 20 min at a flow rate of 0.25 ± 0.03 L min(-1) and elemental Hg vapor was trapped on gold coated sand. Volatilization was 0.276 ± 0.065 ng (x m: 0.284 ng) at 15 °C, 5.55 ± 2.83 ng (x m: 5.09 ng) at 25 °C, and 2.37 ± 0.514 ng (x m: 2.34 ng) at 35 °C. Surprisingly, Hg fluxes were lower at 35 than 25 °C. For all temperature variants, an elevated Hg flux was observed within the first 100 h followed by a decrease of volatilization thereafter. However, the background level of ambient air was not achieved at the end of the experiments indicating that BFS mixed with BOFS still possessed Hg volatilization potential.

  10. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record

  11. Global scientific research commons under the Nagoya Protocol: Towards a collaborative economy model for the sharing of basic research assets.

    Science.gov (United States)

    Dedeurwaerdere, Tom; Melindi-Ghidi, Paolo; Broggiato, Arianna

    2016-01-01

    This paper aims to get a better understanding of the motivational and transaction cost features of building global scientific research commons, with a view to contributing to the debate on the design of appropriate policy measures under the recently adopted Nagoya Protocol. For this purpose, the paper analyses the results of a world-wide survey of managers and users of microbial culture collections, which focused on the role of social and internalized motivations, organizational networks and external incentives in promoting the public availability of upstream research assets. Overall, the study confirms the hypotheses of the social production model of information and shareable goods, but it also shows the need to complete this model. For the sharing of materials, the underlying collaborative economy in excess capacity plays a key role in addition to the social production, while for data, competitive pressures amongst scientists tend to play a bigger role.

  12. Spoilage of light (PSE-like) and dark turkey meat under aerobic or modified atmosphere package: microbial indicators and their relationship with total volatile basic nitrogen

    OpenAIRE

    Fraqueza, Maria João Ramos; Ferreira, Marilia Catarina; Barreto, António Salvador

    2008-01-01

    Abstract 1. The aim of this work was to evaluate the shelf life of turkey meat from different colour categories (light (PSE-like), intermediate and dark), packaged under aerobic or modified atmosphere (MAP) conditions; also to establish a relationship between microbial quality and total volatile basic nitrogen (TVB-N), evaluating its capacity for shelf life determination. Breasts were selected according to Luminance (L*) and pH24: L ? 51 and pH < 5.8 for light colour, 43 < L < 51 f...

  13. Basic design of shield blocks for a spallation neutron source under the high-intensity proton accelerator project

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Katsuhiko; Maekawa, Fujio; Takada, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    Under the JAERI-KEK High-Intensity Proton Accelerator Project (J-PARC), a spallation neutron source driven by a 3 GeV-1 MW proton beam is planed to be constructed as a main part of the Materials and Life Science Facility. Overall dimensions of a biological shield of the neutron source had been determined by evaluation of shielding performance by Monte Carlo calculations. This report describes results of design studies on an optimum dividing scheme in terms of cost and treatment and mechanical strength of shield blocks for the biological shield. As for mechanical strength, it was studied whether the shield blocks would be stable, fall down or move to a horizontal direction in case of an earthquake of seismic intensity of 5.5 (250 Gal) as an abnormal load. For ceiling shielding blocks being supported by both ends of the long blocks, maximum bending moment and an amount of maximum deflection of their center were evaluated. (author)

  14. Root gravitropism: an experimental tool to investigate basic cellular and molecular processes underlying mechanosensing and signal transmission in plants

    Science.gov (United States)

    Boonsirichai, K.; Guan, C.; Chen, R.; Masson, P. H.

    2002-01-01

    The ability of plant organs to use gravity as a guide for growth, named gravitropism, has been recognized for over two centuries. This growth response to the environment contributes significantly to the upward growth of shoots and the downward growth of roots commonly observed throughout the plant kingdom. Root gravitropism has received a great deal of attention because there is a physical separation between the primary site for gravity sensing, located in the root cap, and the site of differential growth response, located in the elongation zones (EZs). Hence, this system allows identification and characterization of different phases of gravitropism, including gravity perception, signal transduction, signal transmission, and curvature response. Recent studies support some aspects of an old model for gravity sensing, which postulates that root-cap columellar amyloplasts constitute the susceptors for gravity perception. Such studies have also allowed the identification of several molecules that appear to function as second messengers in gravity signal transduction and of potential signal transducers. Auxin has been implicated as a probable component of the signal that carries the gravitropic information between the gravity-sensing cap and the gravity-responding EZs. This has allowed the identification and characterization of important molecular processes underlying auxin transport and response in plants. New molecular models can be elaborated to explain how the gravity signal transduction pathway might regulate the polarity of auxin transport in roots. Further studies are required to test these models, as well as to study the molecular mechanisms underlying a poorly characterized phase of gravitropism that is independent of an auxin gradient.

  15. 大数据下网络稳定性测试基础研究%Network stability test of basic research under the big data

    Institute of Scientific and Technical Information of China (English)

    高加琼

    2015-01-01

    大数据环境下,需要对种类繁多、数量庞大的多样数据中快速进行信息获取,这就使得在大数据环境下对网络稳定性提出了更高的要求.本文将对大数据下网络稳定性测试基础研究.%Under the large data environment, need to get information in the variety, large number of diverse data, which makes in a big data environment put forward higher requirements on the stability of the network. This article wil network stability test of basic research under the big data.

  16. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.

    2015-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.

  17. Regular Caratheodory-type selectors under no convexity assumptions

    NARCIS (Netherlands)

    Chistyakov, VV

    2005-01-01

    We prove the existence of Caratheodory-type selectors (that is, measurable in the first variable and having certain regularity properties like Lipschitz continuity, absolute continuity or bounded variation in the second variable) for multifunctions mapping the product of a measurable space and an in

  18. Testing the habituation assumption underlying models of parasitoid foraging behavior

    NARCIS (Netherlands)

    Abram, Paul K.; Cusumano, Antonino; Abram, Katrina; Colazza, Stefano; Peri, Ezio

    2017-01-01

    Background. Habituation, a form of non-associative learning, has several well-defined characteristics that apply to a wide range of physiological and behavioral responses in many organisms. In classic patch time allocation models, habituation is considered to be a major mechanistic component of para

  19. Basic electrotechnology

    CERN Document Server

    Ashen, R A

    2013-01-01

    BASIC Electrotechnology discusses the applications of Beginner's All-purpose Symbolic Instruction Code (BASIC) in engineering, particularly in solving electrotechnology-related problems. The book is comprised of six chapters that cover several topics relevant to BASIC and electrotechnology. Chapter 1 provides an introduction to BASIC, and Chapter 2 talks about the use of complex numbers in a.c. circuit analysis. Chapter 3 covers linear circuit analysis with d.c. and sinusoidal a.c. supplies. The book also discusses the elementary magnetic circuit theory. The theory and performance of two windi

  20. ERT basics

    Energy Technology Data Exchange (ETDEWEB)

    Butters, M. [MBC Energy and Environment, Ottawa, ON (Canada)]|[National Round Table on the Environment and the Economy, Ottawa, ON (Canada)

    2002-07-01

    ERT is an economic instrument which helps power companies achieve emission reduction compliance cost-effectively. This paper presents the basics of ERT with reference to trading concepts, types of systems and types of emissions. The paper also describes the state of the Canadian energy market regarding greenhouse gases (GHG), nitrogen oxides, sulphur dioxide and volatile organic compounds. The association between ERT and district energy is also explained. By 2010, the global market for GHG trading is expected to be worth $10 billion to $3 trillion U.S. Canada has committed to reducing its GHG to 6 per cent below 1990 levels by 2012, but currently emits 705 Mt per year. This is expected to increase to 770 Mt by 2010. Therefore, in order to meet its commitment, GHGs will have to be reduced 200 Mt per year. Canada is currently considering ratifying the Kyoto agreement and a trading system is being developed. There are several abatement technologies currently under consideration for district energy systems, including adding scrubbers, improving efficiency, and fuel switching. The marginal cost of abatement was also discussed. tabs., figs.

  1. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and facilit

  2. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  3. Culturally Biased Assumptions in Counseling Psychology

    Science.gov (United States)

    Pedersen, Paul B.

    2003-01-01

    Eight clusters of culturally biased assumptions are identified for further discussion from Leong and Ponterotto's (2003) article. The presence of cultural bias demonstrates that cultural bias is so robust and pervasive that is permeates the profession of counseling psychology, even including those articles that effectively attack cultural bias…

  4. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    2008-01-01

    in different countries operating different economic and social models. Characterizing CMMI in this way opens the door to another question: are there other sets of organisational and management assumptions which would be better suited to other types of organisations operating in other cultural contexts?...

  5. Brain Basics

    Medline Plus

    Full Text Available ... Basics will introduce you to some of this science, such as: How the brain develops How genes and the environment affect the brain The basic structure of the brain How different parts of the brain communicate and work with each other How changes in the brain ...

  6. Spoilage of light (PSE-like) and dark turkey meat under aerobic or modified atmosphere package: microbial indicators and their relationship with total volatile basic nitrogen.

    Science.gov (United States)

    Fraqueza, M J; Ferreira, M C; Barreto, A S

    2008-01-01

    1. The aim of this work was to evaluate the shelf life of turkey meat from different colour categories (Pale, Soft and Exudative (PSE)-like), intermediate and dark), packaged under aerobic or modified atmosphere (MAP) conditions; also to establish a relationship between microbial quality and total volatile basic nitrogen (TVB-N), evaluating its capacity for shelf life determination. 2. Breasts were selected according to luminance (L*) and pH(24): L >/= 51 and pH 5.8 for dark colour. Sliced meat was packaged under aerobic or MAP conditions with 50% N(2) and 50% CO(2), then stored in the dark at 0 +/- 1 degrees C for periods of 12 or 25 d. Meat under aerobic conditions was evaluated for microbiological characteristics and TVB-N on d 0, 5 and 12. This evaluation was extended to include d 19 and 25 when samples were under MAP conditions. 3. The dark meat group after 12 d of storage in aerobiosis presented significantly higher plate counts of aerobic mesophilic, psychrotrophic micro-organisms and higher TVB-N than other meat colour categories. The shelf life of turkey meat under MAP was one week longer for intermediate and light colour meat (20 d) than for dark meat. TVB-N values of 20 to 30 mg NH(3)/100 g turkey meat correspond to advanced spoilage stages. We proposed 14 mg NH(3)/100 g as the limit of freshness acceptability for turkey meat. 4. TVB-N was an indicator of turkey meat microbial spoilage but was not a suitable early predictor for microbial spoilage and in particular for turkey meat stored under MAP conditions because counts of micro-organisms were moderately correlated (Pseudomonas spp. and Enterobacteriaceae) with this index, as they were inhibited by MAP gas mixture and storage temperature used in the present study.

  7. Basic Concurrency Theory

    DEFF Research Database (Denmark)

    Løvengreen, Hans Henrik

    2002-01-01

    In this set of notes, we present some of the basic theory underlying the discipline of programming with concurrent processes/threads. The notes are intended to supplement a standard textbook on concurrent programming.......In this set of notes, we present some of the basic theory underlying the discipline of programming with concurrent processes/threads. The notes are intended to supplement a standard textbook on concurrent programming....

  8. Problems and Guidelines of Strategy Implementation in Basic Educational Institutions under the Supervision of KhonKaen Primary Educational Service Area Office 4

    Directory of Open Access Journals (Sweden)

    Sasiwan Tonkanya

    2016-09-01

    Full Text Available The research aimed to 1 study problems of strategy implementation in basic educational institutions under Khonkaen Primary Educational Service Area Office 4 ; and 2 propose the guidelines for strategy implementation in basic educational institutions under Khonkaen Primary Educational Service Area Office 4. The study was carried out in 2 phases. In phase 1, it focused on the study and analysis of the strategic implementation problems and phase 2 studied the best practice schools. The informants for the interview in phase 1 comprised 6 school administrators and teachers who were involved in strategy implementation from small-sized, medium-sized, and large-sized schools. They were selected by the use of purposive sampling technique. The population in the study of the strategic implementation problems in basic educational institutions in phase 1 consisted of 543 school administrators and teachers who were involved in strategy implementation from 181 schools under Khonkaen Primary Educational Service Area Office 4 in academic year 2014. The study samples were 217 school administrators and teachers who were involved in strategy implementation from small-sized, medium-sized, and large-sized schools under Khonkaen Primary Educational Service Area Office 4. The samples were selected by the use of stratified sampling technique. The informants of the phase 2 study were 6 school administrators and teachers who were involved in strategy implementation from small-sized, medium-sized, and large-sized best practice schools obtained from purposive sampling technique. The research instruments used for data collection consisted of 2 sets of questionnaires. The Set 1 questionnaire was the 5-point Likert scale on the levels of the problems in implementation with item discrimination at 0.60 – 1.00 and reliability of the whole questionnaire at .9359. The questionnaire contained 3 parts with 65 items. The Set 2 questionnaire comprised 2 parts with 10 items regarding

  9. Basic hydraulics

    CERN Document Server

    Smith, P D

    1982-01-01

    BASIC Hydraulics aims to help students both to become proficient in the BASIC programming language by actually using the language in an important field of engineering and to use computing as a means of mastering the subject of hydraulics. The book begins with a summary of the technique of computing in BASIC together with comments and listing of the main commands and statements. Subsequent chapters introduce the fundamental concepts and appropriate governing equations. Topics covered include principles of fluid mechanics; flow in pipes, pipe networks and open channels; hydraulic machinery;

  10. On distributional assumptions and whitened cosine similarities

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Recently, an interpretation of the whitened cosine similarity measure as a Bayes decision rule was proposed (C. Liu, "The Bayes Decision Rule Induced Similarity Measures,'' IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 29, no. 6, pp. 1086-1090, June 2007. This communication makes th...... the observation that some of the distributional assumptions made to derive this measure are very restrictive and, considered simultaneously, even inconsistent....

  11. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  12. Brain Basics

    Medline Plus

    Full Text Available ... depression. The Growing Brain Inside the Brain: Neurons & Neural Circuits Neurons are the basic working unit of ... but sometimes give rise to disabilities or diseases. neural circuit —A network of neurons and their interconnections. ...

  13. Brain Basics

    Medline Plus

    Full Text Available ... Real Life Brain Basics in Real Life—How Depression affects the Brain Meet Sarah Sarah is a ... blues" from time to time. In contrast, major depression is a serious disorder that lasts for weeks. ...

  14. Schizophrenia Basics

    Science.gov (United States)

    ... I know with schizophrenia? For More Information Share Schizophrenia Basics Download PDF Download ePub Order a free hardcopy What is schizophrenia? Schizophrenia is a serious mental disorder that affects ...

  15. Brain Basics

    Medline Plus

    Full Text Available ... News About Us Home > Health & Education > Educational Resources Brain Basics Introduction The Growing Brain The Working Brain ... to mental disorders, such as depression. The Growing Brain Inside the Brain: Neurons & Neural Circuits Neurons are ...

  16. Brain Basics

    Science.gov (United States)

    ... News About Us Home > Health & Education > Educational Resources Brain Basics Introduction The Growing Brain The Working Brain ... to mental disorders, such as depression. The Growing Brain Inside the Brain: Neurons & Neural Circuits Neurons are ...

  17. Fluoridation Basics

    Science.gov (United States)

    ... Page Basic Information About Fluoride Benefits: Strong Teeth History of Fluoride in Water Cost: Saves Money, Saves Teeth Fluoride in the Water Today The mineral fluoride occurs naturally on earth and is released from rocks into the soil, ...

  18. Basic Finance

    Science.gov (United States)

    Vittek, J. F.

    1972-01-01

    A discussion of the basic measures of corporate financial strength, and the sources of the information is reported. Considered are: balance sheet, income statement, funds and cash flow, and financial ratios.

  19. Brain Basics

    Medline Plus

    Full Text Available ... science, such as: How the brain develops How genes and the environment affect the brain The basic ... that with brain development in people mental disorders. Genes and environmental cues both help to direct this ...

  20. Brain Basics

    Medline Plus

    Full Text Available ... in the anatomy, physiology, and chemistry of the nervous system. When the brain cannot effectively coordinate the billions ... the basic working unit of the brain and nervous system. These cells are highly specialized for the function ...

  1. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  2. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...... literature, and few contributions represent the three remaining discourses, which unjustifiably leaves out issues that research could and most probably should investigate. In order to highlight research potentials, limitations, and underlying assumptions of each discourse, we develop four IT PPM metaphors......: (1) IT PPM as the top management marketplace, (2) IT PPM as the cause of social dilemmas at the lower organizational levels (3) IT PPM as polity between different organizational interests, (4) IT PPM as power relations that suppress creativity and diversity. Our metaphors can be used by practitioners...

  3. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  4. Keeping Things Simple: Why the Human Development Index Should Not Diverge from Its Equal Weights Assumption

    Science.gov (United States)

    Stapleton, Lee M.; Garrod, Guy D.

    2007-01-01

    Using a range of statistical criteria rooted in Information Theory we show that there is little justification for relaxing the equal weights assumption underlying the United Nation's Human Development Index (HDI) even if the true HDI diverges significantly from this assumption. Put differently, the additional model complexity that unequal weights…

  5. Teaching Lessons in Exclusion: Researchers' Assumptions and the Ideology of Normality

    Science.gov (United States)

    Benincasa, Luciana

    2012-01-01

    Filling in a research questionnaire means coming into contact with the researchers' assumptions. In this sense filling in a questionnaire may be described as a learning situation. In this paper I carry out discourse analysis of selected questionnaire items from a number of studies, in order to highlight underlying values and assumptions, and their…

  6. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  7. Closed World Assumption for Disjunctive Reasoning

    Institute of Scientific and Technical Information of China (English)

    WANG Kewen; ZHOU Lizhu

    2001-01-01

    In this paper, the relationship between argumentation and closed world reasoning for disjunctive information is studied. In particular, the authors propose a simple and intuitive generalization of the closed world assumption (CWA) for general disjunctive deductive databases (with default negation). This semantics,called DCWA, allows a natural argumentation-based interpretation and can be used to represent reasoning for disjunctive information. We compare DCWA with GCWA and prove that DCWA extends Minker's GCWA to the class of disjunctive databases with default negation. Also we compare our semantics with some related approaches.In addition, the computational complexity of DCWA is investigated.

  8. Basic electronics

    CERN Document Server

    Holbrook, Harold D

    1971-01-01

    Basic Electronics is an elementary text designed for basic instruction in electricity and electronics. It gives emphasis on electronic emission and the vacuum tube and shows transistor circuits in parallel with electron tube circuits. This book also demonstrates how the transistor merely replaces the tube, with proper change of circuit constants as required. Many problems are presented at the end of each chapter. This book is comprised of 17 chapters and opens with an overview of electron theory, followed by a discussion on resistance, inductance, and capacitance, along with their effects on t

  9. Consenting to Heteronormativity: Assumptions in Biomedical Research

    NARCIS (Netherlands)

    Cottingham, M.D.; Fisher, J.A.

    2015-01-01

    The process of informed consent is fundamental to basic scientific research with human subjects. As one aspect of the scientific enterprise, clinical drug trials rely on informed consent documents to safeguard the ethical treatment of trial participants. This paper explores the role of heteronormati

  10. Ethanol Basics

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-01-30

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  11. Body Basics

    Science.gov (United States)

    ... more about how the body works, what basic human anatomy is, and what happens when parts of the body don't function properly. Blood Bones, Muscles, and Joints Brain and Nervous System Digestive System Endocrine System Eyes Female Reproductive System Heart and Circulatory System Immune ...

  12. Brain Basics

    Medline Plus

    Full Text Available ... such as depression. The Growing Brain Inside the Brain: Neurons & Neural Circuits Neurons are the basic working unit ... final destination. Chemical signals from other cells guide neurons in forming various brain structures. Neighboring neurons make connections with each other ...

  13. Insulin Basics

    Science.gov (United States)

    ... Honor Become a Member En Español Type 1 Type 2 About Us Online Community Meal Planning Sign In Search: Search More Sites Search ≡ Are You At Risk? Diabetes Basics Living with Diabetes Food & Fitness In My ... Diabetes and Learning About Prediabetes Type 2 Diabetes Risk Test Lower Your Risk Healthy ...

  14. AN EFFICIENT BIT COMMITMENT SCHEME BASED ON FACTORING ASSUMPTION

    Institute of Scientific and Technical Information of China (English)

    Zhong Ming; Yang Yixian

    2001-01-01

    Recently, many bit commitment schemes have been presented. This paper presents a new practical bit commitment scheme based on Schnorr's one-time knowledge proof scheme,where the use of cut-and-choose method and many random exam candidates in the protocols are replaced by a single challenge number. Therefore the proposed bit commitment scheme is more efficient and practical than the previous schemes In addition, the security of the proposed scheme under factoring assumption is proved, thus the cryptographic basis of the proposed scheme is clarified.

  15. Inference and Assumption in Historical Seismology

    Science.gov (United States)

    Musson, R. M. W.

    The principal aim in studies of historical earthquakes is usually to be able to derive parameters for past earthquakes from macroseismic or other data and thus extend back in time parametric earthquake catalogues, often with improved seismic hazard studies as the ultimate goal. In cases of relatively recent historical earthquakes, for example, those of the 18th and 19th centuries, it is often the case that there is such an abundance of available macroseismic data that estimating earthquake parameters is relatively straightforward. For earlier historical periods, especially medieval and earlier, and also for areas where settlement or documentation are sparse, the situation is much harder. The seismologist often finds that he has only a few data points (or even one) for an earthquake that nevertheless appears to be regionally significant.In such cases, it is natural that the investigator will attempt to make the most of the available data, expanding it by making working assumptions, and from these deriving conclusions by inference (i.e. the process of proceeding logically from some premise). This can be seen in a number of existing studies; in some cases extremely slight data are so magnified by the use of inference that one must regard the results as tentative in the extreme. Two main types of inference can be distinguished. The first type is inference from documentation. This is where assumptions are made such as: the absence of a report of the earthquake from this monastic chronicle indicates that at this locality the earthquake was not felt. The second type is inference from seismicity. Here one deals with arguments such as all recent earthquakes felt at town X are events occurring in seismic zone Y, therefore this ancient earthquake which is only reported at town X probably also occurred in this zone.

  16. Wavelet basics

    CERN Document Server

    Chan, Y T

    1995-01-01

    Since the study of wavelets is a relatively new area, much of the research coming from mathematicians, most of the literature uses terminology, concepts and proofs that may, at times, be difficult and intimidating for the engineer. Wavelet Basics has therefore been written as an introductory book for scientists and engineers. The mathematical presentation has been kept simple, the concepts being presented in elaborate detail in a terminology that engineers will find familiar. Difficult ideas are illustrated with examples which will also aid in the development of an intuitive insight. Chapter 1 reviews the basics of signal transformation and discusses the concepts of duals and frames. Chapter 2 introduces the wavelet transform, contrasts it with the short-time Fourier transform and clarifies the names of the different types of wavelet transforms. Chapter 3 links multiresolution analysis, orthonormal wavelets and the design of digital filters. Chapter 4 gives a tour d'horizon of topics of current interest: wave...

  17. Validity of the Michaelis-Menten equation--steady-state or reactant stationary assumption: that is the question.

    Science.gov (United States)

    Schnell, Santiago

    2014-01-01

    The Michaelis-Menten equation is generally used to estimate the kinetic parameters, V and K(M), when the steady-state assumption is valid. Following a brief overview of the derivation of the Michaelis-Menten equation for the single-enzyme, single-substrate reaction, a critical review of the criteria for validity of the steady-state assumption is presented. The application of the steady-state assumption makes the implicit assumption that there is an initial transient during which the substrate concentration remains approximately constant, equal to the initial substrate concentration, while the enzyme-substrate complex concentration builds up. This implicit assumption is known as the reactant stationary assumption. This review presents evidence showing that the reactant stationary assumption is distinct from and independent of the steady-state assumption. Contrary to the widely believed notion that the Michaelis-Menten equation can always be applied under the steady-state assumption, the reactant stationary assumption is truly the necessary condition for validity of the Michaelis-Menten equation to estimate kinetic parameters. Therefore, the application of the Michaelis-Menten equation only leads to accurate estimation of kinetic parameters when it is used under experimental conditions meeting the reactant stationary assumption. The criterion for validity of the reactant stationary assumption does not require the restrictive condition of choosing a substrate concentration that is much higher than the enzyme concentration in initial rate experiments.

  18. Ethnic identity, identity coherence, and psychological functioning: testing basic assumptions of the developmental model.

    Science.gov (United States)

    Syed, Moin; Juang, Linda P

    2014-04-01

    The purpose of the present study was to test three fundamental theoretical propositions from Phinney's (1990) developmental model about the relations among ethnic identity, identity coherence, and psychological functioning: (a) ethnic identity is more strongly related to identity coherence for ethnic minorities than for Whites; (b) ethnic identity is more strongly related to psychological functioning for ethnic minorities than for Whites; and (c) identity coherence mediates the association between ethnic identity and psychological functioning for ethnic minorities, but not for Whites. These hypotheses were tested in three independent samples of ethnically diverse youth. In general, we found weak to moderate support for these three hypotheses, suggesting that the theoretically proposed differences in ethnic identity between ethnic minorities and Whites may not be supported by data. Implications for theory and measurement of ethnic identity are discussed.

  19. Basic electronics

    CERN Document Server

    Tayal, DC

    2010-01-01

    The second edition of this book incorporates the comments and suggestions of my friends and students who have critically studied the first edition. In this edition the changes and additions have been made and subject matter has been rearranged at some places. The purpose of this text is to provide a comprehensive and up-to-date study of the principles of operation of solid state devices, their basic circuits and application of these circuits to various electronic systems, so that it can serve as a standard text not only for universities and colleges but also for technical institutes. This book

  20. Regression Basics

    CERN Document Server

    Kahane, Leo H

    2007-01-01

    Using a friendly, nontechnical approach, the Second Edition of Regression Basics introduces readers to the fundamentals of regression. Accessible to anyone with an introductory statistics background, this book builds from a simple two-variable model to a model of greater complexity. Author Leo H. Kahane weaves four engaging examples throughout the text to illustrate not only the techniques of regression but also how this empirical tool can be applied in creative ways to consider a broad array of topics. New to the Second Edition Offers greater coverage of simple panel-data estimation:

  1. Experimental Investigation on the Basic Law of the Fracture Spatial Morphology for Water Pressure Blasting in a Drillhole Under True Triaxial Stress

    Science.gov (United States)

    Huang, Bingxiang; Li, Pengfeng

    2015-07-01

    The present literature on the morphology of water pressure blasting fractures in drillholes is not sufficient and does not take triaxial confining stress into account. Because the spatial morphology of water pressure blasting fractures in drillholes is not clear, the operations lack an exact basis. Using a large true triaxial water pressure blasting experimental system and an acoustic emission 3-D positioning system, water pressure blasting experiments on cement mortar test blocks (300 mm × 300 mm × 300 mm) were conducted to study the associated basic law of the fracture spatial morphology. The experimental results show that water pressure blasting does not always generate bubble pulsation. After water pressure blasting under true triaxial stress, a crushed compressive zone and a blasting fracture zone are formed from the inside, with the blasting section of the naked drillhole as the center, to the outside. The shape of the outer edges of the two zones is ellipsoidal. The range of the blasting fracture is large in the radial direction of the drillhole, where the surrounding pressure is large, i.e., the range of the blasting fracture in the drillhole radial cross-section is approximately ellipsoidal. The rock near the drillhole wall is affected by a tensile stress wave caused by the test block boundary reflection, resulting in more flake fractures appearing in the fracturing crack surface in the drillhole axial direction and parallel to the boundary surface. The flake fracture is thin, presenting a small-range flake fracture. The spatial morphology of the water pressure blasting fracture in the drillhole along the axial direction is similar to a wide-mouth Chinese bottle: the crack extent is large near the drillhole orifice, gradually narrows inward along the drillhole axial direction, and then increases into an approximate ellipsoid in the internal naked blasting section. Based on the causes of the crack generation, the blasting cracks are divided into three

  2. Anisotropic hydrodynamics -- basic concepts

    CERN Document Server

    Florkowski, Wojciech; Ryblewski, Radoslaw; Strickland, Michael

    2013-01-01

    Due to the rapid longitudinal expansion of the quark-gluon plasma created in relativistic heavy ion collisions, potentially large local rest frame momentum-space anisotropies are generated. The magnitude of these momentum-space anisotropies can be so large as to violate the central assumption of canonical viscous hydrodynamical treatments which linearize around an isotropic background. In order to better describe the early-time dynamics of the quark gluon plasma, one can consider instead expanding around a locally anisotropic background which results in a dynamical framework called anisotropic hydrodynamics. In this proceedings contribution we review the basic concepts of the anisotropic hydrodynamics framework presenting viewpoints from both the phenomenological and microscopic points of view.

  3. Post-traumatic stress and world assumptions: the effects of religious coping.

    Science.gov (United States)

    Zukerman, Gil; Korn, Liat

    2014-12-01

    Religiosity has been shown to moderate the negative effects of traumatic event experiences. The current study was designed to examine the relationship between post-traumatic stress (PTS) following traumatic event exposure; world assumptions defined as basic cognitive schemas regarding the world; and self and religious coping conceptualized as drawing on religious beliefs and practices for understanding and dealing with life stressors. This study examined 777 Israeli undergraduate students who completed several questionnaires which sampled individual world assumptions and religious coping in addition to measuring PTS, as manifested by the PTSD check list. Results indicate that positive religious coping was significantly associated with more positive world assumptions, while negative religious coping was significantly associated with more negative world assumptions. Additionally, negative world assumptions were significantly associated with more avoidance symptoms, while reporting higher rates of traumatic event exposure was significantly associated with more hyper-arousal. These findings suggest that religious-related cognitive schemas directly affect world assumptions by creating protective shields that may prevent the negative effects of confronting an extreme negative experience.

  4. Roy's specific life values and the philosophical assumption of humanism.

    Science.gov (United States)

    Hanna, Debra R

    2013-01-01

    Roy's philosophical assumption of humanism, which is shaped by the veritivity assumption, is considered in terms of her specific life values and in contrast to the contemporary view of humanism. Like veritivity, Roy's philosophical assumption of humanism unites a theocentric focus with anthropological values. Roy's perspective enriches the mainly secular, anthropocentric assumption. In this manuscript, the basis for Roy's perspective of humanism will be discussed so that readers will be able to use the Roy adaptation model in an authentic manner.

  5. Time derivatives of the spectrum: Relaxing the stationarity assumption

    Science.gov (United States)

    Prieto, G. A.; Thomson, D. J.; Vernon, F. L.

    2005-12-01

    Spectrum analysis of seismic waveforms has played a significant role towards the understanding of multiple aspects of Earth structure and earthquake source physics. In recent years the multitaper spectrum estimation approach (Thomson, 1982) has been applied to geophysical problems providing not only reliable estimates of the spectrum, but also estimates of spectral uncertainties (Thomson and Chave, 1991). However, these improved spectral estimates were developed under the assumption of local stationarity and provide an incomplete description of the observed process. It is obvious that due to the intrinsic attenuation of the Earth, the amplitudes, and thus the frequency contents are changing with time as waves pass through a seismic station. There have been incredible improvements in different techniques to analyze non-stationary signals, including wavelet decomposition, Wigner-Ville spectrum and the dual-frequency spectrum. We apply one of the recently developed techniques, the Quadratic Inverse Theory (Thomson, 1990, 1994), combined with the multitaper technique to look at the time derivatives of the spectrum. If the spectrum is reasonably white in a certain bandwidth, using QI theory, we can estimate the derivatives of the spectrum at each frequency. We test synthetic signals to corroborate the approach and apply it the records of small earthquakes at local distances. This is a first approach to try and combine the classical spectrum analysis without the assumption of stationarity that is generally taken.

  6. DDH-like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2011-01-01

    generalized to use instead d-DDH, and we show in the generic group model that d-DDH is harder than DDH. This means that virtually any application of DDH can now be realized with the same (amortized) efficiency, but under a potentially weaker assumption. On the negative side, we also show that d-DDH, just like...... DDH, is easy in bilinear groups. This motivates our suggestion of a different type of assumption, the d-vector DDH problems (VDDH), which are based on f(X)= X^d, but with a twist to avoid the problems with reducible polynomials. We show in the generic group model that VDDH is hard in bilinear groups...... and that in fact the problems become harder with increasing d and hence form an infinite hierarchy. We show that hardness of VDDH implies CCA-secure encryption, efficient Naor-Reingold style pseudorandom functions, and auxiliary input secure encryption, a strong form of leakage resilience. This can be seen...

  7. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  8. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, Rampal S.; Alonso, David; McKane, Alan J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the a

  9. Inflation Basics

    Energy Technology Data Exchange (ETDEWEB)

    Green, Dan [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2014-03-01

    inflation since metrical fluctuations, both scalar and tensor, are also produced in inflationary models. Thus, the time appears to be appropriate for a very basic and simple exposition of the inflationary model written from a particle physics perspective. Only the simplest scalar model will be explored because it is easy to understand and contains all the basic elements of the inflationary model.

  10. Inflation Basics

    Energy Technology Data Exchange (ETDEWEB)

    Green, Dan [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2014-03-01

    inflation since metrical fluctuations, both scalar and tensor, are also produced in inflationary models. Thus, the time appears to be appropriate for a very basic and simple exposition of the inflationary model written from a particle physics perspective. Only the simplest scalar model will be explored because it is easy to understand and contains all the basic elements of the inflationary model.

  11. Finite Element Simulations to Explore Assumptions in Kolsky Bar Experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Crum, Justin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-05

    The chief purpose of this project has been to develop a set of finite element models that attempt to explore some of the assumptions in the experimental set-up and data reduction of the Kolsky bar experiment. In brief, the Kolsky bar, sometimes referred to as the split Hopkinson pressure bar, is an experimental apparatus used to study the mechanical properties of materials at high strain rates. Kolsky bars can be constructed to conduct experiments in tension or compression, both of which are studied in this paper. The basic operation of the tension Kolsky bar is as follows: compressed air is inserted into the barrel that contains the striker; the striker accelerates towards the left and strikes the left end of the barrel producing a tensile stress wave that propogates first through the barrel and then down the incident bar, into the specimen, and finally the transmission bar. In the compression case, the striker instead travels to the right and impacts the incident bar directly. As the stress wave travels through an interface (e.g., the incident bar to specimen connection), a portion of the pulse is transmitted and the rest reflected. The incident pulse, as well as the transmitted and reflected pulses are picked up by two strain gauges installed on the incident and transmitted bars as shown. By interpreting the data acquired by these strain gauges, the stress/strain behavior of the specimen can be determined.

  12. Assumption tests regarding the ‘narrow’ rectangles dimensions of the open thin wall sections

    Science.gov (United States)

    Oanta, E.; Panait, C.; Sabau, A.; Barhalescu, M.; Dascalescu, A. E.

    2016-08-01

    Computer based analytic models that use the strength of materials theory are inheriting the accuracy given by the basic simplifying hypotheses. The according assumptions were rationally conceived hundreds of years ago in an age when there was no computing instrument, therefore the minimization of the necessary volume of calculi was an important requirement. An initial study was an attempt to evaluate how ‘thin’ may be the walls of an open section in order to have accurate results using the analytic calculus method. In this initial study there was compared the calculus of the rectangular sections loaded by twisting moments vs. a narrow section under the same load. Being compared analytic methods applied for a simple shape section, a more thorough study was required. In this way, we consider a thin wall open section loaded by a twisting moment, section which is discretized in ‘narrow’ rectangles. The ratio of the sides of the ‘narrow’ rectangles is the variable of the study. We compare the results of the finite element analysis to the results of the analytic method. The conclusions are important for the development of computer based analytic models which use parametrized sections for which different sets of calculus relations may be used.

  13. Exploring gravitational statistics not based on quantum dynamical assumptions

    CERN Document Server

    Mandrin, P A

    2016-01-01

    Despite considerable progress in several approaches to quantum gravity, there remain uncertainties on the conceptual level. One issue concerns the different roles played by space and time in the canonical quantum formalism. This issue occurs because the Hamilton-Jacobi dynamics is being quantised. The question then arises whether additional physically relevant states could exist which cannot be represented in the canonical form or as a partition function. For this reason, the author has explored a statistical approach (NDA) which is not based on quantum dynamical assumptions and does not require space-time splitting boundary conditions either. For dimension 3+1 and under thermal equilibrium, NDA simplifies to a path integral model. However, the general case of NDA cannot be written as a partition function. As a test of NDA, one recovers general relativity at low curvature and quantum field theory in the flat space-time approximation. Related paper: arxiv:1505.03719.

  14. Decision-Theoretic Planning: Structural Assumptions and Computational Leverage

    CERN Document Server

    Boutilier, C; Hanks, S; 10.1613/jair.575

    2011-01-01

    Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives adopted in these areas often differ in substantial ways, many planning problems of interest to researchers in these fields can be modeled as Markov decision processes (MDPs) and analyzed using the techniques of decision theory. This paper presents an overview and synthesis of MDP-related methods, showing how they provide a unifying framework for modeling many classes of planning problems studied in AI. It also describes structural properties of MDPs that, when exhibited by particular classes of problems, can be exploited in the construction of optimal or approximately optimal policies or plans. Planning problems commonly possess structure in the reward and value functions used to describe performance criteria, in the...

  15. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    Science.gov (United States)

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment.

  16. On the Basic Equations of the Magnetostatics

    Directory of Open Access Journals (Sweden)

    A. M. Makarov

    2016-01-01

    Full Text Available The paper studies the physical relationship between the main objects of the magnetic field in a continuous medium with magnetization effects. Consistently considers the following hypotheses: a hypothesis of the primacy and the physical reality of the magnetization vector field environment, a similar hypothesis about the real existence of Ampere currents (molecular currents, magnetization currents, a hypothesis of a magnetic dipole moment of the medium volume element in view of bulk density of electric currents in this volume. A more rigorous derivation of the basic differential equations of magnetostatics from the Biot-Savart-Laplace equation is proposed.The well-known works justifying basic equations of magnetostatics use a procedure wherein when proving the local differential ratio is used a transformation of some volume integral to the surface integral bounding this volume. Thus, there is a specific way to select a closed surface that is either a surface in a vacuum (beyond the medium volume under consideration, or a surface of the conductor (a normal component of currents to the surface, here, becomes zero. In the paper the control surface is arbitrarily carried out within the volume of the medium under consideration, thereby leading to the mathematically sound result.The paper analyzes the hypotheses listed above. The main feature of analysis is a succesively using concept of bilateralism surface bounding the medium volume of the arbitrary finite dimensions. The analysis allowed us to reveal the physical adequacy of the considered hypotheses, derive the appropriate differential equations for the basic vector fields of magnetostatics and obtain a new condition. The resulting condition for the closedness of magnetization currents is recorded in entire compliance with the well-known Gauss electrostatic law, which avoids the need for additional, but not always reasonable assumptions.

  17. Visual Basic环境下Excel报表功能的实现%Implementation of Reporting form of Excel under Visual Basic Environment

    Institute of Scientific and Technical Information of China (English)

    李爱民

    2002-01-01

    本文介绍了在Visual Basic 6.0环境下利用Excel实现报表功能的方法.根据这一方法开发的报表模块,界面友好、操作方便,而且缩短了开发时间,提高了软件质量.

  18. On the Impact of the Dutch Educational Supervision Act : Analyzing Assumptions Concerning the Inspection of Primary Education

    NARCIS (Netherlands)

    Ehren, Melanie C. M.; Leeuw, Frans L.; Scheerens, Jaap

    2001-01-01

    This article uses a policy scientific approach to reconstruct assumptions underlying the Dutch Educational Supervision Act.We showan example of howto reconstruct and evaluate a program theory that is based on legislation of inspection. The assumptions explain how inspection leads to school improveme

  19. Examining Assumptions and Limitations of Research on the Effects of Emerging Technologies for Teaching and Learning in Higher Education

    Science.gov (United States)

    Kirkwood, Adrian; Price, Linda

    2013-01-01

    This paper examines assumptions and beliefs underpinning research into educational technology. It critically reviews some approaches used to investigate the impact of technologies for teaching and learning. It focuses on comparative studies, performance comparisons and attitudinal studies to illustrate how under-examined assumptions lead to…

  20. [Intensity of apoptotic processes, aconitate hydratase activity and citrate level in patients with type 2 diabetes mellitus complicated steatohepatitis under application of epifamin at basic therapy].

    Science.gov (United States)

    Popov, S S; Pashkov, A N; Agarkov, A A; Shulgin, K K

    2015-01-01

    DNA fragmentation, caspase-1 and caspase-3, aconitate hydratase (AH) activities, and citrate content have been investigated in the blood of patients with type 2 diabetes mellitus complicated by steatohepatitis. These indicators of apoptotic processes intensity and oxidative stress development were estimated after basic treatment and a combined therapy including epifamin. Before treatment DNA fragmentation blood leukocytes, decrease of AH activity and increase of caspases activities in the serum of patients were detected. Treatment with epifamin provided more pronounced changes in the investigated parameters towards control values as compared to basis therapy. Epifamin caused a positive effect on the citrate content in the serum of patients. Epifamin inclusion to the basic therapy was accompanied by a more pronounced changes towards the normal values of such biochemical parameters as ALT, AST, b-lipoproteins, cholesterol, fasting glucose and postprandial glucose levels. All these changes may be obviously attributed to epifamin-induced correction of the melatonin level and manifestation of adaptogenic properties and antioxidant effects of the hormone.

  1. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  2. Special Theory of Relativity without special assumptions and tachyonic motion

    Directory of Open Access Journals (Sweden)

    E. Kapuścik

    2010-01-01

    Full Text Available The most general form of transformations of space-time coordinates in Special Theory of Relativity based solely on physical assumptions is described. Only the linearity of space-time transformations and the constancy of the speed of light are used as assumptions. The application to tachyonic motion is indicated.

  3. Exposing Trust Assumptions in Distributed Policy Enforcement (Briefing Charts)

    Science.gov (United States)

    2016-06-21

    Coordinated defenses appear to be feasible • Writing policies from scratch is hard – Exposing assumptions requires people to think about what assumptions... critical capabilities as: – Adaptation to dynamic service availability – Complex situational dynamics (e.g., differentiating between bot-net and

  4. Assumptions and Axioms: Mathematical Structures to Describe the Physics of Rigid Bodies

    CERN Document Server

    Butler, Philip H; Renaud, Peter F

    2010-01-01

    This paper challenges some of the common assumptions underlying the mathematics used to describe the physical world. We start by reviewing many of the assumptions underlying the concepts of real, physical, rigid bodies and the translational and rotational properties of such rigid bodies. Nearly all elementary and advanced texts make physical assumptions that are subtly different from ours, and as a result we develop a mathematical description that is subtly different from the standard mathematical structure. Using the homogeneity and isotropy of space, we investigate the translational and rotational features of rigid bodies in two and three dimensions. We find that the concept of rigid bodies and the concept of the homogeneity of space are intrinsically linked. The geometric study of rotations of rigid objects leads to a geometric product relationship for lines and vectors. By requiring this product to be both associative and to satisfy Pythagoras' theorem, we obtain a choice of Clifford algebras. We extend o...

  5. Ways for basic pension subsidy under new rural social endowment insurance system%新型农村社会养老保险基础养老金补助方式分析

    Institute of Scientific and Technical Information of China (English)

    汪东旭

    2012-01-01

      中国新型农村社会养老保险基础养老金实行地区差别性补助政策,对中西部按中央确定的基础养老金标准给予全额补助,对东部给予50%的补助。差别性补助政策总体上是合理的,但是在基础养老金补助额度上将全国划分为东部和中西部两个部分,忽略了各地区内部的差异,划分宏观,影响了财政补助的地区公平性。通过各省(直辖市)新农保财政负担能力定量研究,提出了完善地区差别性补助的政策建议。%  China has carried out the policy of providing different amounts of basic pension to different areas under the New Social Endowment Insurance System for Rural Residents. Citizens in the central and west China can receive the basic pension in full amount while those in the east 50% of the full amount according to the standards determined by the Central Committee of Chinese Communist Party. Generally speaking, the policy of providing different amounts of basic pension to different areas is rea-sonable. But since the amounts of the basic pension differ only between the east, central and west ar-eas, the differences among all Chinese local areas have been neglected and such basic pension provi-sion method appears to be over-macroscopic, leading to regional injustice in the respect of financial aid provision. Through the quantitative research on the financial affordability of all Chinese provinces (mu-nicipalities directly under the Central Government) under the New Social Endowment Insurance System, suggestions are put forward to perfect the policy of providing different amounts of basic pension to dif-ferent areas in China.

  6. What Is This Substance? What Makes It Different? Mapping Progression in Students' Assumptions about Chemical Identity

    Science.gov (United States)

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-01-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical…

  7. Local Large-Scale Structure and the Assumption of Homogeneity

    Science.gov (United States)

    Keenan, Ryan C.; Barger, Amy J.; Cowie, Lennox L.

    2016-10-01

    Our recent estimates of galaxy counts and the luminosity density in the near-infrared (Keenan et al. 2010, 2012) indicated that the local universe may be under-dense on radial scales of several hundred megaparsecs. Such a large-scale local under-density could introduce significant biases in the measurement and interpretation of cosmological observables, such as the inferred effects of dark energy on the rate of expansion. In Keenan et al. (2013), we measured the K-band luminosity density as a function of distance from us to test for such a local under-density. We made this measurement over the redshift range 0.01 0.07, we measure an increasing luminosity density that by z ~ 0.1 rises to a value of ~ 1.5 times higher than that measured locally. This implies that the stellar mass density follows a similar trend. Assuming that the underlying dark matter distribution is traced by this luminous matter, this suggests that the local mass density may be lower than the global mass density of the universe at an amplitude and on a scale that is sufficient to introduce significant biases into the measurement of basic cosmological observables. At least one study has shown that an under-density of roughly this amplitude and scale could resolve the apparent tension between direct local measurements of the Hubble constant and those inferred by Planck team. Other theoretical studies have concluded that such an under-density could account for what looks like an accelerating expansion, even when no dark energy is present.

  8. H-INFINITY-OPTIMIZATION WITHOUT ASSUMPTIONS ON FINITE OR INFINITE ZEROS

    NARCIS (Netherlands)

    SCHERER, C

    1992-01-01

    Explicit algebraic conditions are presented for the suboptimality of some parameter in the H(infinity)-optimization problem by output measurement control. Apart from two strict properness conditions, no artificial assumptions restrict the underlying system. In particular, the plant may have zeros on

  9. On the Necessary and Sufficient Assumptions for UC Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Nielsen, Jesper Buus; Orlandi, Claudio

    2010-01-01

    We study the necessary and sufficient assumptions for universally composable (UC) computation, both in terms of setup and computational assumptions. We look at the common reference string model, the uniform random string model and the key-registration authority model (KRA), and provide new results......-transfer protocol for the stand-alone model. Since a KRA where the secret keys can be computed from the public keys is useless, and some setup assumption is needed for UC secure computation, this establishes the best we could hope for the KRA model: any non-trivial KRA is sufficient for UC computation. •  We show...

  10. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  11. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    Swarup Mohalik; R Ramanujam

    2002-04-01

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the commitments offered by the other at that state. We model examples like reliable bit transmission and sequence transmission protocols in this framework and discuss how assumption-commitment structure facilitates compositional design of such protocols. We prove a decomposition theorem which states that every protocol specified globally as a finite state system can be decomposed into such an assumption compatible system. We also present a syntactic characterization of this class using top level parallel composition.

  12. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  13. US Intervention in Failed States: Bad Assumptions=Poor Outcomes

    Science.gov (United States)

    2002-01-01

    NATIONAL DEFENSE UNIVERSITY NATIONAL WAR COLLEGE STRATEGIC LOGIC ESSAY US INTERVENTION IN FAILED STATES: BAD ASSUMPTIONS = POOR ...2002 2. REPORT TYPE 3. DATES COVERED 00-00-2002 to 00-00-2002 4. TITLE AND SUBTITLE US Intervention in Failed States: Bad Assumptions= Poor ...country remains in the grip of poverty , natural disasters, and stagnation. Rwanda Rwanda, another small African country, is populated principally

  14. Homotopy Method for a General Multiobjective Programming Problem under Generalized Quasinormal Cone Condition

    Directory of Open Access Journals (Sweden)

    X. Zhao

    2012-01-01

    Full Text Available A combined interior point homotopy continuation method is proposed for solving general multiobjective programming problem. We prove the existence and convergence of a smooth homotopy path from almost any interior initial interior point to a solution of the KKT system under some basic assumptions.

  15. Stem Cell Basics

    Science.gov (United States)

    ... Tips Info Center Research Topics Federal Policy Glossary Stem Cell Information General Information Clinical Trials Funding Information Current ... Basics » Stem Cell Basics I. Back to top Stem Cell Basics I. Introduction: What are stem cells, and ...

  16. Basics of SCI Rehabilitation

    Science.gov (United States)

    ... Donate Experts \\ The Basics of Spinal Cord Injury Rehabilitation Topics Adult Injuries Spinal Cord Injury 101 Spinal ... Injury 101 The Basics of Spinal Cord Injury Rehabilitation The Basics of Spinal Cord Injury Rehabilitation Preventing ...

  17. What is a god? Metatheistic assumptions in Old Testament Yahwism(s

    Directory of Open Access Journals (Sweden)

    J W Gericke

    2006-09-01

    Full Text Available In this article, the author provides a prolegomena to further research attempting to answer a most undamental and basic question � much more so than what has thus far been the case in the disciplines of Old Testament theology and history of Israelite religion. It concerns the implicit assumptions in the Hebrew Bible�s discourse about the fundamental nature of deity. In other words, the question is not, �What is� YHWH like?� but rather , �what, according to the Old Testament texts, is a god?�

  18. ANALYSIS OF FINANCIAL DERIVATIVES BY MECHANICAL METHOD (Ⅱ)-BASIC EQUATION OF MARKET PRICE OF OPTION

    Institute of Scientific and Technical Information of China (English)

    云天铨

    2001-01-01

    The basic equation of market price of option is formulated by taking assumptions based on the characteristics of option and similar method for formulating basic equations in solid mechanics: hv0 (t) = m1 v0-1 (t) -n1 v0 (t) + F , where h, m1 ,n1 , F are constants.The main assumptions are: the ups and downs of market price v0 (t) are determined by supply and demand of the market;the factors, such as the strike price, tenor, volatility, etc.that affect on v0 (t) are demonstrated by using proportion or inverse proportion relation;opposite rules are used for purchasing and selling respectively. The solutions of the basic equation under various conditions are found and are compared with the solution vf (t) of the basic equation of market price of futures. Furthermore the one-one correspondence between vf and v0 (t) is proved by implicit function theorem, which forms the theoretic base for study of vf affecting on the market price of option v0(t) .

  19. Projecting the future of Canada's population: assumptions, implications, and policy

    Directory of Open Access Journals (Sweden)

    Beaujot, Roderic

    2003-01-01

    Full Text Available After considering the assumptions for fertility, mortality and international migration, this paper looks at implications of the evolving demographics for population growth, labour force, retirement, and population distribution. With the help of policies favouring gender equity and supporting families of various types, fertility in Canada could avoid the particularly low levels seen in some countries, and remain at levels closer to 1.6 births per woman. The prognosis in terms of both risk factors and treatment suggests further reductions in mortality toward a life expectancy of 85. On immigration, there are political interests for levels as high as 270,000 per year, while levels of 150,000 correspond to the long term post-war average. The future will see slower population growth, and due to migration more than natural increase. International migration of some 225,000 per year can enable Canada to avoid population decline, and sustain the size of the labour force, but all scenarios show much change in the relative size of the retired compared to the labour force population. According to the ratio of persons aged 20-64 to that aged 65 and over, there were seven persons at labour force ages per person at retirement age in 1951, compared to five in 2001 and probably less than 2.5 in 2051. Growth that is due to migration more so than natural increase will accentuate the urbanization trend and the unevenness of the population distribution over space. Past projections have under-projected the mortality improvements and their impact on the relative size of the population at older age groups. Policies regarding fertility, mortality and migration could be aimed at avoiding population decline and reducing the effect of aging, but there is lack of an institutional basis for policy that would seek to endogenize population.

  20. Reflections on the Revision of The Specifications of Basic Accounting Work under the Condition of Informationalization%信息化条件下修订《会计基础工作规范》的思考

    Institute of Scientific and Technical Information of China (English)

    梁朝仪; 刘伽伽; 白玉

    2014-01-01

    In The Specifications of Basic Accounting Work issued by Ministry of Finance in 1 996,some of the terms have already se-riously lagged behind the current status of the basic accounting work under the condition of informationalization age.This article pro-vides an analysis of the major changes of the basic accounting work in the new era from six aspects,namely,request of intensive fi-nancial management,responsibilities of the frontline operating department,the printing issue under the condition of accounting docu-ment and great increase of account books,requirement of one-click accounting statement for connected transaction,scope and mode of accounting supervision,and management of accounting e-record.In addition,policy suggestions are made for the revision of The Specifications of Basic Accounting Work under the new circumstances.%财政部1996年颁发的《会计基础工作规范》中,部分条款已严重滞后于信息化条件下会计基础工作实务的现状。文章从六个方面分析信息化条件下会计基础工作的主要变化,包括:财务集约化管理的要求;前端业务部门的职责;会计凭证与账簿数量大幅增加情况下的打印问题;一键式会计报表对关联交易处理的要求;会计监督的范围和方式;会计电子档案的管理等,并根据这些变化提出了修订《会计基础工作规范》的政策性建议。

  1. Back to the basics: Identifying and addressing underlying challenges in achieving high quality and relevant health statistics for indigenous populations in Canada.

    Science.gov (United States)

    Smylie, Janet; Firestone, Michelle

    Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations.

  2. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2016-01-01

    are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute...... over what constitutes proper assumptions—even in the absence of corroborating or falsifying empirical evidence. We also discuss how changing assumptions may drive future progress in the resource-based view.......A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...

  3. What is this Substance? What Makes it Different? Mapping Progression in Students' Assumptions about Chemical Identity

    Science.gov (United States)

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-09-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical analysis relies. We conceive chemical identity as a core crosscutting disciplinary concept which can bring coherence and relevance to chemistry curricula at all educational levels, primary through tertiary. Although chemical identity is not a concept explicitly addressed by traditional chemistry curricula, its understanding can be expected to evolve as students are asked to recognize different types of substances and explore their properties. The goal of this contribution is to characterize students' assumptions about factors that determine chemical identity and to map how core assumptions change with training in the discipline. Our work is based on the review and critical analysis of existing research findings on students' alternative conceptions in chemistry education, and historical and philosophical analyses of chemistry. From this perspective, our analysis contributes to the growing body of research in the area of learning progressions. In particular, it reveals areas in which our understanding of students' ideas about chemical identity is quite robust, but also highlights the existence of major knowledge gaps that should be filled in to better foster student understanding. We provide suggestions in this area and discuss implications for the teaching of chemistry.

  4. Basic principles of concrete structures

    CERN Document Server

    Gu, Xianglin; Zhou, Yong

    2016-01-01

    Based on the latest version of designing codes both for buildings and bridges (GB50010-2010 and JTG D62-2004), this book starts from steel and concrete materials, whose properties are very important to the mechanical behavior of concrete structural members. Step by step, analysis of reinforced and prestressed concrete members under basic loading types (tension, compression, flexure, shearing and torsion) and environmental actions are introduced. The characteristic of the book that distinguishes it from other textbooks on concrete structures is that more emphasis has been laid on the basic theories of reinforced concrete and the application of the basic theories in design of new structures and analysis of existing structures. Examples and problems in each chapter are carefully designed to cover every important knowledge point. As a basic course for undergraduates majoring in civil engineering, this course is different from either the previously learnt mechanics courses or the design courses to be learnt. Compa...

  5. Catalyst in Basic Oleochemicals

    Directory of Open Access Journals (Sweden)

    Eva Suyenty

    2007-10-01

    Full Text Available Currently Indonesia is the world largest palm oil producer with production volume reaching 16 million tones per annum. The high crude oil and ethylene prices in the last 3 – 4 years contribute to the healthy demand growth for basic oleochemicals: fatty acids and fatty alcohols. Oleochemicals are starting to replace crude oil derived products in various applications. As widely practiced in petrochemical industry, catalyst plays a very important role in the production of basic oleochemicals. Catalytic reactions are abound in the production of oleochemicals: Nickel based catalysts are used in the hydrogenation of unsaturated fatty acids; sodium methylate catalyst in the transesterification of triglycerides; sulfonic based polystyrene resin catalyst in esterification of fatty acids; and copper chromite/copper zinc catalyst in the high pressure hydrogenation of methyl esters or fatty acids to produce fatty alcohols. To maintain long catalyst life, it is crucial to ensure the absence of catalyst poisons and inhibitors in the feed. The preparation methods of nickel and copper chromite catalysts are as follows: precipitation, filtration, drying, and calcinations. Sodium methylate is derived from direct reaction of sodium metal and methanol under inert gas. The sulfonic based polystyrene resin is derived from sulfonation of polystyrene crosslinked with di-vinyl-benzene. © 2007 BCREC UNDIP. All rights reserved.[Presented at Symposium and Congress of MKICS 2007, 18-19 April 2007, Semarang, Indonesia][How to Cite: E. Suyenty, H. Sentosa, M. Agustine, S. Anwar, A. Lie, E. Sutanto. (2007. Catalyst in Basic Oleochemicals. Bulletin of Chemical Reaction Engineering and Catalysis, 2 (2-3: 22-31.  doi:10.9767/bcrec.2.2-3.6.22-31][How to Link/DOI: http://dx.doi.org/10.9767/bcrec.2.2-3.6.22-31 || or local: http://ejournal.undip.ac.id/index.php/bcrec/article/view/6

  6. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.;

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed...... by Froese et al. are realistic and consistent. We further show that the assumption about density-dependence being described by a stock recruitment relationship is responsible for determining whether a peak in the cohort biomass of a population occurs late or early in life. Finally, we argue...

  7. Basic research projects

    Energy Technology Data Exchange (ETDEWEB)

    1979-04-01

    The research programs under the cognizance of the Office of Energy Research (OER) are directed toward discovery of natural laws and new knowledge, and to improved understanding of the physical and biological sciences as related to the development, use, and control of energy. The ultimate goal is to develop a scientific underlay for the overall DOE effort and the fundamental principles of natural phenomena so that these phenomena may be understood, and new principles, formulated. The DOE-OER outlay activities include three major programs: High Energy Physics, Nuclear Physics, and Basic Energy Sciences. Taken together, these programs represent some 30 percent of the Nation's Federal support of basic research in the energy sciences. The research activities of OER involve more than 6,000 scientists and engineers working in some 17 major Federal Research Centers and at more than 135 different universities and industrial firms throughout the United States. Contract holders in the areas of high-energy physics, nuclear physics, materials sciences, nuclear science, chemical sciences, engineering, mathematics geosciences, advanced energy projects, and biological energy research are listed. Funding trends for recent years are outlined. (RWR)

  8. Woman's Moral Development in Search of Philosophical Assumptions.

    Science.gov (United States)

    Sichel, Betty A.

    1985-01-01

    Examined is Carol Gilligan's thesis that men and women use different moral languages to resolve moral dilemmas, i.e., women speak a language of caring and responsibility, and men speak a language of rights and justice. Her thesis is not grounded with adequate philosophical assumptions. (Author/RM)

  9. Lightweight Graphical Models for Selectivity Estimation Without Independence Assumptions

    DEFF Research Database (Denmark)

    Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.

    2011-01-01

    ’s optimizers are frequently caused by missed correlations between attributes. We present a selectivity estimation approach that does not make the independence assumptions. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution of all...

  10. 41 CFR 60-3.9 - No assumption of validity.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true No assumption of validity. 60-3.9 Section 60-3.9 Public Contracts and Property Management Other Provisions Relating to Public... 3-UNIFORM GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 60-3.9...

  11. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  12. Assumptions regarding right censoring in the presence of left truncation.

    Science.gov (United States)

    Qian, Jing; Betensky, Rebecca A

    2014-04-01

    Clinical studies using complex sampling often involve both truncation and censoring, where there are options for the assumptions of independence of censoring and event and for the relationship between censoring and truncation. In this paper, we clarify these choices, show certain equivalences, and provide examples.

  13. Monthly values of the standardized precipitation index in the State of São Paulo, Brazil: trends and spectral features under the normality assumption Valores mensais do índice padronizado de precipitação pluvial no Estado de São Paulo, Brasil: tendência e características espectrais sob o pressuposto da normalidade

    Directory of Open Access Journals (Sweden)

    Gabriel Constantino Blain

    2012-01-01

    Full Text Available The aim of this study was to describe monthly series of the Standardized Precipitation Index obtained from four weather stations of the State of São Paulo, Brazil. The analyses were carried out by evaluating the normality assumption of the SPI distributions, the spectral features of these series and, the presence of climatic trends in these datasets. It was observed that the Pearson type III distribution was better than the gamma 2-parameter distribution in providing monthly SPI series closer to the normality assumption inherent to the use of this standardized index. The spectral analyses carried out in the time-frequency domain did not allow us to establish a dominant mode in the analyzed series. In general, the Mann-Kendall and the Pettitt tests indicated the presence of no significant trend in the SPI series. However, both trend tests have indicated that the temporal variability of this index, observed at the months of October over the last 60 years, cannot be seen as the result of a purely random process. This last inference is due to the concentration of decreasing trends, with a common beginning (1983/84 in the four locations of the study.O objetivo do trabalho foi descrever séries mensais do Índice Padronizado de Precipitação (SPI, obtidas a partir de quatro estações meteorológicas do Estado de São Paulo, Brasil (1951-2010. As análises foram realizadas avaliando-se o pressuposto de normalidade das distribuições do SPI, as características espectrais dessas séries e a presença de tendências climáticas nessas amostras. Observou-se que a distribuição Pearson tipo III foi melhor que a gama 2-parâmetros em prover séries mensais do SPI mais próximas ao pressuposto de normalidade inerente ao uso desse índice padronizado. As análises espectrais realizadas no domínio tempo-frequência não permitiram o estabelecimento de modo (de frequência dominante nas séries analisadas. De forma geral, os testes de Mann-Kendall e

  14. A Study on "Basic Accounting" Practical Teaching Reform under the Background of Classified Recruitment%大类招生背景下“基础会计”课程实践教学改革研究

    Institute of Scientific and Technical Information of China (English)

    董丽; 王嘉发; 朱兆林

    2012-01-01

    Under the background of classified recruitment, before the shunt of professional education, the practical teaching of "Basic Accounting" should aim at training students' perceptual knowledge first, assisted with exercising students' accounting ability: According to this goal, this paper constructs a new practical teaching mode of "Basic Accounting", i. e. , "giving priority to the comprehensive practice inside school, assisted with video teaching; occasionally organizing outside visiting, and simulating practical teaching inside school".%在大类招生培养模式下,专业分流前的“基础会计”实践教学目标应体现基础性,以培养学生对会计的感性认识为主,锻炼学生的核算能力为辅。根据这一目标,本文构建了“校内整合实习为主,实践教学影片为辅助,校外参观为点缀,校内模拟实践教学为拓展”的大类招生模式下“基础会计”实践教学模式。

  15. Quantum cryptography in real-life applications: Assumptions and security

    Science.gov (United States)

    Zhao, Yi

    Quantum cryptography, or quantum key distribution (QKD), provides a means of unconditionally secure communication. The security is in principle based on the fundamental laws of physics. Security proofs show that if quantum cryptography is appropriately implemented, even the most powerful eavesdropper cannot decrypt the message from a cipher. The implementations of quantum crypto-systems in real life may not fully comply with the assumptions made in the security proofs. Such discrepancy between the experiment and the theory can be fatal to the security of a QKD system. In this thesis we address a number of these discrepancies. A perfect single-photon source is often assumed in many security proofs. However, a weak coherent source is widely used in a real-life QKD implementation. Decoy state protocols have been proposed as a novel approach to dramatically improve the performance of a weak coherent source based QKD implementation without jeopardizing its security. Here, we present the first experimental demonstrations of decoy state protocols. Our experimental scheme was later adopted by most decoy state QKD implementations. In the security proof of decoy state protocols as well as many other QKD protocols, it is widely assumed that a sender generates a phase-randomized coherent state. This assumption has been enforced in few implementations. We close this gap in two steps: First, we implement and verify the phase randomization experimentally; second, we prove the security of a QKD implementation without the coherent state assumption. In many security proofs of QKD, it is assumed that all the detectors on the receiver's side have identical detection efficiencies. We show experimentally that this assumption may be violated in a commercial QKD implementation due to an eavesdropper's malicious manipulation. Moreover, we show that the eavesdropper can learn part of the final key shared by the legitimate users as a consequence of this violation of the assumptions.

  16. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  17. Body Basics Library

    Science.gov (United States)

    ... of Healthy Breakfasts Shyness About the Body Basics Library KidsHealth > For Teens > About the Body Basics Library A A A Did you ever wonder what ... system, part, and process works. Use this medical library to find out about basic human anatomy, how ...

  18. Body Basics Library

    Science.gov (United States)

    ... of Healthy Breakfasts Shyness About the Body Basics Library KidsHealth > For Teens > About the Body Basics Library Print A A A Did you ever wonder ... system, part, and process works. Use this medical library to find out about basic human anatomy, how ...

  19. Basic Cake Decorating Workbook.

    Science.gov (United States)

    Bogdany, Mel

    Included in this student workbook for basic cake decorating are the following: (1) Drawings of steps in a basic way to ice a layer cake, how to make a paper cone, various sizes of flower nails, various sizes and types of tin pastry tubes, and special rose tubes; (2) recipes for basic decorating icings (buttercream, rose paste, and royal icing);…

  20. A "unity assumption" does not promote intersensory integration.

    Science.gov (United States)

    Misceo, Giovanni F; Taylor, Nathanael J

    2011-01-01

    An account of intersensory integration is premised on knowing that different sensory inputs arise from the same object. Could, however, the combination of the inputs be impaired although the "unity assumption" holds? Forty observers viewed a square through a minifying (50%) lens while they simultaneously touched the square. Half could see and half could not see their haptic explorations of the square. Both groups, however, had reason to believe that they were touching and viewing the same square. Subsequent matches of the inspected square were mutually biased by touch and vision when the exploratory movements were visible. However, the matches were biased in the direction of the square's haptic size when observers could not see their exploratory movements. This impaired integration without the visible haptic explorations suggests that the unity assumption alone is not enough to promote intersensory integration.

  1. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    Web usage mining concerns the discovery of common browsing patterns, i.e., pages requested in sequence, from web logs. To cope with the enormous amounts of data, several aggregated structures based on statistical models of web surfing have appeared, e.g., the Hypertext Probabilistic Grammar (HPG...... knowledge there has been no systematic study of the validity of the Markov assumption wrt.\\ web usage mining and the resulting quality of the mined browsing patterns. In this paper we systematically investigate the quality of browsing patterns mined from structures based on the Markov assumption. Formal...... measures of quality, based on the closeness of the mined patterns to the true traversal patterns, are defined and an extensive experimental evaluation is performed, based on two substantial real-world data sets. The results indicate that a large number of rules must be considered to achieve high quality...

  2. Testing Modeling Assumptions in the West Africa Ebola Outbreak

    Science.gov (United States)

    Burghardt, Keith; Verzijl, Christopher; Huang, Junming; Ingram, Matthew; Song, Binyang; Hasne, Marie-Pierre

    2016-01-01

    The Ebola virus in West Africa has infected almost 30,000 and killed over 11,000 people. Recent models of Ebola Virus Disease (EVD) have often made assumptions about how the disease spreads, such as uniform transmissibility and homogeneous mixing within a population. In this paper, we test whether these assumptions are necessarily correct, and offer simple solutions that may improve disease model accuracy. First, we use data and models of West African migration to show that EVD does not homogeneously mix, but spreads in a predictable manner. Next, we estimate the initial growth rate of EVD within country administrative divisions and find that it significantly decreases with population density. Finally, we test whether EVD strains have uniform transmissibility through a novel statistical test, and find that certain strains appear more often than expected by chance. PMID:27721505

  3. Analysis of one assumption of the Navier-Stokes equations

    CERN Document Server

    Budarin, V A

    2013-01-01

    This article analyses the assumptions regarding the influence of pressure forces during the calculation of the motion of a Newtonian fluid. The purpose of the analysis is to determine the reasonableness of the assumptions and their impact on the results of the analytical calculation. The connections between equations, causes of discrepancies in exact solutions of the Navier-Stokes equations at low Reynolds numbers and the emergence of unstable solutions using computer programs are also addressed. The necessity to complement the well-known equations of motion in mechanical stress requires other equations are substantive. It is shown that there are three methods of solving such a problem and the requirements for the unknown equations are described. Keywords: Navier-Stokes, approximate equation, closing equations, holonomic system.

  4. The sufficiency assumption of the reasoned approach to action

    OpenAIRE

    David Trafimow

    2015-01-01

    The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables accou...

  5. Assumptions and realities of the NCLEX-RN.

    Science.gov (United States)

    Aucoin, Julia W; Treas, Leslie

    2005-01-01

    Every three years the National Council of State Boards of Nursing conducts a practice analysis to verify the activities that are tested on the licensure exam (NCLEX-RN). Faculty can benefit from information in the practice analysis to ensure that courses and experiences adequately prepare graduates for the NCLEX-RN. This summary of the practice analysis challenges common assumptions and provides recommendations for faculty.

  6. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike;

    2012-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R...... obtain, in fact, an infinite hierarchy of progressively weaker assumptions whose complexities lie “between” DDH and CDH. This leads to a large number of new schemes because virtually all known DDH-based constructions can very easily be upgraded to be based on d-DDH. We use the same construction...... and security proof but get better security and moreover, the amortized complexity (e.g, computation per encrypted bit) is the same as when using DDH. We also show that d-DDH, just like DDH, is easy in bilinear groups. We therefore suggest a different type of assumption, the d-vector DDH problems (d...

  7. Differentiating Different Modeling Assumptions in Simulations of MagLIF loads on the Z Generator

    Science.gov (United States)

    Jennings, C. A.; Gomez, M. R.; Harding, E. C.; Knapp, P. F.; Ampleford, D. J.; Hansen, S. B.; Weis, M. R.; Glinsky, M. E.; Peterson, K.; Chittenden, J. P.

    2016-10-01

    Metal liners imploded by a fast rising (MagLIF experiments have had some success. While experiments are increasingly well diagnosed, many of the measurements (particularly during stagnation) are time integrated, limited in spatial resolution or require additional assumptions to interpret in the context of a structured, rapidly evolving system. As such, in validating MHD calculations, there is the potential for the same observables in the experimental data to be reproduced under different modeling assumptions. Using synthetic diagnostics of the results of different pre-heat, implosion and stagnation simulations run with the Gorgon MHD code, we discuss how the interpretation of typical Z diagnostics relate to more fundamental simulation parameters. We then explore the extent to which different assumptions on instability development, current delivery, high-Z mix into the fuel and initial laser deposition can be differentiated in our existing measurements. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.

  8. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  9. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  10. Developmental Orientation of Commercial Health Insurance under the Perspective ;of Basic Medical Insurance%基本医疗保险视角下的商业健康保险发展定位

    Institute of Scientific and Technical Information of China (English)

    董曙辉

    2015-01-01

    我国商业健康保险发展总体滞后,造成广大人民群众的医疗需求过度依赖基本医保。基本医保资源的有限性、配置的不合理性和经办管理机制的缺失,与人民群众日益增长的医疗需求的矛盾越来越突出。加快商业健康保险的发展,让更多的供给、更优的配置、更好的服务由商业健康保险提供,是走出现行基本医保运行困境,构建新型多层次医疗保障体系的必由之路。%The commercial health insurance of China is under development, which causes the problems of medical care needs excessively relying on the basic medical insurance. The contradiction between the problems in health insurance system, such as limitation in medical insurance resources, irrational allocation of the resources and the lack of management mechanism, and the growing health care demands has been increasingly prominent. Accelerating the development of commercial health insurance, so that more supplies, better allocations and better services can be provided by commercial health insurance, is the only way to make the basic medical insurance walking away from the current predicament, and to build new multi-level medical security system.

  11. Experimental data from irradiation of physical detectors disclose weaknesses in basic assumptions of the δ ray theory of track structure

    DEFF Research Database (Denmark)

    Olsen, K. J.; Hansen, Jørgen-Walther

    1985-01-01

    to 20200 MeV.cm2.g-1 using ion beams ranging from protons to sulphur ions. The low-LET reference radiations were beams of fast electrons and of 60Co γ rays. At doses well below saturation the two detectors act upon low-LET radiation in close accordance with the theoretical considerations, but at marginal...

  12. Basics of Bayesian Learning - Basically Bayes

    DEFF Research Database (Denmark)

    Larsen, Jan

    Tutorial presented at the IEEE Machine Learning for Signal Processing Workshop 2006, Maynooth, Ireland, September 8, 2006. The tutorial focuses on the basic elements of Bayesian learning and its relation to classical learning paradigms. This includes a critical discussion of the pros and cons...

  13. Basic molecular spectroscopy

    CERN Document Server

    Gorry, PA

    1985-01-01

    BASIC Molecular Spectroscopy discusses the utilization of the Beginner's All-purpose Symbolic Instruction Code (BASIC) programming language in molecular spectroscopy. The book is comprised of five chapters that provide an introduction to molecular spectroscopy through programs written in BASIC. The coverage of the text includes rotational spectra, vibrational spectra, and Raman and electronic spectra. The book will be of great use to students who are currently taking a course in molecular spectroscopy.

  14. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  15. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis

    Directory of Open Access Journals (Sweden)

    David B. Flora

    2012-03-01

    Full Text Available We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  16. Old and new ideas for data screening and assumption testing for exploratory and confirmatory factor analysis.

    Science.gov (United States)

    Flora, David B; Labrish, Cathy; Chalmers, R Philip

    2012-01-01

    We provide a basic review of the data screening and assumption testing issues relevant to exploratory and confirmatory factor analysis along with practical advice for conducting analyses that are sensitive to these concerns. Historically, factor analysis was developed for explaining the relationships among many continuous test scores, which led to the expression of the common factor model as a multivariate linear regression model with observed, continuous variables serving as dependent variables, and unobserved factors as the independent, explanatory variables. Thus, we begin our paper with a review of the assumptions for the common factor model and data screening issues as they pertain to the factor analysis of continuous observed variables. In particular, we describe how principles from regression diagnostics also apply to factor analysis. Next, because modern applications of factor analysis frequently involve the analysis of the individual items from a single test or questionnaire, an important focus of this paper is the factor analysis of items. Although the traditional linear factor model is well-suited to the analysis of continuously distributed variables, commonly used item types, including Likert-type items, almost always produce dichotomous or ordered categorical variables. We describe how relationships among such items are often not well described by product-moment correlations, which has clear ramifications for the traditional linear factor analysis. An alternative, non-linear factor analysis using polychoric correlations has become more readily available to applied researchers and thus more popular. Consequently, we also review the assumptions and data-screening issues involved in this method. Throughout the paper, we demonstrate these procedures using an historic data set of nine cognitive ability variables.

  17. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  18. Evaluating risk factor assumptions: a simulation-based approach

    Directory of Open Access Journals (Sweden)

    Miglioretti Diana L

    2011-09-01

    Full Text Available Abstract Background Microsimulation models are an important tool for estimating the comparative effectiveness of interventions through prediction of individual-level disease outcomes for a hypothetical population. To estimate the effectiveness of interventions targeted toward high risk groups, the mechanism by which risk factors influence the natural history of disease must be specified. We propose a method for evaluating these risk factor assumptions as part of model-building. Methods We used simulation studies to examine the impact of risk factor assumptions on the relative rate (RR of colorectal cancer (CRC incidence and mortality for a cohort with a risk factor compared to a cohort without the risk factor using an extension of the CRC-SPIN model for colorectal cancer. We also compared the impact of changing age at initiation of screening colonoscopy for different risk mechanisms. Results Across CRC-specific risk factor mechanisms, the RR of CRC incidence and mortality decreased (towards one with increasing age. The rate of change in RRs across age groups depended on both the risk factor mechanism and the strength of the risk factor effect. Increased non-CRC mortality attenuated the effect of CRC-specific risk factors on the RR of CRC when both were present. For each risk factor mechanism, earlier initiation of screening resulted in more life years gained, though the magnitude of life years gained varied across risk mechanisms. Conclusions Simulation studies can provide insight into both the effect of risk factor assumptions on model predictions and the type of data needed to calibrate risk factor models.

  19. Questioning the foundations of physics which of our fundamental assumptions are wrong?

    CERN Document Server

    Foster, Brendan; Merali, Zeeya

    2015-01-01

    The essays in this book look at way in which the fundaments of physics might need to be changed in order to make progress towards a unified theory. They are based on the prize-winning essays submitted to the FQXi essay competition “Which of Our Basic Physical Assumptions Are Wrong?”, which drew over 270 entries. As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.” The authors of the eighteen prize-winning essays have, where necessary, adapted their essays for the present volume so as to (a) incorporate the community feedback generated in the online discussion of the essays, (b) add new material that has come to light since their completion and (c) to ensure accessibility to a broad audience of re...

  20. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  1. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky;

    2010-01-01

    , such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing......A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...

  2. 5 CFR 551.401 - Basic principles.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Basic principles. 551.401 Section 551.401 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY ADMINISTRATION UNDER THE FAIR LABOR STANDARDS ACT Hours of Work General Provisions § 551.401 Basic principles. (a) All...

  3. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data.

    Science.gov (United States)

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms.

  4. Economic Growth Assumptions in Climate and Energy Policy

    Directory of Open Access Journals (Sweden)

    Nir Y. Krakauer

    2014-03-01

    Full Text Available The assumption that the economic growth seen in recent decades will continue has dominated the discussion of future greenhouse gas emissions and the mitigation of and adaptation to climate change. Given that long-term economic growth is uncertain, the impacts of a wide range of growth trajectories should be considered. In particular, slower economic growth would imply that future generations will be relatively less able to invest in emissions controls or adapt to the detrimental impacts of climate change. Taking into consideration the possibility of economic slowdown therefore heightens the urgency of reducing greenhouse gas emissions now by moving to renewable energy sources, even if this incurs short-term economic cost. I quantify this counterintuitive impact of economic growth assumptions on present-day policy decisions in a simple global economy-climate model (Dynamic Integrated model of Climate and the Economy (DICE. In DICE, slow future growth increases the economically optimal present-day carbon tax rate and the utility of taxing carbon emissions, although the magnitude of the increase is sensitive to model parameters, including the rate of social time preference and the elasticity of the marginal utility of consumption. Future scenario development should specifically include low-growth scenarios, and the possibility of low-growth economic trajectories should be taken into account in climate policy analyses.

  5. The contour method cutting assumption: error minimization and correction

    Energy Technology Data Exchange (ETDEWEB)

    Prime, Michael B [Los Alamos National Laboratory; Kastengren, Alan L [ANL

    2010-01-01

    The recently developed contour method can measure 2-D, cross-sectional residual-stress map. A part is cut in two using a precise and low-stress cutting technique such as electric discharge machining. The contours of the new surfaces created by the cut, which will not be flat if residual stresses are relaxed by the cutting, are then measured and used to calculate the original residual stresses. The precise nature of the assumption about the cut is presented theoretically and is evaluated experimentally. Simply assuming a flat cut is overly restrictive and misleading. The critical assumption is that the width of the cut, when measured in the original, undeformed configuration of the body is constant. Stresses at the cut tip during cutting cause the material to deform, which causes errors. The effect of such cutting errors on the measured stresses is presented. The important parameters are quantified. Experimental procedures for minimizing these errors are presented. An iterative finite element procedure to correct for the errors is also presented. The correction procedure is demonstrated on experimental data from a steel beam that was plastically bent to put in a known profile of residual stresses.

  6. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  7. Relaxing the zero-sum assumption in neutral biodiversity theory.

    Science.gov (United States)

    Haegeman, Bart; Etienne, Rampal S

    2008-05-21

    The zero-sum assumption is one of the ingredients of the standard neutral model of biodiversity by Hubbell. It states that the community is saturated all the time, which in this model means that the total number of individuals in the community is constant over time, and therefore introduces a coupling between species abundances. It was shown recently that a neutral model with independent species, and thus without any coupling between species abundances, has the same sampling formula (given a fixed number of individuals in the sample) as the standard model [Etienne, R.S., Alonso, D., McKane, A.J., 2007. The zero-sum assumption in neutral biodiversity theory. J. Theor. Biol. 248, 522-536]. The equilibria of both models are therefore equivalent from a practical point of view. Here we show that this equivalence can be extended to a class of neutral models with density-dependence on the community-level. This result can be interpreted as robustness of the model, i.e. insensitivity of the model to the precise interaction of the species in a neutral community. It can also be interpreted as a lack of resolution, as different mechanisms of interactions between neutral species cannot be distinguished using only a single snapshot of species abundance data.

  8. "以学生为中心"教学原则下的外语教师的素质与培养%Basic Quality & Training of Foreign Language Teachers under the Student-centered Teaching Principle

    Institute of Scientific and Technical Information of China (English)

    张乐慧

    2015-01-01

    "以学生为中心"的教学原则 ,近几年在外语教学过程中已经得到了积极推广和实施 ,并且在外语教学方面收到了明显的效果.为了使教和学更好地结合起来 ,体现"以学生为中心"的原则 ,还应进一步明确外语教师应具备的素质及培养途径.%The"student-centered"teaching principle has been widely accepted and implemented among teach-ers and students in the recent years of foreign language teaching .According to the principle ,students could give full play to their subjective initiative and creativity under the instruction and guide of teachers .To fur-ther combine teaching with learning ,teachers should receive trainings on their basic quality to better guide students in their learning .

  9. Basic Research Objectives Reaffirmed

    Institute of Scientific and Technical Information of China (English)

    Guo Haiyan; Zhao Baohua

    2002-01-01

    @@ As a national institution for scientific research and a component of the national innovation system, CAS should and must make key contributions to the great national rejuvenation of the country. Keeping this in mind, CAS has developed four developmental targets for its basic research. This was revealed at a CAS conference on basic research held June 11-12 in Beijing.

  10. Cycles in basic innovations

    NARCIS (Netherlands)

    Groot, de E.A. (Bert); Franses, P.H.P.H.

    2005-01-01

    Basic innovations are often believed to be the drivers of economic growth. It has been widely documented that economic growth follows cyclical patterns of varying length. In this paper we examine if such patterns are also present in basic innovations. For an annual time series of count data covering

  11. Basic Science Training Program.

    Science.gov (United States)

    Brummel, Clete

    These six learning modules were developed for Lake Michigan College's Basic Science Training Program, a workshop to develop good study skills while reviewing basic science. The first module, which was designed to provide students with the necessary skills to study efficiently, covers the following topics: time management; an overview of a study…

  12. Basic principle of superconductivity

    OpenAIRE

    De Cao, Tian

    2007-01-01

    The basic principle of superconductivity is suggested in this paper. There have been two vital wrong suggestions on the basic principle, one is the relation between superconductivity and the Bose-Einstein condensation (BEC), and another is the relation between superconductivity and pseudogap.

  13. Legacies, Assumptions, and Decisions: The Path to Hiroshima

    Science.gov (United States)

    1997-01-01

    Navy Bard, behevmg Japan was lookmg for a way, out urged warmng m a June 27 memo.& Fmally, Sz~lard med one last moMed pehuon m mrd-July 4g These...Company, 1995 Walzer, Michael Just and Uniust Wars New York Basic Books, Inc , 1977 Wemberg, Gerhard L A World at Arms A Global Hrstorv of World War

  14. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  15. Basic Research Needs for Countering Terrorism

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, W.; Michalske, T.; Trewhella, J.; Makowski, L.; Swanson, B.; Colson, S.; Hazen, T.; Roberto, F.; Franz, D.; Resnick, G.; Jacobson, S.; Valdez, J.; Gourley, P.; Tadros, M.; Sigman, M.; Sailor, M.; Ramsey, M.; Smith, B.; Shea, K.; Hrbek, J.; Rodacy, P.; Tevault, D.; Edelstein, N.; Beitz, J.; Burns, C.; Choppin, G.; Clark, S.; Dietz, M.; Rogers, R.; Traina, S.; Baldwin, D.; Thurnauer, M.; Hall, G.; Newman, L.; Miller, D.; Kung, H.; Parkin, D.; Shuh, D.; Shaw, H.; Terminello, L.; Meisel, D.; Blake, D.; Buchanan, M.; Roberto, J.; Colson, S.; Carling, R.; Samara, G.; Sasaki, D.; Pianetta, P.; Faison, B.; Thomassen, D.; Fryberger, T.; Kiernan, G.; Kreisler, M.; Morgan, L.; Hicks, J.; Dehmer, J.; Kerr, L.; Smith, B.; Mays, J.; Clark, S.

    2002-03-01

    To identify connections between technology needs for countering terrorism and underlying science issues and to recommend investment strategies to increase the impact of basic research on efforts to counter terrorism.

  16. Sequence diversity under the multispecies coalescent with Yule process and constant population size.

    Science.gov (United States)

    Heled, Joseph

    2012-03-01

    The study of sequence diversity under phylogenetic models is now classic. Theoretical studies of diversity under the Kingman coalescent appeared shortly after the introduction of the coalescent. In this paper we revisit this topic under the multispecies coalescent, an extension of the single population model to multiple populations. We derive exact formulas for the sequence dissimilarity of two sequences drawn at random under a basic multispecies setup. The multispecies model uses three parameters--the species tree birth rate under the pure birth process (Yule), the species effective population size and the mutation rate. We also discuss the effects of relaxing some of the model assumptions.

  17. Testing surrogacy assumptions: can threatened and endangered plants be grouped by biological similarity and abundances?

    Science.gov (United States)

    Che-Castaldo, Judy P; Neel, Maile C

    2012-01-01

    There is renewed interest in implementing surrogate species approaches in conservation planning due to the large number of species in need of management but limited resources and data. One type of surrogate approach involves selection of one or a few species to represent a larger group of species requiring similar management actions, so that protection and persistence of the selected species would result in conservation of the group of species. However, among the criticisms of surrogate approaches is the need to test underlying assumptions, which remain rarely examined. In this study, we tested one of the fundamental assumptions underlying use of surrogate species in recovery planning: that there exist groups of threatened and endangered species that are sufficiently similar to warrant similar management or recovery criteria. Using a comprehensive database of all plant species listed under the U.S. Endangered Species Act and tree-based random forest analysis, we found no evidence of species groups based on a set of distributional and biological traits or by abundances and patterns of decline. Our results suggested that application of surrogate approaches for endangered species recovery would be unjustified. Thus, conservation planning focused on individual species and their patterns of decline will likely be required to recover listed species.

  18. Optimization of Non-Profit Projects’ Portfolio: Chosen Aspects and Assumptions

    Directory of Open Access Journals (Sweden)

    Jacek Woźniak

    2014-09-01

    Full Text Available The chosen aspects and assumptions of the author’s proposal of the optimization model of the non-profit projects’ portfolio are presented. The functional model of the non-profit sector (third sector, which is the base for the further analyses, is also characterized. The article also contains the quantification of fundamental conditions of portfolio optimization. There is developed the utility model for the management system in the non-profit portfolio, in the framework of which there are specified the scope of the model and relationships between four categories of the non-profit portfolio’s participants/stakeholders: non-profit organizations, donors, co-participants and customers (recipients of the basic benefits/values associated with the realization of the non-profit projects. The main optimality conditions and optimization algorithm of the non-profit portfolio are also given. The paper is concluded with exemplary analytical matrixes used for optimization of the non-profit portfolios and based on the evaluation of both the optimization utility conditions and added parameters. Only basic and chosen aspects of the optimization of the non-profit projects’ portfolio have been described here. [b]Keywords[/b]: Management, Organization, Non-Profit, Project, Portfolio, Optimization, Utility

  19. Cost and Performance Assumptions for Modeling Electricity Generation Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Tidball, Rick [ICF International, Fairfax, VA (United States); Bluestein, Joel [ICF International, Fairfax, VA (United States); Rodriguez, Nick [ICF International, Fairfax, VA (United States); Knoke, Stu [ICF International, Fairfax, VA (United States)

    2010-11-01

    The goal of this project was to compare and contrast utility scale power plant characteristics used in data sets that support energy market models. Characteristics include both technology cost and technology performance projections to the year 2050. Cost parameters include installed capital costs and operation and maintenance (O&M) costs. Performance parameters include plant size, heat rate, capacity factor or availability factor, and plant lifetime. Conventional, renewable, and emerging electricity generating technologies were considered. Six data sets, each associated with a different model, were selected. Two of the data sets represent modeled results, not direct model inputs. These two data sets include cost and performance improvements that result from increased deployment as well as resulting capacity factors estimated from particular model runs; other data sets represent model input data. For the technologies contained in each data set, the levelized cost of energy (LCOE) was also evaluated, according to published cost, performance, and fuel assumptions.

  20. New media in strategy – mapping assumptions in the field

    DEFF Research Database (Denmark)

    Gulbrandsen, Ib Tunby; Plesner, Ursula; Raviola, Elena

    2017-01-01

    There is plenty of empirical evidence for claiming that new media make a difference for how strategy is conceived and executed. Furthermore, there is a rapidly growing body of literature that engages with this theme, and offers recommendations regarding the appropriate strategic actions in relati...... as an inspiration to develop nuanced perspectives on the role of new media in strategy.......There is plenty of empirical evidence for claiming that new media make a difference for how strategy is conceived and executed. Furthermore, there is a rapidly growing body of literature that engages with this theme, and offers recommendations regarding the appropriate strategic actions in relation...... to new media. By contrast, there is relatively little attention to the assumptions behind strategic thinking in relation to new media. This article reviews the most influential strategy journals, asking how new media are conceptualized. It is shown that strategy scholars have a tendency to place...

  1. Linear irreversible heat engines based on local equilibrium assumptions

    Science.gov (United States)

    Izumida, Yuki; Okuda, Koji

    2015-08-01

    We formulate an endoreversible finite-time Carnot cycle model based on the assumptions of local equilibrium and constant energy flux, where the efficiency and the power are expressed in terms of the thermodynamic variables of the working substance. By analyzing the entropy production rate caused by the heat transfer in each isothermal process during the cycle, and using the endoreversible condition applied to the linear response regime, we identify the thermodynamic flux and force of the present system and obtain a linear relation that connects them. We calculate the efficiency at maximum power in the linear response regime by using the linear relation, which agrees with the Curzon-Ahlborn (CA) efficiency known as the upper bound in this regime. This reason is also elucidated by rewriting our model into the form of the Onsager relations, where our model turns out to satisfy the tight-coupling condition leading to the CA efficiency.

  2. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    CERN Document Server

    Côté, Benoit; Ritter, Christian; Herwig, Falk; Venn, Kim A

    2016-01-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of Type Ia supernovae and the strength of gal...

  3. Uncovering Metaethical Assumptions in Bioethical Discourse across Cultures.

    Science.gov (United States)

    Sullivan, Laura Specker

    2016-03-01

    Much of bioethical discourse now takes place across cultures. This does not mean that cross-cultural understanding has increased. Many cross-cultural bioethical discussions are marked by entrenched disagreement about whether and why local practices are justified. In this paper, I argue that a major reason for these entrenched disagreements is that problematic metaethical commitments are hidden in these cross-cultural discourses. Using the issue of informed consent in East Asia as an example of one such discourse, I analyze two representative positions in the discussion and identify their metaethical commitments. I suggest that the metaethical assumptions of these positions result from their shared method of ethical justification: moral principlism. I then show why moral principlism is problematic in cross-cultural analyses and propose a more useful method for pursuing ethical justification across cultures.

  4. Validating modelling assumptions of alpha particles in electrostatic turbulence

    CERN Document Server

    Wilkie, George; Highcock, Edmund; Dorland, William

    2014-01-01

    To rigorously model fast ions in fusion plasmas, a non-Maxwellian equilibrium distribution must be used. In the work, the response of high-energy alpha particles to electrostatic turbulence has been analyzed for several different tokamak parameters. Our results are consistent with known scalings and experimental evidence that alpha particles are generally well-confined: on the order of several seconds. It is also confirmed that the effect of alphas on the turbulence is negligible at realistically low concentrations, consistent with linear theory. It is demonstrated that the usual practice of using a high-temperature Maxwellian gives incorrect estimates for the radial alpha particle flux, and a method of correcting it is provided. Furthermore, we see that the timescales associated with collisions and transport compete at moderate energies, calling into question the assumption that alpha particles remain confined to a flux surface that is used in the derivation of the slowing-down distribution.

  5. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  6. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  7. Quantum electronics basic theory

    CERN Document Server

    Fain, V M; Sanders, J H

    1969-01-01

    Quantum Electronics, Volume 1: Basic Theory is a condensed and generalized description of the many research and rapid progress done on the subject. It is translated from the Russian language. The volume describes the basic theory of quantum electronics, and shows how the concepts and equations followed in quantum electronics arise from the basic principles of theoretical physics. The book then briefly discusses the interaction of an electromagnetic field with matter. The text also covers the quantum theory of relaxation process when a quantum system approaches an equilibrium state, and explai

  8. Video Screen Capture Basics

    Science.gov (United States)

    Dunbar, Laura

    2014-01-01

    This article is an introduction to video screen capture. Basic information of two software programs, QuickTime for Mac and BlueBerry Flashback Express for PC, are also discussed. Practical applications for video screen capture are given.

  9. HIV Treatment: The Basics

    Science.gov (United States)

    HIV Treatment HIV Treatment: The Basics (Last updated 2/24/2017; last reviewed 2/24/2017) Key Points Antiretroviral therapy (ART) ... reduces the risk of HIV transmission . How do HIV medicines work? HIV attacks and destroys the infection- ...

  10. Kidney Disease Basics

    Science.gov (United States)

    ... Links Take the first step Alternate Language URL Kidney Disease Basics Page Content Your kidneys filter extra ... blood pressure are the most common causes of kidney disease. ​These conditions can slowly damage the kidneys ...

  11. Health Literacy Basics

    Science.gov (United States)

    ... have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions. 1 Health literacy is dependent on individual and systemic factors: Communication skills of lay persons and professionals Lay and professional ...

  12. Basic Financial Accounting

    DEFF Research Database (Denmark)

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects...

  13. Evaluating the reliability of equilibrium dissolution assumption from residual gasoline in contact with water saturated sands

    Science.gov (United States)

    Lekmine, Greg; Sookhak Lari, Kaveh; Johnston, Colin D.; Bastow, Trevor P.; Rayner, John L.; Davis, Greg B.

    2017-01-01

    Understanding dissolution dynamics of hazardous compounds from complex gasoline mixtures is a key to long-term predictions of groundwater risks. The aim of this study was to investigate if the local equilibrium assumption for BTEX and TMBs (trimethylbenzenes) dissolution was valid under variable saturation in two dimensional flow conditions and evaluate the impact of local heterogeneities when equilibrium is verified at the scale of investigation. An initial residual gasoline saturation was established over the upper two-thirds of a water saturated sand pack. A constant horizontal pore velocity was maintained and water samples were recovered across 38 sampling ports over 141 days. Inside the residual NAPL zone, BTEX and TMBs dissolution curves were in agreement with the TMVOC model based on the local equilibrium assumption. Results compared to previous numerical studies suggest the presence of small scale dissolution fingering created perpendicular to the horizontal dissolution front, mainly triggered by heterogeneities in the medium structure and the local NAPL residual saturation. In the transition zone, TMVOC was able to represent a range of behaviours exhibited by the data, confirming equilibrium or near-equilibrium dissolution at the scale of investigation. The model locally showed discrepancies with the most soluble compounds, i.e. benzene and toluene, due to local heterogeneities exhibiting that at lower scale flow bypassing and channelling may have occurred. In these conditions mass transfer rates were still high enough to fall under the equilibrium assumption in TMVOC at the scale of investigation. Comparisons with other models involving upscaled mass transfer rates demonstrated that such approximations with TMVOC could lead to overestimate BTEX dissolution rates and underestimate the total remediation time.

  14. Discrete Neural Signatures of Basic Emotions.

    Science.gov (United States)

    Saarimäki, Heini; Gotsopoulos, Athanasios; Jääskeläinen, Iiro P; Lampinen, Jouko; Vuilleumier, Patrik; Hari, Riitta; Sams, Mikko; Nummenmaa, Lauri

    2016-06-01

    Categorical models of emotions posit neurally and physiologically distinct human basic emotions. We tested this assumption by using multivariate pattern analysis (MVPA) to classify brain activity patterns of 6 basic emotions (disgust, fear, happiness, sadness, anger, and surprise) in 3 experiments. Emotions were induced with short movies or mental imagery during functional magnetic resonance imaging. MVPA accurately classified emotions induced by both methods, and the classification generalized from one induction condition to another and across individuals. Brain regions contributing most to the classification accuracy included medial and inferior lateral prefrontal cortices, frontal pole, precentral and postcentral gyri, precuneus, and posterior cingulate cortex. Thus, specific neural signatures across these regions hold representations of different emotional states in multimodal fashion, independently of how the emotions are induced. Similarity of subjective experiences between emotions was associated with similarity of neural patterns for the same emotions, suggesting a direct link between activity in these brain regions and the subjective emotional experience.

  15. On Constructing Basic Framework of the University Internal Control under the Conditions of Centralized Treasury Payment System%国库集中支付条件下高校内部控制基本框架构建

    Institute of Scientific and Technical Information of China (English)

    司金山

    2012-01-01

    高校作为中央或地方财政预算单位,现已全面实行国库集中支付制度,高校的财务管理也发生了重大变化。国库集中支付制度实行的好坏,很大程度上取决于国库集中支付制度内部控制的好坏。分析了国库制度实行后高校内部控制方面出现的问题,并结合实际情况综合运用COSO框架来构建国库集中支付条件下地方高校内部控制的基本框架,希望以此来完善高校内部控制体系,保障国库集中支付制度在高校有良好实施效果。%Universities as central or local budget unit is now filly implement centralized treasury payment system, bringing about significant changes in the financial management of universities. Whether the result is good or bad depends largely on its internal control. This paper analyzes the university internal control problems in the implementation of the treasury system, using the COSO framework and combining with the actual situation to build the basic framework of the university internal control under the conditions of eentralized treasury payment system in order to improve the college internal control system and to safeguard the implementation of centralized payment system in the universities.

  16. Adhesion Detection Analysis by Modeling Rail Wheel Set Dynamics under the Assumption of Constant Creep Coefficient

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali Soomro

    2014-12-01

    Full Text Available Adhesion level control is very necessary to avoid slippage of rail wheelset and track from derailment for smoothing running of rail vehicle. In this paper the proper dynamics of wheelset for velocities acting in three dimensions of wheelset and rail track has been discussed along with creep forces on each wheel in longitudinal, lateral and spin directions has been enumerated and computed for suitable modeling. The concerned results have been simulated by Matlab code to observe the correlation of this phenomenon to compare creepage and creep forces for detecting adhesion level. This adhesion identification is recognized by applying coulomb’s law for sliding friction by comparing tangential and normal forces through co-efficient of friction

  17. Manifestation of Coupled Geometric Complexity in Urban Road Networks under Mono-Centric Assumption

    CERN Document Server

    Peiravian, Farideddin

    2015-01-01

    This article analyzes the complex geometry of urban transportation networks as a gateway to understanding their encompassing urban systems. Using a proposed ring-buffer approach and applying it to 50 urban areas in the United States, we measure road lengths in concentric rings from carefully-selected urban centers and study how the trends evolve as we move away from these centers. Overall, we find that the complexity of urban transportation networks is naturally coupled, consisting of two distinct patterns: (1) a fractal component (i.e., power law) that represent a uniform grid, and (2) a second component that can be exponential, power law, or logarithmic that captures changes in road density. From this second component, we introduce two new indices, density index and decay index, which jointly capture essential characteristics of urban systems and therefore can help us gain new insights into how cities evolve.

  18. The Perceptions of High-Level Officers in Cyprus about Intercultural Education and Their Underlying Assumptions

    Science.gov (United States)

    Hajisoteriou, Christina; Neophytou, Lefkios; Angelides, Panayiotis

    2015-01-01

    Since 2004, the Ministry of Education and Culture in Cyprus has launched an educational reform. The Ministry highlighted Cyprus' participation in the European context and, by extension, the turning-into-multicultural character of the Cypriot society as the most important reasons, which necessitated such a reform. This paper seeks to examine the…

  19. Empirical Tests of the Assumptions Underlying Models for Foreign Exchange Rates.

    Science.gov (United States)

    1984-03-01

    Martinengo (1980) extends a model by Dornbusch (1976) in which market equilibrium is formalized in terms of interest rates, level of prices, public...55-65. Dornbusch , R., "The Theory of Flexible Exchange Rate Regimes and Macroeconomic Policy", Scandinavian Journal of Economics, 78, 1976, pP. 255

  20. 77 FR 15188 - Agency Information Collection Activity Under OMB Review: Application for Assumption Approval and...

    Science.gov (United States)

    2012-03-14

    ... INFORMATION CONTACT: Denise McLamb, Enterprise Records Service (005R1B), Department of Veterans Affairs, 810 Vermont Avenue NW., Washington, DC 20420, (202) 632-7479, fax (202) 632-7583 or email denise.mclamb@va.gov... of the Secretary. Denise McLamb, Program Analyst, Enterprise Records Service. BILLING CODE 8320-01-P...

  1. Binary Biometrics: An Analytic Framework to Estimate the Bit Error Probability under Gaussian Assumption

    NARCIS (Netherlands)

    Kelkboom, E.J.C.; Molina, G.; Kevenaar, T.A.M.; Veldhuis, R.N.J.; Jonker, W.

    2008-01-01

    In recent years the protection of biometric data has gained increased interest from the scientific community. Methods such as the helper data system, fuzzy extractors, fuzzy vault and cancellable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic pri

  2. Complete Moment Convergence of Weighted Sums for Processes under Asymptotically almost Negatively Associated Assumptions

    Indian Academy of Sciences (India)

    Jun An

    2014-05-01

    For weighted sums of sequences of asymptotically almost negatively associated (AANA) random variables, we study the complete moment convergence by using the Rosenthal type moment in equalities. Our results extend the corresponding ones for sequences of independently identically distributed random variables of Chow [4].

  3. Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

    NARCIS (Netherlands)

    Kelkboom, Emile J.C.; Garcia Molina, Gary; Breebaart, Jeroen; Veldhuis, Raymond N.J.; Kevenaar, Tom A.M.; Jonker, Willem

    2010-01-01

    In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these met

  4. Uniqueness Results for Second Order Bellman-Isaacs Equations under Quadratic Growth Assumptions and Applications

    CERN Document Server

    Da Lio, Francesca; 10.1137/S0363012904440897

    2010-01-01

    In this paper, we prove a comparison result between semicontinuous viscosity sub and supersolutions growing at most quadratically of second-order degenerate parabolic Hamilton-Jacobi-Bellman and Isaacs equations. As an application, we characterize the value function of a finite horizon stochastic control problem with unbounded controls as the unique viscosity solution of the corresponding dynamic programming equation.

  5. The Perceptions of High-Level Officers in Cyprus about Intercultural Education and Their Underlying Assumptions

    Science.gov (United States)

    Hajisoteriou, Christina; Neophytou, Lefkios; Angelides, Panayiotis

    2015-01-01

    Since 2004, the Ministry of Education and Culture in Cyprus has launched an educational reform. The Ministry highlighted Cyprus' participation in the European context and, by extension, the turning-into-multicultural character of the Cypriot society as the most important reasons, which necessitated such a reform. This paper seeks to examine…

  6. Close-Form Pricing of Benchmark Equity Default Swaps Under the CEV Assumption

    NARCIS (Netherlands)

    Campi, L.; Sbuelz, A.

    2005-01-01

    Equity Default Swaps are new equity derivatives designed as a product for credit investors.Equipped with a novel pricing result, we provide closedform values that give an analytic contribution to the viability of cross-asset trading related to credit risk.

  7. An Analysis of the Assumptions Underlying the Taxonomy of Educational Objectives: Cognitive Domain

    Science.gov (United States)

    Stedman, Carlton H.

    1973-01-01

    Presents information gained from a study designed to evaluate the effectiveness of providing students with behavioral objectives, with a secondary purpose being to evaluate achievement results across the first four levels of Bloom's taxonomy of educational objectives. No significant differences were found between knowledge and comprehension or…

  8. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up.

  9. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    Science.gov (United States)

    Côté, Benoit; O’Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.

    2017-02-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA

  10. PKreport: report generation for checking population pharmacokinetic model assumptions

    Directory of Open Access Journals (Sweden)

    Li Jun

    2011-05-01

    Full Text Available Abstract Background Graphics play an important and unique role in population pharmacokinetic (PopPK model building by exploring hidden structure among data before modeling, evaluating model fit, and validating results after modeling. Results The work described in this paper is about a new R package called PKreport, which is able to generate a collection of plots and statistics for testing model assumptions, visualizing data and diagnosing models. The metric system is utilized as the currency for communicating between data sets and the package to generate special-purpose plots. It provides ways to match output from diverse software such as NONMEM, Monolix, R nlme package, etc. The package is implemented with S4 class hierarchy, and offers an efficient way to access the output from NONMEM 7. The final reports take advantage of the web browser as user interface to manage and visualize plots. Conclusions PKreport provides 1 a flexible and efficient R class to store and retrieve NONMEM 7 output, 2 automate plots for users to visualize data and models, 3 automatically generated R scripts that are used to create the plots; 4 an archive-oriented management tool for users to store, retrieve and modify figures, 5 high-quality graphs based on the R packages, lattice and ggplot2. The general architecture, running environment and statistical methods can be readily extended with R class hierarchy. PKreport is free to download at http://cran.r-project.org/web/packages/PKreport/index.html.

  11. Cleanup of contaminated soil -- Unreal risk assumptions: Contaminant degradation

    Energy Technology Data Exchange (ETDEWEB)

    Schiffman, A. [New Jersey Department of Environmental Protection, Ewing, NJ (United States)

    1995-12-31

    Exposure assessments for development of risk-based soil cleanup standards or criteria assume that contaminant mass in soil is infinite and conservative (constant concentration). This assumption is not real for most organic chemicals. Contaminant mass is lost from soil and ground water when organic chemicals degrade. Factors to correct for chemical mass lost by degradation are derived from first-order kinetics for 85 organic chemicals commonly listed by USEPA and state agencies. Soil cleanup criteria, based on constant concentration, are then corrected for contaminant mass lost. For many chemicals, accounting for mass lost yields large correction factors to risk-based soil concentrations. For degradation in ground water and soil, correction factors range from greater than one to several orders of magnitude. The long exposure durations normally used in exposure assessments (25 to 70 years) result in large correction factors to standards even for carcinogenic chemicals with long half-lives. For the ground water pathway, a typical soil criterion for TCE of 1 mg/kg would be corrected to 11 mg/kg. For noncarcinogens, correcting for mass lost means that risk algorithms used to set soil cleanup requirements are inapplicable for many chemicals, especially for long periods of exposure.

  12. Observing gravitational-wave transient GW150914 with minimal assumptions

    Science.gov (United States)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; DeRosa, R. T.; De Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Haas, R.; Hacker, J. J.

    2016-06-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be sensitive to gravitational waves emitted by a wide range of sources including binary black hole mergers. Over the observational period from September 12 to October 20, 2015, these transient searches were sensitive to binary black hole mergers similar to GW150914 to an average distance of ˜600 Mpc . In this paper, we describe the analyses that first detected GW150914 as well as the parameter estimation and waveform reconstruction techniques that initially identified GW150914 as the merger of two black holes. We find that the reconstructed waveform is consistent with the signal from a binary black hole merger with a chirp mass of ˜30 M⊙ and a total mass before merger of ˜70 M⊙ in the detector frame.

  13. Basic Electromagnetism and Materials

    CERN Document Server

    Moliton, André

    2007-01-01

    Basic Electromagnetism and Materials is the product of many years of teaching basic and applied electromagnetism. This textbook can be used to teach electromagnetism to a wide range of undergraduate science majors in physics, electrical engineering or materials science. However, by making lesser demands on mathematical knowledge than competing texts, and by emphasizing electromagnetic properties of materials and their applications, this textbook is uniquely suited to students of materials science. Many competing texts focus on the study of propagation waves either in the microwave or optical domain, whereas Basic Electromagnetism and Materials covers the entire electromagnetic domain and the physical response of materials to these waves. Professor André Moliton is Director of the Unité de Microélectronique, Optoélectronique et Polymères (Université de Limoges, France), which brings together three groups studying the optoelectronics of molecular and polymer layers, micro-optoelectronic systems for teleco...

  14. Nuclear multifragmentation: Basic concepts

    Indian Academy of Sciences (India)

    G Chaudhuri; S Mallik; S Das Gupta

    2014-05-01

    We present a brief overview of nuclear multifragmentation reaction. Basic formalism of canonical thermodynamical model based on equilibrium statistical mechanics is described. This model is used to calculate basic observables of nuclear multifragmentation like mass distribution, fragment multiplicity, isotopic distribution and isoscaling. Extension of canonical thermodynamical model to a projectile fragmentation model is outlined. Application of the projectile fragmentation model for calculating average number of intermediate mass fragments and the average size of the largest cluster at different bound, differential charge distribution and cross-section of neutron-rich nuclei of different projectile fragmentation reactions at different energies are described. Application of nuclear multifragmentation reaction in basic research as well as in other domains is outlined.

  15. Decontamination: back to basics.

    Science.gov (United States)

    Meredith, Susan J; Sjorgen, Geoff

    2008-07-01

    My invitation from this Journal's Editor, Felicia Cox, to provide a paper for this themed issue, included the sentence 'I was wondering if you or a colleague would like to contribute a back to basics article on the relevant standards and guidelines for decontamination, including what is compliance?'. The reason it is so interesting to me is that the term 'back to basics' implies reverting to a simpler time in life - when by just sticking to the rules, life became easier. However, with decontamination this is not actually true.

  16. Comprehensive basic mathematics

    CERN Document Server

    Veena, GR

    2005-01-01

    Salient Features As per II PUC Basic Mathematics syllabus of Karnataka. Provides an introduction to various basic mathematical techniques and the situations where these could be usefully employed. The language is simple and the material is self-explanatory with a large number of illustrations. Assists the reader in gaining proficiency to solve diverse variety of problems. A special capsule containing a gist and list of formulae titled ''REMEMBER! Additional chapterwise arranged question bank and 3 model papers in a separate section---''EXAMINATION CORNER''.

  17. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  18. Basic research championed

    Science.gov (United States)

    Friebele, Elaine

    In April, the Office of National Science and Technology Policy released its biennial report to Congress. Science and Technology: Shaping the Twenty-First Century addresses the President's policy for maintaining U.S. leadership in science and technology, significant developments, and important national issues in science, and opportunities to use science and technology in federal programs and national goals. The administration strongly supports basic research as a sound investment and an inspiration to society. As corporate laboratories increasingly favor applied R&D projects, the federal government is becoming the dominant sponsor of long-term, basic research.

  19. Basic properties of semiconductors

    CERN Document Server

    Landsberg, PT

    2013-01-01

    Since Volume 1 was published in 1982, the centres of interest in the basic physics of semiconductors have shifted. Volume 1 was called Band Theory and Transport Properties in the first edition, but the subject has broadened to such an extent that Basic Properties is now a more suitable title. Seven chapters have been rewritten by the original authors. However, twelve chapters are essentially new, with the bulk of this work being devoted to important current topics which give this volume an almost encyclopaedic form. The first three chapters discuss various aspects of modern band theory and the

  20. Basic Financial Accounting

    DEFF Research Database (Denmark)

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects: busin......: business entities, the transformation process, types of businesses, stakeholders, legislation, the annual report, the VAT system, double-entry bookkeeping, inventories, and year-end cast flow analysis.......This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects...

  1. A computational model to investigate assumptions in the headturn preference procedure

    Directory of Open Access Journals (Sweden)

    Christina eBergmann

    2013-10-01

    Full Text Available In this paper we use a computational model to investigate four assumptions that are tacitly present in interpreting the results of studies on infants' speech processing abilities using the Headturn Preference Procedure (HPP: (1 behavioural differences originate in different processing; (2 processing involves some form of recognition; (3 words are segmented from connected speech; and (4 differences between infants should not affect overall results. In addition, we investigate the impact of two potentially important aspects in the design and execution of the experiments: (a the specific voices used in the two parts on HPP experiments (familiarisation and test and (b the experimenter's criterion for what is a sufficient headturn angle. The model is designed to be maximise cognitive plausibility. It takes real speech as input, and it contains a module that converts the output of internal speech processing and recognition into headturns that can yield real-time listening preference measurements. Internal processing is based on distributed episodic representations in combination with a matching procedure based on the assumptions that complex episodes can be decomposed as positive weighted sums of simpler constituents. Model simulations show that the first assumptions hold under two different definitions of recognition. However, explicit segmentation is not necessary to simulate the behaviours observed in infant studies. Differences in attention span between infants can affect the outcomes of an experiment. The same holds for the experimenter's decision criterion. The speakers used in experiments affect outcomes in complex ways that require further investigation. The paper ends with recommendations for future studies using the HPP.

  2. Ethanol Basics (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2015-01-01

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  3. Korean Basic Course.

    Science.gov (United States)

    Defense Language Inst., Washington, DC.

    These 11 volumes of the Korean Basic Course comprise 112 lesson units designed to train native English language speakers to Level 3 proficiency in comprehension and speaking and Level 2 proficiency in reading and writing Korean. (Level 5 on this scale is native-speaker level.) Intended for classroom use in the Defense Language Institute intensive…

  4. Basic physics for all

    CERN Document Server

    Kumar, B N

    2012-01-01

    This is a simple, concise book for both student and non-physics students, presenting basic facts in straightforward form and conveying fundamental principles and theories of physics. This book will be helpful as a supplement to class teaching and to aid those who have difficulty in mastering concepts and principles.

  5. Vaccine Basics (Smallpox)

    Science.gov (United States)

    ... this page: About CDC.gov . Smallpox About Smallpox History of Smallpox Spread and Eradication of Smallpox Transmission Signs and Symptoms Prevention and Treatment Smallpox Vaccine Basics Vaccine Safety Side Effects of Vaccination Who Should Get a Smallpox Vaccination? Bioterrorism The ...

  6. FULA BASIC COURSE.

    Science.gov (United States)

    SWIFT, LLOYD B.; AND OTHERS

    THIS BEGINNING COURSE IS AN INTRODUCTION TO FULA (KNOWN VARIOUSLY AS FULANI, FUL, PEUL, OR PHEUL), A NIGER-CONGO LANGUAGE SPOKEN THROUGHOUT THE GRASSLAND AREAS OF WEST AFRICA FROM THE ATLANTIC TO CAMEROUN. THE TEXT IS ONE OF A SERIES OF SHORT BASIC COURSES IN SELECTED AFRICAN LANGUAGES BEING PREPARED BY THE FOREIGN SERVICE INSTITUTE. IT IS…

  7. Basic Library List.

    Science.gov (United States)

    Duren, William L., Jr.

    Reported is an initial attempt to define a minimal college mathematics library. Included is a list of some 300 books, from which approximately 170 are to be chosen to form a basic library in undergraduate mathematics. The areas provided for in this list include Algebra, Analysis, Applied Mathematics, Geometry, Topology, Logic, Foundations and Set…

  8. Lippincott Basic Reading Program.

    Science.gov (United States)

    Monterey Peninsula Unified School District, Monterey, CA.

    This program, included in "Effective Reading Programs...," serves 459 students in grades 1-3 at 15 elementary schools. The program employs a diagnostic-prescriptive approach to instruction in a nongraded setting through the use of the Lippincott Basic Reading program. When a child enters the program, he is introduced to a decoding…

  9. Basic Drafting: Book Two.

    Science.gov (United States)

    Davis, Ronald; And Others

    The second of a two-book course in drafting, this manual consists of 12 topics in the following units: sketching techniques, geometric constructions, orthographic views, dimensioning procedures, basic tolerancing, auxiliary views, sectional views, inking tools and techniques, axonometrics, oblique, perspective, and computer-aided drafting.…

  10. Health Insurance Basics

    Science.gov (United States)

    ... members at a lower cost. The four basic types of managed care plans are: HMO (Health Maintenance Organization). When you join an HMO, you choose a ... may have to pay more. EPO (Exclusive Provider Organization). An EPO is like a PPO, only ... Health Plan (CDHP) This type of plan is fairly new. It lets you ...

  11. Basic bioreactor design.

    NARCIS (Netherlands)

    Riet, van 't K.; Tramper, J.

    1991-01-01

    Based on a graduate course in biochemical engineering, provides the basic knowledge needed for the efficient design of bioreactors and the relevant principles and data for practical process engineering, with an emphasis on enzyme reactors and aerated reactors for microorganisms. Includes exercises.

  12. Basic Nuclear Physics.

    Science.gov (United States)

    Bureau of Naval Personnel, Washington, DC.

    Basic concepts of nuclear structures, radiation, nuclear reactions, and health physics are presented in this text, prepared for naval officers. Applications to the area of nuclear power are described in connection with pressurized water reactors, experimental boiling water reactors, homogeneous reactor experiments, and experimental breeder…

  13. Canadian Adult Basic Education.

    Science.gov (United States)

    Brooke, W. Michael, Comp.

    "Trends," a publication of the Canadian Association for Adult Education, is a collection of abstracts on selected subjects affecting adult education; this issue is on adult basic education (ABE). It covers teachers and teacher training, psychological factors relating to the ABE teacher and students, manuals for teachers, instructional…

  14. Basic Microfluidics Theory

    DEFF Research Database (Denmark)

    Svendsen, Winnie Edith

    2015-01-01

    ,000 m−1, which is a huge difference and has a large impact on flow behavior. In this chapter the basic microfluidic theory will be presented, enabling the reader to gain a comprehensive understanding of how liquids behave at the microscale, enough to be able to engage in design of micro systems...

  15. Basic Tuberculosis Facts

    Centers for Disease Control (CDC) Podcasts

    2012-03-12

    In this podcast, Dr. Kenneth Castro, Director of the Division of Tuberculosis Elimination, discusses basic TB prevention, testing, and treatment information.  Created: 3/12/2012 by National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP).   Date Released: 3/12/2012.

  16. 國中教師以學習共同體啟動新學習型態之研究 Adopting a Learning Community in a Junior High School under the 12-Year Basic Education System

    Directory of Open Access Journals (Sweden)

    薛雅慈(曉華) Ya-Ci (Hsiao-Hua Selena Hsueh

    2014-03-01

    of learning through a learning community, and numerous schools have participated in this learning community program. The traditional learning style of speaking while students listen is expected to change. In this qualitative study, student experiences and how they changed under the guidance of a learning community were investigated by conducting interviews, and potential problems in the learning method were identified. Five teachers from a junior high school, in which the learning community method was adopted in their classes, participated in this study. The results of positivist analysis indicate that the implementation of a learning community is expected to be a valuable educational method under the 12-Year Basic Education system. Both the researcher and the teachers observed changes in student learning caused by the use of various teaching strategies. Six crucial findings were derived from this research. (1 The methods used by junior high school teachers for promoting collaborative learning in their classes are comprehensive and diversified. (2 Based on the learning community proposed by Professor Manabu Sato, the most widely used method in practice among junior high school teachers is collaborative learning. (3 The collaborative learning technique used by junior high school teachers is typically cooperative learning, which focuses on group discussion and expression rather than on listening, connecting, and referring to the text, as argued by Sato. (4 Regarding junior high school students, the greatest benefit produced by collaborative learning is the cultivation of motivation and teamwork. (5 Inferior students who were previously unacquainted with their classmates attained achievements through collaborative learning. (6 Overall, the teachers enhanced student learning, and changed the learning style of the students in a positive manner.

  17. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  18. The necessary distinction between methodology and philosophical assumptions in healthcare research.

    Science.gov (United States)

    Mesel, Terje

    2013-09-01

    Methodological discussions within healthcare research have traditionally described a methodological dichotomy between qualitative and quantitative methods. The aim of this article is to demonstrate that such a dichotomy presents unnecessary obstacles for good research design and is methodologically and philosophically unsustainable. The issue of incommensurability is not a question of method but rather a question of the philosophical premises underpinning a given method. Thus, transparency on the philosophical level is important for validity and consistency as well as for attempts to integrate or establish an interface to other research. I argue that it is necessary to make a distinction between methodology and philosophical assumptions and to ensure consistency in these correlations. Furthermore, I argue that the question of incommensurability is best answered at this basic philosophical level. The complexity of health care calls for methodological pluralism and creativity that utilises the strength of both qualitative and quantitative approaches. Transparency and consistency on the philosophical level can facilitate new mixed methods research designs that may be promising methodological assets for healthcare research. I believe we are ill served by fortified positions that continue to uphold old battle lines. Empirical research begins in the field of practice and requires a certain amount of pragmatism. However, this pragmatism must be philosophically informed.

  19. Indoor Slope and Edge Detection by using Two-Dimensional EKF-SLAM with Orthogonal Assumption

    Directory of Open Access Journals (Sweden)

    Jixin Lv

    2015-04-01

    Full Text Available In an indoor environment, slope and edge detection is an important problem in simultaneous localization and mapping (SLAM, which is a basic requirement for mobile robot autonomous navigation. Slope detection allows the robot to find areas that are more traversable while the edge detection can prevent robot from falling. Three-dimensional (3D solutions usually require a large memory and high computational costs. This study proposes an efficient two-dimensional (2D solution to combine slope and edge detection with a line-segment-based extended Kalman filter SLAM (EKF-SLAM in a structured indoor area. The robot is designed to use two fixed 2D laser range finders (LRFs to perform horizontal and vertical scans. With local area orthogonal assumption, the slope and edge are modelled into line segments swiftly from each vertical scan, and then are merged into the EKF-SLAM framework. The EKF-SLAM framework features an optional prediction model that can automatically decide whether the application of iterative closest point (ICP is necessary to compensate for the dead reckoning error. The experimental results demonstrate that the proposed algorithm is capable of building an accurate 2D map swiftly, which contains crucial information of the edge and slope.

  20. Basic method of collecting the data of the crowds under evacuation and its application%人群疏散基础数据采集方法及其应用研究

    Institute of Scientific and Technical Information of China (English)

    葛晓霞

    2013-01-01

    人群疏散基础数据的采集方法是关系到数据精准程度的关键.根据人群数据特点及数据采集的需要提出了数据采集场地应具备的条件及现场布置的具体方法,指出了最佳观测角度的选取方式和在不同时间段获取基础数据应涵盖的人群特性指标.针对所获取的视频资料,提出了按照行人运动轨迹计算行走距离的方法,并在此基础上进一步规范了行走速度、人流密度和通行系数的数据处理方法.利用该方法分别对某地铁站中的人群在水平通道及楼梯上的速度进行采集.结果表明,该方法所得数据与国外研究数据有较好的一致性.%The paper is to present a basic method we have worked out involving the data collection of the crowd of people evacuated and its application in rescue practice of fire or other accidents.According to the demands of the crowd evacuation characteristics and data acquisition,we have carefully discussed the rationality of the method selection for observation from a definite angle,while pointing out the choice of the observation period.Our study shows that the observation perspective should be chosen and fixed on a position with an open view and a certain changeable height and breadth for a broad-view observation.Once the observation period was decided,close attention should be attached into the basic data which can be acquired from the different observation periods and can cover a variety of the characteristic indexes of the crowd under excavation.In addition,the observation point should be chosen at the top of the observatory area,so as to form a vantage perspective.For example,it is recommended to use a grid of 60 mm × 60 mm to divide the field,so that the data be easy to collect,sort out for prompt analysis.According to the video data that can be obtained on the observation spot,we have adopted the method to work out the passing distances of the evacuated crowd so as to draw their moving

  1. Basic electronic circuits

    CERN Document Server

    Buckley, P M

    1980-01-01

    In the past, the teaching of electricity and electronics has more often than not been carried out from a theoretical and often highly academic standpoint. Fundamentals and basic concepts have often been presented with no indication of their practical appli­ cations, and all too frequently they have been illustrated by artificially contrived laboratory experiments bearing little relationship to the outside world. The course comes in the form of fourteen fairly open-ended constructional experiments or projects. Each experiment has associated with it a construction exercise and an explanation. The basic idea behind this dual presentation is that the student can embark on each circuit following only the briefest possible instructions and that an open-ended approach is thereby not prejudiced by an initial lengthy encounter with the theory behind the project; this being a sure way to dampen enthusiasm at the outset. As the investigation progresses, questions inevitably arise. Descriptions of the phenomena encounte...

  2. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  3. Basic linear algebra

    CERN Document Server

    Blyth, T S

    2002-01-01

    Basic Linear Algebra is a text for first year students leading from concrete examples to abstract theorems, via tutorial-type exercises. More exercises (of the kind a student may expect in examination papers) are grouped at the end of each section. The book covers the most important basics of any first course on linear algebra, explaining the algebra of matrices with applications to analytic geometry, systems of linear equations, difference equations and complex numbers. Linear equations are treated via Hermite normal forms which provides a successful and concrete explanation of the notion of linear independence. Another important highlight is the connection between linear mappings and matrices leading to the change of basis theorem which opens the door to the notion of similarity. This new and revised edition features additional exercises and coverage of Cramer's rule (omitted from the first edition). However, it is the new, extra chapter on computer assistance that will be of particular interest to readers:...

  4. Basic Semiconductor Physics

    CERN Document Server

    Hamaguchi, Chihiro

    2010-01-01

    This book presents a detailed description of the basic semiconductor physics. The reader is assumed to have a basic command of mathematics and some elementary knowledge of solid state physics. The text covers a wide range of important phenomena in semiconductors, from the simple to the advanced. The reader can understand three different methods of energy band calculations, empirical pseudo-potential, k.p perturbation and tight-binding methods. The effective mass approximation and electron motion in a periodic potential, Boltzmann transport equation and deformation potentials used for full band Monte Carlo simulation are discussed. Experiments and theoretical analysis of cyclotron resonance are discussed in detail because the results are essential to the understanding of semiconductor physics. Optical and transport properties, magneto-transport, two dimensional electron gas transport (HEMT and MOSFET), and quantum transport are reviewed, explaining optical transition, electron phonon interactions, electron mob...

  5. Basics of RF electronics

    CERN Document Server

    Gallo, A

    2011-01-01

    RF electronics deals with the generation, acquisition and manipulation of high-frequency signals. In particle accelerators signals of this kind are abundant, especially in the RF and beam diagnostics systems. In modern machines the complexity of the electronics assemblies dedicated to RF manipulation, beam diagnostics, and feedbacks is continuously increasing, following the demands for improvement of accelerator performance. However, these systems, and in particular their front-ends and back-ends, still rely on well-established basic hardware components and techniques, while down-converted and acquired signals are digitally processed exploiting the rapidly growing computational capability offered by the available technology. This lecture reviews the operational principles of the basic building blocks used for the treatment of high-frequency signals. Devices such as mixers, phase and amplitude detectors, modulators, filters, switches, directional couplers, oscillators, amplifiers, attenuators, and others are d...

  6. Basic plasma physics

    CERN Document Server

    Ghosh, Basudev

    2014-01-01

    Basic Plasma Physics is designed to serve as an introductory compact textbook for advanced undergraduate, postgraduate and research students taking plasma physics as one of their subject of study for the first time. It covers the current syllabus of plasma physics offered by the most universities and technical institutions. The book requires no background in plasma physics but only elementary knowledge of basic physics and mathematics. Emphasis has been given on the analytical approach. Topics are developed from first principle so that the students can learn through self-study. One chapter has been devoted to describe some practical aspects of plasma physics. Each chapter contains a good number of solved and unsolved problems and a variety of review questions, mostly taken from recent examination papers. Some classroom experiments described in the book will surely help students as well as instructors.

  7. Emulsion Science Basic Principles

    CERN Document Server

    Leal-Calderon, Fernando; Schmitt, Véronique

    2007-01-01

    Emulsions are generally made out of two immiscible fluids like oil and water, one being dispersed in the second in the presence of surface-active compounds.They are used as intermediate or end products in a huge range of areas including the food, chemical, cosmetic, pharmaceutical, paint, and coating industries. Besides the broad domain of technological interest, emulsions are raising a variety of fundamental questions at the frontier between physics and chemistry. This book aims to give an overview of the most recent advances in emulsion science. The basic principles, covering aspects of emulsions from their preparation to their destruction, are presented in close relation to both the fundamental physics and the applications of these materials. The book is intended to help scientists and engineers in formulating new materials by giving them the basics of emulsion science.

  8. 经济学的理性假设辨析%Analysis on the Rationality Assumption of Economics

    Institute of Scientific and Technical Information of China (English)

    李佩; 张宇

    2014-01-01

    理性历来是经济学研究秉持的最基本的行为假设,然而自从其概念诞生以来,经济学家们对它的理解与解释却众说纷纭。理性的框架应界定为:利己性、最优化与偏好一致性。对理性假设应该采取秉持实证主义思想,维持理性假设的态度与措施;当理论与现实矛盾时,谨慎地适当地扩展理论模型或环境假设,扩展的底限是维持理性的内在一致性,上限则取决于理论的一般性与现实性的权衡。%Rationality has always being the most basic behavioral assumption in Economics research. However, since the con-cept was born, economists have proposed lots of understanding and interpretation. Rational framework should be defined as:self-interest, optimization and preferences consistency. Assumption of rational positivism should be taken to uphold the idea of positivism, maintaining a rational assumption that attitudes and measures; when theory and reality contradictions cautiously properly extended theoretical models or environmental assumptions, the bottom line is to maintain a rational extension of inter-nal consistency, the upper limit depending on the general and the reality of trade-off theory.

  9. Menstrual Cycle: Basic Biology

    OpenAIRE

    2008-01-01

    The basic biology of the menstrual cycle is a complex, coordinated sequence of events involving the hypothalamus, anterior pituitary, ovary, and endometrium. The menstrual cycle with all its complexities can be easily perturbed by environmental factors such as stress, extreme exercise, eating disorders, and obesity. Furthermore, genetic influences such as fragile X premutations (Chapter X), X chromosome abnormalities (Chapter X), and galactose-1-phosphate uridyltransferase (GALT) point mutati...

  10. Risk communication basics

    Energy Technology Data Exchange (ETDEWEB)

    Corrado, P.G. [Lawrence Livermore National Laboratory, CA (United States)

    1995-12-31

    In low-trust, high-concern situations, 50% of your credibility comes from perceived empathy and caring, demonstrated in the first 30 s you come in contact with someone. There is no second chance for a first impression. These and other principles contained in this paper provide you with a basic level of understanding of risk communication. The principles identified are time-tested caveats and will assist you in effectively communicating technical information.

  11. Visual Basic educational programme

    OpenAIRE

    Pranaitis, Arūnas

    2005-01-01

    Visual basic educational programme Informational Technologies has become such a popular subject that they are applied in all works of life. However, Informational Technologies are still rarely used in the lessons at school. There are such reasons of the mentioned issue: · Insufficient base of computers, · The old software and its disadvantages, · The lack of computerized educational programmes. The aim of the work was to prove that it is actual to create computerized educat...

  12. Basics of Computer Networking

    CERN Document Server

    Robertazzi, Thomas

    2012-01-01

    Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

  13. Thermodynamics - basic conception

    Energy Technology Data Exchange (ETDEWEB)

    Wee, Eul Bok

    1979-08-15

    This book tells of basic conception of thermodynamics, condition and property of matter, work and power, thermal efficiency, the principle of the conservation of energy, relationship between work and heat, enthalpy, Jouel's law, complete gasification, the second low of thermodynamics such as thermal efficiency and quality factor, carnot cycle, and entropy, condensation of gas like press of internal combustion engine, vapor, steam power plant and structure, internal combustion cycle, freeze cycle, flow of fluid, combustion and heat transfer.

  14. Decision support basics

    CERN Document Server

    Power, Daniel J

    2009-01-01

    This book is targeted to busy managers and MBA students who need to grasp the basics of computerized decision support. Some of the topics covered include: What is a DSS? What do managers need to know about computerized decision support? And how can managers identify opportunities to create innovative DSS? Overall the book addresses 35 fundamental questions that are relevant to understanding computerized decision support.

  15. The basic anaesthesia machine.

    Science.gov (United States)

    Gurudatt, Cl

    2013-09-01

    After WTG Morton's first public demonstration in 1846 of use of ether as an anaesthetic agent, for many years anaesthesiologists did not require a machine to deliver anaesthesia to the patients. After the introduction of oxygen and nitrous oxide in the form of compressed gases in cylinders, there was a necessity for mounting these cylinders on a metal frame. This stimulated many people to attempt to construct the anaesthesia machine. HEG Boyle in the year 1917 modified the Gwathmey's machine and this became popular as Boyle anaesthesia machine. Though a lot of changes have been made for the original Boyle machine still the basic structure remains the same. All the subsequent changes which have been brought are mainly to improve the safety of the patients. Knowing the details of the basic machine will make the trainee to understand the additional improvements. It is also important for every practicing anaesthesiologist to have a thorough knowledge of the basic anaesthesia machine for safe conduct of anaesthesia.

  16. Basic Radar Altimetry Toolbox & Tutorial

    Science.gov (United States)

    Rosmorduc, Vinca; Benveniste, Jerome; Breebaart, Leo; Bronner, Emilie; Dinardo, Salvatore; Earith, Didier; Lucas, Bruno Manuel; Niejmeier, Sander; Picot, Nicolas

    2010-12-01

    The Basic Radar Altimetry Toolbox is an "all-altimeter" collection of tools, tutorials and documents designed to facilitate the use of radar altimetry data, including the last mission launched, CryoSat. It has been available from April 2007, and had been demonstrated during training courses and scientific meetings. Nearly 1200 people downloaded it (as of end of June 2010), with many "newcomers" to altimetry among them. Users' feedbacks, developments in altimetry, and practice, showed that new interesting features could be added. Some have been added and/or improved in version 2. Others are ongoing, some are in discussion. The Basic Radar Altimetry Toolbox is able: - to read most distributed radar altimetry data, from ERS-1 & 2, Topex/Poseidon, Geosat Follow-on, Jason- 1, Envisat, Jason- 2, CryoSat and also the future Saral and Sentinel 3 missions, - to perform some processing, data editing and statistic, - and to visualize the results. It can be used at several levels/several ways: - as a data reading tool, with APIs for C, Fortran, Matlab and IDL - as processing/extraction routines, through the on-line command mode - as an educational and a quick-look tool both, with the graphical user interface As part of the Toolbox, a Radar Altimetry Tutorial gives general information about altimetry, the technique involved and its applications, as well as an overview of past, present and future missions, including information on how to access data, additional software and documentation. It also presents a series of data use cases, covering all uses of altimetry over ocean, cryosphere and land, showing the basic methods for some of the most frequent manners of using altimetry data. BRAT is developed under contract with ESA and CNES. It is available at http://www.altimetry.info and http://earth.esa.int/brat/

  17. Use of the mutual exclusivity assumption by young word learners

    DEFF Research Database (Denmark)

    Markman, Ellen M.; Wasow, Judith L.; Hansen, Mikkel

    2003-01-01

    an obvious location to search. On the whole, babies at both ages resisted second labels for objects and, with some qualifications, tended to search for a better referent for the novel label. Thus mutual exclusivity is in place before the onset of the naming explosion. The findings demonstrate that lexical...... constraints enable babies to learn words even under non-optimal conditions-when speakers are not clear and referents are not visible. The results are discussed in relation to an alternative social-pragmatic account. © 2003 Elsevier (USA). All rights reserved....

  18. HYPROLOG: A New Logic Programming Language with Assumptions and Abduction

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2005-01-01

    with the grammar notation provided by the underlying Prolog system. An operational semantics is given which complies with standard declarative semantics for the ``pure'' sublanguages, while for the full HYPROLOG language, it must be taken as definition. The implementation is straightforward and seems to provide...... for abduction, the most efficient of known implementations; the price, however, is a limited use of negations. The main difference wrt.\\ previous implementations of abduction is that we avoid any level of metainterpretation by having Prolog execute the deductive steps directly and by treating abducibles (and...

  19. Basic heat transfer

    CERN Document Server

    Bacon, D H

    2013-01-01

    Basic Heat Transfer aims to help readers use a computer to solve heat transfer problems and to promote greater understanding by changing data values and observing the effects, which are necessary in design and optimization calculations.The book is concerned with applications including insulation and heating in buildings and pipes, temperature distributions in solids for steady state and transient conditions, the determination of surface heat transfer coefficients for convection in various situations, radiation heat transfer in grey body problems, the use of finned surfaces, and simple heat exc

  20. Electrical installation calculations basic

    CERN Document Server

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for basic electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practice. A step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3Fo

  1. Back to basics audio

    CERN Document Server

    Nathan, Julian

    1998-01-01

    Back to Basics Audio is a thorough, yet approachable handbook on audio electronics theory and equipment. The first part of the book discusses electrical and audio principles. Those principles form a basis for understanding the operation of equipment and systems, covered in the second section. Finally, the author addresses planning and installation of a home audio system.Julian Nathan joined the audio service and manufacturing industry in 1954 and moved into motion picture engineering and production in 1960. He installed and operated recording theaters in Sydney, Austra

  2. Basic genetics for dermatologists

    Directory of Open Access Journals (Sweden)

    Muthu Sendhil Kumaran

    2013-01-01

    Full Text Available During the past few decades, advances in the field of molecular genetics have enriched us in understanding the pathogenesis of diseases, their identification, and appropriate therapeutic interventions. In the last 20 years, genetic basis of more than 350 monogenic skin diseases have been elucidated and is counting. The widespread use of molecular genetics as a tool in diagnosis is not practiced routinely due to genetic heterogenicity, limited access and low sensitivity. In this review, we have presented the very basics of genetics so as to enable dermatologists to have working understanding of medical genetics.

  3. Machine shop basics

    CERN Document Server

    Miller, Rex

    2004-01-01

    Use the right tool the right wayHere, fully updated to include new machines and electronic/digital controls, is the ultimate guide to basic machine shop equipment and how to use it. Whether you're a professional machinist, an apprentice, a trade student, or a handy homeowner, this fully illustrated volume helps you define tools and use them properly and safely. It's packed with review questions for students, and loaded with answers you need on the job.Mark Richard Miller is a Professor and Chairman of the Industrial Technology Department at Texas A&M University in Kingsville, T

  4. C# Database Basics

    CERN Document Server

    Schmalz, Michael

    2012-01-01

    Working with data and databases in C# certainly can be daunting if you're coming from VB6, VBA, or Access. With this hands-on guide, you'll shorten the learning curve considerably as you master accessing, adding, updating, and deleting data with C#-basic skills you need if you intend to program with this language. No previous knowledge of C# is necessary. By following the examples in this book, you'll learn how to tackle several database tasks in C#, such as working with SQL Server, building data entry forms, and using data in a web service. The book's code samples will help you get started

  5. Menstrual Cycle: Basic Biology

    Science.gov (United States)

    Hawkins, Shannon M.; Matzuk, Martin M.

    2010-01-01

    The basic biology of the menstrual cycle is a complex, coordinated sequence of events involving the hypothalamus, anterior pituitary, ovary, and endometrium. The menstrual cycle with all its complexities can be easily perturbed by environmental factors such as stress, extreme exercise, eating disorders, and obesity. Furthermore, genetic influences such as fragile X premutations (Chapter X), X chromosome abnormalities (Chapter X), and galactose-1-phosphate uridyltransferase (GALT) point mutations (galactosemia) also contribute to perturbations of the menstrual cycle. Although not perfect, mouse model have helped to identify and confirm additional components and pathways in menstrual cycle function and dysfunction in humans. PMID:18574203

  6. Basic structural dynamics

    CERN Document Server

    Anderson, James C

    2012-01-01

    A concise introduction to structural dynamics and earthquake engineering Basic Structural Dynamics serves as a fundamental introduction to the topic of structural dynamics. Covering single and multiple-degree-of-freedom systems while providing an introduction to earthquake engineering, the book keeps the coverage succinct and on topic at a level that is appropriate for undergraduate and graduate students. Through dozens of worked examples based on actual structures, it also introduces readers to MATLAB, a powerful software for solving both simple and complex structural d

  7. Refining the theory of basic individual values.

    Science.gov (United States)

    Schwartz, Shalom H; Cieciuch, Jan; Vecchione, Michele; Davidov, Eldad; Fischer, Ronald; Beierlein, Constanze; Ramos, Alice; Verkasalo, Markku; Lönnqvist, Jan-Erik; Demirutku, Kursad; Dirilen-Gumus, Ozlem; Konty, Mark

    2012-10-01

    We propose a refined theory of basic individual values intended to provide greater heuristic and explanatory power than the original theory of 10 values (Schwartz, 1992). The refined theory more accurately expresses the central assumption of the original theory that research has largely ignored: Values form a circular motivational continuum. The theory defines and orders 19 values on the continuum based on their compatible and conflicting motivations, expression of self-protection versus growth, and personal versus social focus. We assess the theory with a new instrument in 15 samples from 10 countries (N = 6,059). Confirmatory factor and multidimensional scaling analyses support discrimination of the 19 values, confirming the refined theory. Multidimensional scaling analyses largely support the predicted motivational order of the values. Analyses of predictive validity demonstrate that the refined values theory provides greater and more precise insight into the value underpinnings of beliefs. Each value correlates uniquely with external variables.

  8. On basic equation of statistical physics

    Institute of Scientific and Technical Information of China (English)

    邢修三

    1996-01-01

    Considering that thermodynamic irreversibility, the principle of entropy increase and hydrodynamic equations cannot be derived rigorously and in a unified way from the Liouville equations, the anomalous Langevin equation in Liouville space or its equivalent generalized Liouville equation is proposed as a basic equation of statistical physics. This equation reflects the fact that the law of motion of statistical thermodynamics is stochastic, but not deterministic. From that the nonequilibrium entropy, the principle of entropy increase, the theorem of minimum entropy production and the BBGKY diffusion equation hierarchy have been derived. The hydrodynamic equations, such as the generalized Navier-Stokes equation and the mass drift-diffusion equation, etc. have been derived from the BBGKY diffusion equation hierarchy. This equation has the same equilibrium solution as that of the Liouville equation. All these are unified and rigorous without adding any extra assumption. But it is difficult to prove that th

  9. Basic and clinical immunology

    Science.gov (United States)

    Chinen, Javier; Shearer, William T.

    2003-01-01

    Progress in immunology continues to grow exponentially every year. New applications of this knowledge are being developed for a broad range of clinical conditions. Conversely, the study of primary and secondary immunodeficiencies is helping to elucidate the intricate mechanisms of the immune system. We have selected a few of the most significant contributions to the fields of basic and clinical immunology published between October 2001 and October 2002. Our choice of topics in basic immunology included the description of T-bet as a determinant factor for T(H)1 differentiation, the role of the activation-induced cytosine deaminase gene in B-cell development, the characterization of CD4(+)CD25(+) regulatory T cells, and the use of dynamic imaging to study MHC class II transport and T-cell and dendritic cell membrane interactions. Articles related to clinical immunology that were selected for review include the description of immunodeficiency caused by caspase 8 deficiency; a case series report on X-linked agammaglobulinemia; the mechanism of action, efficacy, and complications of intravenous immunoglobulin; mechanisms of autoimmunity diseases; and advances in HIV pathogenesis and vaccine development. We also reviewed two articles that explore the possible alterations of the immune system caused by spaceflights, a new field with increasing importance as human space expeditions become a reality in the 21st century.

  10. 新形势下院系教务干事的基本素质和队伍建设%The basic quality and team construction of teaching secretary under new situation

    Institute of Scientific and Technical Information of China (English)

    邓伟宁

    2013-01-01

    Teaching secretary plays an important role in the teaching management of higher schools, this paper expounds the basic qualities in the new situation, as well as the effective way to strengthen the team construction of college teaching secretary.%教务干事在高等学校教学管理工作中具有极其重要的作用,本文阐述了在新形势下应具备的基本素质,以及加强院系教务干事队伍建设的有效途径。

  11. Immobilization of L-Lysine on Zeolite 4A as an Organic-Inorganic Composite Basic Catalyst for Synthesis of α,β-Unsaturated Carbonyl Compounds under Mild Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Zamani, Farzad; Rezapour, Mehdi; Kianpour, Sahar [Islamic Azad Univ., Isfahan (Iran, Islamic Republic of)

    2013-08-15

    Lysine (Lys) immobilized on zeolite 4A was prepared by a simple adsorption method. The physical and chemical properties of Lys/zeolite 4A were investigated by X-ray diffraction (XRD), FT-IR, Brunauer-Emmett-Teller (BET), scanning electron microscopy (SEM), transmission electron microscopy (TEM) and UV-vis. The obtained organic-inorganic composite was effectively employed as a heterogeneous basic catalyst for synthesis of α,β-unsaturated carbonyl compounds. No by-product formation, high yields, short reaction times, mild reaction conditions, operational simplicity with reusability of the catalyst are the salient features of the present catalyst.

  12. CRITICAL ASSUMPTIONS IN THE F-TANK FARM CLOSURE OPERATIONAL DOCUMENTATION REGARDING WASTE TANK INTERNAL CONFIGURATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Hommel, S.; Fountain, D.

    2012-03-28

    The intent of this document is to provide clarification of critical assumptions regarding the internal configurations of liquid waste tanks at operational closure, with respect to F-Tank Farm (FTF) closure documentation. For the purposes of this document, FTF closure documentation includes: (1) Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the FTF PA) (SRS-REG-2007-00002), (2) Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site (DOE/SRS-WD-2012-001), (3) Tier 1 Closure Plan for the F-Area Waste Tank Systems at the Savannah River Site (SRR-CWDA-2010-00147), (4) F-Tank Farm Tanks 18 and 19 DOE Manual 435.1-1 Tier 2 Closure Plan Savannah River Site (SRR-CWDA-2011-00015), (5) Industrial Wastewater Closure Module for the Liquid Waste Tanks 18 and 19 (SRRCWDA-2010-00003), and (6) Tank 18/Tank 19 Special Analysis for the Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the Tank 18/Tank 19 Special Analysis) (SRR-CWDA-2010-00124). Note that the first three FTF closure documents listed apply to the entire FTF, whereas the last three FTF closure documents listed are specific to Tanks 18 and 19. These two waste tanks are expected to be the first two tanks to be grouted and operationally closed under the current suite of FTF closure documents and many of the assumptions and approaches that apply to these two tanks are also applicable to the other FTF waste tanks and operational closure processes.

  13. Collections Care: A Basic Reference Shelflist.

    Science.gov (United States)

    de Torres, Amparo R., Ed.

    This is an extensive bibliography of reference sources--i.e., books and articles--that relate to the care and conservation of library, archival, and museum collections. Bibliographies are presented under the following headings: (1) General Information; (2) Basic Collections Care; (3) Architectural Conservation; (4) Collections Management: Law,…

  14. Guideline for Adopting the Local Reaction Assumption for Porous Absorbers in Terms of Random Incidence Absorption Coefficients

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho

    2011-01-01

    Room surfaces have been extensively modeled as locally reacting in room acoustic predictions although such modeling could yield significant errors under certain conditions. Therefore, this study aims to propose a guideline for adopting the local reaction assumption by comparing predicted random...... incidence acoustical characteristics of typical building elements made of porous materials assuming extended and local reaction. For each surface reaction, five well-established wave propagation models, the Delany-Bazley, Miki, Beranek, Allard-Champoux, and Biot model, are employed. Effects of the flow...... resistivity and the absorber thickness on the difference between the two surface reaction models are examined and discussed. For a porous absorber backed by a rigid surface, the assumption of local reaction always underestimates the random incidence absorption coefficient and the local reaction models give...

  15. Basic real analysis

    CERN Document Server

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....

  16. Basics of plasma astrophysics

    CERN Document Server

    Chiuderi, Claudio

    2015-01-01

    This book is an introduction to contemporary plasma physics that discusses the most relevant recent advances in the field and covers a careful choice of applications to various branches of astrophysics and space science. The purpose of the book is to allow the student to master the basic concepts of plasma physics and to bring him or her up to date in a number of relevant areas of current research. Topics covered include orbit theory, kinetic theory, fluid models, magnetohydrodynamics, MHD turbulence, instabilities, discontinuities, and magnetic reconnection. Some prior knowledge of classical physics is required, in particular fluid mechanics, statistical physics, and electrodynamics. The mathematical developments are self-contained and explicitly detailed in the text. A number of exercises are provided at the end of each chapter, together with suggestions and solutions.

  17. Cloud computing basics

    CERN Document Server

    Srinivasan, S

    2014-01-01

    Cloud Computing Basics covers the main aspects of this fast moving technology so that both practitioners and students will be able to understand cloud computing. The author highlights the key aspects of this technology that a potential user might want to investigate before deciding to adopt this service. This book explains how cloud services can be used to augment existing services such as storage, backup and recovery. Addressing the details on how cloud security works and what the users must be prepared for when they move their data to the cloud. Also this book discusses how businesses could prepare for compliance with the laws as well as industry standards such as the Payment Card Industry.

  18. Atomic Basic Blocks

    Science.gov (United States)

    Scheler, Fabian; Mitzlaff, Martin; Schröder-Preikschat, Wolfgang

    Die Entscheidung, einen zeit- bzw. ereignisgesteuerten Ansatz für ein Echtzeitsystem zu verwenden, ist schwierig und sehr weitreichend. Weitreichend vor allem deshalb, weil diese beiden Ansätze mit äußerst unterschiedlichen Kontrollflussabstraktionen verknüpft sind, die eine spätere Migration zum anderen Paradigma sehr schwer oder gar unmöglich machen. Wir schlagen daher die Verwendung einer Zwischendarstellung vor, die unabhängig von der jeweils verwendeten Kontrollflussabstraktion ist. Für diesen Zweck verwenden wir auf Basisblöcken basierende Atomic Basic Blocks (ABB) und bauen darauf ein Werkzeug, den Real-Time Systems Compiler (RTSC) auf, der die Migration zwischen zeit- und ereignisgesteuerten Systemen unterstützt.

  19. 论新形势下高校辅导员队伍应当具备的基本职业能力%Under the New Situation of University Instructor Team Should be the Basic Professional Ability

    Institute of Scientific and Technical Information of China (English)

    富显韬

    2011-01-01

    Pick to students' ideological and political education work, to be "the best sermon in rational, teach by example tangible and often teach square, group of powerful" and need to teach the less said, do not speak the truth, the popular slanting reason empty, speak deeply truth, raise the theme of ideological and political work. In order to adapt to the social development and the change of the value demand more students, the problem must have their own strong basic ability, improve the work skills, skilled in the use of the new communication, communication tools, the new object. A higher vocational ability, is the basic of university counselors to ideological and political education work young students the premise and foundation.%学生思想政治教育工作中要做到"言教有理、身教有形、常教有方、群教有力",需要辅导员少说小道理,不讲空道理,批驳歪道理,讲透大道理,扬起思想政治工作的主旋律。为适应社会发展和学生日益变幻的价值需求,高校辅导员必须具备自身强有力的基本能力,不断提高工作技能,熟练使用新的传播、交流工具,研究新的工作对象。具备较高的基本职业能力,是高校辅导员做好青年学生思想政治教育工作的前提和基础。

  20. As crianças de 0 a 6 anos nas políticas educacionais no Brasil: educação infantil e/é fundamental Children under 7 in educational policies in brazil: primary and basic education

    Directory of Open Access Journals (Sweden)

    Sonia Kramer

    2006-10-01

    Full Text Available Este artigo discute a educação infantil no contexto das políticas educacionais no Brasil. Inicialmente, situa a educação infantil no cenário político nacional e apresenta desafios deste campo. Focaliza, em seguida, a formação de profissionais de educação infantil, um dos maiores desafios das políticas educacionais, e trata da importância das mudanças curriculares do curso de pedagogia. No terceiro item, analisa educação infantil e ensino fundamental (agora com nove anos como instâncias indissociáveis do processo de democratização da educação brasileira e destaca a relevância desta articulação no que se refere às crianças e ao trabalho pedagógico nas creches, pré-escolas e escolas.This paper discusses childhood education in the context of Brazilian educational policies. In a first moment, childhood education is situated in the political panorama and the challenges of this field are presented. The paper then focuses on teacher education, one of the most expressive challenges of Brazilian educational policies and analyses the importance of the changes in the Pedagogy Course curriculum. In a third moment, it explores childhood education and basic school (now 9 year long as inseparable levels of the democratization process of Brazilian education and points out the relevance of this articulation for children and pedagogic practice in nursery, primary and basic schools.

  1. Basics of aerothermodynamics

    CERN Document Server

    Hirschel, Ernst Heinrich

    2015-01-01

    This successful book gives an introduction to the basics of aerothermodynamics, as applied in particular to winged re-entry vehicles and airbreathing hypersonic cruise and acceleration vehicles. The book gives a review of the issues of transport of momentum, energy and mass, real-gas effects as well as inviscid and viscous flow phenomena. In this second, revised edition the chapters with the classical topics of aerothermodynamics more or less were left untouched. The access to some single topics of practical interest was improved. Auxiliary chapters were put into an appendix. The recent successful flights of the X-43A and the X-51A indicate that the dawn of sustained airbreathing hypersonic flight now has arrived. This proves that the original approach of the book to put emphasis on viscous effects and the aerothermodynamics of radiation-cooled vehicle surfaces was timely. This second, revised edition even more accentuates these topics. A new, additional chapter treats examples of viscous thermal surface eff...

  2. [Basic research in pulmonology].

    Science.gov (United States)

    Gea, Joaquim

    2008-11-01

    This is a review of the articles dealing with basic science published in recent issues of Archivos de Bronconeumología. Of particular interest with regard to chronic obstructive pulmonary disease were an article on extrapulmonary inflammation and oxidative stress and another on bronchial remodeling. The articles relating to asthma included a review on the use of drugs that block free immunoglobulin-E and an article about the contribution of experimental models to our knowledge of this disease. Two of the most interesting articles on the topic of lung cancer dealt with gene therapy and resistance to chemotherapy. Also notable were 2 studies that investigated ischemia-reperfusion injury. One evaluated tissue resistance to injury while the other analyzed the role played by interleukin-8 in this process. On the topic of pulmonary fibrosis, an article focused on potential biomarkers of progression and prognosis; others dealt with the contribution of experimental models to our understanding of this disorder and the fibrogenic role of transforming growth factor b. In the context of both sleep apnea syndrome and pulmonary infection, studies investigating the role of oxidative stress were published. Finally, 2 studies analyzed the diagnosis and treatment of tuberculosis and other pulmonary infections.

  3. Gastric cancer: basic aspects.

    Science.gov (United States)

    Resende, Carlos; Thiel, Alexandra; Machado, José C; Ristimäki, Ari

    2011-09-01

    Gastric cancer (GC) is a world health burden, ranging as the second cause of cancer death worldwide. Etiologically, GC arises not only from the combined effects of environmental factors and susceptible genetic variants but also from the accumulation of genetic and epigenetic alterations. In the last years, molecular oncobiology studies brought to light a number of genes that are implicated in gastric carcinogenesis. This review is intended to focus on the recently described basic aspects that play key roles in the process of gastric carcinogenesis. Genetic variants of the genes IL-10, IL-17, MUC1, MUC6, DNMT3B, SMAD4, and SERPINE1 have been reported to modify the risk of developing GC. Several genes have been newly associated with gastric carcinogenesis, both through oncogenic activation (GSK3β, CD133, DSC2, P-Cadherin, CDH17, CD168, CD44, metalloproteinases MMP7 and MMP11, and a subset of miRNAs) and through tumor suppressor gene inactivation mechanisms (TFF1, PDX1, BCL2L10, XRCC, psiTPTE-HERV, HAI-2, GRIK2, and RUNX3). It also addressed the role of the inflammatory mediator cyclooxygenase-2 (COX-2) in the process of gastric carcinogenesis and its importance as a potential molecular target for therapy.

  4. Nanodesign: some basic questions

    CERN Document Server

    Schommers, Wolfram

    2013-01-01

    There is no doubt that nanoscience will be the dominant direction for technology in this century, and that this science will influence our lives to a large extent as well as open completely new perspectives on all scientific and technological disciplines. To be able to produce optimal nanosystems with tailor-made properties, it is necessary to analyze and construct such systems in advance by adequate theoretical and computational methods. Since we work in nanoscience and nanotechnology at the ultimate level, we have to apply the basic laws of physics. What methods and tools are relevant here? The book gives an answer to this question. The background of the theoretical methods and tools is critically discussed, and also the world view on which these physical laws are based. Such a debate is not only of academic interest but is of highly general concern, and this is because we constantly move in nanoscience and nanotechnology between two extreme poles, between infinite life and total destruction . On the one ...

  5. Basic operator theory

    CERN Document Server

    Gohberg, Israel

    2001-01-01

    rii application of linear operators on a Hilbert space. We begin with a chapter on the geometry of Hilbert space and then proceed to the spectral theory of compact self adjoint operators; operational calculus is next presented as a nat­ ural outgrowth of the spectral theory. The second part of the text concentrates on Banach spaces and linear operators acting on these spaces. It includes, for example, the three 'basic principles of linear analysis and the Riesz­ Fredholm theory of compact operators. Both parts contain plenty of applications. All chapters deal exclusively with linear problems, except for the last chapter which is an introduction to the theory of nonlinear operators. In addition to the standard topics in functional anal­ ysis, we have presented relatively recent results which appear, for example, in Chapter VII. In general, in writ­ ing this book, the authors were strongly influenced by re­ cent developments in operator theory which affected the choice of topics, proofs and exercises. One ...

  6. Basic Social Processes

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2005-06-01

    Full Text Available The goal of grounded theory is to generate a theory that accounts for a pattern of behavior that is relevant and problematic for those involved. The goal is not voluminous description, nor clever verification. As with all grounded theory, the generation of a basic social process (BSP theory occurs around a core category. While a core category is always present in a grounded research study, a BSP may not be.BSPs are ideally suited to generation by grounded theory from qualitative research because qualitative research can pick up process through fieldwork that continues over a period of time. BSPs are a delight to discover and formulate since they give so much movement and scope to the analyst’s perception of the data. BSPs such as cultivating, defaulting, centering, highlighting or becoming, give the feeling of process, change and movement over time. They also have clear, amazing general implications; so much so, that it is hard to contain them within the confines of a single substantive study. The tendency is to refer to them as a formal theory without the necessary comparative development of formal theory. They are labeled by a “gerund”(“ing” which both stimulates their generation and the tendency to over-generalize them.

  7. Basic science of osteoarthritis.

    Science.gov (United States)

    Cucchiarini, Magali; de Girolamo, Laura; Filardo, Giuseppe; Oliveira, J Miguel; Orth, Patrick; Pape, Dietrich; Reboul, Pascal

    2016-12-01

    Osteoarthritis (OA) is a prevalent, disabling disorder of the joints that affects a large population worldwide and for which there is no definitive cure. This review provides critical insights into the basic knowledge on OA that may lead to innovative end efficient new therapeutic regimens. While degradation of the articular cartilage is the hallmark of OA, with altered interactions between chondrocytes and compounds of the extracellular matrix, the subchondral bone has been also described as a key component of the disease, involving specific pathomechanisms controlling its initiation and progression. The identification of such events (and thus of possible targets for therapy) has been made possible by the availability of a number of animal models that aim at reproducing the human pathology, in particular large models of high tibial osteotomy (HTO). From a therapeutic point of view, mesenchymal stem cells (MSCs) represent a promising option for the treatment of OA and may be used concomitantly with functional substitutes integrating scaffolds and drugs/growth factors in tissue engineering setups. Altogether, these advances in the fundamental and experimental knowledge on OA may allow for the generation of improved, adapted therapeutic regimens to treat human OA.

  8. Basic Data on Biogas

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Renewable gases such as biogas and biomethane are considered as key energy carrier when the society is replacing fossil fuels with renewable alternatives. In Sweden, almost 80 % of the fossil fuels are used in the transport sector. Therefore, the focus in Sweden has been to use the produced biogas in this sector as vehicle gas. Basic Data on Biogas contains an overview of production, utilisation, climate effects etc. of biogas from a Swedish perspective. The purpose is to give an easy overview of the current situation in Sweden for politicians, decision makers and interested public. 1.4 TWh of biogas is produced annually in Sweden at approximately 230 facilities. The 135 wastewater treatment plants that produce biogas contribute with around half of the production. In order to reduce the sludge volume, biogas has been produced at wastewater treatment plants for decades. New biogas plants are mainly co-digestion plants and farm plants. The land filling of organic waste has been banned since 2005, thus the biogas produced in landfills is decreasing.

  9. 7 CFR 772.10 - Transfer and assumption-AMP loans.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Transfer and assumption-AMP loans. 772.10 Section 772..., DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS SERVICING MINOR PROGRAM LOANS § 772.10 Transfer and assumption—AMP loans. (a) Eligibility. The Agency may approve transfers and assumptions of AMP loans when: (1)...

  10. A Proposal for Testing Local Realism Without Using Assumptions Related to Hidden Variable States

    Science.gov (United States)

    Ryff, Luiz Carlos

    1996-01-01

    A feasible experiment is discussed which allows us to prove a Bell's theorem for two particles without using an inequality. The experiment could be used to test local realism against quantum mechanics without the introduction of additional assumptions related to hidden variables states. Only assumptions based on direct experimental observation are needed.

  11. Making Sense out of Sex Stereotypes in Advertising: A Feminist Analysis of Assumptions.

    Science.gov (United States)

    Ferrante, Karlene

    Sexism and racism in advertising have been well documented, but feminist research aimed at social change must go beyond existing content analyses to ask how advertising is created. Analysis of the "mirror assumption" (advertising reflects society) and the "gender assumption" (advertising speaks in a male voice to female…

  12. Assessing Key Assumptions of Network Meta-Analysis: A Review of Methods

    Science.gov (United States)

    Donegan, Sarah; Williamson, Paula; D'Alessandro, Umberto; Tudur Smith, Catrin

    2013-01-01

    Background: Homogeneity and consistency assumptions underlie network meta-analysis (NMA). Methods exist to assess the assumptions but they are rarely and poorly applied. We review and illustrate methods to assess homogeneity and consistency. Methods: Eligible articles focussed on indirect comparison or NMA methodology. Articles were sought by…

  13. Assessing the assumption of symmetric proximity measures in the context of multidimensional scaling.

    Science.gov (United States)

    Kelley, Ken

    2004-01-01

    Applications of multidimensional scaling often make the assumption of symmetry for the population matrix of proximity measures. Although the likelihood of such an assumption holding true varies from one area of research to another, formal assessment of such an assumption has received little attention. The present article develops a nonparametric procedure that can be used in a confirmatory fashion or in an exploratory fashion in order to probabilistically assess the assumption of population symmetry for proximity measures in a multidimensional scaling context. The proposed procedure makes use of the bootstrap technique and alleviates the assumptions of parametric statistical procedures. Computer code for R and S-Plus is included in an appendix in order to carry out the proposed procedures.

  14. An Embedded Technology Basic Course that Cooperates with Region

    Science.gov (United States)

    Fujisawa, Yoshinori; Nakajima, Takayuki; Nirei, Masami

    Nagano National College of Technology cooperates with regions of Nagano prefecture that concluded an agreement and holds an embedded technology basic course. This basic course developed by the authors, and its teaching materials have been developed based on author‧s teaching experience in Nagano National College of Technology. The basic course intends for engineers who have no experience on the embedded technology and are willing to learn its foundation again. Thus the authors constituted the curriculum based on assumption that some of engineers attending the course have not enough knowledge of the C language either. This paper describes concept of the course and detail of its teaching materials, and also describes results of an examination and a questionnaire.

  15. 翻转课堂理念下高校计算机应用基础课程教学设计%Discussion on Teaching Design of Basic Computer Application Course in University under the Concept of Flipped Classroom

    Institute of Scientific and Technical Information of China (English)

    徐正梅; 杨颖; 曹红兵; 张永华

    2016-01-01

    针对传统教学模式下计算机应用基础课程教学存在的问题,尝试采用翻转课堂进行教学,经过精心的课前任务设计和多样化的课堂交互形式,学生的学习积极性、动手操作能力、自主学习能力和协作创新能力得到明显提升,取得了很好的教学效果。%With regard to the problems of traditional teaching mode of basic computer application course , using the flipped classroom teaching can not only improve students’ enthusiasm, hands-on ability, self-learning ability, and collaborative innovation capacity as well, but also achieve good teaching results, through the careful design of the task before class and diverse forms of in-teraction in the classroom .

  16. The European Water Framework Directive: How Ecological Assumptions Frame Technical and Social Change

    Directory of Open Access Journals (Sweden)

    Patrick Steyaert

    2007-06-01

    Full Text Available The European Water Framework Directive (WFD is built upon significant cognitive developments in the field of ecological science but also encourages active involvement of all interested parties in its implementation. The coexistence in the same policy text of both substantive and procedural approaches to policy development stimulated this research as did our concerns about the implications of substantive ecological visions within the WFD policy for promoting, or not, social learning processes through participatory designs. We have used a qualitative analysis of the WFD text which shows the ecological dimension of the WFD dedicates its quasi-exclusive attention to a particular current of thought in ecosystems science focusing on ecosystems status and stability and considering human activities as disturbance factors. This particular worldview is juxtaposed within the WFD with a more utilitarian one that gives rise to many policy exemptions without changing the general underlying ecological model. We discuss these policy statements in the light of the tension between substantive and procedural policy developments. We argue that the dominant substantive approach of the WFD, comprising particular ecological assumptions built upon "compositionalism," seems to be contradictory with its espoused intention of involving the public. We discuss that current of thought in regard to more functionalist thinking and adaptive management, which offers greater opportunities for social learning, i.e., place a set of interdependent stakeholders in an intersubjective position in which they operate a "social construction" of water problems through the co-production of knowledge.

  17. [Assumption of medical risks and the problem of medical liability in ancient Roman law].

    Science.gov (United States)

    Váradi, Agnes

    2008-11-01

    The claim of an individual to assure his health and life, to assume and compensate the damage from diseases and accidents, had already appeared in the system of the ancient Roman law in the form of many singular legal institutions. In lack of a unified archetype of regulation, we have to analyse the damages caused in the health or corporal integrity of different personal groups: we have to mention the legal interpretation of the diseases or injuries suffered by serves, people under manus or patria potestas and free Roman citizens. The fragments from the Digest od Justinian do not only demonstrate concrete legal problems, but they can serve as a starting point for further theoretical analyses. For example: if death is the consequence of a medical failure, does the doctor have any kind of liability? Was after-care part of the healing process according to the Roman law? Examining these questions, we should not forget to talk about the complex liability system of the Roman law, the compensation of the damages caused in a contractual or delictual context and about the lex Aquilia. Although these conclusions have no direct relation with the present legal regulation of risk assumption, we have to see that analysing the examples of the Roman law can be useful for developing our view of a certain theoretical problem, like that of the modern liability concept in medicine as well.

  18. Distributional assumptions in food and feed commodities- development of fit-for-purpose sampling protocols.

    Science.gov (United States)

    Paoletti, Claudia; Esbensen, Kim H

    2015-01-01

    Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.

  19. Implications of the homogeneous turbulence assumption on the aero-optic linking equation

    Science.gov (United States)

    Hugo, Ronald J.; Jumper, Eric J.

    1995-09-01

    This paper investigates the validity of applying the simplified (under the assumptions of isotropic and homogeneous turbulence) aero-optic linking equation to a flowfield that is known to consist of anisotropic and nonhomogeneous turbulence. The investigation is performed in the near nozzle-region of a heated two-dimensional jet, and the study makes use of a conditional sampling experiment to acquire a spatio-temporal temperature field data base for the heated jet flowfield. After compensating for the bandwidth limitations of constant-current-wire temperature measurements, the temperature field data base is applied to the computation of optical degradation through both direct methods and indirect methods relying on the aero-optic linking equation. The simplified version of the linking equation was found to provide very good agreement with direct calculations provided that the length scale of the density fluctuations was interpreted as being the integral scale, with the limits of the integration being the two first zero crossings of the covariance coefficient function.

  20. Numerical simulation of flow in mechanical heart valves: grid resolution and the assumption of flow symmetry.

    Science.gov (United States)

    Ge, Liang; Jones, S Casey; Sotiropoulos, Fotis; Healy, Timothy M; Yoganathan, Ajit P

    2003-10-01

    A numerical method is developed for simulating unsteady, 3-D, laminar flow through a bileaflet mechanical heart valve with the leaflets fixed. The method employs a dual-time-stepping artificial-compressibility approach together with overset (Chimera) grids and is second-order accurate in space and time. Calculations are carried out for the full 3-D valve geometry under steady inflow conditions on meshes with a total number of nodes ranging from 4 x 10(5) to 1.6 x 10(6). The computed results show that downstream of the leaflets the flow is dominated by two pairs of counter-rotating vortices, which originate on either side of the central orifice in the aortic sinus and rotate such that the common flow of each pair is directed away from the aortic wall. These vortices intensify with Reynolds number, and at a Reynolds number of approximately 1200 their complex interaction leads to the onset of unsteady flow and the break of symmetry with respect to both geometric planes of symmetry. Our results show the highly 3-D structure of the flow; question the validity of computationally expedient assumptions of flow symmetry; and demonstrate the need for highly resolved, fully 3-D simulations if computational fluid dynamics is to accurately predict the flow in prosthetic mechanical heart valves.

  1. Research on the Basic Education in Rural Areas under the Perspective of Practical Ability%可行能力视角下的农村基础教育问题探析

    Institute of Scientific and Technical Information of China (English)

    胡章平

    2014-01-01

    根据印度经济学家阿玛蒂亚·森的可行能力理论研究发现,当前中国农村基础教育中存在师资力量短缺、教育经费不足、办学条件差以及辍学率高等问题,这是由于权利保障不平衡、城乡发展差距大、城乡二元结构化、教育信息公开化程度不高以及政府对农村教育关注度不够等因素所造成。运用阿玛蒂亚·森的工具性自由理论探析,这些问题要通过提高农民的可行能力来解决。%According to the Indian economist Amartya sen's theory of practical ability, the problems that exist in research analysis of the current China's rural basic education are the shortage of qualified teachers, inadequate education funds , poor school conditions and high dropout rates which are caused by the unbalanced safeguard right, the gap between urban and rural development, the different structured policies between urban and rural areas , the lower level of education information disclosure and the government ’s insufficient attention on the rural education. By applying Sen's tool freedom theory, these problems can be solved by promoting farmers' practical ability .

  2. An Exploration of Dental Students' Assumptions About Community-Based Clinical Experiences.

    Science.gov (United States)

    Major, Nicole; McQuistan, Michelle R

    2016-03-01

    The aim of this study was to ascertain which assumptions dental students recalled feeling prior to beginning community-based clinical experiences and whether those assumptions were fulfilled or challenged. All fourth-year students at the University of Iowa College of Dentistry & Dental Clinics participate in community-based clinical experiences. At the completion of their rotations, they write a guided reflection paper detailing the assumptions they had prior to beginning their rotations and assessing the accuracy of their assumptions. For this qualitative descriptive study, the 218 papers from three classes (2011-13) were analyzed for common themes. The results showed that the students had a variety of assumptions about their rotations. They were apprehensive about working with challenging patients, performing procedures for which they had minimal experience, and working too slowly. In contrast, they looked forward to improving their clinical and patient management skills and knowledge. Other assumptions involved the site (e.g., the equipment/facility would be outdated; protocols/procedures would be similar to the dental school's). Upon reflection, students reported experiences that both fulfilled and challenged their assumptions. Some continued to feel apprehensive about treating certain patient populations, while others found it easier than anticipated. Students were able to treat multiple patients per day, which led to increased speed and patient management skills. However, some reported challenges with time management. Similarly, students were surprised to discover some clinics were new/updated although some had limited instruments and materials. Based on this study's findings about students' recalled assumptions and reflective experiences, educators should consider assessing and addressing their students' assumptions prior to beginning community-based dental education experiences.

  3. Visual Basic 2012 programmer's reference

    CERN Document Server

    Stephens, Rod

    2012-01-01

    The comprehensive guide to Visual Basic 2012 Microsoft Visual Basic (VB) is the most popular programming language in the world, with millions of lines of code used in businesses and applications of all types and sizes. In this edition of the bestselling Wrox guide, Visual Basic expert Rod Stephens offers novice and experienced developers a comprehensive tutorial and reference to Visual Basic 2012. This latest edition introduces major changes to the Visual Studio development platform, including support for developing mobile applications that can take advantage of the Windows 8 operating system

  4. Troubling 'lived experience': a post-structural critique of mental health nursing qualitative research assumptions.

    Science.gov (United States)

    Grant, A

    2014-08-01

    Qualitative studies in mental health nursing research deploying the 'lived experience' construct are often written on the basis of conventional qualitative inquiry assumptions. These include the presentation of the 'authentic voice' of research participants, related to their 'lived experience' and underpinned by a meta-assumption of the 'metaphysics of presence'. This set of assumptions is critiqued on the basis of contemporary post-structural qualitative scholarship. Implications for mental health nursing qualitative research emerging from this critique are described in relation to illustrative published work, and some benefits and challenges for researchers embracing post-structural sensibilities are outlined.

  5. Some Finite Sample Properties and Assumptions of Methods for Determining Treatment Effects

    DEFF Research Database (Denmark)

    Petrovski, Erik

    2016-01-01

    for determining treatment effects were chosen: ordinary least squares regression, propensity score matching, and inverse probability weighting. The assumptions and properties tested across these methods are: unconfoundedness, differences in average treatment effects and treatment effects on the treated, overlap...... will compare assumptions and properties of select methods for determining treatment effects with Monte Carlo simulation. The comparison will highlight the pros and cons of using one method over another and the assumptions that researchers need to make for the method they choose.Three popular methods...

  6. 实名制环境下个人信息保护的基本原则重构%Rebuilding Basic Rules in Personal Information Protection under Real-Name Cyberspace

    Institute of Scientific and Technical Information of China (English)

    杨晓娇

    2014-01-01

    传统个人信息保护原则是基于个人信息主体确定而义务主体不确定的对世权保护原则;而信息网络技术的普及应用,尤其是实名制的实施,导致基于信息收集契约关系的个人信息权利义务主体的确定性特征及其权利义务关系发生本质变化,并对个人信息权利保护原则提出新的挑战。经过对实名制环境下个人信息权的考察,我们应当重新构架新的个人信息保护原则。具体而言,实名制环境下的个人信息保护原则应当包括个人信息自决原则、信息收集处理合法原则、信息收集处理规则公开原则、信息收集处理双方地位平等原则、个人信息收集数量最小化原则、个人信息收集和利用有限原则、个人信息安全管理原则以及信息控制者的救济责任原则等八大基本原则。%Traditionally, the principles of how to protect personal information are established on the premise that information subject is certain and obligation subject, namely information control subject is uncertain. However, the popularization and application of information and network technology, especially the implementation of the real-name cyberspace, both of them are certain, which make key elements of principles in right protection undergo major transformation. And it put forward the new challenge to the principles of personal information right protection. In this case, it’s necessary to rebuild principles in personal information protection. It’s mentioned that the new principles should include 8 basic principles which are as follows: personal information self-determination, legalization on collecting and processing information, open rules in collecting and processing information, equal status between both sides of information collecting and processing, minimization of information quantity, limited collecting and using of personal information, security of personal information management, and

  7. 38 CFR 21.142 - Adult basic education.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Adult basic education. 21...) VOCATIONAL REHABILITATION AND EDUCATION Vocational Rehabilitation and Employment Under 38 U.S.C. Chapter 31 Special Rehabilitation Services § 21.142 Adult basic education. (a) Definition. The term adult...

  8. China's top 10 events in basic research in 2007

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    @@ Under the joint auspices of the Ministry of Sciences & Technology (MOST) and China Association for Science & Technology, more than 1,600 Chinese scholars including CAS and CAE members, chief scientists of the National Basic Research Program (dubbed 973 Program) and directors of national key labs, have voted for China's top 10 events in basic research in 2007.

  9. Private Security Training. Phase 1: Basic. Instructor Edition.

    Science.gov (United States)

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This basic module on private security training was designed under the direction of the Oklahoma Council on Law Enforcement Education and Training to teach basic skills necessary for entry-level employment in this field. This module contains six instructional units that cover the following topics: (1) interpreting the Oklahoma Security Guard and…

  10. BasicS oli Physicochemical Properties and Soil Fungal Diversti y under Different Forest Ty pes of Urban Forest%城市森林不同林型下土壤基本理化特性及土壤真菌多样性1)

    Institute of Scientific and Technical Information of China (English)

    高微微; 康颖; 卢宏; 王秋玉

    2016-01-01

    .mongolica, Pinus tabulaeformis var.mukdensis, Picea koraiensis, and forest edge grassland as control to de-termine the main soil physicochemical properties including soil pH , relative water content and electrical conductivity , and detected the soil fungal metagenomics diversity .There were significant variation among different soil samples in three soil basic properties, such as 4.597-7.393 for pH value,4.11%-10.90%for relative water content, 953.000-3 443.333μs· cm-1 for soil electrical conductivity .The pH value and soil electrical conductivity were highest for soil of Juglans mandshu-rica plantation, and the lowest for Larix gmelinii plantation.There were great difference in soil fungal metagenomics among eight soil samples.Total of 362 species, 211 genera, 124 families, 63 orders and 24 classes, 8 eumycota were in all soil samples.There were clear changes in the level of Eumycophyta and Eumycetes , including Ascomycota , Basidiomycota, Chytridiomycota , Zygomycota , and Glomeromycota .An ancient mycorrhizal fungi of Ascomycota newly discovered in recent years was found in the forest soil of Pinus tabulaeformis var.Mukdensis, while Agaricostibomycete fungi of pucciniomycoti-na, Basidiomycota were detected in the control samples and Glomeromycota fungi in the forest soil of Juglans mandshurica and Fraxinus mandshurica, Exobasidiomycete fungi of Ustilaginomycotina Basidiomycota were only detected in the forest soil of Picea koraiensis and control. The dominant species were the fungi of Ascomycota phylum in the forest soil of Juglans mandshurica, Fraxinus mandshurica, Larix gmelinii, Pinus sylvestris var.mongolica, and Basidiomycota fungi in the forest soil of Betula platyphylla, Pinus tabulaeformis var.Mukdensis, Picea koraiensis in Eumycophyta level .The dominate spe-cies in Eumycetes level were mainly Agaricomycetes fungi , in which sordariomycetes fungi of Pezizomycotina , Ascomycota as the dominate species were only found in the soil sample of Fraxinus mandshurica.

  11. Washington International Renewable Energy Conference (WIREC) 2008 Pledges. Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bilello, Daniel E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cowlin, Shannon C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wise, Alison [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2008-08-01

    This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions resulting from more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy.

  12. Who needs the assumption of opportunistic behavior? Transaction cost economics does not!

    DEFF Research Database (Denmark)

    Koch, Carsten Allan

    2000-01-01

    The assumption of opportunistic behavior, familiar from transaction cost economics, has been and remains highly controversial. But opportunistic behavior, albeit undoubtedly an extremely important form of motivation, is not a necessary condition for the contractual problems studied by transaction...

  13. Basic Learning Processes in Childhood.

    Science.gov (United States)

    Reese, Hayne W.

    This book is an introduction to the psychological study of basic learning processes in children. Written for students who are not majors in psychology and who do not have much familiarity with the technical vocabulary of psychology, it has two themes: even the most basic kinds of learning are included by cognitive processes or mental activities;…

  14. Basics for Handling Food Safely

    Science.gov (United States)

    ... 888-MPHotline (1-888-674-6854) Basics for Safe Food Handling dishes in bowls of ice or use ... 9 months Do not freeze 2 Basics for Safe Food Handling Product Refrigerator Freezer (40 °F) (0 °F) ...

  15. Japanese Basic Course: Exercise Book.

    Science.gov (United States)

    Defense Language Inst., Washington, DC.

    This exercise book, prepared for use after Lesson 121 of the Defense Language Institute Basic Course in Japanese, provides for instruction in the use of Kanji dictionaries, familiarizes students with useful phrases and expressions that are not included in the Basic Course, and allows for greater variety in the classroom. The ten lessons, in the…

  16. Shake gas. Basic information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-08-15

    The ongoing improvement of production technologies has enabled access to unconventional gas resources present in source rocks. Whether Poland is going to see a gas revolution depends chiefly on the geological conditions. At this point it is difficult to estimate the actual size of Poland's shale gas resources and commercialization of shale gas production. First results will be known in the next four or five years, when operators complete the work under exploration and appraisal licences granted to them by the Ministry of the Environment. Polish government is offering licences on exceptionally favourable terms as an incentive for research on unconventional gas resources. Such an approach is driven by the strategic objective of ending Poland's reliance on foreign sources of natural gas in the future. Shale gas will not change Poland's and the region's energy landscape instantaneously. As in the case of all commodity and energy revolutions, changes occur slowly, but shale gas development offers huge opportunities for a permanent shift in the Polish and European energy sectors. Poland stands a chance of becoming fully independent on natural gas imports, and Polish companies - a chance of improving their international standing.

  17. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    Science.gov (United States)

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage.

  18. The Fitness of Assumptions and an Alternative Model for Funding the Public Sector Pension Scheme: The Case of Rio Grande do Sul

    Directory of Open Access Journals (Sweden)

    Paulo Roberto Caldart

    2014-12-01

    Full Text Available The research presented herein has two objectives. First, this study will test whether actuarial assumptions for public sector pension schemes in Brazil adhere to reality and whether changing these assumptions might affect the results, particularly with respect to life tables and wage growth assumptions. The paper shows that the best fit life table is AT 2000 for males aggregated by one year, which involves a longer life expectancy than the life table proposed under current legislation (IBGE 2009. The data also show that actual wage growth was 4.59% per year from 2002 to 2012, as opposed to the 1% wage increase proposed by the same legislation. Changing these two assumptions increases the actuarial imbalance for a representative individual by 18.17% after accounting for the adjusted life table or by 98.30% after revising the wage growth assumption. With respect to its second objective, this paper proposes alternative funding mechanisms in which the local pension scheme will provide the funded component of the benefit that would be complemented by local government in a pay-as-you-go manner. The database utilized was for the state of Rio Grande do Sul in the month of November 2011. The results are thus restricted to Rio Grande do Sul.

  19. Nonparametric statistical tests for the continuous data: the basic concept and the practical use.

    Science.gov (United States)

    Nahm, Francis Sahngun

    2016-02-01

    Conventional statistical tests are usually called parametric tests. Parametric tests are used more frequently than nonparametric tests in many medical articles, because most of the medical researchers are familiar with and the statistical software packages strongly support parametric tests. Parametric tests require important assumption; assumption of normality which means that distribution of sample means is normally distributed. However, parametric test can be misleading when this assumption is not satisfied. In this circumstance, nonparametric tests are the alternative methods available, because they do not required the normality assumption. Nonparametric tests are the statistical methods based on signs and ranks. In this article, we will discuss about the basic concepts and practical use of nonparametric tests for the guide to the proper use.

  20. Evaluating abundance estimate precision and the assumptions of a count-based index for small mammals

    Science.gov (United States)

    Wiewel, A.S.; Adams, A.A.Y.; Rodda, G.H.

    2009-01-01

    Conservation and management of small mammals requires reliable knowledge of population size. We investigated precision of markrecapture and removal abundance estimates generated from live-trapping and snap-trapping data collected at sites on Guam (n 7), Rota (n 4), Saipan (n 5), and Tinian (n 3), in the Mariana Islands. We also evaluated a common index, captures per unit effort (CPUE), as a predictor of abundance. In addition, we evaluated cost and time associated with implementing live-trapping and snap-trapping and compared species-specific capture rates of selected live- and snap-traps. For all species, markrecapture estimates were consistently more precise than removal estimates based on coefficients of variation and 95 confidence intervals. The predictive utility of CPUE was poor but improved with increasing sampling duration. Nonetheless, modeling of sampling data revealed that underlying assumptions critical to application of an index of abundance, such as constant capture probability across space, time, and individuals, were not met. Although snap-trapping was cheaper and faster than live-trapping, the time difference was negligible when site preparation time was considered. Rattus diardii spp. captures were greatest in Haguruma live-traps (Standard Trading Co., Honolulu, HI) and Victor snap-traps (Woodstream Corporation, Lititz, PA), whereas Suncus murinus and Mus musculus captures were greatest in Sherman live-traps (H. B. Sherman Traps, Inc., Tallahassee, FL) and Museum Special snap-traps (Woodstream Corporation). Although snap-trapping and CPUE may have utility after validation against more rigorous methods, validation should occur across the full range of study conditions. Resources required for this level of validation would likely be better allocated towards implementing rigorous and robust methods.

  1. Basic properties of Fedosov supermanifolds

    CERN Document Server

    Geyer, B

    2004-01-01

    Basic properties of even (odd) supermanifolds endowed with a connection respecting a given symplectic structure are studied. Such supermanifolds can be considered as generalization of Fedosov manifolds to the supersymmetric case.

  2. Complementary Basic Education in Tanzania

    OpenAIRE

    大津, 和子

    2001-01-01

    This paper discusses current development in the Complementary Basic Education program (COBET), which aims to contribute to the provision of alternative learning opportunities for out-of-school children, particularly girls in a non-formal setting. The Ministry of Education and Culture started the program as part of the Basic Education Master Plan (BEMP) in 1999. Unlike traditional primary schools, the COBET centers have no school fees, no uniforms, no corporal punishment and no child labou...

  3. World assumptions, posttraumatic stress and quality of life after a natural disaster: A longitudinal study

    Directory of Open Access Journals (Sweden)

    Nygaard Egil

    2012-06-01

    Full Text Available Abstract Background Changes in world assumptions are a fundamental concept within theories that explain posttraumatic stress disorder. The objective of the present study was to gain a greater understanding of how changes in world assumptions are related to quality of life and posttraumatic stress symptoms after a natural disaster. Methods A longitudinal study of 574 Norwegian adults who survived the Southeast Asian tsunami in 2004 was undertaken. Multilevel analyses were used to identify which factors at six months post-tsunami predicted quality of life and posttraumatic stress symptoms two years post-tsunami. Results Good quality of life and posttraumatic stress symptoms were negatively related. However, major differences in the predictors of these outcomes were found. Females reported significantly higher quality of life and more posttraumatic stress than men. The association between level of exposure to the tsunami and quality of life seemed to be mediated by posttraumatic stress. Negative perceived changes in the assumption “the world is just” were related to adverse outcome in both quality of life and posttraumatic stress. Positive perceived changes in the assumptions “life is meaningful” and “feeling that I am a valuable human” were associated with higher levels of quality of life but not with posttraumatic stress. Conclusions Quality of life and posttraumatic stress symptoms demonstrate differences in their etiology. World assumptions may be less specifically related to posttraumatic stress than has been postulated in some cognitive theories.

  4. Are nest sites actively chosen? Testing a common assumption for three non-resource limited birds

    Science.gov (United States)

    Goodenough, A. E.; Elliot, S. L.; Hart, A. G.

    2009-09-01

    Many widely-accepted ecological concepts are simplified assumptions about complex situations that remain largely untested. One example is the assumption that nest-building species choose nest sites actively when they are not resource limited. This assumption has seen little direct empirical testing: most studies on nest-site selection simply assume that sites are chosen actively (and seek explanations for such behaviour) without considering that sites may be selected randomly. We used 15 years of data from a nestbox scheme in the UK to test the assumption of active nest-site choice in three cavity-nesting bird species that differ in breeding and migratory strategy: blue tit ( Cyanistes caeruleus), great tit ( Parus major) and pied flycatcher ( Ficedula hypoleuca). Nest-site selection was non-random (implying active nest-site choice) for blue and great tits, but not for pied flycatchers. We also considered the relative importance of year-specific and site-specific factors in determining occupation of nest sites. Site-specific factors were more important than year-specific factors for the tit species, while the reverse was true for pied flycatchers. Our results show that nest-site selection, in birds at least, is not always the result of active choice, such that choice should not be assumed automatically in studies of nesting behaviour. We use this example to highlight the need to test key ecological assumptions empirically, and the importance of doing so across taxa rather than for single "model" species.

  5. Basic analysis of regularized series and products

    CERN Document Server

    Jorgenson, Jay A

    1993-01-01

    Analytic number theory and part of the spectral theory of operators (differential, pseudo-differential, elliptic, etc.) are being merged under amore general analytic theory of regularized products of certain sequences satisfying a few basic axioms. The most basic examples consist of the sequence of natural numbers, the sequence of zeros with positive imaginary part of the Riemann zeta function, and the sequence of eigenvalues, say of a positive Laplacian on a compact or certain cases of non-compact manifolds. The resulting theory is applicable to ergodic theory and dynamical systems; to the zeta and L-functions of number theory or representation theory and modular forms; to Selberg-like zeta functions; andto the theory of regularized determinants familiar in physics and other parts of mathematics. Aside from presenting a systematic account of widely scattered results, the theory also provides new results. One part of the theory deals with complex analytic properties, and another part deals with Fourier analys...

  6. 论道德法律化背景下我国高校法律基础教育的道德化困境%Study on the Moral Dilemma of Legal Basic Education in Colleges and Universities of China under the Background of Moral Legalization

    Institute of Scientific and Technical Information of China (English)

    喻靖文

    2016-01-01

    Under the background of comprehensively promoting the rule of law and the legalization of morality , the tendency of moral Legalization on courses , objectives, contents and the role of teachers appears in the legal bas-ic education among colleges and universities in China .It leads to the fact that the objective of legal foundation course is difficult to realize and the legal quality of college students is difficult to improve .Moreover , it's tough to fulfill the willing of comprehensively implementing governing the country by law .The keys in solving this dilemma are to improve the curriculum objectives of legal basic education in colleges and universities , recover the independ-ent subject status of legal basic education curriculum , enhance the effectiveness of legal education and strengthen legal culture construction of colleges and universities .%在全面推进依法治国和道德法律化这一大背景下,我国高校法律基础教育却出现了课程、目标、内容和教师角色的道德化倾向,导致法律基础课程教育目标难以实现、大学生法律素质难以提高和全面依法治国的愿景难以推进的不利困境。破解这一困境必须提升高校法律基础教育课程目标,恢复法律基础教育课程独立的主体地位,提高法律教育的实效性,加强高校校园的法律文化建设。

  7. COMPETITION VERSUS COLLUSION: THE PARALLEL BEHAVIOUR IN THE ABSENCE OF THE SYMETRY ASSUMPTION

    Directory of Open Access Journals (Sweden)

    Romano Oana Maria

    2012-07-01

    Full Text Available Cartel detection is usually viewed as a key task of competition authorities. A special case of cartel is the parallel behaviour in terms of price selling. This type of behaviour is difficult to assess and its analysis has not always conclusive results. For evaluating such behaviour the data available are compared with theoretical values obtained by using a competitive or a collusive model. When different competitive or collusive models are considered, for the simplicity of calculations the economists use the symmetry assumption of costs and quantities produced / sold. This assumption has the disadvantage that the theoretical values obtained may deviate significantly from actual values (the real values on the market, which can sometimes lead to ambiguous results. The present paper analyses the parallel behaviour of economic agents in the absence of the symmetry assumption and study the identification of the model in this conditions.

  8. Critical appraisal of assumptions in chains of model calculations used to project local climate impacts for adaptation decision support—the case of Baakse Beek

    Science.gov (United States)

    van der Sluijs, Jeroen P.; Arjan Wardekker, J.

    2015-04-01

    In order to enable anticipation and proactive adaptation, local decision makers increasingly seek detailed foresight about regional and local impacts of climate change. To this end, the Netherlands Models and Data-Centre implemented a pilot chain of sequentially linked models to project local climate impacts on hydrology, agriculture and nature under different national climate scenarios for a small region in the east of the Netherlands named Baakse Beek. The chain of models sequentially linked in that pilot includes a (future) weather generator and models of respectively subsurface hydrogeology, ground water stocks and flows, soil chemistry, vegetation development, crop yield and nature quality. These models typically have mismatching time step sizes and grid cell sizes. The linking of these models unavoidably involves the making of model assumptions that can hardly be validated, such as those needed to bridge the mismatches in spatial and temporal scales. Here we present and apply a method for the systematic critical appraisal of model assumptions that seeks to identify and characterize the weakest assumptions in a model chain. The critical appraisal of assumptions presented in this paper has been carried out ex-post. For the case of the climate impact model chain for Baakse Beek, the three most problematic assumptions were found to be: land use and land management kept constant over time; model linking of (daily) ground water model output to the (yearly) vegetation model around the root zone; and aggregation of daily output of the soil hydrology model into yearly input of a so called ‘mineralization reduction factor’ (calculated from annual average soil pH and daily soil hydrology) in the soil chemistry model. Overall, the method for critical appraisal of model assumptions presented and tested in this paper yields a rich qualitative insight in model uncertainty and model quality. It promotes reflectivity and learning in the modelling community, and leads to

  9. CCN predictions using simplified assumptions of organic aerosol composition and mixing state: a synthesis from six different locations

    Directory of Open Access Journals (Sweden)

    B. Ervens

    2010-05-01

    Full Text Available An accurate but simple quantification of the fraction of aerosol particles that can act as cloud condensation nuclei (CCN is needed for implementation in large-scale models. Data on aerosol size distribution, chemical composition, and CCN concentration from six different locations have been analyzed to explore the extent to which simple assumptions of composition and mixing state of the organic fraction can reproduce measured CCN number concentrations.

    Fresher pollution aerosol as encountered in Riverside, CA, and the ship channel in Houston, TX, cannot be represented without knowledge of more complex (size-resolved composition. For aerosol that has experienced processing (Mexico City, Holme Moss (UK, Point Reyes (CA, and Chebogue Point (Canada, CCN can be predicted within a factor of two assuming either externally or internally mixed soluble organics although these simplified compositions/mixing states might not represent the actual properties of ambient aerosol populations, in agreement with many previous CCN studies in the literature. Under typical conditions, a factor of two uncertainty in CCN concentration due to composition assumptions translates to an uncertainty of ~15% in cloud drop concentration, which might be adequate for large-scale models given the much larger uncertainty in cloudiness.

  10. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  11. Automatic ethics: the effects of implicit assumptions and contextual cues on moral behavior.

    Science.gov (United States)

    Reynolds, Scott J; Leavitt, Keith; DeCelles, Katherine A

    2010-07-01

    We empirically examine the reflexive or automatic aspects of moral decision making. To begin, we develop and validate a measure of an individual's implicit assumption regarding the inherent morality of business. Then, using an in-basket exercise, we demonstrate that an implicit assumption that business is inherently moral impacts day-to-day business decisions and interacts with contextual cues to shape moral behavior. Ultimately, we offer evidence supporting a characterization of employees as reflexive interactionists: moral agents whose automatic decision-making processes interact with the environment to shape their moral behavior.

  12. A criterion of orthogonality on the assumption and restrictions in subgrid-scale modelling of turbulence

    Science.gov (United States)

    Fang, L.; Sun, X. Y.; Liu, Y. W.

    2016-12-01

    In order to shed light on understanding the subgrid-scale (SGS) modelling methodology, we analyze and define the concepts of assumption and restriction in the modelling procedure, then show by a generalized derivation that if there are multiple stationary restrictions in a modelling, the corresponding assumption function must satisfy a criterion of orthogonality. Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion. This study is expected to inspire future research on generally guiding the SGS modelling methodology.

  13. Impact of unseen assumptions on communication of atmospheric carbon mitigation options

    Science.gov (United States)

    Elliot, T. R.; Celia, M. A.; Court, B.

    2010-12-01

    With the rapid access and dissemination of information made available through online and digital pathways, there is need for a concurrent openness and transparency in communication of scientific investigation. Even with open communication it is essential that the scientific community continue to provide impartial result-driven information. An unknown factor in climate literacy is the influence of an impartial presentation of scientific investigation that has utilized biased base-assumptions. A formal publication appendix, and additional digital material, provides active investigators a suitable framework and ancillary material to make informed statements weighted by assumptions made in a study. However, informal media and rapid communiqués rarely make such investigatory attempts, often citing headline or key phrasing within a written work. This presentation is focused on Geologic Carbon Sequestration (GCS) as a proxy for the wider field of climate science communication, wherein we primarily investigate recent publications in GCS literature that produce scenario outcomes using apparently biased pro- or con- assumptions. A general review of scenario economics, capture process efficacy and specific examination of sequestration site assumptions and processes, reveals an apparent misrepresentation of what we consider to be a base-case GCS system. The authors demonstrate the influence of the apparent bias in primary assumptions on results from commonly referenced subsurface hydrology models. By use of moderate semi-analytical model simplification and Monte Carlo analysis of outcomes, we can establish the likely reality of any GCS scenario within a pragmatic middle ground. Secondarily, we review the development of publically available web-based computational tools and recent workshops where we presented interactive educational opportunities for public and institutional participants, with the goal of base-assumption awareness playing a central role. Through a series of

  14. Basic State Party Functions and Skills Under CWC

    Science.gov (United States)

    1992-09-01

    Capabilities: Inspection support expertise 1411411dild41dd141d 411#4d M id 44114 id 1 d lilt d ldd Cited Treaty Text Annex to Article IV (Chemical Weapons...Coriolis acceleration due to the earth rotation induces lateral dispersion of the trajectory for a zero bank angle flight. If desired, this effect can be...cross- product of the vehicle velocity vector and the earth rotational vector has induced a lateral dispersion of trajectory at the hypersonic cruise

  15. E-Basics: Online Basic Training in Program Evaluation

    Science.gov (United States)

    Silliman, Ben

    2016-01-01

    E-Basics is an online training in program evaluation concepts and skills designed for youth development professionals, especially those working in nonformal science education. Ten hours of online training in seven modules is designed to prepare participants for mentoring and applied practice, mastery, and/or team leadership in program evaluation.…

  16. Basic Operational Robotics Instructional System

    Science.gov (United States)

    Todd, Brian Keith; Fischer, James; Falgout, Jane; Schweers, John

    2013-01-01

    The Basic Operational Robotics Instructional System (BORIS) is a six-degree-of-freedom rotational robotic manipulator system simulation used for training of fundamental robotics concepts, with in-line shoulder, offset elbow, and offset wrist. BORIS is used to provide generic robotics training to aerospace professionals including flight crews, flight controllers, and robotics instructors. It uses forward kinematic and inverse kinematic algorithms to simulate joint and end-effector motion, combined with a multibody dynamics model, moving-object contact model, and X-Windows based graphical user interfaces, coordinated in the Trick Simulation modeling environment. The motivation for development of BORIS was the need for a generic system for basic robotics training. Before BORIS, introductory robotics training was done with either the SRMS (Shuttle Remote Manipulator System) or SSRMS (Space Station Remote Manipulator System) simulations. The unique construction of each of these systems required some specialized training that distracted students from the ideas and goals of the basic robotics instruction.

  17. Basic research for environmental restoration

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The Department of Energy (DOE) is in the midst of a major environmental restoration effort to reduce the health and environmental risks resulting from past waste management and disposal practices at DOE sites. This report describes research needs in environmental restoration and complements a previously published document, DOE/ER-0419, Evaluation of Mid-to-Long Term Basic Research for Environmental Restoration. Basic research needs have been grouped into five major categories patterned after those identified in DOE/ER-0419: (1) environmental transport and transformations; (2) advanced sampling, characterization, and monitoring methods; (3) new remediation technologies; (4) performance assessment; and (5) health and environmental effects. In addition to basic research, this document deals with education and training needs for environmental restoration. 2 figs., 6 tabs.

  18. Examining recent expert elicitation, judgment guidelines: Value assumptions and the prospects for rationality

    Energy Technology Data Exchange (ETDEWEB)

    Fleming, P.A. [Creighton Univ., Omaha, NE (United States). Dept. of Philosophy

    1999-12-01

    Any examination of the role of values in decisions on risk must take into consideration the increasing reliance on the expert judgment method. Today, reliance on expert judgment is conspicuously present in the documents and work associated with site characterization of Yucca Mountain as a host for the United States' first high level nuclear waste repository. The NRC encourages the use of probabilistic risk assessment's state of the art technology as a complement to deterministic approaches to nuclear regulatory activities. It considers expert judgment as one of those technologies. At the last International Conference on High-Level Nuclear Waste Development several presentations report on the use of expert elicitation sessions held during 1997 at Yucca Mountain. Over a decade ago, few guidelines existed for Department of Energy work in expert judgment. In an analysis of these guidelines, I described the author-advocate's view of the role of values in this method of risk assessment. I suggested that the guidelines assume naive positivism. I noted that the creators of these guidelines also tend toward scientific realism in their apologetic tone that expert judgment falls short of representing the way nature is. I also pointed to a tendency toward what I call a heightened or super-realism. Normal science represents the way the world is and for expert judgment this is only likely so. Expert judgment method, however, is capable of truly capturing expertise in a representative sense. The purpose of this paper is to examine new guidelines from the Department of Energy and the Nuclear Regulatory Commission, with a view to eliciting the epistemological assumptions about the role of values and the status of objectivity claimed for this method. Do these new guidelines also adopt naive positivism? Does the inability to encounter raw, pure, value-neutral expert judgment, reveal itself in these guidelines? Or do these guidelines adopt the belief that values are not

  19. Tests of the frozen-flux and tangentially geostrophic assumptions using magnetic satellite data

    DEFF Research Database (Denmark)

    Chulliat, A.; Olsen, Nils; Sabaka, T.

    In 1984, Jean-Louis Le Mouël published a paper suggesting that the flow at the top of the Earth’s core is tangentially geostrophic, i.e., the Lorentz force is much smaller than the Coriolis force in this particular region of the core. This new assumption wassubsequently used to discriminate among...

  20. HIERARCHICAL STRUCTURE IN ADL AND IADL - ANALYTICAL ASSUMPTIONS AND APPLICATIONS FOR CLINICIAN AND RESEARCHERS

    NARCIS (Netherlands)

    KEMPEN, GIJM; MYERS, AM; POWELL, LE

    1995-01-01

    The results of a Canadian study have shown that a set of 12 (I)ADL items did not meet the criteria of Guttman's scalogram program, questioning the assumption of hierarchical ordering. In this article, the hierarchical structure of (I)ADL items from the Canadian elderly sample is retested with anothe

  1. 76 FR 22925 - Assumption Buster Workshop: Abnormal Behavior Detection Finds Malicious Actors

    Science.gov (United States)

    2011-04-25

    ... Assumption Buster Workshop: Abnormal Behavior Detection Finds Malicious Actors AGENCY: The National... assumptionbusters@nitrd.gov . Travel expenses will be paid at the government rate for selected participants who live... behavioral models to monitor the size and destinations of financial transfers, and/or on-line...

  2. Comparison of Three Common Experimental Designs to Improve Statistical Power When Data Violate Parametric Assumptions.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    A Monte Carlo technique was used to investigate the small sample goodness of fit and statistical power of several nonparametric tests and their parametric analogues when applied to data which violate parametric assumptions. The motivation was to facilitate choice among three designs, simple random assignment with and without a concomitant variable…

  3. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    Science.gov (United States)

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  4. Monitoring long-lasting insecticidal net (LLIN) durability to validate net serviceable life assumptions, in Rwanda

    NARCIS (Netherlands)

    Hakizimana, E.; Cyubahiro, B.; Rukundo, A.; Kabayiza, A.; Mutabazi, A.; Beach, R.; Patel, R.; Tongren, J.E.; Karema, C.

    2014-01-01

    Background To validate assumptions about the length of the distribution–replacement cycle for long-lasting insecticidal nets (LLINs) in Rwanda, the Malaria and other Parasitic Diseases Division, Rwanda Ministry of Health, used World Health Organization methods to independently confirm the three-year

  5. The National Teacher Corps: A Study of Shifting Goals and Changing Assumptions

    Science.gov (United States)

    Eckert, Sarah Anne

    2011-01-01

    This article investigates the lasting legacy of the National Teacher Corps (NTC), which was created in 1965 by the U.S. federal government with two crucial assumptions: that teaching poor urban children required a very specific skill set and that teacher preparation programs were not providing adequate training in these skills. Analysis reveals…

  6. How Do People Learn at the Workplace? Investigating Four Workplace Learning Assumptions

    NARCIS (Netherlands)

    Kooken, Jose; Ley, Tobias; Hoog, de Robert; Duval, Erik; Klamma, Ralf

    2007-01-01

    Any software development project is based on assumptions about the state of the world that probably will hold when it is fielded. Investigating whether they are true can be seen as an important task. This paper describes how an empirical investigation was designed and conducted for the EU funded APO

  7. Challenging Assumptions about Values, Interests and Power in Further and Higher Education Partnerships

    Science.gov (United States)

    Elliott, Geoffrey

    2017-01-01

    This article raises questions that challenge assumptions about values, interests and power in further and higher education partnerships. These topics were explored in a series of semi-structured interviews with a sample of principals and senior higher education partnership managers of colleges spread across a single region in England. The data…

  8. Kinematic and static assumptions for homogenization in micromechanics of granular materials

    NARCIS (Netherlands)

    Kruyt, N.P.; Rothenburg, L.

    2004-01-01

    A study is made of kinematic and static assumptions for homogenization in micromechanics of granular materials for two cases. The first case considered deals with the elastic behaviour of isotropic, two-dimensional assemblies with bonded contacts. Using a minimum potential energy principle and estim

  9. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid mediu

  10. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  11. Net Generation at Social Software: Challenging Assumptions, Clarifying Relationships and Raising Implications for Learning

    Science.gov (United States)

    Valtonen, Teemu; Dillon, Patrick; Hacklin, Stina; Vaisanen, Pertti

    2010-01-01

    This paper takes as its starting point assumptions about use of information and communication technology (ICT) by people born after 1983, the so called net generation. The focus of the paper is on social networking. A questionnaire survey was carried out with 1070 students from schools in Eastern Finland. Data are presented on students' ICT-skills…

  12. 42 CFR 417.120 - Fiscally sound operation and assumption of financial risk.

    Science.gov (United States)

    2010-10-01

    ... liability claims, fire, theft, fraud, embezzlement, and other casualty risks. (2) Financial plan requirement... financial risk. 417.120 Section 417.120 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT...: Organization and Operation § 417.120 Fiscally sound operation and assumption of financial risk. (a)...

  13. World assumptions, religiosity, and PTSD in survivors of intimate partner violence.

    Science.gov (United States)

    Lilly, Michelle M; Howell, Kathryn H; Graham-Bermann, Sandra

    2015-01-01

    Intimate partner violence (IPV) is among the most frequent types of violence annually affecting women. One frequent outcome of violence exposure is posttraumatic stress disorder (PTSD). The theory of shattered world assumptions represents one possible explanation for adverse mental health outcomes following trauma, contending that trauma disintegrates individuals' core assumptions that the world is safe and meaningful, and that the self is worthy. Research that explores world assumptions in relationship to survivors of IPV has remained absent. A more consistent finding in research on IPV suggests that religiosity is strongly associated with survivors' reactions to, and recovery from, IPV. The present study found that world assumptions was a significant mediator of the relationship between IPV exposure and PTSD symptoms. Religiosity was also significantly, positively related to PTSD symptoms, but was not significantly related to amount of IPV exposure. Though African American women reported more IPV exposure and greater religiosity than European American women in the sample, there were no interethnic differences in PTSD symptom endorsement. Implications of these findings are discussed.

  14. The Impact of Feedback Frequency on Learning and Task Performance: Challenging the "More Is Better" Assumption

    Science.gov (United States)

    Lam, Chak Fu; DeRue, D. Scott; Karam, Elizabeth P.; Hollenbeck, John R.

    2011-01-01

    Previous research on feedback frequency suggests that more frequent feedback improves learning and task performance (Salmoni, Schmidt, & Walter, 1984). Drawing from resource allocation theory (Kanfer & Ackerman, 1989), we challenge the "more is better" assumption and propose that frequent feedback can overwhelm an individual's cognitive resource…

  15. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    Science.gov (United States)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  16. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Gernaey, Krist V.; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant...

  17. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    Science.gov (United States)

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  18. Complex Learning Theory--Its Epistemology and Its Assumptions about Learning: Implications for Physical Education

    Science.gov (United States)

    Light, Richard

    2008-01-01

    Davis and Sumara (2003) argue that differences between commonsense assumptions about learning and those upon which constructivism rests present a significant challenge for the fostering of constructivist approaches to teaching in schools. Indeed, as Rink (2001) suggests, initiating any change process for teaching method needs to involve some…

  19. Conceptualizing Identity Development: Unmasking the Assumptions within Inventories Measuring Identity Development

    Science.gov (United States)

    Moran, Christy D.

    2009-01-01

    The purpose of this qualitative research was to analyze the dimensions and manifestations of identity development embedded within commonly used instruments measuring student identity development. To this end, a content analysis of ten identity assessment tools was conducted to determine the assumptions about identity development contained therein.…

  20. Experimental evaluation of the pure configurational stress assumption in the flow dynamics of entangled polymer melts

    DEFF Research Database (Denmark)

    Rasmussen, Henrik K.; Bejenariu, Anca Gabriela; Hassager, Ole

    2010-01-01

    with the assumption of pure configurational stress was accurately able to predict the startup as well as the reversed flow behavior. This confirms that this commonly used theoretical picture for the flow of polymeric liquids is a correct physical principle to apply. c 2010 The Society of Rheology. [DOI: 10.1122/1.3496378]...

  1. Mutual assumptions and facts about nondisclosure among clinical supervisors and students in group supervision

    DEFF Research Database (Denmark)

    Nielsen, Geir Høstmark; Skjerve, Jan; Jacobsen, Claus Haugaard;

    2009-01-01

    In the two preceding papers of this issue of Nordic Psychology the authors report findings from a study of nondisclosure among student therapists and clinical supervisors. The findings were reported separately for each group. In this article, the two sets of findings are held together and compared......, so as to draw a picture of mutual assumptions and facts about nondisclosure among students and supervisors....

  2. The Mediating Effect of World Assumptions on the Relationship between Trauma Exposure and Depression

    Science.gov (United States)

    Lilly, Michelle M.; Valdez, Christine E.; Graham-Bermann, Sandra A.

    2011-01-01

    The association between trauma exposure and mental health-related challenges such as depression are well documented in the research literature. The assumptive world theory was used to explore this relationship in 97 female survivors of intimate partner violence (IPV). Participants completed self-report questionnaires that assessed trauma history,…

  3. Electronic imaging fundamentals: basic theory.

    Science.gov (United States)

    Vizy, K N

    1983-01-01

    Introduction of the computer into the field of medical imaging, as typified by the extensive use of digital subtraction angiography (DSA), created an important need for a basic understanding of the principles of digital imaging. This paper reviews these fundamental principles, starting with the definition of images and the interaction of these images with television display systems, then continuing with a detailed description of the way in which imaging systems are specified. This work defines the basic terms and concepts that will be used throughout the contents of this issue.

  4. Stereochemistry basic concepts and applications

    CERN Document Server

    Nógrádi, M

    2013-01-01

    Stereochemistry: Basic Concepts and Applications is a three-chapter text that introduces the basic principles and concepts of stereochemistry, as well as its application to organic chemistry application.Chapter 1 describes first the stereochemistry of the ground state, specifically the configuration and conformation of organic compounds, as well as the most important methods for its investigation. This chapter also deals with the kinetics of conformational changes and provides an overview of the so-called ""applied stereochemistry"". Chapter 2 focuses on the analysis of the internal motions of

  5. Basic linear partial differential equations

    CERN Document Server

    Treves, Francois

    2006-01-01

    Focusing on the archetypes of linear partial differential equations, this text for upper-level undergraduates and graduate students features most of the basic classical results. The methods, however, are decidedly nontraditional: in practically every instance, they tend toward a high level of abstraction. This approach recalls classical material to contemporary analysts in a language they can understand, as well as exploiting the field's wealth of examples as an introduction to modern theories.The four-part treatment covers the basic examples of linear partial differential equations and their

  6. The chemisorptive bond basic concepts

    CERN Document Server

    Clark, Alfred

    1974-01-01

    The Chemisorptive Bond: Basic Concepts describes the basic concepts of the chemisorptive bond on solid surfaces from the simple analogies with ordinary chemical bonds to the quantum-mechanical approaches.This book is composed of 10 chapters and begins with discussions of simple formulas for correlating measurable quantities in chemisorptions and catalysis. The succeeding chapters deal with theories based on quantum-mechanical principles that describe the mutual interactions of atoms of the solid and foreign atoms on the surface. The remaining chapters consider the possible arrangements

  7. Estimating the position of illuminants in paintings under weak model assumptions: an application to the works of two Baroque masters

    Science.gov (United States)

    Kale, David; Stork, David G.

    2009-02-01

    The problems of estimating the position of an illuminant and the direction of illumination in realist paintings have been addressed using algorithms from computer vision. These algorithms fall into two general categories: In model-independent methods (cast-shadow analysis, occluding-contour analysis, ...), one does not need to know or assume the three-dimensional shapes of the objects in the scene. In model-dependent methods (shape-fromshading, full computer graphics synthesis, ...), one does need to know or assume the three-dimensional shapes. We explore the intermediate- or weak-model condition, where the three-dimensional object rendered is so simple one can very confidently assume its three-dimensional shape and, further, that this shape admits an analytic derivation of the appearance model. Specifically, we can assume that floors and walls are flat and that they are horizontal and vertical, respectively. We derived the maximum-likelihood estimator for the two-dimensional spatial location of a point source in an image as a function of the pattern of brightness (or grayscale value) over such a planar surface. We applied our methods to two paintings of the Baroque, paintings for which the question of the illuminant position is of interest to art historians: Georges de la Tour's Christ in the carpenter's studio (1645) and Caravaggio's The calling of St. Matthew (1599-1600). Our analyses show that a single point source (somewhat near to the depicted candle) is a slightly better explanation of the pattern of brightness on the floor in Christ than are two point sources, one in place of each of the figures. The luminance pattern on the rear wall in The calling implies the source is local, a few meters outside the picture frame-not the infinitely distant sun. Both results are consistent with previous rebuttals of the recent art historical claim that these paintings were executed by means of tracing optically projected images. Our method is the first application of such weak-model methods for inferring the location of illuminants in realist paintings and should find use in other questions in the history of art.

  8. Calibration plot for proteomics: A graphical tool to visually check the assumptions underlying FDR control in quantitative experiments.

    Science.gov (United States)

    Giai Gianetto, Quentin; Combes, Florence; Ramus, Claire; Bruley, Christophe; Couté, Yohann; Burger, Thomas

    2016-01-01

    In MS-based quantitative proteomics, the FDR control (i.e. the limitation of the number of proteins that are wrongly claimed as differentially abundant between several conditions) is a major postanalysis step. It is classically achieved thanks to a specific statistical procedure that computes the adjusted p-values of the putative differentially abundant proteins. Unfortunately, such adjustment is conservative only if the p-values are well-calibrated; the false discovery control being spuriously underestimated otherwise. However, well-calibration is a property that can be violated in some practical cases. To overcome this limitation, we propose a graphical method to straightforwardly and visually assess the p-value well-calibration, as well as the R codes to embed it in any pipeline. All MS data have been deposited in the ProteomeXchange with identifier PXD002370 (http://proteomecentral.proteomexchange.org/dataset/PXD002370).

  9. Retrieval of the particle size distribution function from the data of lidar sensing under the assumption of known refractive index.

    Science.gov (United States)

    Samoilova, S V; Sviridenkov, M A; Penner, I E

    2016-10-01

    This paper presents a method to retrieve the particle size distribution function from the data of vertical lidar sensing. We have used 462 data models obtained at the Zvenigorod AERONET site obtained in 2011-2012. For each laser shot, we considered both fine (with particle sizes in a range from 0.05 to 0.6 μm) and coarse aerosol fractions (from 0.6 to 10 μm), with emphasize on the coarse fraction. Our suggested method is a modification of the Tikhonov method. The Tikhonov method is not optimal for coarse particles because its stabilizer does not and cannot account for the presence of the coarse mode, i.e., existence of more than one maximum of the size distribution function. The components of the matrix Wu-1 located in quadrants II and IV are sensitive to the change of these parameters. Neglecting this fact will lead again to arbitrary estimates of the contribution of the coarse particles even for exact values on the main diagonal and the two diagonals adjacent to it. Our method allows the coarse fraction up to 2.5 μm to be determined unambiguously. For larger particles (>2.5  μm) we recommend using the available sets of the coefficients, but with the level of values to be determined.

  10. SIMPLEST DIFFERENTIAL EQUATION OF STOCK PRICE, ITS SOLUTION AND RELATION TO ASSUMPTION OF BLACK-SCHOLES MODEL

    Institute of Scientific and Technical Information of China (English)

    云天铨; 雷光龙

    2003-01-01

    Two kinds of mathematical expressions of stock price, one of which based on certain description is the solution of the simplest differential equation (S.D.E.) obtained by method similar to that used in solid mechanics, the other based on uncertain description (i. e., the statistic theory) is the assumption of Black-Scholes's model (A.B-S.M.) in which the density function of stock price obeys logarithmic normal distribution, can be shown to be completely the same under certain equivalence relation of coefficients. The range of the solution of S.D.E. has been shown to be suited only for normal cases (no profit, or lost profit news, etc.) of stock market, so the same range is suited for A. B-S. M. as well.

  11. Safety Training: Basic Safety and Access Courses

    CERN Multimedia

    Antonella Vignes

    2005-01-01

    Objective The purpose of the basic safety courses is to increase awareness for everyone working on the CERN site (CERN staff, associates, outside companies, students and apprentices) of the various existing on-site hazards, and how to recognize and avoid them. Safety course changes The current organization for basic safety courses is changing. There will be two main modifications: the organization of the courses and the implementation of a specific new training course for the LHC machine during the LHC tests and hardware commissioning phase. Organizational changes This concerns the existing basic safety training, currently called level1, level2 and level3. Under the new procedure, a video will be projected in registration building 55 and will run every day at 14.00 and 15.00 in English. The duration of the video will be 50 minutes. The course contents will be the same as the slides currently used, plus a video showing real situations. With this new organization, attendees will systematically follow the...

  12. Safety Training: basic safety and access courses

    CERN Multimedia

    2005-01-01

    Objective The purpose of the basic safety courses is to increase awareness for everyone working on the CERN site (CERN staff, associates, outside companies, students and apprentices) of the various hazards existing on site, and how to recognise and avoid them. Safety course changes The current organisation of basic safety courses is changing. There will be two main modifications: the organisation of the courses and the implementation of a specific new training course for the LHC machine during the LHC tests and hardware commissioning phase. Organisational changes This concerns the existing basic safety training, currently called level 1, level 2 and level 3. Under the new procedure, a video will be projected in registration building 55 and will run every day at 14.00 and 15.00 in English. The duration of the video will be 50 minutes. The course contents will be the same as the slides currently used, plus a video showing real situations. With this new organization, participants will systematically follow...

  13. Sensitivity analysis of incomplete longitudinal data departing from the missing at random assumption: Methodology and application in a clinical trial with drop-outs.

    Science.gov (United States)

    Moreno-Betancur, M; Chavance, M

    2016-08-01

    Statistical analyses of longitudinal data with drop-outs based on direct likelihood, and using all the available data, provide unbiased and fully efficient estimates under some assumptions about the drop-out mechanism. Unfortunately, these assumptions can never be tested from the data. Thus, sensitivity analyses should be routinely performed to assess the robustness of inferences to departures from these assumptions. However, each specific scientific context requires different considerations when setting up such an analysis, no standard method exists and this is still an active area of research. We propose a flexible procedure to perform sensitivity analyses when dealing with continuous outcomes, which are described by a linear mixed model in an initial likelihood analysis. The methodology relies on the pattern-mixture model factorisation of the full data likelihood and was validated in a simulation study. The approach was prompted by a randomised clinical trial for sleep-maintenance insomnia treatment. This case study illustrated the practical value of our approach and underlined the need for sensitivity analyses when analysing data with drop-outs: some of the conclusions from the initial analysis were shown to be reliable, while others were found to be fragile and strongly dependent on modelling assumptions. R code for implementation is provided.

  14. On assumption in low-altitude investigation of dayside magnetospheric phenomena

    Science.gov (United States)

    Koskinen, H. E. J.

    In the physics of large-scale phenomena in complicated media, such as space plasmas, the chain of reasoning from the fundamental physics to conceptual models is a long and winding road, requiring much physical insight and reliance on various assumptions and approximations. The low-altitude investigation of dayside phenomena provides numerous examples of problems arising from the necessity to make strong assumptions. In this paper we discuss some important assumptions that are either unavoidable or at least widely used. Two examples are the concepts of frozen-in field lines and convection velocity. Instead of asking what violates the frozen-in condition, it is quite legitimate to ask what freezes the plasma and the magnetic field in the first place. Another important complex of problems are the limitations introduced by a two-dimensional approach or linearization of equations. Although modern research is more and more moving toward three-dimensional and time-dependent models, limitations in computing power often make a two-dimensional approach tempting. In a similar way, linearization makes equations analytically tractable. Finally, a very central question is the mapping. In the first approximation, the entire dayside magnetopause maps down to the ionosphere through the dayside cusp region. From the mapping viewpoint, the cusp is one of the most difficult regions and assumptions needed to perform the mapping in practice must be considered with the greatest possible care. We can never avoid assumptions but we must always make them clear to ourselves and also to the readers of our papers.

  15. Direct numerical simulations of temporally developing hydrocarbon shear flames at elevated pressure: effects of the equation of state and the unity Lewis number assumption

    Science.gov (United States)

    Korucu, Ayse; Miller, Richard

    2016-11-01

    Direct numerical simulations (DNS) of temporally developing shear flames are used to investigate both equation of state (EOS) and unity-Lewis (Le) number assumption effects in hydrocarbon flames at elevated pressure. A reduced Kerosene / Air mechanism including a semi-global soot formation/oxidation model is used to study soot formation/oxidation processes in a temporarlly developing hydrocarbon shear flame operating at both atmospheric and elevated pressures for the cubic Peng-Robinson real fluid EOS. Results are compared to simulations using the ideal gas law (IGL). The results show that while the unity-Le number assumption with the IGL EOS under-predicts the flame temperature for all pressures, with the real fluid EOS it under-predicts the flame temperature for 1 and 35 atm and over-predicts the rest. The soot mass fraction, Ys, is only under-predicted for the 1 atm flame for both IGL and real gas fluid EOS models. While Ys is over-predicted for elevated pressures with IGL EOS, for the real gas EOS Ys's predictions are similar to results using a non-unity Le model derived from non-equilibrium thermodynamics and real diffusivities. Adopting the unity Le assumption is shown to cause misprediction of Ys, the flame temperature, and the mass fractions of CO, H and OH.

  16. Predicting Grades in Basic Algebra.

    Science.gov (United States)

    Newman, Elise

    1994-01-01

    Data from (n=470) students at Owens Technical College in Fall 1991 showed that high school GPA was the best predictor of grades in Basic Algebra, followed by high school rank, college GPA, ACT natural sciences, ASSET numerical skills, and ASSET elementary algebra scores. (11 references) (SW)

  17. Dental Health: The Basic Facts

    Science.gov (United States)

    ... difficult to manage. The basic fact is healthy teeth and gums are essential for: n Preventing infections which may cause MS symptoms to increase n ... person clenches his or her jaws or “grinds” teeth, usually during the night n ... and periodontitis are infections, each of which can be made worse by ...

  18. Basic DTU Wind Energy controller

    DEFF Research Database (Denmark)

    Hansen, Morten Hartvig; Henriksen, Lars Christian

    This report contains a description and documentation, including source code, of the basic DTU Wind Energy controller applicable for pitch-regulated, variable speed wind turbines. The controller features both partial and full load operation capabilities as well as switching mechanisms ensuring...

  19. Unions: Bread, Butter & Basic Skills.

    Science.gov (United States)

    BCEL Newsletter for the Business Community, 1987

    1987-01-01

    Unions are natural providers of basic skills instruction. They are in daily workplace contact with their membership, are trusted to work on members' behalf, and speak the language of the worker. Unions are trying to address the needs of illiterate workers through collective bargaining arrangements in which employers contribute a percentage of…

  20. Basic HIV/AIDS Statistics

    Science.gov (United States)

    ... Abroad Treatment Basic Statistics Get Tested Find an HIV testing site near you. Enter ZIP code or city Follow HIV/AIDS CDC HIV CDC HIV/AIDS See RSS | ... Statistics Center . How many people are diagnosed with HIV each year in the United States? In 2015, ...

  1. Guarani Basic Course, Part II.

    Science.gov (United States)

    Blair, Robert W.; And Others

    This volume of the basic course in Guarani (the indigenous language of Paraguay) contains the core stage, or class-instructional phase, of the ten units presented in Volume One. These units contain explanations, exercises, dialogues, various types of pattern drills, suggestions for games and communication activities, and various types of…

  2. Guarani Basic Course, Part I.

    Science.gov (United States)

    Blair, Robert W.; And Others

    This is the first in a two-volume basic course in Guarani, the indigenous language of Paraguay. The volume consists of an introduction to the Guarani language, some general principles for adult language-learning, and ten instructional units. Because the goal of the course is to encourage and lead the learner to communicate in Guarani in class and…

  3. The Measurement of Basic Stuff.

    Science.gov (United States)

    Disch, James G., Ed.; And Others

    1983-01-01

    Seven articles contain information about measurement and evaluation in physical education and sport and complement the "Basic Stuff" series. They focus on (1) student self-assessment for exercise physiology; (2) monitoring motor development; (3) biomechanical analysis; and (4) measurements of aesthetic qualities, psychosocial…

  4. Emergency medicine: beyond the basics.

    Science.gov (United States)

    Malamed, S F

    1997-07-01

    Medical emergencies can arise in the dental office. Preparedness for these emergencies is predicated on an ability to rapidly recognize a problem and to effectively institute prompt and proper management. In all emergency situations, management is based on implementation of basic life support, as needed. The author describes the appropriate management of two common emergency situations: allergy and chest pain.

  5. Basic research in kidney cancer

    NARCIS (Netherlands)

    Oosterwijk, E.; Rathmell, W.K.; Junker, K.; Brannon, A.R.; Pouliot, F.; Finley, D.S.; Mulders, P.F.A.; Kirkali, Z.; Uemura, H.; Belldegrun, A.

    2011-01-01

    CONTEXT: Advances in basic research will enhance prognosis, diagnosis, and treatment of renal cancer patients. OBJECTIVE: To discuss advances in our understanding of the molecular basis of renal cancer, targeted therapies, renal cancer and immunity, and genetic factors and renal cell carcinoma (RCC)

  6. Thermionics basic principles of electronics

    CERN Document Server

    Jenkins, J; Ashhurst, W

    2013-01-01

    Basic Principles of Electronics, Volume I : Thermionics serves as a textbook for students in physics. It focuses on thermionic devices. The book covers topics on electron dynamics, electron emission, and the themionic vacuum diode and triode. Power amplifiers, oscillators, and electronic measuring equipment are studied as well. The text will be of great use to physics and electronics students, and inventors.

  7. Women in Adult Basic Education

    Science.gov (United States)

    Park, Rosemarie J.

    1977-01-01

    A survey of adult basic education (ABE) program directors in five states revealed that most ABE teachers are women and work part-time without benefits while most ABE administrators are men who are employed full-time. Concludes that women employed in ABE are victims of discrimination. (EM)

  8. Welding. Performance Objectives. Basic Course.

    Science.gov (United States)

    Vincent, Kenneth

    Several intermediate performance objectives and corresponding criterion measures are listed for each of eight terminal objectives for a basic welding course. The materials were developed for a 36-week (2 hours daily) course developed to teach the fundamentals of welding shop work, to become familiar with the operation of the welding shop…

  9. Basic Income on the Agenda

    NARCIS (Netherlands)

    Groot, Loek; Veen, van der Robert-Jan

    2000-01-01

    Persisting unemployment, poverty and social exclusion, labour market flexibility, job insecurity and higher wage inequality, changing patterns of work and family life are among the factors that exert pressure on welfare states in Europe. This book explores the potential of an unconditional basic inc

  10. Basic types of plant layout

    OpenAIRE

    Salas Bacalla, Julio; Docente FII-UNMSM

    2014-01-01

    Basic formats plant layout shown, considering the criteria to be taken into account in each of the formats.  Se muestra los formatos básicos de la distribución de planta, considerando los criterios que se deben tomar en cuenta en cada uno de los formatos.

  11. Common-sense chemistry: The use of assumptions and heuristics in problem solving

    Science.gov (United States)

    Maeyer, Jenine Rachel

    Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build predictions and make decisions). A better understanding and characterization of these constraints are of central importance in the development of curriculum and teaching strategies that better support student learning in science. It was the overall goal of this thesis to investigate student reasoning in chemistry, specifically to better understand and characterize the assumptions and heuristics used by undergraduate chemistry students. To achieve this, two mixed-methods studies were conducted, each with quantitative data collected using a questionnaire and qualitative data gathered through semi-structured interviews. The first project investigated the reasoning heuristics used when ranking chemical substances based on the relative value of a physical or chemical property, while the second study characterized the assumptions and heuristics used when making predictions about the relative likelihood of different types of chemical processes. Our results revealed that heuristics for cue selection and decision-making played a significant role in the construction of answers during the interviews. Many study participants relied frequently on one or more of the following heuristics to make their decisions: recognition, representativeness, one-reason decision-making, and arbitrary trend. These heuristics allowed students to generate answers in the absence of requisite knowledge, but often led students astray. When characterizing assumptions, our results indicate that students relied on intuitive, spurious, and valid assumptions about the nature of chemical substances and processes in building their responses. In particular, many

  12. Analysis of Modeling Assumptions used in Production Cost Models for Renewable Integration Studies

    Energy Technology Data Exchange (ETDEWEB)

    Stoll, Brady [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Townsend, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bloom, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-01

    Renewable energy integration studies have been published for many different regions exploring the question of how higher penetration of renewable energy will impact the electric grid. These studies each make assumptions about the systems they are analyzing; however the effect of many of these assumptions has not been yet been examined and published. In this paper we analyze the impact of modeling assumptions in renewable integration studies, including the optimization method used (linear or mixed-integer programming) and the temporal resolution of the dispatch stage (hourly or sub-hourly). We analyze each of these assumptions on a large and a small system and determine the impact of each assumption on key metrics including the total production cost, curtailment of renewables, CO2 emissions, and generator starts and ramps. Additionally, we identified the impact on these metrics if a four-hour ahead commitment step is included before the dispatch step and the impact of retiring generators to reduce the degree to which the system is overbuilt. We find that the largest effect of these assumptions is at the unit level on starts and ramps, particularly for the temporal resolution, and saw a smaller impact at the aggregate level on system costs and emissions. For each fossil fuel generator type we measured the average capacity started, average run-time per start, and average number of ramps. Linear programming results saw up to a 20% difference in number of starts and average run time of traditional generators, and up to a 4% difference in the number of ramps, when compared to mixed-integer programming. Utilizing hourly dispatch instead of sub-hourly dispatch saw no difference in coal or gas CC units for either start metric, while gas CT units had a 5% increase in the number of starts and 2% increase in the average on-time per start. The number of ramps decreased up to 44%. The smallest effect seen was on the CO2 emissions and total production cost, with a 0.8% and 0

  13. Unpacking assumptions about inclusion in community-based health promotion: perspectives of women living in poverty.

    Science.gov (United States)

    Ponic, Pamela; Frisby, Wendy

    2010-11-01

    Community-based health promoters often aim to facilitate "inclusion" when working with marginalized women to address their exclusion and related health issues. Yet the notion of inclusion has not been critically interrogated within this field, resulting in the perpetuation of assumptions that oversimplify it. We provide qualitative evidence on inclusion as a health-promotion strategy from the perspectives of women living in poverty. We collected data with women engaged in a 6-year community-based health promotion and feminist participatory action research project. Participants' experiences illustrated that inclusion was a multidimensional process that involved a dynamic interplay between structural determinants and individual agency. The women named multiple elements of inclusion across psychosocial, relational, organizational, and participatory dimensions. This knowledge interrupts assumptions that inclusion is achievable and desirable for so-called recipients of such initiatives. We thus call for critical consideration of the complexities, limitations, and possibilities of facilitating inclusion as a health-promotion strategy.

  14. THE HISTORY OF BUILDING THE NORTHERN FRATERNAL CELLS OF VIRGIN MARY ASSUMPTION MONASTERY IN TIKHVIN

    Directory of Open Access Journals (Sweden)

    Tatiana Nikolaevna PYATNITSKAYA

    2014-01-01

    Full Text Available The article is focused on the formation of one of the fra-ternal houses of the Virgin Mary Assumption Monastery in Tikhvin (Leningrad region, the volume-spatial compo-sition of which was developed during the second half of the 17th century. It describes the history of the complex origin around the Assumption Cathedral of the 16th cen-tury and Cell housing location in the wooden and stone ensembles. Comparing the archival documents and the data obtained as a result of field studies, were identified the initial planning and design features of the Nordic fraternal cells. The research identified brigades of Tikhvin masons of 1680-1690 who worked in the construction of the building. Fragments of the original architectural dec-orations and facade colors were found. The research also identified graphic reconstructions, giving an idea not only of the original appearance of the building, but also the history of its changes.

  15. Innovation or 'Inventions'? The conflict between latent assumptions in marine aquaculture and local fishery.

    Science.gov (United States)

    Martínez-Novo, Rodrigo; Lizcano, Emmánuel; Herrera-Racionero, Paloma; Miret-Pastor, Lluís

    2016-06-01

    Recent European policy highlights the need to promote local fishery and aquaculture by means of innovation and joint participation in fishery management as one of the keys to achieve the sustainability of our seas. However, the implicit assumptions held by the actors in the two main groups involved - innovators (scientists, businessmen and administration managers) and local fishermen - can complicate, perhaps even render impossible, mutual understanding and co-operation. A qualitative analysis of interviews with members of both groups in the Valencian Community (Spain) reveals those latent assumptions and their impact on the respective practices. The analysis shows that the innovation narrative in which one group is based and the inventions narrative used by the other one are rooted in two dramatically different, or even antagonistic, collective worldviews. Any environmental policy that implies these groups should take into account these strong discords.

  16. Error in the description of foot kinematics due to violation of rigid body assumptions.

    Science.gov (United States)

    Nester, C J; Liu, A M; Ward, E; Howard, D; Cocheba, J; Derrick, T

    2010-03-03

    Kinematic data from rigid segment foot models inevitably includes errors because the bones within each segment move relative to each other. This study sought to define error in foot kinematic data due to violation of the rigid segment assumption. The research compared kinematic data from 17 different mid and forefoot rigid segment models to kinematic data of the individual bones comprising these segments. Kinematic data from a previous dynamic cadaver model study was used to derive individual bone as well as foot segment kinematics. Mean and maximum errors due to violation of the rigid body assumption varied greatly between models. The model with least error was the combination of navicular and cuboid (mean errors kinematics research study being undertaken.

  17. IRT models with relaxed assumptions in eRm: A manual-like instruction

    Directory of Open Access Journals (Sweden)

    REINHOLD HATZINGER

    2009-03-01

    Full Text Available Linear logistic models with relaxed assumptions (LLRA as introduced by Fischer (1974 are a flexible tool for the measurement of change for dichotomous or polytomous responses. As opposed to the Rasch model, assumptions on dimensionality of items, their mutual dependencies and the distribution of the latent trait in the population of subjects are relaxed. Conditional maximum likelihood estimation allows for inference about treatment, covariate or trend effect parameters without taking the subjects' latent trait values into account. In this paper we will show how LLRAs based on the LLTM, LRSM and LPCM can be used to answer various questions about the measurement of change and how they can be fitted in R using the eRm package. A number of small didactic examples is provided that can easily be used as templates for real data sets. All datafiles used in this paper are available from http://eRm.R-Forge.R-project.org/

  18. Risk Pooling, Commitment and Information: An experimental test of two fundamental assumptions

    OpenAIRE

    Abigail Barr

    2003-01-01

    This paper presents rigorous and direct tests of two assumptions relating to limited commitment and asymmetric information that current underpin current models of risk pooling. A specially designed economic experiment involving 678 subjects across 23 Zimbabwean villages is used to solve the problems of observability and quantification that have frustrated previous attempts to conduct such tests. I find that more extrinsic commitment is associated with more risk pooling, but that more informat...

  19. Logic Assumptions and Risks Framework Applied to Defence Campaign Planning and Evaluation

    Science.gov (United States)

    2013-05-01

    Checkland, P. and J. Poulter (2006). Learning for Action:A Definitive Account of Soft Systems Methodology and its use for Practitioners, Teachers and... Systems Methodology alerts us to as differing ‘world views’. These are contrasted with assumptions about the causal linkages about the implementation...the problem and of the population, and the boundary, or limiting conditions, of the effects of the program – what Checkland and Poulter’s (2006) Soft

  20. Assumptions in quantitative analyses of health risks of overhead power lines

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, A.; Wardekker, J.A.; Van der Sluijs, J.P. [Department of Science, Technology and Society, Copernicus Institute, Utrecht University, Budapestlaan 6, 3584 CD Utrecht (Netherlands)

    2012-02-15

    One of the major issues hampering the formulation of uncontested policy decisions on contemporary risks is the presence of uncertainties in various stages of the policy cycle. In literature, different lines are suggested to address the problem of provisional and uncertain evidence. Reflective approaches such as pedigree analysis can be used to explore the quality of evidence when quantification of uncertainties is at stake. One of the issues where the quality of evidence impedes policy making, is the case of electromagnetic fields. In this case, a (statistical) association was suggested with an increased risk on childhood leukaemia in the vicinity of overhead power lines. A biophysical mechanism that could support this association was not found till date however. The Dutch government bases its policy concerning overhead power lines on the precautionary principle. For The Netherlands, previous studies have assessed the potential number of extra cases of childhood leukaemia due to the presence over overhead power lines. However, such a quantification of the health risk of EMF entails a (large) number of assumptions, both prior to and in the calculation chain. In this study, these assumptions were prioritized and critically appraised in an expert elicitation workshop, using a pedigree matrix for characterization of assumptions in assessments. It appeared that assumptions that were regarded to be important in quantifying the health risks show a high value-ladenness. The results show that, given the present state of knowledge, quantification of the health risks of EMF is premature. We consider the current implementation of the precautionary principle by the Dutch government to be adequate.

  1. RateMyProfessors.com: Testing Assumptions about Student Use and Misuse

    Science.gov (United States)

    Bleske-Rechek, April; Michels, Kelsey

    2010-01-01

    Since its inception in 1999, the RateMyProfessors.com (RMP.com) website has grown in popularity and, with that, notoriety. In this research we tested three assumptions about the website: (1) Students use RMP.com to either rant or rave; (2) Students who post on RMP.com are different from students who do not post; and (3) Students reward easiness by…

  2. Camera traps and mark-resight models: The value of ancillary data for evaluating assumptions

    Science.gov (United States)

    Parsons, Arielle W.; Simons, Theodore R.; Pollock, Kenneth H.; Stoskopf, Michael K.; Stocking, Jessica J.; O'Connell, Allan F.

    2015-01-01

    Unbiased estimators of abundance and density are fundamental to the study of animal ecology and critical for making sound management decisions. Capture–recapture models are generally considered the most robust approach for estimating these parameters but rely on a number of assumptions that are often violated but rarely validated. Mark-resight models, a form of capture–recapture, are well suited for use with noninvasive sampling methods and allow for a number of assumptions to be relaxed. We used ancillary data from continuous video and radio telemetry to evaluate the assumptions of mark-resight models for abundance estimation on a barrier island raccoon (Procyon lotor) population using camera traps. Our island study site was geographically closed, allowing us to estimate real survival and in situ recruitment in addition to population size. We found several sources of bias due to heterogeneity of capture probabilities in our study, including camera placement, animal movement, island physiography, and animal behavior. Almost all sources of heterogeneity could be accounted for using the sophisticated mark-resight models developed by McClintock et al. (2009b) and this model generated estimates similar to a spatially explicit mark-resight model previously developed for this population during our study. Spatially explicit capture–recapture models have become an important tool in ecology and confer a number of advantages; however, non-spatial models that account for inherent individual heterogeneity may perform nearly as well, especially where immigration and emigration are limited. Non-spatial models are computationally less demanding, do not make implicit assumptions related to the isotropy of home ranges, and can provide insights with respect to the biological traits of the local population.

  3. Basic concepts in computational physics

    CERN Document Server

    Stickler, Benjamin A

    2016-01-01

    This new edition is a concise introduction to the basic methods of computational physics. Readers will discover the benefits of numerical methods for solving complex mathematical problems and for the direct simulation of physical processes. The book is divided into two main parts: Deterministic methods and stochastic methods in computational physics. Based on concrete problems, the first part discusses numerical differentiation and integration, as well as the treatment of ordinary differential equations. This is extended by a brief introduction to the numerics of partial differential equations. The second part deals with the generation of random numbers, summarizes the basics of stochastics, and subsequently introduces Monte-Carlo (MC) methods. Specific emphasis is on MARKOV chain MC algorithms. The final two chapters discuss data analysis and stochastic optimization. All this is again motivated and augmented by applications from physics. In addition, the book offers a number of appendices to provide the read...

  4. Health insurance basic actuarial models

    CERN Document Server

    Pitacco, Ermanno

    2014-01-01

    Health Insurance aims at filling a gap in actuarial literature, attempting to solve the frequent misunderstanding in regards to both the purpose and the contents of health insurance products (and ‘protection products’, more generally) on the one hand, and the relevant actuarial structures on the other. In order to cover the basic principles regarding health insurance techniques, the first few chapters in this book are mainly devoted to the need for health insurance and a description of insurance products in this area (sickness insurance, accident insurance, critical illness covers, income protection, long-term care insurance, health-related benefits as riders to life insurance policies). An introduction to general actuarial and risk-management issues follows. Basic actuarial models are presented for sickness insurance and income protection (i.e. disability annuities). Several numerical examples help the reader understand the main features of pricing and reserving in the health insurance area. A short int...

  5. Magnetic resonance imaging the basics

    CERN Document Server

    Constantinides, Christakis

    2014-01-01

    Magnetic resonance imaging (MRI) is a rapidly developing field in basic applied science and clinical practice. Research efforts in this area have already been recognized with five Nobel prizes awarded to seven Nobel laureates in the past 70 years. Based on courses taught at The Johns Hopkins University, Magnetic Resonance Imaging: The Basics provides a solid introduction to this powerful technology. The book begins with a general description of the phenomenon of magnetic resonance and a brief summary of Fourier transformations in two dimensions. It examines the fundamental principles of physics for nuclear magnetic resonance (NMR) signal formation and image construction and provides a detailed explanation of the mathematical formulation of MRI. Numerous image quantitative indices are discussed, including (among others) signal, noise, signal-to-noise, contrast, and resolution. The second part of the book examines the hardware and electronics of an MRI scanner and the typical measurements and simulations of m...

  6. Nuclear medicine physics the basics

    CERN Document Server

    Chandra, Ramesh

    2012-01-01

    For decades this classic reference has been the book to review to master the complexities of nuclear-medicine physics. Part of the renowned The Basics series of medical physics books, Nuclear Medicine Physics has become an essential resource for radiology residents and practitioners, nuclear cardiologists, medical physicists, and radiologic technologists. This thoroughly revised Seventh Edition retains all the features that have made The Basics series a reliable and trusted partner for board review and reference. This handy manual contains key points at the end of each chapter that help to underscore principal concepts. You'll also find review questions at the end of each chapter—with detailed answers at the end of the book—to help you master the material. This edition includes useful appendices that elaborate on specific topics, such as physical characteristics of radionuclides and CGS and SI Units.

  7. Basics of modern mathematical statistics

    CERN Document Server

    Spokoiny, Vladimir

    2015-01-01

    This textbook provides a unified and self-contained presentation of the main approaches to and ideas of mathematical statistics. It collects the basic mathematical ideas and tools needed as a basis for more serious studies or even independent research in statistics. The majority of existing textbooks in mathematical statistics follow the classical asymptotic framework. Yet, as modern statistics has changed rapidly in recent years, new methods and approaches have appeared. The emphasis is on finite sample behavior, large parameter dimensions, and model misspecifications. The present book provides a fully self-contained introduction to the world of modern mathematical statistics, collecting the basic knowledge, concepts and findings needed for doing further research in the modern theoretical and applied statistics. This textbook is primarily intended for graduate and postdoc students and young researchers who are interested in modern statistical methods.

  8. Positron emission tomography basic sciences

    CERN Document Server

    Townsend, D W; Valk, P E; Maisey, M N

    2003-01-01

    Essential for students, science and medical graduates who want to understand the basic science of Positron Emission Tomography (PET), this book describes the physics, chemistry, technology and overview of the clinical uses behind the science of PET and the imaging techniques it uses. In recent years, PET has moved from high-end research imaging tool used by the highly specialized to an essential component of clinical evaluation in the clinic, especially in cancer management. Previously being the realm of scientists, this book explains PET instrumentation, radiochemistry, PET data acquisition and image formation, integration of structural and functional images, radiation dosimetry and protection, and applications in dedicated areas such as drug development, oncology, and gene expression imaging. The technologist, the science, engineering or chemistry graduate seeking further detailed information about PET, or the medical advanced trainee wishing to gain insight into the basic science of PET will find this book...

  9. Basic statistics in cell biology.

    Science.gov (United States)

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.

  10. Generalized Stieltjes transforms: basic aspects

    CERN Document Server

    Karp, Dmitry

    2011-01-01

    The paper surveys the basic properties of generalized Stieltjes functions including some new ones. We introduce the notion of exact Stieltjes order and give a criterion of exactness, simple sufficient conditions and some prototypical examples. The paper includes an appendix, where we define the left sided Riemann-Liouville and the right sided Kober-Erdelyi fractional integrals of measures supported on half axis and give inversion formulas for them.

  11. Basic emotions - self-awareness

    OpenAIRE

    Correia, Ana Almeida; Veiga-Branco, Augusta

    2011-01-01

    We start from basic emotions using Paul Ekman’s model (1999): joy, sadness, anger, surprise, disgust, fear and contempt, to study the concepts of Self- Awareness - Knowing our own emotions - (Goleman, 1995), and Emotional Awareness - Ability to become aware of one's own emotions - (Bisquerra, 2001). Objectives: To understand the levels of Emotional Self-awareness/ Emotional awareness of a group of preschool, primary and lower secondary school teachers through the identifi...

  12. Basic statistics for social research

    CERN Document Server

    Hanneman, Robert A; Riddle, Mark D

    2012-01-01

    A core statistics text that emphasizes logical inquiry, notmath Basic Statistics for Social Research teaches core generalstatistical concepts and methods that all social science majorsmust master to understand (and do) social research. Its use ofmathematics and theory are deliberately limited, as the authorsfocus on the use of concepts and tools of statistics in theanalysis of social science data, rather than on the mathematicaland computational aspects. Research questions and applications aretaken from a wide variety of subfields in sociology, and eachchapter is organized arou

  13. HMPT: Basic Radioactive Material Transportation

    Energy Technology Data Exchange (ETDEWEB)

    Hypes, Philip A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-29

    Hazardous Materials and Packaging and Transportation (HMPT): Basic Radioactive Material Transportation Live (#30462, suggested one time) and Test (#30463, required initially and every 36 months) address the Department of Transportation’s (DOT’s) function-specific [required for hazardous material (HAZMAT) handlers, packagers, and shippers] training requirements of the HMPT Los Alamos National Laboratory (LANL) Labwide training. This course meets the requirements of 49 CFR 172, Subpart H, Section 172.704(a)(ii), Function-Specific Training.

  14. Basic principles of electronics thermionics

    CERN Document Server

    Jenkins, J

    2013-01-01

    Basic Principles of Electronics, Volume I: Thermionics covers topics related to thermionic devices. The book starts by providing a physical background about electronics, including structure of matter, ionic, chemical and covalent combination, crystalline structure, conductors and insulators, and thermionic emission. The text then discusses electron dynamics; the characteristics and properties of electrons in solids; electron emission; and thermionic emission in a vacuum diode or triode. The development of the vacuum triode; gas-filled valves; and power amplifiers are also considered. The book

  15. Basic Studies in Plasma Physics

    Science.gov (United States)

    2013-01-31

    close to a Maxwellian parametrized by a temperature T and mean velocity u which satisfy certain non -linear equations, which are the macroscopic equations...Simulations with Particle-to-Grid Methods 17 E. Microscopic-Shock Profiles: Exact Solution of a Non -Equilibrium System 18 IV. List of Publications...Investigator ABSTRACT An improved understanding of equilibrium and non -equilibrium properties of plasmas is central to many areas of basic science as

  16. RF Basics I and II

    CERN Document Server

    Gerigk, Frank

    2013-01-01

    Maxwell's equations are introduced in their general form, together with a basic set of mathematical operations needed to work with them. After simplifying and adapting the equations for application to radio frequency problems, we derive the most important formulae and characteristic quantities for cavities and waveguides. Several practical examples are given to demonstrate the use of the derived equations and to explain the importance of the most common figures of merit.

  17. Bayesian Mass Estimates of the Milky Way II: The dark and light sides of parameter assumptions

    CERN Document Server

    Eadie, Gwendolyn M

    2016-01-01

    We present mass and mass profile estimates for the Milky Way Galaxy using the Bayesian analysis developed by Eadie et al (2015b) and using globular clusters (GCs) as tracers of the Galactic potential. The dark matter and GCs are assumed to follow different spatial distributions; we assume power-law model profiles and use the model distribution functions described in Evans et al. (1997); Deason et al (2011, 2012a). We explore the relationships between assumptions about model parameters and how these assumptions affect mass profile estimates. We also explore how using subsamples of the GC population beyond certain radii affect mass estimates. After exploring the posterior distributions of different parameter assumption scenarios, we conclude that a conservative estimate of the Galaxy's mass within 125kpc is $5.22\\times10^{11} M_{\\odot}$, with a $50\\%$ probability region of $(4.79, 5.63) \\times10^{11} M_{\\odot}$. Extrapolating out to the virial radius, we obtain a virial mass for the Milky Way of $6.82\\times10^{...

  18. Fluid-Structure Interaction Modeling of Intracranial Aneurysm Hemodynamics: Effects of Different Assumptions

    Science.gov (United States)

    Rajabzadeh Oghaz, Hamidreza; Damiano, Robert; Meng, Hui

    2015-11-01

    Intracranial aneurysms (IAs) are pathological outpouchings of cerebral vessels, the progression of which are mediated by complex interactions between the blood flow and vasculature. Image-based computational fluid dynamics (CFD) has been used for decades to investigate IA hemodynamics. However, the commonly adopted simplifying assumptions in CFD (e.g. rigid wall) compromise the simulation accuracy and mask the complex physics involved in IA progression and eventual rupture. Several groups have considered the wall compliance by using fluid-structure interaction (FSI) modeling. However, FSI simulation is highly sensitive to numerical assumptions (e.g. linear-elastic wall material, Newtonian fluid, initial vessel configuration, and constant pressure outlet), the effects of which are poorly understood. In this study, a comprehensive investigation of the sensitivity of FSI simulations in patient-specific IAs is investigated using a multi-stage approach with a varying level of complexity. We start with simulations incorporating several common simplifications: rigid wall, Newtonian fluid, and constant pressure at the outlets, and then we stepwise remove these simplifications until the most comprehensive FSI simulations. Hemodynamic parameters such as wall shear stress and oscillatory shear index are assessed and compared at each stage to better understand the sensitivity of in FSI simulations for IA to model assumptions. Supported by the National Institutes of Health (1R01 NS 091075-01).

  19. The Universality of Intuition an aposteriori Criticize to an apriori Assumption

    Directory of Open Access Journals (Sweden)

    Roohollah Haghshenas

    2015-03-01

    Full Text Available Intuition has a central role in philosophy, the role to arbitrating between different opinions. When a philosopher shows that "intuition" supports his view, he thinks this is a good reason for him. In contrast, if we show some contraries between intuition and a theory or some implications of it, we think a replacement or at least some revisions would be needed. There are some well-known examples of this role for intuition in many fields of philosophy the transplant case in ethics, the chinese nation case in philosophy of mind and the Gettier examples in epistemology. But there is an assumption here we suppose all people think in same manner, i.e. we think intuition(s is universal. Experimental philosophy tries to study this assumption experimentally. This project continues Quine's movement to "pursuit of truth" from a naturalistic point of view and making epistemology "as a branch of natural science." The work of experimental philosophy shows that in many cases people with different cultural backgrounds reflect to some specific moral or epistemological cases –like Gettier examples- differently and thus intuition is not universal. So, many problems that are based on this assumption maybe dissolved, have plural forms for plural cultures or bounded to some specific cultures –western culture in many cases.

  20. Basic sciences agonize in Turkey!

    Science.gov (United States)

    Akdemir, Fatma; Araz, Asli; Akman, Ferdi; Durak, Rıdvan

    2016-04-01

    In this study, changes from past to present in the departments of physics, chemistry, biology and mathematics, which are considered as the basic sciences in Turkey, are shown. The importance of basic science for the country emphasized and the status of our country was discussed with a critical perspective. The number of academic staff, the number of students, opened quotas according to years for these four departments at universities were calculated and analysis of the resulting changes were made. In examined graphics changes to these four departments were similar. Especially a significant change was observed in the physics department. Lack of jobs employing young people who have graduated from basic science is also an issue that must be discussed. There are also qualitative results of this study that we have discussed as quantitative. Psychological problems caused by unemployment have become a disease among young people. This study was focused on more quantitative results. We have tried to explain the causes of obtained results and propose solutions.