WorldWideScience

Sample records for basic assumptions underlying

  1. Contextuality under weak assumptions

    International Nuclear Information System (INIS)

    Simmons, Andrew W; Rudolph, Terry; Wallman, Joel J; Pashayan, Hakop; Bartlett, Stephen D

    2017-01-01

    The presence of contextuality in quantum theory was first highlighted by Bell, Kochen and Specker, who discovered that for quantum systems of three or more dimensions, measurements could not be viewed as deterministically revealing pre-existing properties of the system. More precisely, no model can assign deterministic outcomes to the projectors of a quantum measurement in a way that depends only on the projector and not the context (the full set of projectors) in which it appeared, despite the fact that the Born rule probabilities associated with projectors are independent of the context. A more general, operational definition of contextuality introduced by Spekkens, which we will term ‘probabilistic contextuality’, drops the assumption of determinism and allows for operations other than measurements to be considered contextual. Even two-dimensional quantum mechanics can be shown to be contextual under this generalised notion. Probabilistic noncontextuality represents the postulate that elements of an operational theory that cannot be distinguished from each other based on the statistics of arbitrarily many repeated experiments (they give rise to the same operational probabilities) are ontologically identical. In this paper, we introduce a framework that enables us to distinguish between different noncontextuality assumptions in terms of the relationships between the ontological representations of objects in the theory given a certain relation between their operational representations. This framework can be used to motivate and define a ‘possibilistic’ analogue, encapsulating the idea that elements of an operational theory that cannot be unambiguously distinguished operationally can also not be unambiguously distinguished ontologically. We then prove that possibilistic noncontextuality is equivalent to an alternative notion of noncontextuality proposed by Hardy. Finally, we demonstrate that these weaker noncontextuality assumptions are sufficient to prove

  2. Basic concepts and assumptions behind the new ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1979-01-01

    A review is given of some of the basic concepts and assumptions behind the current recommendations by the International Commission on Radiological Protection in ICRP Publications 26 and 28, which form the basis for the revision of the Basic Safety Standards jointly undertaken by IAEA, ILO, NEA and WHO. Special attention is given to the assumption of a linear, non-threshold dose-response relationship for stochastic radiation effects such as cancer and hereditary harm. The three basic principles of protection are discussed: justification of practice, optimization of protection and individual risk limitation. In the new ICRP recommendations particular emphasis is given to the principle of keeping all radiation doses as low as is reasonably achievable. A consequence of this is that the ICRP dose limits are now given as boundary conditions for the justification and optimization procedures rather than as values that should be used for purposes of planning and design. The fractional increase in total risk at various ages after continuous exposure near the dose limits is given as an illustration. The need for taking other sources, present and future, into account when applying the dose limits leads to the use of the commitment concept. This is briefly discussed as well as the new quantity, the effective dose equivalent, introduced by ICRP. (author)

  3. Primary prevention in public health: an analysis of basic assumptions.

    Science.gov (United States)

    Ratcliffe, J; Wallack, L

    1985-01-01

    The common definition of primary prevention is straightforward; but how it is transformed into a framework to guide action is based on personal and societal feelings and beliefs about the basis for social organization. This article focuses on the two contending primary prevention strategies of health promotion and health protection. The contention between the two strategies stems from a basic disagreement about disease causality in modern society. Health promotion is based on the "lifestyle" theory of disease causality, which sees individual health status linked ultimately to personal decisions about diet, stress, and drug habits. Primary prevention, from this perspective, entails persuading individuals to forgo their risk-taking, self-destructive behavior. Health protection, on the other hand, is based on the "social-structural" theory of disease causality. This theory sees the health status of populations linked ultimately to the unequal distribution of social resources, industrial pollution, occupational stress, and "anti-health promotion" marketing practices. Primary prevention, from this perspective, requires changing existing social and, particularly, economic policies and structures. In order to provide a basis for choosing between these contending strategies, the demonstrated (i.e., past) impact of each strategy on the health of the public is examined. Two conclusions are drawn. First, the health promotion strategy shows little potential for improving the public health, because it systematically ignores the risk-imposing, other-destructive behavior of influential actors (policy-makers and institutions) in society. And second, effective primary prevention efforts entail an "upstream" approach that results in far-reaching sociopolitical and economic change.

  4. Bank stress testing under different balance sheet assumptions

    OpenAIRE

    Busch, Ramona; Drescher, Christian; Memmel, Christoph

    2017-01-01

    Using unique supervisory survey data on the impact of a hypothetical interest rate shock on German banks, we analyse price and quantity effects on banks' net interest margin components under different balance sheet assumptions. In the first year, the cross-sectional variation of banks' simulated price effect is nearly eight times as large as the one of the simulated quantity effect. After five years, however, the importance of both effects converges. Large banks adjust their balance sheets mo...

  5. Underlying assumptions and core beliefs in anorexia nervosa and dieting.

    Science.gov (United States)

    Cooper, M; Turner, H

    2000-06-01

    To investigate assumptions and beliefs in anorexia nervosa and dieting. The Eating Disorder Belief Questionnaire (EDBQ), was administered to patients with anorexia nervosa, dieters and female controls. The patients scored more highly than the other two groups on assumptions about weight and shape, assumptions about eating and negative self-beliefs. The dieters scored more highly than the female controls on assumptions about weight and shape. The cognitive content of anorexia nervosa (both assumptions and negative self-beliefs) differs from that found in dieting. Assumptions about weight and shape may also distinguish dieters from female controls.

  6. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    International Nuclear Information System (INIS)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-01-01

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model

  7. The stable model semantics under the any-world assumption

    OpenAIRE

    Straccia, Umberto; Loyer, Yann

    2004-01-01

    The stable model semantics has become a dominating approach to complete the knowledge provided by a logic program by means of the Closed World Assumption (CWA). The CWA asserts that any atom whose truth-value cannot be inferred from the facts and rules is supposed to be false. This assumption is orthogonal to the so-called the Open World Assumption (OWA), which asserts that every such atom's truth is supposed to be unknown. The topic of this paper is to be more fine-grained. Indeed, the objec...

  8. A framework for the organizational assumptions underlying safety culture

    International Nuclear Information System (INIS)

    Packer, Charles

    2002-01-01

    The safety culture of the nuclear organization can be addressed at the three levels of culture proposed by Edgar Schein. The industry literature provides a great deal of insight at the artefact and espoused value levels, although as yet it remains somewhat disorganized. There is, however, an overall lack of understanding of the assumption level of safety culture. This paper describes a possible framework for conceptualizing the assumption level, suggesting that safety culture is grounded in unconscious beliefs about the nature of the safety problem, its solution and how to organize to achieve the solution. Using this framework, the organization can begin to uncover the assumptions at play in its normal operation, decisions and events and, if necessary, engage in a process to shift them towards assumptions more supportive of a strong safety culture. (author)

  9. Consequences of Violated Equating Assumptions under the Equivalent Groups Design

    Science.gov (United States)

    Lyren, Per-Erik; Hambleton, Ronald K.

    2011-01-01

    The equal ability distribution assumption associated with the equivalent groups equating design was investigated in the context of a selection test for admission to higher education. The purpose was to assess the consequences for the test-takers in terms of receiving improperly high or low scores compared to their peers, and to find strong…

  10. Basic Assumptions of the New Price System and Supplements to the Tariff System for Electricity Sale

    International Nuclear Information System (INIS)

    Klepo, M.

    1995-01-01

    The article outlines some basic assumptions of the new price system and major elements of the latest proposition for the changes and supplements to the Tariff system for Electricity Sale in the Republic of Croatia, including the analysis of those elements which brought about the present unfavourable and non-productive relations within the electric power system. The paper proposes measures and actions which should by means of a price system and tariff policy improve the present unfavourable relations and their consequences and achieve a desirable consumption structure and characteristics, resulting in rational management and effective power supply-economy relationships within the electric power system as a subsystem of the power supply sector. (author). 2 refs., 3 figs., 4 tabs

  11.  Basic assumptions and definitions in the analysis of financial leverage

    Directory of Open Access Journals (Sweden)

    Tomasz Berent

    2015-12-01

    Full Text Available The financial leverage literature has been in a state of terminological chaos for decades as evidenced, for example, by the Nobel Prize Lecture mistake on the one hand, and the global financial crisis on the other. A meaningful analysis of the leverage phenomenon calls for the formulation of a coherent set of assumptions and basic definitions. The objective of the paper is to answer this call. The paper defines leverage as a value neutral concept useful in explaining the magnification effect exerted by financial activity upon the whole spectrum of financial results. By adopting constructivism as a methodological approach, we are able to introduce various types of leverage such as capital and income, base and non-base, accounting and market value, for levels and for distances (absolute and relative, costs and simple etc. The new definitions formulated here are subsequently adopted in the analysis of the content of leverage statements used by the leading finance textbook.

  12. CRITIQUES TOWARDS COSO’S ENTERPRISE RISK MANAGEMENT (ERM) FRAMEWORK IN ITS BASIC ASSUMPTIONS

    OpenAIRE

    Kurniawanti, Ika Atma

    2010-01-01

    Most professionals in internal control, risk management and other similar bailiwickshave agreed that Enterprise Risk Management discourses would’ve invariablyreferred to what the COSO had produced recently: the framework underlying ERM.But this paper takes a bit different stance that views several problematic issuesstem from unclear conceptions of either the basic premise underlying ERM or thenature of some ERM’s components outlined by COSO. This paper notes that, atleast, there are three poi...

  13. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    Science.gov (United States)

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a

  14. Forecasting Value-at-Risk under Different Distributional Assumptions

    Directory of Open Access Journals (Sweden)

    Manuela Braione

    2016-01-01

    Full Text Available Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR. We provide a comprehensive look at the problem by considering the impact that different distributional assumptions have on the accuracy of both univariate and multivariate GARCH models in out-of-sample VaR prediction. The set of analyzed distributions comprises the normal, Student, Multivariate Exponential Power and their corresponding skewed counterparts. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures used to rank the different specifications. The results show the importance of allowing for heavy-tails and skewness in the distributional assumption with the skew-Student outperforming the others across all tests and confidence levels.

  15. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  16. Natural and laboratory OSL growth curve–Verification of the basic assumption of luminescence dating

    International Nuclear Information System (INIS)

    Kijek, N.; Chruścińska, A.

    2016-01-01

    The basic assumption of luminescence dating is the equality between the growth curve of OSL generated by the natural radiation and the OSL growth curve reconstructed in laboratory conditions. The dose rates that generate the OSL in nature and in laboratory experiments differ by about ten orders of magnitude. Recently some discrepancies between the natural and laboratory growth curves have been observed. It is important to establish their reasons in order to introduce appropriate correction into the OSL dating protocol or to find a test that allows to eliminate the samples which should not be used for dating. For this purpose, both growth curves, natural and laboratory, were reconstructed by means of computer simulations of the processes occurring in the sample during its deposition time in environment as well as those which occur in a laboratory during dating procedure. The simulations were carried out for three models including one shallow trap, two OSL traps, one disconnected deep and one luminescence center. The OSL model for quartz can be more complex than the one used in the presented simulations, but in spite of that the results show effects of growth curves discrepancies similar to those observed in experiments. It is clear that the consistency of growth curves is not a general feature of the OSL processes, but rather a result of an advantageous configuration of trap parameters. The deep disconnected traps play the key role and their complete filling before the zeroing of OSL signal is a necessary condition of the growth curves' consistency. - Highlights: • Process of OSL growth curve generation in nature and in laboratory was simulated. • Discrepancies between the natural and the laboratory growth curves are observed. • Deep disconnected traps play the key role in growth curve inequality. • Empty deep traps before zeroing of OSL cause the inequality of growth curves.

  17. Uncertainties in sandy shorelines evolution under the Bruun rule assumption

    Directory of Open Access Journals (Sweden)

    Gonéri eLe Cozannet

    2016-04-01

    Full Text Available In the current practice of sandy shoreline change assessments, the local sedimentary budget is evaluated using the sediment balance equation, that is, by summing the contributions of longshore and cross-shore processes. The contribution of future sea-level-rise induced by climate change is usually obtained using the Bruun rule, which assumes that the shoreline retreat is equal to the change of sea-level divided by the slope of the upper shoreface. However, it remains unsure that this approach is appropriate to account for the impacts of future sea-level rise. This is due to the lack of relevant observations to validate the Bruun rule under the expected sea-level rise rates. To address this issue, this article estimates the coastal settings and period of time under which the use of the Bruun rule could be (invalidated, in the case of wave-exposed gently-sloping sandy beaches. Using the sedimentary budgets of Stive (2004 and probabilistic sea-level rise scenarios based on IPCC, we provide shoreline change projections that account for all uncertain hydrosedimentary processes affecting idealized coasts (impacts of sea-level rise, storms and other cross-shore and longshore processes. We evaluate the relative importance of each source of uncertainties in the sediment balance equation using a global sensitivity analysis. For scenario RCP 6.0 and 8.5 and in the absence of coastal defences, the model predicts a perceivable shift toward generalized beach erosion by the middle of the 21st century. In contrast, the model predictions are unlikely to differ from the current situation in case of scenario RCP 2.6. Finally, the contribution of sea-level rise and climate change scenarios to sandy shoreline change projections uncertainties increases with time during the 21st century. Our results have three primary implications for coastal settings similar to those provided described in Stive (2004 : first, the validation of the Bruun rule will not necessarily be

  18. Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie; Bekker, Tilde; Barendregt, Wolmet

    2016-01-01

    In this full-day workshop we want to discuss how the IDC community can make underlying assumptions, values and views regarding children and childhood in making design decisions more explicit. What assumptions do IDC designers and researchers make, and how can they be supported in reflecting......, and intends to share different approaches for uncovering and reflecting on values, assumptions and views about children and childhood in design....

  19. Unrealistic Assumptions in Economics: an Analysis under the Logic of Socioeconomic Processes

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2014-11-01

    Full Text Available The realism of assumptions is an ongoing debate within the philosophy of economics. One of the most referenced papers in this matter belongs to Milton Friedman. He defends the use of unrealistic assumptions, not only because of a pragmatic issue, but also the intrinsic difficulties of determining the extent of realism. On the other hand, realists have criticized (and still do today the use of unrealistic assumptions - such as the assumption of rational choice, perfect information, homogeneous goods, etc. However, they did not accompany their statements with a proper epistemological argument that supports their positions. In this work it is expected to show that the realism of (a particular sort of assumptions is clearly relevant when examining economic models, since the system under study (the real economies is not compatible with logic of invariance and of mechanisms, but with the logic of possibility trees. Because of this, models will not function as tools for predicting outcomes, but as representations of alternative scenarios, whose similarity to the real world will be examined in terms of the verisimilitude of a class of model assumptions

  20. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  1. Oil production, oil prices, and macroeconomic adjustment under different wage assumptions

    International Nuclear Information System (INIS)

    Harvie, C.; Maleka, P.T.

    1992-01-01

    In a previous paper one of the authors developed a simple model to try to identify the possible macroeconomic adjustment processes arising in an economy experiencing a temporary period of oil production, under alternative wage adjustment assumptions, namely nominal and real wage rigidity. Certain assumptions were made regarding the characteristics of actual production, the permanent revenues generated from that oil production, and the net exports/imports of oil. The role of the price of oil, and possible changes in that price was essentially ignored. Here we attempt to incorporate the price of oil, as well as changes in that price, in conjunction with the production of oil, the objective being to identify the contribution which the price of oil, and changes in it, make to the adjustment process itself. The emphasis in this paper is not given to a mathematical derivation and analysis of the model's dynamics of adjustment or its comparative statics, but rather to the derivation of simulation results from the model, for a specific assumed case, using a numerical algorithm program, conducive to the type of theoretical framework utilized here. The results presented suggest that although the adjustment profiles of the macroeconomic variables of interest, for either wage adjustment assumption, remain fundamentally the same, the magnitude of these adjustments is increased. Hence to derive a more accurate picture of the dimensions of adjustment of these macroeconomic variables, it is essential to include the price of oil as well as changes in that price. (Author)

  2. Recursive Subspace Identification of AUV Dynamic Model under General Noise Assumption

    Directory of Open Access Journals (Sweden)

    Zheping Yan

    2014-01-01

    Full Text Available A recursive subspace identification algorithm for autonomous underwater vehicles (AUVs is proposed in this paper. Due to the advantages at handling nonlinearities and couplings, the AUV model investigated here is for the first time constructed as a Hammerstein model with nonlinear feedback in the linear part. To better take the environment and sensor noises into consideration, the identification problem is concerned as an errors-in-variables (EIV one which means that the identification procedure is under general noise assumption. In order to make the algorithm recursively, propagator method (PM based subspace approach is extended into EIV framework to form the recursive identification method called PM-EIV algorithm. With several identification experiments carried out by the AUV simulation platform, the proposed algorithm demonstrates its effectiveness and feasibility.

  3. Helping Students to Recognize and Evaluate an Assumption in Quantitative Reasoning: A Basic Critical-Thinking Activity with Marbles and Electronic Balance

    Science.gov (United States)

    Slisko, Josip; Cruz, Adrian Corona

    2013-01-01

    There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…

  4. Limitations to the Dutch cannabis toleration policy: Assumptions underlying the reclassification of cannabis above 15% THC.

    Science.gov (United States)

    Van Laar, Margriet; Van Der Pol, Peggy; Niesink, Raymond

    2016-08-01

    The Netherlands has seen an increase in Δ9-tetrahydrocannabinol (THC) concentrations from approximately 8% in the 1990s up to 20% in 2004. Increased cannabis potency may lead to higher THC-exposure and cannabis related harm. The Dutch government officially condones the sale of cannabis from so called 'coffee shops', and the Opium Act distinguishes cannabis as a Schedule II drug with 'acceptable risk' from other drugs with 'unacceptable risk' (Schedule I). Even in 1976, however, cannabis potency was taken into account by distinguishing hemp oil as a Schedule I drug. In 2011, an advisory committee recommended tightening up legislation, leading to a 2013 bill proposing the reclassification of high potency cannabis products with a THC content of 15% or more as a Schedule I drug. The purpose of this measure was twofold: to reduce public health risks and to reduce illegal cultivation and export of cannabis by increasing punishment. This paper focuses on the public health aspects and describes the (explicit and implicit) assumptions underlying this '15% THC measure', as well as to what extent these are supported by scientific research. Based on scientific literature and other sources of information, we conclude that the 15% measure can provide in theory a slight health benefit for specific groups of cannabis users (i.e., frequent users preferring strong cannabis, purchasing from coffee shops, using 'steady quantities' and not changing their smoking behaviour), but certainly not for all cannabis users. These gains should be weighed against the investment in enforcement and the risk of unintended (adverse) effects. Given the many assumptions and uncertainty about the nature and extent of the expected buying and smoking behaviour changes, the measure is a political choice and based on thin evidence. Copyright © 2016 Springer. Published by Elsevier B.V. All rights reserved.

  5. Allele Age Under Non-Classical Assumptions is Clarified by an Exact Computational Markov Chain Approach.

    Science.gov (United States)

    De Sanctis, Bianca; Krukov, Ivan; de Koning, A P Jason

    2017-09-19

    Determination of the age of an allele based on its population frequency is a well-studied problem in population genetics, for which a variety of approximations have been proposed. We present a new result that, surprisingly, allows the expectation and variance of allele age to be computed exactly (within machine precision) for any finite absorbing Markov chain model in a matter of seconds. This approach makes none of the classical assumptions (e.g., weak selection, reversibility, infinite sites), exploits modern sparse linear algebra techniques, integrates over all sample paths, and is rapidly computable for Wright-Fisher populations up to N e  = 100,000. With this approach, we study the joint effect of recurrent mutation, dominance, and selection, and demonstrate new examples of "selective strolls" where the classical symmetry of allele age with respect to selection is violated by weakly selected alleles that are older than neutral alleles at the same frequency. We also show evidence for a strong age imbalance, where rare deleterious alleles are expected to be substantially older than advantageous alleles observed at the same frequency when population-scaled mutation rates are large. These results highlight the under-appreciated utility of computational methods for the direct analysis of Markov chain models in population genetics.

  6. A Memory-Based Model of Posttraumatic Stress Disorder: Evaluating Basic Assumptions Underlying the PTSD Diagnosis

    Science.gov (United States)

    Rubin, David C.; Berntsen, Dorthe; Bohni, Malene Klindt

    2008-01-01

    In the mnemonic model of posttraumatic stress disorder (PTSD), the current memory of a negative event, not the event itself, determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; American Psychiatric Association,…

  7. Do unreal assumptions pervert behaviour?

    DEFF Research Database (Denmark)

    Petersen, Verner C.

    of the basic assumptions underlying the theories found in economics. Assumptions relating to the primacy of self-interest, to resourceful, evaluative, maximising models of man, to incentive systems and to agency theory. The major part of the paper then discusses how these assumptions and theories may pervert......-interested way nothing will. The purpose of this paper is to take a critical look at some of the assumptions and theories found in economics and discuss their implications for the models and the practices found in the management of business. The expectation is that the unrealistic assumptions of economics have...... become taken for granted and tacitly included into theories and models of management. Guiding business and manage¬ment to behave in a fashion that apparently makes these assumptions become "true". Thus in fact making theories and models become self-fulfilling prophecies. The paper elucidates some...

  8. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    Science.gov (United States)

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  9. Implicit assumptions underlying simple harvest models of marine bird populations can mislead environmental management decisions.

    Science.gov (United States)

    O'Brien, Susan H; Cook, Aonghais S C P; Robinson, Robert A

    2017-10-01

    Assessing the potential impact of additional mortality from anthropogenic causes on animal populations requires detailed demographic information. However, these data are frequently lacking, making simple algorithms, which require little data, appealing. Because of their simplicity, these algorithms often rely on implicit assumptions, some of which may be quite restrictive. Potential Biological Removal (PBR) is a simple harvest model that estimates the number of additional mortalities that a population can theoretically sustain without causing population extinction. However, PBR relies on a number of implicit assumptions, particularly around density dependence and population trajectory that limit its applicability in many situations. Among several uses, it has been widely employed in Europe in Environmental Impact Assessments (EIA), to examine the acceptability of potential effects of offshore wind farms on marine bird populations. As a case study, we use PBR to estimate the number of additional mortalities that a population with characteristics typical of a seabird population can theoretically sustain. We incorporated this level of additional mortality within Leslie matrix models to test assumptions within the PBR algorithm about density dependence and current population trajectory. Our analyses suggest that the PBR algorithm identifies levels of mortality which cause population declines for most population trajectories and forms of population regulation. Consequently, we recommend that practitioners do not use PBR in an EIA context for offshore wind energy developments. Rather than using simple algorithms that rely on potentially invalid implicit assumptions, we recommend use of Leslie matrix models for assessing the impact of additional mortality on a population, enabling the user to explicitly define assumptions and test their importance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. The social contact hypothesis under the assumption of endemic equilibrium: Elucidating the transmission potential of VZV in Europe

    Directory of Open Access Journals (Sweden)

    E. Santermans

    2015-06-01

    Full Text Available The basic reproduction number R0 and the effective reproduction number R are pivotal parameters in infectious disease epidemiology, quantifying the transmission potential of an infection in a population. We estimate both parameters from 13 pre-vaccination serological data sets on varicella zoster virus (VZV in 12 European countries and from population-based social contact surveys under the commonly made assumptions of endemic and demographic equilibrium. The fit to the serology is evaluated using the inferred effective reproduction number R as a model eligibility criterion combined with AIC as a model selection criterion. For only 2 out of 12 countries, the common choice of a constant proportionality factor is sufficient to provide a good fit to the seroprevalence data. For the other countries, an age-specific proportionality factor provides a better fit, assuming physical contacts lasting longer than 15 min are a good proxy for potential varicella transmission events. In all countries, primary infection with VZV most often occurs in early childhood, but there is substantial variation in transmission potential with R0 ranging from 2.8 in England and Wales to 7.6 in The Netherlands. Two non-parametric methods, the maximal information coefficient (MIC and a random forest approach, are used to explain these differences in R0 in terms of relevant country-specific characteristics. Our results suggest an association with three general factors: inequality in wealth, infant vaccination coverage and child care attendance. This illustrates the need to consider fundamental differences between European countries when formulating and parameterizing infectious disease models.

  11. Bootstrapping realized volatility and realized beta under a local Gaussianity assumption

    DEFF Research Database (Denmark)

    Hounyo, Ulrich

    The main contribution of this paper is to propose a new bootstrap method for statistics based on high frequency returns. The new method exploits the local Gaussianity and the local constancy of volatility of high frequency returns, two assumptions that can simplify inference in the high frequency...... context, as recently explained by Mykland and Zhang (2009). Our main contributions are as follows. First, we show that the local Gaussian bootstrap is firstorder consistent when used to estimate the distributions of realized volatility and ealized betas. Second, we show that the local Gaussian bootstrap...... matches accurately the first four cumulants of realized volatility, implying that this method provides third-order refinements. This is in contrast with the wild bootstrap of Gonçalves and Meddahi (2009), which is only second-order correct. Third, we show that the local Gaussian bootstrap is able...

  12. Testing the simplex assumption underlying the Sport Motivation Scale: a structural equation modeling analysis.

    Science.gov (United States)

    Li, F; Harmer, P

    1996-12-01

    Self-determination theory (Deci & Ryan, 1985) suggests that motivational orientation or regulatory styles with respect to various behaviors can be conceptualized along a continuum ranging from low (a motivation) to high (intrinsic motivation) levels of self-determination. This pattern is manifested in the rank order of correlations among these regulatory styles (i.e., adjacent correlations are expected to be higher than those more distant) and is known as a simplex structure. Using responses from the Sport Motivation Scale (Pelletier et al., 1995) obtained from a sample of 857 college students (442 men, 415 women), the present study tested the simplex structure underlying SMS subscales via structural equation modeling. Results confirmed the simplex model structure, indicating that the various motivational constructs are empirically organized from low to high self-determination. The simplex pattern was further found to be invariant across gender. Findings from this study support the construct validity of the SMS and have important implications for studies focusing on the influence of motivational orientation in sport.

  13. The philosophy and assumptions underlying exposure limits for ionising radiation, inorganic lead, asbestos and noise

    International Nuclear Information System (INIS)

    Akber, R.

    1996-01-01

    Full text: A review of the literature relating to exposure to, and exposure limits for, ionising radiation, inorganic lead, asbestos and noise was undertaken. The four hazards were chosen because they were insidious and ubiquitous, were potential hazards in both occupational and environmental settings and had early and late effects depending on dose and dose rate. For all four hazards, the effect of the hazard was enhanced by other exposures such as smoking or organic solvents. In the cases of inorganic lead and noise, there were documented health effects which affected a significant percentage of the exposed populations at or below the [effective] exposure limits. This was not the case for ionising radiation and asbestos. None of the exposure limits considered exposure to multiple mutagens/carcinogens in the calculation of risk. Ionising radiation was the only one of the hazards to have a model of all likely exposures, occupational, environmental and medical, as the basis for the exposure limits. The other three considered occupational exposure in isolation from environmental exposure. Inorganic lead and noise had economic considerations underlying the exposure limits and the exposure limits for asbestos were based on the current limit of detection. All four hazards had many variables associated with exposure, including idiosyncratic factors, that made modelling the risk very complex. The scientific idea of a time weighted average based on an eight hour day, and forty hour week on which the exposure limits for lead, asbestos and noise were based was underpinned by neither empirical evidence or scientific hypothesis. The methodology of the ACGIH in the setting of limits later brought into law, may have been unduly influenced by the industries most closely affected by those limits. Measuring exposure over part of an eight hour day and extrapolating to model exposure over the longer term is not the most effective way to model exposure. The statistical techniques used

  14. Tale of Two Courthouses: A Critique of the Underlying Assumptions in Chronic Disease Self-Management for Aboriginal People

    Directory of Open Access Journals (Sweden)

    Isabelle Ellis

    2009-12-01

    Full Text Available This article reviews the assumptions that underpin thecommonly implemented Chronic Disease Self-Managementmodels. Namely that there are a clear set of instructions forpatients to comply with, that all health care providers agreewith; and that the health care provider and the patient agreewith the chronic disease self-management plan that wasdeveloped as part of a consultation. These assumptions areevaluated for their validity in the remote health care context,particularly for Aboriginal people. These assumptions havebeen found to lack validity in this context, therefore analternative model to enhance chronic disease care isproposed.

  15. False assumptions.

    Science.gov (United States)

    Swaminathan, M

    1997-01-01

    Indian women do not have to be told the benefits of breast feeding or "rescued from the clutches of wicked multinational companies" by international agencies. There is no proof that breast feeding has declined in India; in fact, a 1987 survey revealed that 98% of Indian women breast feed. Efforts to promote breast feeding among the middle classes rely on such initiatives as the "baby friendly" hospital where breast feeding is promoted immediately after birth. This ignores the 76% of Indian women who give birth at home. Blaming this unproved decline in breast feeding on multinational companies distracts attention from more far-reaching and intractable effects of social change. While the Infant Milk Substitutes Act is helpful, it also deflects attention from more pressing issues. Another false assumption is that Indian women are abandoning breast feeding to comply with the demands of employment, but research indicates that most women give up employment for breast feeding, despite the economic cost to their families. Women also seek work in the informal sector to secure the flexibility to meet their child care responsibilities. Instead of being concerned about "teaching" women what they already know about the benefits of breast feeding, efforts should be made to remove the constraints women face as a result of their multiple roles and to empower them with the support of families, governmental policies and legislation, employers, health professionals, and the media.

  16. Estimating risks and relative risks in case-base studies under the assumptions of gene-environment independence and Hardy-Weinberg equilibrium.

    Directory of Open Access Journals (Sweden)

    Tina Tsz-Ting Chui

    Full Text Available Many diseases result from the interactions between genes and the environment. An efficient method has been proposed for a case-control study to estimate the genetic and environmental main effects and their interactions, which exploits the assumptions of gene-environment independence and Hardy-Weinberg equilibrium. To estimate the absolute and relative risks, one needs to resort to an alternative design: the case-base study. In this paper, the authors show how to analyze a case-base study under the above dual assumptions. This approach is based on a conditional logistic regression of case-counterfactual controls matched data. It can be easily fitted with readily available statistical packages. When the dual assumptions are met, the method is approximately unbiased and has adequate coverage probabilities for confidence intervals. It also results in smaller variances and shorter confidence intervals as compared with a previous method for a case-base study which imposes neither assumption.

  17. Estimating Risks and Relative Risks in Case-Base Studies under the Assumptions of Gene-Environment Independence and Hardy-Weinberg Equilibrium

    Science.gov (United States)

    Chui, Tina Tsz-Ting; Lee, Wen-Chung

    2014-01-01

    Many diseases result from the interactions between genes and the environment. An efficient method has been proposed for a case-control study to estimate the genetic and environmental main effects and their interactions, which exploits the assumptions of gene-environment independence and Hardy-Weinberg equilibrium. To estimate the absolute and relative risks, one needs to resort to an alternative design: the case-base study. In this paper, the authors show how to analyze a case-base study under the above dual assumptions. This approach is based on a conditional logistic regression of case-counterfactual controls matched data. It can be easily fitted with readily available statistical packages. When the dual assumptions are met, the method is approximately unbiased and has adequate coverage probabilities for confidence intervals. It also results in smaller variances and shorter confidence intervals as compared with a previous method for a case-base study which imposes neither assumption. PMID:25137392

  18. Effect of grid resolution and subgrid assumptions on the model prediction of a reactive buoyant plume under convective conditions

    International Nuclear Information System (INIS)

    Chock, D.P.; Winkler, S.L.; Pu Sun

    2002-01-01

    We have introduced a new and elaborate approach to understand the impact of grid resolution and subgrid chemistry assumption on the grid-model prediction of species concentrations for a system with highly non-homogeneous chemistry - a reactive buoyant plume immediately downwind of the stack in a convective boundary layer. The Parcel-Grid approach plume was used to describe both the air parcel turbulent transport and chemistry. This approach allows an identical transport process for all simulations. It also allows a description of subgrid chemistry. The ambient and plume parcel transport follows the description of Luhar and Britter (Atmos. Environ, 23 (1989) 1911, 26A (1992) 1283). The chemistry follows that of the Carbon-Bond mechanism. Three different grid sizes were considered: fine, medium and coarse, together with three different subgrid chemistry assumptions: micro-scale or individual parcel, tagged-parcel (plume and ambient parcels treated separately), and untagged-parcel (plume and ambient parcels treated indiscriminately). Reducing the subgrid information is not necessarily similar to increasing the model grid size. In our example, increasing the grid size leads to a reduction in the suppression of ozone in the presence of a high-NO x stack plume, and a reduction in the effectiveness of the NO x -inhibition effect. On the other hand, reducing the subgrid information (by using the untagged-parcel assumption) leads to an increase in ozone reduction and an enhancement of the NO x -inhibition effect insofar as the ozone extremum is concerned. (author)

  19. Political Assumptions Underlying Pedagogies of National Education: The Case of Student Teachers Teaching 'British Values' in England

    Science.gov (United States)

    Sant, Edda; Hanley, Chris

    2018-01-01

    Teacher education in England now requires that student teachers follow practices that do not undermine "fundamental British values" where these practices are assessed against a set of ethics and behaviour standards. This paper examines the political assumptions underlying pedagogical interpretations about the education of national…

  20. BASIC

    DEFF Research Database (Denmark)

    Hansen, Pelle Guldborg; Schmidt, Karsten

    2017-01-01

    De sidste 10 år har vi været vidner til opkomsten af et nyt evidensbaseret policy paradigme, Behavioural Public Policy (BPP), der søger at integrere teoretiske og metodiske indsigter fra adfærdsvidenskaberne i offentlig politikudvikling. Arbejdet med BPP har dog båret præg af, at være usystematisk...... BPP. Tilgangen består dels af den overordnede proces-model BASIC og dels af et iboende framework, ABCD, der er en model for systematisk adfærdsanalyse, udvikling, test og implementering af adfærdsrettede løsningskoncepter. Den samlede model gør det muligt for forskere såvel som offentligt ansatte...

  1. Understanding the basic biology underlying the flavor world of children

    Directory of Open Access Journals (Sweden)

    Julie A. MENNELLA, Alison K. VENTURA

    2010-12-01

    Full Text Available Health organizations worldwide recommend that adults and children minimize intakes of excess energy and salty, sweet, and fatty foods (all of which are highly preferred tastes and eat diets richer in whole grains, low- and non- fat dairy products, legumes, fish, lean meat, fruits, and vegetables (many of which taste bitter. Despite such recommendations and the well-established benefits of these foods to human health, adults are not complying, nor are their children. A primary reason for this difficulty is the remarkably potent rewarding properties of the tastes and flavors of foods high in sweetness, saltiness, and fatness. While we cannot easily change children’s basic ingrained biology of liking sweets and avoiding bitterness, we can modulate their flavor preferences by providing early exposure, starting in utero, to a wide variety of flavors within healthy foods, such as fruits, vegetables, and whole grains. Because the flavors of foods mothers eat during pregnancy and lactation also flavor amniotic fluid and breast milk and become preferred by infants, pregnant and lactating women should widen their food choices to include as many flavorful and healthy foods as possible. These experiences, combined with repeated exposure to nutritious foods and flavor variety during the weaning period and beyond, should maximize the chances that children will select and enjoy a healthier diet [Current Zoology 56 (6: 834–841, 2010].

  2. The metaphysics of D-CTCs: On the underlying assumptions of Deutsch's quantum solution to the paradoxes of time travel

    Science.gov (United States)

    Dunlap, Lucas

    2016-11-01

    I argue that Deutsch's model for the behavior of systems traveling around closed timelike curves (CTCs) relies implicitly on a substantive metaphysical assumption. Deutsch is employing a version of quantum theory with a significantly supplemented ontology of parallel existent worlds, which differ in kind from the many worlds of the Everett interpretation. Standard Everett does not support the existence of multiple identical copies of the world, which the D-CTC model requires. This has been obscured because he often refers to the branching structure of Everett as a "multiverse", and describes quantum interference by reference to parallel interacting definite worlds. But he admits that this is only an approximation to Everett. The D-CTC model, however, relies crucially on the existence of a multiverse of parallel interacting worlds. Since his model is supplemented by structures that go significantly beyond quantum theory, and play an ineliminable role in its predictions and explanations, it does not represent a quantum solution to the paradoxes of time travel.

  3. Computational studies of global nuclear energy development under the assumption of the world's heterogeneous development

    International Nuclear Information System (INIS)

    Egorov, A.F.; Korobejnikov, V.V.; Poplavskaya, E.V.; Fesenko, G.A.

    2013-01-01

    Authors study the mathematical model of Global nuclear energy development until the end of this century. For comparative scenarios analysis of transition to sustainable nuclear energy systems, the models of heterogeneous world with an allowance for specific national development are under investigation [ru

  4. Basic Density and Strength Properties Variations in Cordia Africana (Lam) Grown Under Agroforestry in Arumeru, Tanzania

    NARCIS (Netherlands)

    Mahonge, C.P.I.

    2007-01-01

    Variations in basic density and strength properties of Cordia africana (lam) grown under agroforestry in Arumeru district Arusha Tanzania were determined. Tree sampling procedure and data collection based on standard methods (ISO 3129.of 1975). The main results indicated that basic density increased

  5. Reconstructing genealogies of serial samples under the assumption of a molecular clock using serial-sample UPGMA.

    Science.gov (United States)

    Drummond, A; Rodrigo, A G

    2000-12-01

    Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.

  6. Unders and Overs: Using a Dice Game to Illustrate Basic Probability Concepts

    Science.gov (United States)

    McPherson, Sandra Hanson

    2015-01-01

    In this paper, the dice game "Unders and Overs" is described and presented as an active learning exercise to introduce basic probability concepts. The implementation of the exercise is outlined and the resulting presentation of various probability concepts are described.

  7. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  8. Operating Characteristics of Statistical Methods for Detecting Gene-by-Measured Environment Interaction in the Presence of Gene-Environment Correlation under Violations of Distributional Assumptions.

    Science.gov (United States)

    Van Hulle, Carol A; Rathouz, Paul J

    2015-02-01

    Accurately identifying interactions between genetic vulnerabilities and environmental factors is of critical importance for genetic research on health and behavior. In the previous work of Van Hulle et al. (Behavior Genetics, Vol. 43, 2013, pp. 71-84), we explored the operating characteristics for a set of biometric (e.g., twin) models of Rathouz et al. (Behavior Genetics, Vol. 38, 2008, pp. 301-315), for testing gene-by-measured environment interaction (GxM) in the presence of gene-by-measured environment correlation (rGM) where data followed the assumed distributional structure. Here we explore the effects that violating distributional assumptions have on the operating characteristics of these same models even when structural model assumptions are correct. We simulated N = 2,000 replicates of n = 1,000 twin pairs under a number of conditions. Non-normality was imposed on either the putative moderator or on the ultimate outcome by ordinalizing or censoring the data. We examined the empirical Type I error rates and compared Bayesian information criterion (BIC) values. In general, non-normality in the putative moderator had little impact on the Type I error rates or BIC comparisons. In contrast, non-normality in the outcome was often mistaken for or masked GxM, especially when the outcome data were censored.

  9. Technical note: Evaluation of the simultaneous measurements of mesospheric OH, HO2, and O3 under a photochemical equilibrium assumption - a statistical approach

    Science.gov (United States)

    Kulikov, Mikhail Y.; Nechaev, Anton A.; Belikovich, Mikhail V.; Ermakova, Tatiana S.; Feigin, Alexander M.

    2018-05-01

    This Technical Note presents a statistical approach to evaluating simultaneous measurements of several atmospheric components under the assumption of photochemical equilibrium. We consider simultaneous measurements of OH, HO2, and O3 at the altitudes of the mesosphere as a specific example and their daytime photochemical equilibrium as an evaluating relationship. A simplified algebraic equation relating local concentrations of these components in the 50-100 km altitude range has been derived. The parameters of the equation are temperature, neutral density, local zenith angle, and the rates of eight reactions. We have performed a one-year simulation of the mesosphere and lower thermosphere using a 3-D chemical-transport model. The simulation shows that the discrepancy between the calculated evolution of the components and the equilibrium value given by the equation does not exceed 3-4 % in the full range of altitudes independent of season or latitude. We have developed a statistical Bayesian evaluation technique for simultaneous measurements of OH, HO2, and O3 based on the equilibrium equation taking into account the measurement error. The first results of the application of the technique to MLS/Aura data (Microwave Limb Sounder) are presented in this Technical Note. It has been found that the satellite data of the HO2 distribution regularly demonstrate lower altitudes of this component's mesospheric maximum. This has also been confirmed by model HO2 distributions and comparison with offline retrieval of HO2 from the daily zonal means MLS radiance.

  10. Bioclim Deliverable D6b: application of statistical down-scaling within the BIOCLIM hierarchical strategy: methods, data requirements and underlying assumptions

    International Nuclear Information System (INIS)

    2004-01-01

    -study regions were identified, together with the additional issues which arise in applying these techniques to output from the BIOCLIM simulations. This preliminary work is described in this BIOCLIM technical note. It provides an overview of statistical down-scaling methods, together with their underlying assumptions and advantages/disadvantages. Specific issues relating to their application within the BIOCLIM context (i.e., application to the IPSL C M4 D snapshot simulations) are identified, for example, the stationarity issue. The predictor and predictand data sets that would be required to implement these methods within the BIOCLIM hierarchical strategy are also outlined, together with the methodological steps involved. Implementation of these techniques was delayed in order to give priority to the application of the rule-based down-scaling method developed in WP3 to WP2 EMIC output (see Deliverable D8a). This task was not originally planned, but has allowed more comprehensive comparison and evaluation of the BIOCLIM scenarios and down-scaling methods to be undertaken

  11. A Scalable Method for Regioselective 3-Acylation of 2-Substituted Indoles under Basic Conditions

    DEFF Research Database (Denmark)

    Johansson, Karl Henrik; Urruticoechea, Andoni; Larsen, Inna

    2015-01-01

    Privileged structures such as 2-arylindoles are recurrent molecular scaffolds in bioactive molecules. We here present an operationally simple, high yielding and scalable method for regioselective 3-acylation of 2-substituted indoles under basic conditions using functionalized acid chlorides. The ....... The method shows good tolerance to both electron-withdrawing and donating substituents on the indole scaffold and gives ready access to a variety of functionalized 3-acylindole building blocks suited for further derivatization....

  12. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the

  13. Adult Learning Assumptions

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to examine Knowles' theory of andragogy and his six assumptions of how adults learn while providing evidence to support two of his assumptions based on the theory of andragogy. As no single theory explains how adults learn, it can best be assumed that adults learn through the accumulation of formal and informal…

  14. Basic Substances under EU Pesticide Regulation: An Opportunity for Organic Production?

    Directory of Open Access Journals (Sweden)

    Patrice A. Marchand

    2017-02-01

    Full Text Available Some of the active substances allowed in organic production are now approved as basic sub- stances under the EU plant protection products regulation. Previously, all organic farming permitted active substances were approved as conventional plant protection products. In accordance with the criteria of Article 23 of the EU regulation (EC No 1107/2009, basic substances are granted without maximum residue limits and have a good prospect for being included in Annex II of organic farming Regulation (EC 889/2008. In fact, most of them are already permitted in organic farming. At this stage, it seems desirable to organize applications in order to avoid duplications and to clarify strategy across Europe. This organization should be planned in order to identify corresponding knowledge and data from field experiments, and to further constitute the most crucial issues related to organic production. A work of this nature was initially supported by IFOAM-EU for lecithin, calcium hydroxide and Quassia extract. The Institut Technique de l’Agriculture Biologique (ITAB was previously engaged in a large-scale approval plan motivated by the continuous demand for the regularization of compounds/substances already in use and has a mandate for testing and approving new compatible substances. Thus, the horsetail extract (Equisetum arvense was the first approved basic substance and ITAB has obtained 11 of the 15 basic substances approved at the EU level.

  15. Occupational radiation exposure in international recommendations on radiation protection: Basic standards under review

    International Nuclear Information System (INIS)

    Kraus, W.

    1996-01-01

    The ICRP publication 60 contains a number of new recommendations on the radiological protection of occupationally exposed persons. The recommendations have been incorporated to a very large extent in the BSS, the International Basic Safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources, a publication elaborated by the IAEA in cooperation with many other international organisations, and in the Euratom Basic Safety Standards (EUR) to be published soon. However, there exist some considerable discrepancies in some aspects of the three publications. The ICRP committee has set up a task group for defining four general principles of occupational radiation protection, and a safety guide is in preparation under the responsibility of the IAEA. ''StrahlenschutzPraxis'' will deal with this subject in greater detail after publication of these two important international publications. The article in hand discusses some essential aspects of the recommendations published so far. (orig.) [de

  16. Multiverse Assumptions and Philosophy

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2018-02-01

    Full Text Available Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong topics such as: infinity, duplicate yous, hypothetical fields, more than three space dimensions, Hilbert space, advanced civilizations, and reality established by mathematical relationships. It is easy to confuse multiverse proposals because many divergent models exist. This overview defines the characteristics of eleven popular multiverse proposals. The characteristics compared are: initial conditions, values of constants, laws of nature, number of space dimensions, number of universes, and fine tuning explanations. Future scientific experiments may validate selected assumptions; but until they do, proposals by philosophers may be as valid as theoretical scientific theories.

  17. Sensitivity Analysis Without Assumptions.

    Science.gov (United States)

    Ding, Peng; VanderWeele, Tyler J

    2016-05-01

    Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.

  18. Water oxidation catalysis with nonheme iron complexes under acidic and basic conditions: homogeneous or heterogeneous?

    Science.gov (United States)

    Hong, Dachao; Mandal, Sukanta; Yamada, Yusuke; Lee, Yong-Min; Nam, Wonwoo; Llobet, Antoni; Fukuzumi, Shunichi

    2013-08-19

    Thermal water oxidation by cerium(IV) ammonium nitrate (CAN) was catalyzed by nonheme iron complexes, such as Fe(BQEN)(OTf)2 (1) and Fe(BQCN)(OTf)2 (2) (BQEN = N,N'-dimethyl-N,N'-bis(8-quinolyl)ethane-1,2-diamine, BQCN = N,N'-dimethyl-N,N'-bis(8-quinolyl)cyclohexanediamine, OTf = CF3SO3(-)) in a nonbuffered aqueous solution; turnover numbers of 80 ± 10 and 20 ± 5 were obtained in the O2 evolution reaction by 1 and 2, respectively. The ligand dissociation of the iron complexes was observed under acidic conditions, and the dissociated ligands were oxidized by CAN to yield CO2. We also observed that 1 was converted to an iron(IV)-oxo complex during the water oxidation in competition with the ligand oxidation. In addition, oxygen exchange between the iron(IV)-oxo complex and H2(18)O was found to occur at a much faster rate than the oxygen evolution. These results indicate that the iron complexes act as the true homogeneous catalyst for water oxidation by CAN at low pHs. In contrast, light-driven water oxidation using [Ru(bpy)3](2+) (bpy = 2,2'-bipyridine) as a photosensitizer and S2O8(2-) as a sacrificial electron acceptor was catalyzed by iron hydroxide nanoparticles derived from the iron complexes under basic conditions as the result of the ligand dissociation. In a buffer solution (initial pH 9.0) formation of the iron hydroxide nanoparticles with a size of around 100 nm at the end of the reaction was monitored by dynamic light scattering (DLS) in situ and characterized by X-ray photoelectron spectra (XPS) and transmission electron microscope (TEM) measurements. We thus conclude that the water oxidation by CAN was catalyzed by short-lived homogeneous iron complexes under acidic conditions, whereas iron hydroxide nanoparticles derived from iron complexes act as a heterogeneous catalyst in the light-driven water oxidation reaction under basic conditions.

  19. Basic experimental study on the backfilling material under saline seawater condition

    International Nuclear Information System (INIS)

    Kikuchi, Hirohito; Tanai, Kenji; Sugita, Yutaka

    2003-11-01

    In geological disposal of high-level radioactive waste, closure of repository is the technique of filling clearance using the backfilling material to preserve barrier performance of the engineered barrier system. The required performances of the backfilling material are clearance filling, low permeability and swelling pressure and stiffness. The expecting behaviors of the backfilling material are very complex which are decrease of section area of the tunnel due to creep displacement, decrease of performance of bentonite due to alteration of the concrete lining and so on. And ideal assessment of the clearance filling performance in the backfilled tunnel will be performed considering the coupled behaviors described above. However, there is not enough data to explain the expecting behaviors, and mechanisms of the coupled behaviors are not clarified yet. Therefore, the clearance filling performance of backfilling material was selected first. In this study, the clearance filling performance was tested using the clearance considering only decrease of the volume of the concrete lining due to alteration of the concrete. Basic examination of the backfilling material was performed, which focused on the feasibility of the backfilling material described in the H12 report and the adequate bentonite/sand mixture to obtain conservative filling clearance performance. Results of the examination showed, under test conditions that 30% of the volume of concrete lining decreases due to alteration and such volume become clearance between the backfilling material and concrete lining, in distilled water condition, the specification (bentonite/sand mixture) of the backfilling material described in H12 report almost filled the clearance. However, in saline seawater, 50% and more bentonite was required to fill the clearance. Since this examination fixed the clearance, water stopping performance will be examined in next phase. Through the saline seawater examination, the basic clearance

  20. Sorption of vanillin on highly basic anion exchanger under static conditions

    Science.gov (United States)

    Sholokhova, A. Yu.; Eliseeva, T. V.; Voronyuk, I. V.

    2017-11-01

    The kinetics of the sorption of vanillin by a granulated anion exchanger is studied under static conditions. A comparison of the kinetic curves of the uptake of hydroxybenzaldehyde by gel and macroporous anion exchanger shows that macroporous sorbent has better kinetic characteristics. The effect temperature has on the capacity of an anion exchanger and the time needed to establish sorption equilibrium is found, and the activation energy of vanillin uptake is determined. Studying the effect experimental factors have on the rate of sorption and using the formal kinetics approach, it is established that in the investigated range of concentrations, the limiting stage of the uptake of vanillin by an anion exchanger with the functional groups of a quaternary ammonium base is that of external diffusion. Vanillin sorption by a highly basic anion exchanger in hydroxyl form is characterized by polymolecular uptake best described by a BET isotherm; at the same time, the uptake of sorbate by a chloride form is of a monomolecular character and can be described by a Freindlich isotherm. Structural changes in the anion exchanger sorbed hydroxybenzaldehyde are identified via FTIR spectroscopy.

  1. Kinetics study of hydrochlorothiazide lactose liquid state interaction using conventional isothermal arrhenius method under basic and neutral conditions

    Directory of Open Access Journals (Sweden)

    Faranak Ghaderi

    Full Text Available ABSTRACT The Maillard reaction of hydrochlorothiazide (HCTZ and lactose has been previously demonstrated in pharmaceutical formulations. In this study, the activation energy of - hydrohlorothiazide and lactose interaction in the liquid state was ascertained under basic and neutral conditions. Conventional isothermal High Performance Liquid Chromatography (HPLC technique was employed to ascertain the kinetic parameters using Arrhenius method. Results: The activation energy obtained was 82.43 and 100.28 kJ/mol under basic and neutral conditions, respectively. Consequently, it can be inferred that Maillard reaction is significantly affected by pH, which can be used as a control factor whenever the reaction potentially occurs.

  2. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  3. Basic substances under EC 1107/2009 phytochemical regulation: experience with non-biocide and food products as biorationals

    Directory of Open Access Journals (Sweden)

    Marchand Patrice A.

    2016-07-01

    Full Text Available Basic Substances are a newly effective category of Plant Protection Product under EC Regulation No 1107/2009. The first approved application of Equisetum arvense L. opened Part C of Implementing Regulation (EU No 540/2011, which lists the basic substance approved. Although E. arvense was described as a fungicide extract, subsequent applications like chitosan were related to non-biocide molecules. Consequently, plant protection product data were collected from research on alternative or traditional crop protection methods. They are notably issued or derived from foodstuffs (plants, plant by-products, plant derived products, substances and derived substances from animal origin. Applications are currently submitted by our Institute, under evaluation at different stages of the approval process or already approved. Remarkably, this Basic Substance category under pesticide EU Regulation was surprisingly designed for these non-biocidal plant protection products. In fact, components described as the “active substance” of most of the actual applications are food products like sugars and lecithin. Basic Substance applications for these foodstuffs are therefore a straightforward way of easily gaining approval for them. Here we describe the approval context and detail the agricultural uses of theses food products as Biological Control Agents (BCAs or biorationals for crop protection. From all deposited or approved Basic Substance Application (BSA, a proof has been provided that non-biocide and food products via physical barrier or lure effects may be effective plant protection products with an acceptable low profile of concern for public and agricultural safety.

  4. The Behaviour of Laboratory Soil Electrical Resistivity Value under Basic Soil Properties Influences

    International Nuclear Information System (INIS)

    Hazreek, Z A M; Aziman, M; Azhar, A T S; Chitral, W D; Fauziah, A; Rosli, S

    2015-01-01

    Electrical resistivity method (ERM) was a popular indirect geophysical tools adopted in engineering, environmental and archaeological studies. In the past, results of the electrical resistivity value (ERV) were always subjected to a long discussion and debate among the related parties such as an engineers, geophysicists and geologists due to its lack of clarification and evidences in quantitative point of view. Most of the results produced in the past was always been justified using qualitative ways which difficult to be accept by certain parties. In order to reduce the knowledge gap between those parties, this study has performed a laboratory experiment of soil box resistivity test which supported by an additional basic geotechnical test as referred to particle size distribution test (d), moisture content test (w), density test (ρ bulk ) and Atterberg limit test (LL, PL and PI). The test was performed to establish a series of electrical resistivity value with different quantity of water content for Clayey SILT and Silty SAND soil. It was found that the ERV of Silty SAND (600 - 7300 Ωm) was higher than Clayey SILT (13 - 7700 Ωm) due to the different quantity of basic soil properties value obtained from the basic geotechnical test. This study was successfully demonstrated that the fluctuation of ERV has greatly influenced by the variations of the soil physical properties (d, w, ρ bulk , LL, PL and PI). Hence, the confidence level of ERV interpretation will be increasingly meaningful since it able to be proved by others parameter generated by laboratory direct test

  5. Basic characteristic test of buffer/backfill material under Horonobe groundwater condition

    International Nuclear Information System (INIS)

    Kikuchi, Hirohito; Tanai, Kenji

    2005-02-01

    By the second progress report (H12) on research and development for the geological disposal of high-level radioactive waste (HLW) in Japan, Japan Nuclear Cycle Development Institute (JNC) extended the data base of basic properties of compacted bentonite which were mainly obtained by using distilled water as test fluid. This report presents influence of Horonobe groundwater on the basic properties of buffer and backfill material. The Horonobe groundwater is a type of saline groundwater. The groundwater was sampled at GL-300 m or deeper by using bore hole HDB-6 of the underground laboratory of Horonobe site. In addition, basic properties are also obtained by using distilled water, synthetic seawater, and NaCl solution. Experimental results are as follows; 1) Swelling characteristics, hydraulic characteristics and mechanical characteristics of the buffer material and backfill material decrease by the influence of saline water. The relationship between effective clay density and swelling stress is described by the following equation. σ = exp (2.5786ρ b 3 - 12.238ρ b 2 + 21.818ρ b - 14.035) where σ is swelling stress [MPa], ρ b is effective clay density [Mg/m 3 ]. The relationship between effective clay density and intrinsic permeability is described by the following equation. κ = exp (-41.466 + 4.316ρ b - 4.069ρ b 2 ) where κ is intrinsic permeability [m 2 ], ρ b is effective clay density [Mg/m 3 ]. The relationship between effective clay density and unconfined compressive strength is described by the following equation. qu = 1.4 x 10 -4 exp (5.637ρ b ) where qu is unconfined compressive strength [MPa], ρ b is effective clay density [Mg/m 3 ]. 2) Saline water doesn't influence the thermal characteristic of the buffer material. The thermal conductivity and specific heat are derived by using the relationship that was obtained so far. (author)

  6. Modeling basic creep in concrete at early-age under compressive and tensile loading

    Energy Technology Data Exchange (ETDEWEB)

    Hilaire, Adrien, E-mail: adrien.hilaire@ens-cachan.fr [ENS Cachan/CNRS UMR8535/UPMC/PRES UniverSud Paris, Cachan (France); Benboudjema, Farid; Darquennes, Aveline; Berthaud, Yves [ENS Cachan/CNRS UMR8535/UPMC/PRES UniverSud Paris, Cachan (France); Nahas, Georges [ENS Cachan/CNRS UMR8535/UPMC/PRES UniverSud Paris, Cachan (France); Institut de radioprotection et de sureté nucléaire, Fontenay-aux-Roses (France)

    2014-04-01

    A numerical model has been developed to predict early age cracking for massive concrete structures, and especially concrete nuclear containment vessels. Major phenomena are included: hydration, heat diffusion, autogenous and thermal shrinkage, creep and cracking. Since studied structures are massive, drying is not taken into account. Such modeling requires the identification of several material parameters. Literature data is used to validate the basic creep model. A massive wall, representative of a concrete nuclear containment, is simulated; predicted cracking is consistent with observation and is found highly sensitive to the creep phenomenon.

  7. Effects of turbulence on mixed-phase deep convective clouds under different basic-state winds and aerosol concentrations

    Science.gov (United States)

    Lee, Hyunho; Baik, Jong-Jin; Han, Ji-Young

    2014-12-01

    The effects of turbulence-induced collision enhancement (TICE) on mixed-phase deep convective clouds are numerically investigated using a 2-D cloud model with bin microphysics for uniform and sheared basic-state wind profiles and different aerosol concentrations. Graupel particles account for the most of the cloud mass in all simulation cases. In the uniform basic-state wind cases, graupel particles with moderate sizes account for some of the total graupel mass in the cases with TICE, whereas graupel particles with large sizes account for almost all the total graupel mass in the cases without TICE. This is because the growth of ice crystals into small graupel particles is enhanced due to TICE. The changes in the size distributions of graupel particles due to TICE result in a decrease in the mass-averaged mean terminal velocity of graupel particles. Therefore, the downward flux of graupel mass, and thus the melting of graupel particles, is reduced due to TICE, leading to a decrease in the amount of surface precipitation. Moreover, under the low aerosol concentration, TICE increases the sublimation of ice particles, consequently playing a partial role in reducing the amount of surface precipitation. The effects of TICE are less pronounced in the sheared basic-state wind cases than in the uniform basic-state wind cases because the number of ice crystals is much smaller in the sheared basic-state wind cases than in the uniform basic-state wind cases. Thus, the size distributions of graupel particles in the cases with and without TICE show little difference.

  8. THE TRANSFORMATIONAL PROCESSES INVOLVING MOTOR SKILLS THAT OCCUR UNDER THE INFLUENCE OF BASIC PRELIMINARY TRAINING IN YOUNG HANDBALL PLAYERS

    Directory of Open Access Journals (Sweden)

    Markovic Sasa

    2011-06-01

    Full Text Available The population from which we extracted a sample of 76 subjects consisted of elementary school students in Kursumlija, all male, aged 12-13, who were divided into a sub-sample consisting of 38 young handball players who took part in the training sessions of a school of handball and another sub-sample consisting of 38 non-athletes, who only took part in their regular physical education classes. The aim of the research was to determine the transformation processes involving motor skills, which occur under the influence of basic preliminary training in young handball players. The subject matter of the study was to examine whether a statistically significant increase in the level of motor skills would occur under the influence of physical exercise as part of basic preliminary training in the final as compared to the initial state. Six motor tests which define the dimensions of explosive and repetitive strength were used. The results of the research indicate that significant transformational processes involving the motor skills of young handball players occurred in the final as compared to the initial measuring, under the influence of basic preliminary training.

  9. Basic concepts and assumptions behind the ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1981-03-01

    The paper gives a review of the current radiation protection recommendations by the International Commission on Radiological Protection (ICRP). It discusses concepts like stochastic effects, radiation detriments, collective dose, dose equivalent and dose limits. (G.B.)

  10. The Influence of Basic Physical Properties of Soil on its Electrical Resistivity Value under Loose and Dense Condition

    Science.gov (United States)

    Abidin, M. H. Z.; Ahmad, F.; Wijeyesekera, D. C.; Saad, R.

    2014-04-01

    Electrical resistivity technique has become a famous alternative tool in subsurface characterization. In the past, several interpretations of electrical resistivity results were unable to be delivered in a strong justification due to lack of appreciation of soil mechanics. Traditionally, interpreters will come out with different conclusion which commonly from qualitative point of view thus creating some uncertainty regarding the result reliability. Most engineers desire to apply any techniques in their project which are able to provide some clear justification with strong, reliable and meaningful results. In order to reduce the problem, this study presents the influence of basic physical properties of soil due to the electrical resistivity value under loose and dense condition. Two different conditions of soil embankment model were tested under electrical resistivity test and basic geotechnical test. It was found that the electrical resistivity value (ERV, ρ) was highly influenced by the variations of soil basic physical properties (BPP) with particular reference to moisture content (w), densities (ρbulk/dry), void ratio (e), porosity (η) and particle grain fraction (d) of soil. Strong relationship between ERV and BPP can be clearly presents such as ρ ∞ 1/w, ρ ∞ 1/ρbulk/dry, ρ ∞ e and ρ ∞ η. This study therefore contributes a means of ERV data interpretation using BPP in order to reduce ambiguity of ERV result and interpretation discussed among related persons such as geophysicist, engineers and geologist who applied these electrical resistivity techniques in subsurface profile assessment.

  11. The influence of basic physical properties of soil on its electrical resistivity value under loose and dense condition

    International Nuclear Information System (INIS)

    Abidin, M H Z; Ahmad, F; Wijeyesekera, D C; Saad, R

    2014-01-01

    Electrical resistivity technique has become a famous alternative tool in subsurface characterization. In the past, several interpretations of electrical resistivity results were unable to be delivered in a strong justification due to lack of appreciation of soil mechanics. Traditionally, interpreters will come out with different conclusion which commonly from qualitative point of view thus creating some uncertainty regarding the result reliability. Most engineers desire to apply any techniques in their project which are able to provide some clear justification with strong, reliable and meaningful results. In order to reduce the problem, this study presents the influence of basic physical properties of soil due to the electrical resistivity value under loose and dense condition. Two different conditions of soil embankment model were tested under electrical resistivity test and basic geotechnical test. It was found that the electrical resistivity value (ERV, ρ) was highly influenced by the variations of soil basic physical properties (BPP) with particular reference to moisture content (w), densities (ρ bulk/dry ), void ratio (e), porosity (η) and particle grain fraction (d) of soil. Strong relationship between ERV and BPP can be clearly presents such as ρ ∞ 1/w, ρ ∞ 1/ρ bulk/dry , ρ ∞ e and ρ ∞ η. This study therefore contributes a means of ERV data interpretation using BPP in order to reduce ambiguity of ERV result and interpretation discussed among related persons such as geophysicist, engineers and geologist who applied these electrical resistivity techniques in subsurface profile assessment.

  12. 15 CFR Supplement No. 2 to Part 740 - Items That May Be Donated To Meet Basic Human Needs Under the Humanitarian License Exception

    Science.gov (United States)

    2010-01-01

    ... Basic Human Needs Under the Humanitarian License Exception No. Supplement No. 2 to Part 740 Commerce and... Supplement No. 2 to Part 740—Items That May Be Donated To Meet Basic Human Needs Under the Humanitarian... Medicines and Supplies (c) Clothes and Household Goods Bedding Clothes Cooking Utensils Fabric Personal...

  13. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  14. Mesoporous Structure Control of Silica in Room-Temperature Synthesis under Basic Conditions

    Directory of Open Access Journals (Sweden)

    Jeong Wook Seo

    2015-01-01

    Full Text Available Various types of mesoporous silica, such as continuous cubic-phase MCM-48, hexagonal-phase MCM-41, and layer-phase spherical silica particles, have been synthesized at room temperature using cetyltrimethylammonium bromide as a surfactant, ethanol as a cosurfactant, tetraethyl orthosilicate as a silica precursor, and ammonia as a condensation agent. Special care must be taken both in the filtering of the resultant solid products and in the drying process. In the drying process, further condensation of the silica after filtering was induced. As the surfactant and cosurfactant concentrations in the reaction mixture increased and the NH3 concentration decreased, under given conditions, continuous cubic MCM-48 and layered silica became the dominant phases. A cooperative synthesis mechanism, in which both the surfactant and silica were involved in the formation of mesoporous structures, provided a good explanation of the experimental results.

  15. A Basic Study on the Ejection of ICI Nozzle under Severe Accidents

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jong Rae; Bae, Ji Hoon; Bang, Kwang Hyun [Korea Maritime and Ocean University, Busan (Korea, Republic of); Park, Jong Woong [Dongguk University, Gyeongju (Korea, Republic of)

    2016-05-15

    Nozzle injection should be blocked because it affect to the environment if its melting core exposes outside. The purpose of this study is to carry out the thermos mechanical analysis due to debris relocation under severe accidents and to predict the nozzle ejection calculated considering the contact between the nozzle and lower head, and the supports of pipe cables. As a result of analyzing process of severe accidents, there was melting reaction between nozzle and the lower head. In this situation, we might predict the non-uniform contact region of nozzle hole of lower head and nozzle outside, delaying ejection of nozzles. But after melting, the average remaining length of the nozzle was 120mm and the maximum vertical displacement of lower nozzle near the weld is 3.3mm so there would be no nozzle this model, because the cable supports restrains the vertical displacement of nozzle.

  16. Basic concept on the responses of structural members and structures under impact or impulsive loadings

    International Nuclear Information System (INIS)

    Takeda, J.I.; Tachikawa, H.; Fujimoto, K.

    1982-01-01

    The responses of structural members and structures subjected to impact or impulsive loadings are generated by the interaction between acting bodies and structures, and the interaction is affected by many factors, e.g. the relations of masses, sizes, rigidities, etc. between acting bodies and structures and especially by relative velocity. The development of the responses of structural members and structures are controlled by the constitutive equations and failure criteria of constituent materials, the relationships of cowork system between the constituent materials and existing stress waves. Furthermore, the first two are influenced by rate effects and they all widely change by the speeds of impact and impulsive loadings. This paper deals with the physical meaning of the responses of structures under impact and impulsive loadings. (orig.) [de

  17. The Role of Intrinsic Motivation and the Satisfaction of Basic Psychological Needs Under Conditions of Severe Resource Scarcity.

    Science.gov (United States)

    van Egmond, Marieke Christina; Navarrete Berges, Andrés; Omarshah, Tariq; Benton, Jennifer

    2017-06-01

    An emerging field of research is beginning to examine the ways in which socioeconomic disparities affect emotional, cognitive, and social processes. In this study, we took a two-step approach to examining the role that resource scarcity plays in the predictive power of intrinsic motivation on school attendance, as well as its influence on the precursors of intrinsic motivation: the psychological needs of relatedness, autonomy, and competence. Results revealed that intrinsic motivation predicts school attendance even under conditions of extreme adversity. The satisfaction of the basic needs is more important for participants who are exposed to severe rather than mild levels of deprivation. Our findings illustrate ecological effects on the mechanism underlying goal-directed behavior. They provide evidence in favor of self-determination theory's depiction of humans as active, growth-oriented organisms and for the potential of psychological interventions to reduce poverty.

  18. Highly effective degradation of selected groups of organic compounds by cavitation based AOPs under basic pH conditions.

    Science.gov (United States)

    Gągol, Michał; Przyjazny, Andrzej; Boczkaj, Grzegorz

    2018-07-01

    Cavitation has become on the most often applied methods in a number of industrial technologies. In the case of oxidation of organic pollutants occurring in the aqueous medium, cavitation forms the basis of numerous advanced oxidation processes (AOPs). This paper presents the results of investigations on the efficiency of oxidation of the following groups of organic compounds: organosulfur, nitro derivatives of benzene, BTEX, and phenol and its derivatives in a basic model effluent using hydrodynamic and acoustic cavitation combined with external oxidants, i.e., hydrogen peroxide, ozone and peroxone. The studies revealed that the combination of cavitation with additional oxidants allows 100% oxidation of the investigated model compounds. However, individual treatments differed with respect to the rate of degradation. Hydrodynamic cavitation aided by peroxone was found to be the most effective treatment (100% oxidation of all the investigated compounds in 60 min). When using hydrodynamic and acoustic cavitation alone, the effectiveness of oxidation was diversified. Under these conditions, nitro derivatives of benzene and phenol and its derivatives were found to be resistant to oxidation. In addition, hydrodynamic cavitation was found to be more effective in degradation of model compounds than acoustic cavitation. The results of investigations presented in this paper compare favorably with the investigations on degradation of organic contaminants using AOPs under conditions of basic pH published thus far. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Basic study on PWR plant behavior under the condition of severe accident (1)

    International Nuclear Information System (INIS)

    Ozaki, Yoshihiko; Ida, Shohma; Nakamura, Shinya

    2015-01-01

    In this paper, we report on the results using the PWR plant simulator about the plant behavior under the condition of the severe accident that LOCA occurs but ECCS fails the water irrigation into the reactor core. As for the results about the relationship between the LOCA area and the time from LOCA occurs until fuel temperature rise start, the time became shorter as the area was the larger. But, in LOCA area of 1000 cm 2 or more large, the time was almost constant regardless of the area. For small LOCA of 25 cm 2 area, from the results of the comparative experiments for RCS natural circulation cooling effect in the case of SG open or not, in SG open condition compared with SG not open, the effect was observed more, but the reactor water level was greatly reduced and the time until the fuel temperature rise start was shortened, so the fuel temperature at the time of water irrigation into the reactor core has become higher. On the other hand, for the large LOCA of 1000 cm 2 , the effect was not observed regardless of SG open or not. In addition, the reactor core damage was not spared in the irrigation into the reactor core after 30 minutes from LOCA, however, the hydrogen concentration in the containment building is less than the lower limit of hydrogen detonation, and also the pressure in the containment building is less than the designed value. That is, although suffered the core damage, the integrity of the containment building has been shown to be secured. (author)

  20. Basic study on BWR plant behavior under the condition of severe accident

    International Nuclear Information System (INIS)

    Ozaki, Yoshihiko; Jyohko, Shingo; Dohgo, Hirofumi

    2015-01-01

    In this paper, we report on the results using the BWR plant simulator about the plant behavior under the condition of the severe accident that LOCA occurs but ECCS fails the water irrigation into the reactor core. The simulation experiments were carried out for the cases that LOCA has occurred in the main steam piping or in the recirculation piping, respectively. As for the results about the relationship between the LOCA area and the time from LOCA occurs until the fuel temperature rise start, the effect that RCIC operated was extremely big for LOCA area of up to 100 cm"2 for both type LOCA. In the case of main steam system LOCA, the core water level suddenly decreased for large LOCA of 2000 cm"2 area, however, if the irrigation into the reactor core was carried out 30 min after LOCA occurrence, the core had little damage. In addition, the H_2 concentration in the containment vessel did not exceed both limits of H_2 explosion nor detonation. The pressure of the containment vessel was around 3 kg/cm"2 of design value, so the soundness of the containment vessel was confirmed. On the other hand, for the recirculation system LOCA of 2000 cm"2 area, a drop of the core water level was extremely in comparison with main steam system LOCA, and the fuel assemblies were completely exposed during up to 30 min, to the irrigation from approximately 100 sec, after LOCA occurrence. Therefore, the fuel temperature during the irrigation had reached approximately 1900degC. Thus, the fuel cladding were damaged approximately less than 10%, and H_2 concentration in the containment vessel was approximately 9% which did not exceed H_2 detonation limit of 13% but exceeded H_2 explosion limit of 4%. However, the containment vessel internal pressure was settled around design pressure value of containment vessel. As the results, some core damage could not be avoided, but soundness of the containment vessel, which should take the role of 'confine', was found to be secured. (author)

  1. Basic study on BWR plant behavior under the condition of severe accident (2)

    International Nuclear Information System (INIS)

    Ozaki, Yoshihiko; Ueda, Masataka; Sasaki, Hajime

    2016-01-01

    In this paper, we report on the results using the BWR plant simulator about the plant behavior under the condition of the two types of severe accidents that LOCA occurs but ECCS fails the water irrigation into the reactor core and SBO occurs and at the same time the reclosed failure of SRV occurs. The simulation experiments were carried out for the cases that LOCA has occurred in the main feed-water piping. As for the results about the relationship between the LOCA area and the time from LOCA occurs until the fuel temperature rise start, the effect that RCIC operated was extremely big for small and middle LOCA area. In the case of main feed-water system LOCA, the core water level suddenly decreased for large LOCA of 2000 cm"2 area, however, if the irrigation into the reactor core was carried out 30 min after LOCA occurrence, the core had little damage. In addition, the H_2 concentration in the containment vessel did not exceed both limits of H_2 explosion nor detonation. The pressure of the containment vessel was around 3 kg/cm"2 of design value, so the soundness of the containment vessel was confirmed. On the other hand, for the accident of SBO with reclosed failure of SRV, it has been shown that the accidents continue to progress rapidly as compared with the case of normally operating of SRV. Because SRV has the function that keep the inside pressure of reactor core by repeating opened and closed in response of the inside pressure and prevent the decrease of water level inside reactor core. However, if the irrigation into the reactor core was carried out 30 min after SBO occurrence, the core had little damage and also the H_2 concentration in the containment vessel did not exceed limits of H_2 explosion. Further, as for the accident of reclosed failure of SRV, it has been shown that there are very good correspondence with the simulation results of main steam piping LOCA of area 180 cm"2 corresponding to the inlet cross-sectional area SRV installed on the piping

  2. Basic visualization experiments on eutectic reaction of boron carbide and stainless steel under sodium-cooled fast reactor conditions

    International Nuclear Information System (INIS)

    Yamano, Hidemasa; Suzuki, Tohru; Kamiyama, Kenji; Kudo, Isamu

    2016-01-01

    This paper describes basic visualization experiments on eutectic reaction and relocation of boron carbide (B 4 C) and stainless steel (SS) under a high temperature condition exceeding 1500degC as well as the importance of such behaviors in molten core during a core disruptive accident in a Generation-IV sodium-cooled fast reactor (750 MWe class) designed in Japan. At first, a reactivity history was calculated using an exact perturbation calculation tool taking into account expected behaviors. This calculation indicated the importance of a relocation behavior of the B 4 C-SS eutectic because its behavior has a large uncertainty in the reactivity history. To clarify this behavior, basic experiments were carried out by visualizing the reaction of a B 4 C pellet contacted with molten SS in a high temperature-heating furnace. The experiments have shown the eutectic reaction visualization as well as freezing and relocation of the B 4 C-SS eutectic in upper part of the solidified test piece due to the density separation. (author)

  3. Major Assumptions of Mastery Learning.

    Science.gov (United States)

    Anderson, Lorin W.

    Mastery learning can be described as a set of group-based, individualized, teaching and learning strategies based on the premise that virtually all students can and will, in time, learn what the school has to teach. Inherent in this description are assumptions concerning the nature of schools, classroom instruction, and learners. According to the…

  4. Private management of public schools of Basic Education: a new market under the auspices of the State

    Directory of Open Access Journals (Sweden)

    Bruno Gawryszewski

    2017-09-01

    Full Text Available This paper aims to present the private management of public schools of Basic Education as one of the pillars that support a private-market agenda in the Brazilian contemporary field, based on the belief of greater efficiency and quality in order to ensure equal opportunity to all. The theoretical and methodological approach was the analysis of the current Brazilian situation included in the structural crisis of the capital, followed by the examination of private management in American schools, known as charter schools, and initiatives for the education networks in Brazil. We conclude by stating that the private management of public schools can be seen as a resource to prevent the cyclical crises of capital, which has opened up new market opportunities in the education field under the auspices of the Brazilian state.

  5. Capturing Assumptions while Designing a Verification Model for Embedded Systems

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    A formal proof of a system correctness typically holds under a number of assumptions. Leaving them implicit raises the chance of using the system in a context that violates some assumptions, which in return may invalidate the correctness proof. The goal of this paper is to show how combining

  6. Lutein, zeaxanthin, and meso-zeaxanthin: The basic and clinical science underlying carotenoid-based nutritional interventions against ocular disease.

    Science.gov (United States)

    Bernstein, Paul S; Li, Binxing; Vachali, Preejith P; Gorusupudi, Aruna; Shyam, Rajalekshmy; Henriksen, Bradley S; Nolan, John M

    2016-01-01

    The human macula uniquely concentrates three carotenoids: lutein, zeaxanthin, and meso-zeaxanthin. Lutein and zeaxanthin must be obtained from dietary sources such as green leafy vegetables and orange and yellow fruits and vegetables, while meso-zeaxanthin is rarely found in diet and is believed to be formed at the macula by metabolic transformations of ingested carotenoids. Epidemiological studies and large-scale clinical trials such as AREDS2 have brought attention to the potential ocular health and functional benefits of these three xanthophyll carotenoids consumed through the diet or supplements, but the basic science and clinical research underlying recommendations for nutritional interventions against age-related macular degeneration and other eye diseases are underappreciated by clinicians and vision researchers alike. In this review article, we first examine the chemistry, biochemistry, biophysics, and physiology of these yellow pigments that are specifically concentrated in the macula lutea through the means of high-affinity binding proteins and specialized transport and metabolic proteins where they play important roles as short-wavelength (blue) light-absorbers and localized, efficient antioxidants in a region at high risk for light-induced oxidative stress. Next, we turn to clinical evidence supporting functional benefits of these carotenoids in normal eyes and for their potential protective actions against ocular disease from infancy to old age. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Measuring Productivity Change without Neoclassical Assumptions: A Conceptual Analysis

    NARCIS (Netherlands)

    B.M. Balk (Bert)

    2008-01-01

    textabstractThe measurement of productivity change (or difference) is usually based on models that make use of strong assumptions such as competitive behaviour and constant returns to scale. This survey discusses the basics of productivity measurement and shows that one can dispense with most if not

  8. Assumptions for the Annual Energy Outlook 1992

    International Nuclear Information System (INIS)

    1992-01-01

    This report serves a auxiliary document to the Energy Information Administration (EIA) publication Annual Energy Outlook 1992 (AEO) (DOE/EIA-0383(92)), released in January 1992. The AEO forecasts were developed for five alternative cases and consist of energy supply, consumption, and price projections by major fuel and end-use sector, which are published at a national level of aggregation. The purpose of this report is to present important quantitative assumptions, including world oil prices and macroeconomic growth, underlying the AEO forecasts. The report has been prepared in response to external requests, as well as analyst requirements for background information on the AEO and studies based on the AEO forecasts

  9. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption...

  10. Development of a participatory Management approach of the Committee for Basic Education School under the Nongbualamphu Primary Educational Service Area Office 2

    Directory of Open Access Journals (Sweden)

    Jirayu Prommajak

    2016-10-01

    Full Text Available This study aimed: 1 study the present state and adverse conditions of administration with the participation of the basic education in schools. 2 Development of a participatory Management approach of the Committee for Basic Education school under the Nongbualamphu Primary Educational Service Area Office 2. Split data into 2 phases. Phase 1: The sample used for this research consisted of 128 members of the committee on basic education in school under the Nongbualamphu Primary Educational Service Area Office 2. Selected by using stratified random sampling. Instruments used included a set of rating scale questionnaires. Phase 2: Data from the interviews using a structured questionnaire and focus group discussion. The basic statistics used for analyzing the collected data were percentage, means and standard deviation. The results of this study were as follows: 1. On the present state administration with the participation of the basic education commission in schools underunder the Nongbualamphu Primary Educational Service Area Office 2 overall participation in management is moderate. Considering the individual aspects, found that the academic administration overall participation in management and budget management were moderate. The personnel management and general and administrative overall participation in management at a high level. 2. Adverse conditions of administration with the participation of the school board for basic education in schools underunder the Nongbualamphu Primary Educational Service Area Office 2 overall in a high level. Considering the individual aspects, found that the school board in basic education is desirable to participate in the management of all aspects. 3. Development of a participatory management approach of the committee for basic education school under the Nongbualamphu Primary Educational Service Area Office 2 is a developmental process management principles PDCA, 5 steps. Step 1: Creating a common understanding Step

  11. Programa saúde da família no brasil: um enfoque sobre seus pressupostos básicos, operacionalização e vantagens Family health program in brazil: a focus on its basic assumptions, performance and advantages

    Directory of Open Access Journals (Sweden)

    Milena Lopes Santana

    2001-07-01

    Full Text Available De sua concepção até o momento atual, são muitas as análises a respeito do Programa Saúde da Família (PSF no Brasil. Embora ainda em número reduzido, integrantes das unidades de saúde da família, secretários municipais de saúde, prefeitos, elementos do Ministério da Saúde, bem como docentes de universidades e pesquisadores renomados da saúde pública e outras áreas afins, têm se disposto a discutir e a refletir sobre tal estratégia. Dessa forma, tornou-se pertinente fazer uma revisão da literatura sobre o PSF, a qual foi abordada em temas: retrospectiva histórica do período que antecedeu o PSF; seus pressupostos básicos; estratégias de operacionalização: a família como foco de assistência, o princípio da vigilância à saúde, a atuação da equipe multidisciplinar; os diferentes modelos de implantação no Brasil; aspectos facilitadores ou não dessa implantação, bem como as vantagens e desvantagens do PSF no sistema de saúde brasileiro.Since its conception up to the moment, many have been the analysis concerning the Family Health Program in Brazil (FHP. Although still in a small number, members of the Family Health Units, Health Municipal Secretaries, Mayors, members of health Ministry, as well as Universities teaching staff and renowned researchers of public health and other similar branches, they have disposed themselves towards discussing and considering such strategy. Thus, it became appropriate to carry out a review on the literature about The FHP, which was approached in topics: historic retrospective of the period that preceded The FHP; its basic assumptions; performance strategies; the family as the center of assistance, the principle of health vigilance, the performance of the multidisciplinarian staff, the different patterns of implantation in Brazil, the facilitating aspects or not of this launching in Brazil, as well as the advantages and disadvantages of The FHP in Brazilian Health System.

  12. The impact of vascular endothelial growth factor and basic fibroblast growth factor on cardiac fibroblasts grown under altered gravity conditions

    DEFF Research Database (Denmark)

    Ulbrich, Claudia; Leder, Annekatrin; Pietsch, Jessica

    2010-01-01

    Myocardium is very sensitive to gravitational changes. During a spaceflight cardiovascular atrophy paired with rhythm problems and orthostatic intolerance can occur. The aim of this study was to investigate the impact of basic fibroblast growth factor (bFGF) and vascular endothelial growth factor...

  13. 76 FR 56767 - Request for Information Regarding State Flexibility To Establish a Basic Health Program Under the...

    Science.gov (United States)

    2011-09-14

    ... Affordable Care Act specifies that a Basic Health Program will establish a competitive process for entering... entities are involved, or will likely be involved in this planning process? 6. What guidance or information would be helpful to States, plans, and other stakeholders as they begin the planning process? What other...

  14. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  15. Basic principles

    International Nuclear Information System (INIS)

    Wilson, P.D.

    1996-01-01

    Some basic explanations are given of the principles underlying the nuclear fuel cycle, starting with the physics of atomic and nuclear structure and continuing with nuclear energy and reactors, fuel and waste management and finally a discussion of economics and the future. An important aspect of the fuel cycle concerns the possibility of ''closing the back end'' i.e. reprocessing the waste or unused fuel in order to re-use it in reactors of various kinds. The alternative, the ''oncethrough'' cycle, discards the discharged fuel completely. An interim measure involves the prolonged storage of highly radioactive waste fuel. (UK)

  16. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other...... application domains of the theory. We argue that assumptional symmetry leads to theoretical advancement by promoting the development of theory with greater falsifiability and stronger ontological grounding. Thus, strategic management theory may be advanced by systematically searching for asymmetrical...

  17. Theoretical study on the interactions between chlordecone hydrate and acidic surface groups of activated carbon under basic pH conditions.

    Science.gov (United States)

    Melchor-Rodríguez, Kenia; Gamboa-Carballo, Juan José; Ferino-Pérez, Anthuan; Passé-Coutrin, Nady; Gaspard, Sarra; Jáuregui-Haza, Ulises Javier

    2018-05-01

    A theoretical study of the influence of acidic surface groups (SG) of activated carbon (AC) on chlordecone hydrate (CLDh) adsorption is presented, in order to help understanding the adsorption process under basic pH conditions. A seven rings aromatic system (coronene) with a functional group in the edge was used as a simplified model of AC to evaluate the influence of SG in the course of adsorption from aqueous solution at basic pH conditions. Two SG were modeled in their deprotonated form: carboxyl and hydroxyl (COO - and O - ), interacting with CLDh. In order to model the solvation process, all systems under study were calculated with up to three water molecules. Multiple Minima Hypersurface (MMH) methodology was employed to study the interactions of CLDh with SG on AC using PM7 semiempirical Hamiltonian, to explore the potential energy surfaces of the systems and evaluate their thermodynamic association energies. The re-optimization of representative structures obtained from MMH was done using M06-2X Density Functional Theory. The Quantum Theory of Atoms in Molecules (QTAIM) was used to characterize the interaction types. As result, the association of CLDh with acidic SG at basic pH conditions preferentially occurs between the two alcohol groups of CLDh with COO - and O - groups and by dispersive interactions of chlorine atoms of CLDh with the graphitic surface. On the other hand, the presence of covalent interactions between the negatively charged oxygen of SG and one hydrogen atom of CLDh alcohol groups (O - ⋯HO interactions) without water molecules, was confirmed by QTAIM study. It can be concluded that the interactions of CLDh with acidic SG of AC under basic pH conditions confirms the physical mechanisms of adsorption process. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. On the basic research of design analysis and testing based on the failure rate for pipings and equipment under earthquake conditions

    International Nuclear Information System (INIS)

    Shibata, H.

    1980-01-01

    This paper deals with the evaluation method of the failure rate of pipings and equipment of nuclear power plants under destructive earthquakes and a new design concept in this stand point of view. These researches are supported by various studies related to this subject, which have been done by the author since 1966. In this paper, the history of the development, the summaries of these studies and their significances to the practice will be described briefly. The surveys on damages of industrial facilities caused by recent destructive earthquakes are the basical study for this subject. And the continuous response observation of model structures of a plant complex to natural earthquakes is another important basic study to know the stochastic nature and significance of response analysis for the anti-earthquake design of nuclear power plants. By having the exact knowledges on these subjects, the author has been developing the evaluation procedure of the failure rate of pipings and equipment under destructive earthquake conditions, a new design method 'counter-input design' and others. Now his effort is going towards establishing their practical procedure after finishing the basic researches. (orig.)

  19. Wrong assumptions in the financial crisis

    NARCIS (Netherlands)

    Aalbers, M.B.

    2009-01-01

    Purpose - The purpose of this paper is to show how some of the assumptions about the current financial crisis are wrong because they misunderstand what takes place in the mortgage market. Design/methodology/approach - The paper discusses four wrong assumptions: one related to regulation, one to

  20. The relevance of ''theory rich'' bridge assumptions

    NARCIS (Netherlands)

    Lindenberg, S

    1996-01-01

    Actor models are increasingly being used as a form of theory building in sociology because they can better represent the caul mechanisms that connect macro variables. However, actor models need additional assumptions, especially so-called bridge assumptions, for filling in the relatively empty

  1. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  2. Global scientific research commons under the Nagoya Protocol: Towards a collaborative economy model for the sharing of basic research assets.

    Science.gov (United States)

    Dedeurwaerdere, Tom; Melindi-Ghidi, Paolo; Broggiato, Arianna

    2016-01-01

    This paper aims to get a better understanding of the motivational and transaction cost features of building global scientific research commons, with a view to contributing to the debate on the design of appropriate policy measures under the recently adopted Nagoya Protocol. For this purpose, the paper analyses the results of a world-wide survey of managers and users of microbial culture collections, which focused on the role of social and internalized motivations, organizational networks and external incentives in promoting the public availability of upstream research assets. Overall, the study confirms the hypotheses of the social production model of information and shareable goods, but it also shows the need to complete this model. For the sharing of materials, the underlying collaborative economy in excess capacity plays a key role in addition to the social production, while for data, competitive pressures amongst scientists tend to play a bigger role.

  3. Global scientific research commons under the Nagoya Protocol: Towards a collaborative economy model for the sharing of basic research assets

    OpenAIRE

    Dedeurwaerdere, Tom; Melindi Ghidi, Paolo; Broggiato, Arianna

    2015-01-01

    This paper aims to get a better understanding of the motivational and transaction cost features of building global scientific research commons, with a view to contributing to the debate on the design of appropriate policy measures under the recently adopted Nagoya Protocol. For this purpose, the paper analyses the results of a world-wide survey of managers and users of microbial culture collections, which focused on the role of social and internalized motivations, organizational networks and ...

  4. Assumptions for the Annual Energy Outlook 1993

    International Nuclear Information System (INIS)

    1993-01-01

    This report is an auxiliary document to the Annual Energy Outlook 1993 (AEO) (DOE/EIA-0383(93)). It presents a detailed discussion of the assumptions underlying the forecasts in the AEO. The energy modeling system is an economic equilibrium system, with component demand modules representing end-use energy consumption by major end-use sector. Another set of modules represents petroleum, natural gas, coal, and electricity supply patterns and pricing. A separate module generates annual forecasts of important macroeconomic and industrial output variables. Interactions among these components of energy markets generate projections of prices and quantities for which energy supply equals energy demand. This equilibrium modeling system is referred to as the Intermediate Future Forecasting System (IFFS). The supply models in IFFS for oil, coal, natural gas, and electricity determine supply and price for each fuel depending upon consumption levels, while the demand models determine consumption depending upon end-use price. IFFS solves for market equilibrium for each fuel by balancing supply and demand to produce an energy balance in each forecast year

  5. String cosmology basic ideas and general results

    CERN Document Server

    Veneziano, Gabriele

    1995-01-01

    After recalling a few basic concepts from cosmology and string theory, I will outline the main ideas/assumptions underlying (our own group's approach to) string cosmology and show how these lead to the definition of a two-parameter family of ``minimal" models. I will then briefly explain how to compute, in terms of those parameters, the spectrum of scalar, tensor and electromagnetic perturbations, and mention their most relevant physical consequences. More details on the latter part of this talk can be found in Maurizio Gasperini's contribution to these proceedings.

  6. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  8. Genome-wide identification of basic helix-loop-helix and NF-1 motifs underlying GR binding sites in male rat hippocampus

    DEFF Research Database (Denmark)

    Pooley, John R.; Flynn, Ben P.; Grøntved, Lars

    2017-01-01

    linked to structural and organizational roles, an absence of major tethering partners for GRs, and little or no evidence for binding at negative glucocorticoid response elements. A basic helix-loop-helix motif closely resembling a NeuroD1 or Olig2 binding site was found underlying a subset of GR binding......Glucocorticoids regulate hippocampal function in part by modulating gene expression through the glucocorticoid receptor (GR). GR binding is highly cell type specific, directed to accessible chromatin regions established during tissue differentiation. Distinct classes of GR binding sites...

  9. Determining Bounds on Assumption Errors in Operational Analysis

    Directory of Open Access Journals (Sweden)

    Neal M. Bengtson

    2014-01-01

    Full Text Available The technique of operational analysis (OA is used in the study of systems performance, mainly for estimating mean values of various measures of interest, such as, number of jobs at a device and response times. The basic principles of operational analysis allow errors in assumptions to be quantified over a time period. The assumptions which are used to derive the operational analysis relationships are studied. Using Karush-Kuhn-Tucker (KKT conditions bounds on error measures of these OA relationships are found. Examples of these bounds are used for representative performance measures to show limits on the difference between true performance values and those estimated by operational analysis relationships. A technique for finding tolerance limits on the bounds is demonstrated with a simulation example.

  10. Basic design of shield blocks for a spallation neutron source under the high-intensity proton accelerator project

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Katsuhiko; Maekawa, Fujio; Takada, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    Under the JAERI-KEK High-Intensity Proton Accelerator Project (J-PARC), a spallation neutron source driven by a 3 GeV-1 MW proton beam is planed to be constructed as a main part of the Materials and Life Science Facility. Overall dimensions of a biological shield of the neutron source had been determined by evaluation of shielding performance by Monte Carlo calculations. This report describes results of design studies on an optimum dividing scheme in terms of cost and treatment and mechanical strength of shield blocks for the biological shield. As for mechanical strength, it was studied whether the shield blocks would be stable, fall down or move to a horizontal direction in case of an earthquake of seismic intensity of 5.5 (250 Gal) as an abnormal load. For ceiling shielding blocks being supported by both ends of the long blocks, maximum bending moment and an amount of maximum deflection of their center were evaluated. (author)

  11. Basic design of shield blocks for a spallation neutron source under the high-intensity proton accelerator project

    CERN Document Server

    Yoshida, K; Takada, H

    2003-01-01

    Under the JAERI-KEK High-Intensity Proton Accelerator Project (J-PARC), a spallation neutron source driven by a 3 GeV-1 MW proton beam is planed to be constructed as a main part of the Materials and Life Science Facility. Overall dimensions of a biological shield of the neutron source had been determined by evaluation of shielding performance by Monte Carlo calculations. This report describes results of design studies on an optimum dividing scheme in terms of cost and treatment and mechanical strength of shield blocks for the biological shield. As for mechanical strength, it was studied whether the shield blocks would be stable, fall down or move to a horizontal direction in case of an earthquake of seismic intensity of 5.5 (250 Gal) as an abnormal load. For ceiling shielding blocks being supported by both ends of the long blocks, maximum bending moment and an amount of maximum deflection of their center were evaluated.

  12. Hygiene Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Hygiene Basics KidsHealth / For Teens / Hygiene Basics What's in this article? Oily Hair Sweat ... smell, anyway? Read below for information on some hygiene basics — and learn how to deal with greasy ...

  13. Formalization and Analysis of Reasoning by Assumption

    OpenAIRE

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been speci...

  14. The Development of the Academic Administration Model of Basic Primary Educational Institutions under the Office of Sakon Nakhon Educational Service Area Office 3

    Directory of Open Access Journals (Sweden)

    Kamonlrat Kaenchan

    2017-09-01

    Full Text Available The research aimed to 1 examine the current conditions, problems and academic administration of basic education schools under the office of Sakon Nakhon educational service area office 3, 2 develop the academic administration model of basic educational schools under the office of Sakon Nakhon educational service area office 3. The study was divided into 2 phases. Phase 1: Study the researches and literatures concerning the framework, the current conditions and the problems of academic administration to gain the concept for constructing a set of questionnaire. The questionnaire was then used to collect data from 50 administrators, 83 heads of academic sections and 198 heads of learning areas ; and interviewed the administrators of 5 schools that ranked top-five of the national education test scores (O-NET and were certified by the office of educational standard assurance and quality assessment in the third-round inspection. Phase 2: Construct the model of educational administration of basic education schools under the office of Sakon Nakhon educational service area office 3, held a focus group discussion of which the participants were 2 educational administrators and 5 school directors on the constructed model, evaluated the educational administration models of the schools under the office of Sakon Nakhon educational service area office 3 by 30 school administrators and teachers. The instruments used to collect data were a set of questionnaire, interviewing forms, recording forms and evaluating forms. The data were analyzed by a computer application. The statistics used to analyze the data were percentage, mean and standard deviation. The results were as follows: 1 The current conditions of academic administration of basic education schools under the office of Sakon Nakhon educational service area office 3, overall, were at a high level. The highest mean was the development of the learning process. The problems of academic administration, overall

  15. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  16. Comparison of transports expected under different waste management concepts: determination of basic data for application in risk analyses

    International Nuclear Information System (INIS)

    Alter, U.; Mielke, H.G.; Wehner, G.

    1983-01-01

    According to the Atomic Act, article 9a, paragraph 1, the licensees of nuclear power plants in the Federal Republic of Germany are obliged to provide for the management of radioactive wastes resulting from the operation of these plants. Concerning the provisions to be made for the management of such wastes, two concepts are discussed: nuclear reprocessing and final waste disposal center (Nukleares Entsorgungszentrum, NEZ); and the integrated spent fuel and waste management concept (Integriertes Entsorgungskonzept, IEK). Unlike the NEZ, the IEK-concept may have different sites for the following fuel cycle facilities: intermediate spent fuel storage, reprocessing, waste conditioning and final disposal, and uranium and plutonium fuel element fabrication facilities. The fundamental differences of the pertinent transports are presented. Transport scenarios expected under the two alternatives NEZ and IEK have been elaborated for the purpose of a data collection covering the following aspects: materials to be shipped, number of packages shipped, number of packages shipped per transport, transport by rail or by road, transport routes and distances, and duration of transports

  17. Monitoring Assumptions in Assume-Guarantee Contracts

    Directory of Open Access Journals (Sweden)

    Oleg Sokolsky

    2016-05-01

    Full Text Available Pre-deployment verification of software components with respect to behavioral specifications in the assume-guarantee form does not, in general, guarantee absence of errors at run time. This is because assumptions about the environment cannot be discharged until the environment is fixed. An intuitive approach is to complement pre-deployment verification of guarantees, up to the assumptions, with post-deployment monitoring of environment behavior to check that the assumptions are satisfied at run time. Such a monitor is typically implemented by instrumenting the application code of the component. An additional challenge for the monitoring step is that environment behaviors are typically obtained through an I/O library, which may alter the component's view of the input format. This transformation requires us to introduce a second pre-deployment verification step to ensure that alarms raised by the monitor would indeed correspond to violations of the environment assumptions. In this paper, we describe an approach for constructing monitors and verifying them against the component assumption. We also discuss limitations of instrumentation-based monitoring and potential ways to overcome it.

  18. Pre-equilibrium assumptions and statistical model parameters effects on reaction cross-section calculations

    International Nuclear Information System (INIS)

    Avrigeanu, M.; Avrigeanu, V.

    1992-02-01

    A systematic study on effects of statistical model parameters and semi-classical pre-equilibrium emission models has been carried out for the (n,p) reactions on the 56 Fe and 60 Co target nuclei. The results obtained by using various assumptions within a given pre-equilibrium emission model differ among them more than the ones of different models used under similar conditions. The necessity of using realistic level density formulas is emphasized especially in connection with pre-equilibrium emission models (i.e. with the exciton state density expression), while a basic support could be found only by replacement of the Williams exciton state density formula with a realistic one. (author). 46 refs, 12 figs, 3 tabs

  19. Basic electrotechnology

    CERN Document Server

    Ashen, R A

    2013-01-01

    BASIC Electrotechnology discusses the applications of Beginner's All-purpose Symbolic Instruction Code (BASIC) in engineering, particularly in solving electrotechnology-related problems. The book is comprised of six chapters that cover several topics relevant to BASIC and electrotechnology. Chapter 1 provides an introduction to BASIC, and Chapter 2 talks about the use of complex numbers in a.c. circuit analysis. Chapter 3 covers linear circuit analysis with d.c. and sinusoidal a.c. supplies. The book also discusses the elementary magnetic circuit theory. The theory and performance of two windi

  20. Formalization and analysis of reasoning by assumption.

    Science.gov (United States)

    Bosse, Tibor; Jonker, Catholijn M; Treur, Jan

    2006-01-02

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.

  1. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  2. Problems and Guidelines of Strategy Implementation in Basic Educational Institutions under the Supervision of KhonKaen Primary Educational Service Area Office 4

    Directory of Open Access Journals (Sweden)

    Sasiwan Tonkanya

    2016-09-01

    Full Text Available The research aimed to 1 study problems of strategy implementation in basic educational institutions under Khonkaen Primary Educational Service Area Office 4 ; and 2 propose the guidelines for strategy implementation in basic educational institutions under Khonkaen Primary Educational Service Area Office 4. The study was carried out in 2 phases. In phase 1, it focused on the study and analysis of the strategic implementation problems and phase 2 studied the best practice schools. The informants for the interview in phase 1 comprised 6 school administrators and teachers who were involved in strategy implementation from small-sized, medium-sized, and large-sized schools. They were selected by the use of purposive sampling technique. The population in the study of the strategic implementation problems in basic educational institutions in phase 1 consisted of 543 school administrators and teachers who were involved in strategy implementation from 181 schools under Khonkaen Primary Educational Service Area Office 4 in academic year 2014. The study samples were 217 school administrators and teachers who were involved in strategy implementation from small-sized, medium-sized, and large-sized schools under Khonkaen Primary Educational Service Area Office 4. The samples were selected by the use of stratified sampling technique. The informants of the phase 2 study were 6 school administrators and teachers who were involved in strategy implementation from small-sized, medium-sized, and large-sized best practice schools obtained from purposive sampling technique. The research instruments used for data collection consisted of 2 sets of questionnaires. The Set 1 questionnaire was the 5-point Likert scale on the levels of the problems in implementation with item discrimination at 0.60 – 1.00 and reliability of the whole questionnaire at .9359. The questionnaire contained 3 parts with 65 items. The Set 2 questionnaire comprised 2 parts with 10 items regarding

  3. [Analysis of nursing-related content portrayed in middle and high school textbooks under the national common basic curriculum in Korea].

    Science.gov (United States)

    Jung, Myun Sook; Choi, Hyeong Wook; Li, Dong Mei

    2010-02-01

    The purpose of this study was to analyze nursing-related content in middle, and high school textbooks under the National Common Basic Curriculum in Korea. Nursing-related content from 43 middle school textbooks and 13 high school textbooks was analyzed. There were 28 items of nursing-related content in the selected textbooks. Among them, 13 items were in the 'nursing activity' area, 6 items were in the 'nurse as an occupation' area, 2 items were in the 'major and career choice' area, 6 items were 'just one word' and 1 item in 'others'. The main nursing related content which portrayed in the middle and high school textbooks were caring for patients (7 items accounting for 46.5%), nurses working in hospitals (6 items accounting for 21.4%). In terms of gender perspective, female nurses (15 items accounting for 53.6%) were most prevalent.

  4. Anesthesia Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Anesthesia Basics KidsHealth / For Teens / Anesthesia Basics What's in ... español Conceptos básicos sobre la anestesia What Is Anesthesia? No doubt about it, getting an operation can ...

  5. BASIC Programming.

    Science.gov (United States)

    Jennings, Carol Ann

    Designed for use by both secondary- and postsecondary-level business teachers, this curriculum guide consists of 10 units of instructional materials dealing with Beginners All-Purpose Symbol Instruction Code (BASIC) programing. Topics of the individual lessons are numbering BASIC programs and using the PRINT, END, and REM statements; system…

  6. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and

  7. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  8. Critically Challenging Some Assumptions in HRD

    Science.gov (United States)

    O'Donnell, David; McGuire, David; Cross, Christine

    2006-01-01

    This paper sets out to critically challenge five interrelated assumptions prominent in the (human resource development) HRD literature. These relate to: the exploitation of labour in enhancing shareholder value; the view that employees are co-contributors to and co-recipients of HRD benefits; the distinction between HRD and human resource…

  9. Formalization and Analysis of Reasoning by Assumption

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning

  10. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Basic hydraulics

    CERN Document Server

    Smith, P D

    1982-01-01

    BASIC Hydraulics aims to help students both to become proficient in the BASIC programming language by actually using the language in an important field of engineering and to use computing as a means of mastering the subject of hydraulics. The book begins with a summary of the technique of computing in BASIC together with comments and listing of the main commands and statements. Subsequent chapters introduce the fundamental concepts and appropriate governing equations. Topics covered include principles of fluid mechanics; flow in pipes, pipe networks and open channels; hydraulic machinery;

  12. Basic Finance

    Science.gov (United States)

    Vittek, J. F.

    1972-01-01

    A discussion of the basic measures of corporate financial strength, and the sources of the information is reported. Considered are: balance sheet, income statement, funds and cash flow, and financial ratios.

  13. Report on results of research. Basic studies on characteristics of coal char gasification under pressure; Sekitan char no kaatsuka ni okeru gas ka tokuseino kiso kenkyu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    This paper explains basic studies on characteristics of coal char gasification under pressure. Hydro-gasification of coal needs as a gasifying agent a large amount of hydrogen, which is effectively produced by the water gasification of exhaust unreacted residual char. In fiscal 1975, gasification was tested on Taiheiyo coal carbonized char by an atmospheric fluidized gasifier of 28 mm bore. In fiscal 1976, experiment was conducted under pressure by fully improving the auxiliary safety equipment. The char and gas yield increased with higher pressure in pressurized carbonization by an autoclave. In fiscal 1977, clinker was successfully prevented by using quartz sand for a fluidized medium. In fiscal 1978, two-stage continuous gasification was examined. In fiscal 1979, correlation was determined between operation factors such as gasification pressure, temperature, etc., and clinker formation/char reactivity. An experiment was conducted for particle pop-out using a pressurized fluidized bed of 100 mm inner diameter, with the pop-out quantity found to be proportional to the 0.38th power of a pressure. A high pressure fluidized gasifier was built having a char processing capacity of 1 t/day, 20 atmospheric pressure, and an inner diameter of 100 mm. In fiscal 1980, this device was continuously operated, elucidating problems for the practicability. (NEDO)

  14. Towards New Probabilistic Assumptions in Business Intelligence

    OpenAIRE

    Schumann Andrew; Szelc Andrzej

    2015-01-01

    One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot ...

  15. The 'revealed preferences' theory: Assumptions and conjectures

    International Nuclear Information System (INIS)

    Green, C.H.

    1983-01-01

    Being kind of intuitive psychology the 'Revealed-Preferences'- theory based approaches towards determining the acceptable risks are a useful method for the generation of hypotheses. In view of the fact that reliability engineering develops faster than methods for the determination of reliability aims the Revealed-Preferences approach is a necessary preliminary help. Some of the assumptions on which the 'Revealed-Preferences' theory is based will be identified and analysed and afterwards compared with experimentally obtained results. (orig./DG) [de

  16. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  17. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    Rose, Jeremy; Aaen, Ivan; Nielsen, Peter Axel

    2008-01-01

    Thinking about improving the management of software development in software firms is dominated by one approach: the capability maturity model devised and administered at the Software Engineering Institute at Carnegie Mellon University. Though CMM, and its replacement CMMI are widely known and used...... thinking about large production and manufacturing organisations (particularly in America) in the late industrial age. Many of the difficulties reported with CMMI can be attributed basing practice on these assumptions in organisations which have different cultures and management traditions, perhaps...

  18. Occupancy estimation and the closure assumption

    Science.gov (United States)

    Rota, Christopher T.; Fletcher, Robert J.; Dorazio, Robert M.; Betts, Matthew G.

    2009-01-01

    1. Recent advances in occupancy estimation that adjust for imperfect detection have provided substantial improvements over traditional approaches and are receiving considerable use in applied ecology. To estimate and adjust for detectability, occupancy modelling requires multiple surveys at a site and requires the assumption of 'closure' between surveys, i.e. no changes in occupancy between surveys. Violations of this assumption could bias parameter estimates; however, little work has assessed model sensitivity to violations of this assumption or how commonly such violations occur in nature. 2. We apply a modelling procedure that can test for closure to two avian point-count data sets in Montana and New Hampshire, USA, that exemplify time-scales at which closure is often assumed. These data sets illustrate different sampling designs that allow testing for closure but are currently rarely employed in field investigations. Using a simulation study, we then evaluate the sensitivity of parameter estimates to changes in site occupancy and evaluate a power analysis developed for sampling designs that is aimed at limiting the likelihood of closure. 3. Application of our approach to point-count data indicates that habitats may frequently be open to changes in site occupancy at time-scales typical of many occupancy investigations, with 71% and 100% of species investigated in Montana and New Hampshire respectively, showing violation of closure across time periods of 3 weeks and 8 days respectively. 4. Simulations suggest that models assuming closure are sensitive to changes in occupancy. Power analyses further suggest that the modelling procedure we apply can effectively test for closure. 5. Synthesis and applications. Our demonstration that sites may be open to changes in site occupancy over time-scales typical of many occupancy investigations, combined with the sensitivity of models to violations of the closure assumption, highlights the importance of properly addressing

  19. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  20. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  1. Basic electronics

    CERN Document Server

    Holbrook, Harold D

    1971-01-01

    Basic Electronics is an elementary text designed for basic instruction in electricity and electronics. It gives emphasis on electronic emission and the vacuum tube and shows transistor circuits in parallel with electron tube circuits. This book also demonstrates how the transistor merely replaces the tube, with proper change of circuit constants as required. Many problems are presented at the end of each chapter. This book is comprised of 17 chapters and opens with an overview of electron theory, followed by a discussion on resistance, inductance, and capacitance, along with their effects on t

  2. The Importance of the Assumption of Uncorrelated Errors in Psychometric Theory

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.; Patelis, Thanos

    2015-01-01

    A critical discussion of the assumption of uncorrelated errors in classical psychometric theory and its applications is provided. It is pointed out that this assumption is essential for a number of fundamental results and underlies the concept of parallel tests, the Spearman-Brown's prophecy and the correction for attenuation formulas as well as…

  3. New Assumptions to Guide SETI Research

    Science.gov (United States)

    Colombano, S. P.

    2018-01-01

    The recent Kepler discoveries of Earth-like planets offer the opportunity to focus our attention on detecting signs of life and technology in specific planetary systems, but I feel we need to become more flexible in our assumptions. The reason is that, while it is still reasonable and conservative to assume that life is most likely to have originated in conditions similar to ours, the vast time differences in potential evolutions render the likelihood of "matching" technologies very slim. In light of these challenges I propose a more "aggressive"� approach to future SETI exploration in directions that until now have received little consideration.

  4. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  5. Basic concepts

    International Nuclear Information System (INIS)

    Dorner, B.

    1999-01-01

    The basic concepts of neutron scattering as a tool for studying the structure and the dynamics of condensed matter. Theoretical aspects are outlined, the two different cases of coherent and incoherent scattering are presented. The issue of resolution, coherence volume and the role of monochromators are also discussed. (K.A.)

  6. Body Basics

    Science.gov (United States)

    ... learn more about how the body works, what basic human anatomy is, and what happens when parts of ... consult your doctor. © 1995- The Nemours Foundation. All rights reserved. Images provided by The Nemours Foundation, iStock, Getty Images, Veer, Shutterstock, and Clipart.com.

  7. Basic Thermodynamics

    International Nuclear Information System (INIS)

    Duthil, P

    2014-01-01

    The goal of this paper is to present a general thermodynamic basis that is useable in the context of superconductivity and particle accelerators. The first part recalls the purpose of thermodynamics and summarizes its important concepts. Some applications, from cryogenics to magnetic systems, are covered. In the context of basic thermodynamics, only thermodynamic equilibrium is considered

  8. Basic Thermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Duthil, P [Orsay, IPN (France)

    2014-07-01

    The goal of this paper is to present a general thermodynamic basis that is useable in the context of superconductivity and particle accelerators. The first part recalls the purpose of thermodynamics and summarizes its important concepts. Some applications, from cryogenics to magnetic systems, are covered. In the context of basic thermodynamics, only thermodynamic equilibrium is considered.

  9. Ethanol Basics

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-01-30

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  10. Consenting to Heteronormativity: Assumptions in Biomedical Research

    NARCIS (Netherlands)

    Cottingham, M.D.; Fisher, J.A.

    2015-01-01

    The process of informed consent is fundamental to basic scientific research with human subjects. As one aspect of the scientific enterprise, clinical drug trials rely on informed consent documents to safeguard the ethical treatment of trial participants. This paper explores the role of

  11. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    Science.gov (United States)

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  12. Wavelet basics

    CERN Document Server

    Chan, Y T

    1995-01-01

    Since the study of wavelets is a relatively new area, much of the research coming from mathematicians, most of the literature uses terminology, concepts and proofs that may, at times, be difficult and intimidating for the engineer. Wavelet Basics has therefore been written as an introductory book for scientists and engineers. The mathematical presentation has been kept simple, the concepts being presented in elaborate detail in a terminology that engineers will find familiar. Difficult ideas are illustrated with examples which will also aid in the development of an intuitive insight. Chapter 1 reviews the basics of signal transformation and discusses the concepts of duals and frames. Chapter 2 introduces the wavelet transform, contrasts it with the short-time Fourier transform and clarifies the names of the different types of wavelet transforms. Chapter 3 links multiresolution analysis, orthonormal wavelets and the design of digital filters. Chapter 4 gives a tour d'horizon of topics of current interest: wave...

  13. Education: The Basics. The Basics

    Science.gov (United States)

    Wood, Kay

    2011-01-01

    Everyone knows that education is important, we are confronted daily by discussion of it in the media and by politicians, but how much do we really know about education? "Education: The Basics" is a lively and engaging introduction to education as an academic subject, taking into account both theory and practice. Covering the schooling system, the…

  14. Leakage-Resilient Circuits without Computational Assumptions

    DEFF Research Database (Denmark)

    Dziembowski, Stefan; Faust, Sebastian

    2012-01-01

    Physical cryptographic devices inadvertently leak information through numerous side-channels. Such leakage is exploited by so-called side-channel attacks, which often allow for a complete security breache. A recent trend in cryptography is to propose formal models to incorporate leakage...... on computational assumptions, our results are purely information-theoretic. In particular, we do not make use of public key encryption, which was required in all previous works...... into the model and to construct schemes that are provably secure within them. We design a general compiler that transforms any cryptographic scheme, e.g., a block-cipher, into a functionally equivalent scheme which is resilient to any continual leakage provided that the following three requirements are satisfied...

  15. Assumptions and Challenges of Open Scholarship

    Directory of Open Access Journals (Sweden)

    George Veletsianos

    2012-10-01

    Full Text Available Researchers, educators, policymakers, and other education stakeholders hope and anticipate that openness and open scholarship will generate positive outcomes for education and scholarship. Given the emerging nature of open practices, educators and scholars are finding themselves in a position in which they can shape and/or be shaped by openness. The intention of this paper is (a to identify the assumptions of the open scholarship movement and (b to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. Through a critique of technology use in education, an understanding of educational technology narratives and their unfulfilled potential, and an appreciation of the negotiated implementation of technology use, we hope that this paper helps spark a conversation for a more critical, equitable, and effective future for education and open scholarship.

  16. Challenging the assumptions for thermal sensation scales

    DEFF Research Database (Denmark)

    Schweiker, Marcel; Fuchs, Xaver; Becker, Susanne

    2016-01-01

    Scales are widely used to assess the personal experience of thermal conditions in built environments. Most commonly, thermal sensation is assessed, mainly to determine whether a particular thermal condition is comfortable for individuals. A seven-point thermal sensation scale has been used...... extensively, which is suitable for describing a one-dimensional relationship between physical parameters of indoor environments and subjective thermal sensation. However, human thermal comfort is not merely a physiological but also a psychological phenomenon. Thus, it should be investigated how scales for its...... assessment could benefit from a multidimensional conceptualization. The common assumptions related to the usage of thermal sensation scales are challenged, empirically supported by two analyses. These analyses show that the relationship between temperature and subjective thermal sensation is non...

  17. Human Praxis: A New Basic Assumption for Art Educators of the Future.

    Science.gov (United States)

    Hodder, Geoffrey S.

    1980-01-01

    After analyzing Vincent Lanier's five characteristic roles of art education, the article briefly explains the pedagogy of Paulo Freire, based on human praxis, and applies it to the existing "oppresive" art education system. The article reduces Lanier's roles to resemble a single Freirean model. (SB)

  18. Basic electronics

    CERN Document Server

    Tayal, DC

    2010-01-01

    The second edition of this book incorporates the comments and suggestions of my friends and students who have critically studied the first edition. In this edition the changes and additions have been made and subject matter has been rearranged at some places. The purpose of this text is to provide a comprehensive and up-to-date study of the principles of operation of solid state devices, their basic circuits and application of these circuits to various electronic systems, so that it can serve as a standard text not only for universities and colleges but also for technical institutes. This book

  19. BPA review of Washington Public Power Supply System, Projects 1 and 3 (WNP 1 and 3), construction schedule and financing assumptions

    International Nuclear Information System (INIS)

    1984-01-01

    This document contains the following appendices: Data provided By Supply System Regarding Costs and Schedules; Basic Supply System Data and Assumptions; Detailed Modeling of Net Present Values; Origin and Detailed Description of the System Analysis Mode; Decision Analysis Model; Pro Forma Budget Expenditure Levels for Fiscal years 1984 through 1990; Financial Flexibility Analysis - Discretionary/Nondiscretionary Expenditure Levels; Detailed Analysis of BPA's Debt Structure Under the 13 Pro Forma Budget Scenarios for Fiscal Years 1984 through 1990; Wertheim and Co., Inc., August 30, 1984 Letter; Project Considerations and Licensing/Regulatory Issues, Supply System September 15, 1984 Letter; and Summary of Litigation Affecting WNP 1 and 3, and WNP 4 and 5

  20. Basic Exchange Rate Theories

    NARCIS (Netherlands)

    J.G.M. van Marrewijk (Charles)

    2005-01-01

    textabstractThis four-chapter overview of basic exchange rate theories discusses (i) the elasticity and absorption approach, (ii) the (long-run) implications of the monetary approach, (iii) the short-run effects of monetary and fiscal policy under various economic conditions, and (iv) the transition

  1. Virginia State Adult Basic Education Administrative Guide for Local Programs and Projects under the Adult Education Act, P.L. 91-230 and Amendments.

    Science.gov (United States)

    Virginia State Dept. of Education, Richmond. Adult Education Service.

    This administrative guide was developed to provide local school divisions and other agencies operating federally funded Adult Basic Education (ABE) programs in Virginia with the purpose, requirements, and procedures for conducting these programs. The guide is divided into eleven sections. The introduction covers the purpose and scope of ABE…

  2. Inflation Basics

    Energy Technology Data Exchange (ETDEWEB)

    Green, Dan [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2014-03-01

    inflation since metrical fluctuations, both scalar and tensor, are also produced in inflationary models. Thus, the time appears to be appropriate for a very basic and simple exposition of the inflationary model written from a particle physics perspective. Only the simplest scalar model will be explored because it is easy to understand and contains all the basic elements of the inflationary model.

  3. Inflation Basics

    International Nuclear Information System (INIS)

    Green, Dan

    2014-01-01

    waves imprinted on the CMB. These would be a ''smoking gun'' for inflation since metrical fluctuations, both scalar and tensor, are also produced in inflationary models. Thus, the time appears to be appropriate for a very basic and simple exposition of the inflationary model written from a particle physics perspective. Only the simplest scalar model will be explored because it is easy to understand and contains all the basic elements of the inflationary model.

  4. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, R.S.; Alonso, D.; McKane, A.J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  5. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  6. Behavioural assumptions in labour economics: Analysing social security reforms and labour market transitions

    OpenAIRE

    van Huizen, T.M.

    2012-01-01

    The aim of this dissertation is to test behavioural assumptions in labour economics models and thereby improve our understanding of labour market behaviour. The assumptions under scrutiny in this study are derived from an analysis of recent influential policy proposals: the introduction of savings schemes in the system of social security. A central question is how this reform will affect labour market incentives and behaviour. Part I (Chapter 2 and 3) evaluates savings schemes. Chapter 2 exam...

  7. A PSA study for the SMART basic design

    International Nuclear Information System (INIS)

    Han, Sang Hoon; Kim, H. C.; Yang, S. H.; Lee, D. J.

    2002-03-01

    SMART (System-Integrated Modular Advanced Reactor) is under development that is an advanced integral type small and medium category nuclear power reactor with the rated thermal power of 330 MW. A Probabilistic Safety Analysis (PSA) for the SMART basic design has been performed to evaluate the safety and optimize the design. Currently, the basic design is done and the detailed design is not available for the SMART, we made several assumptions about the system design before performing the PSA. The scope of the PSA was limited to the Level-1 internal full power PSA. The level-2 and 3 PSA, the external PSA, and the low power/shutdown PSA will be performed in the final design stage

  8. HYPROLOG: A New Logic Programming Language with Assumptions and Abduction

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2005-01-01

    We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars. The lan......We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars....... The language shows a novel flexibility in the interaction between the different paradigms, including all additional built-in predicates and constraints solvers that may be available. Assumptions and abduction are especially useful for language processing, and we can show how HYPROLOG works seamlessly together...

  9. On the Basic Equations of the Magnetostatics

    Directory of Open Access Journals (Sweden)

    A. M. Makarov

    2016-01-01

    Full Text Available The paper studies the physical relationship between the main objects of the magnetic field in a continuous medium with magnetization effects. Consistently considers the following hypotheses: a hypothesis of the primacy and the physical reality of the magnetization vector field environment, a similar hypothesis about the real existence of Ampere currents (molecular currents, magnetization currents, a hypothesis of a magnetic dipole moment of the medium volume element in view of bulk density of electric currents in this volume. A more rigorous derivation of the basic differential equations of magnetostatics from the Biot-Savart-Laplace equation is proposed.The well-known works justifying basic equations of magnetostatics use a procedure wherein when proving the local differential ratio is used a transformation of some volume integral to the surface integral bounding this volume. Thus, there is a specific way to select a closed surface that is either a surface in a vacuum (beyond the medium volume under consideration, or a surface of the conductor (a normal component of currents to the surface, here, becomes zero. In the paper the control surface is arbitrarily carried out within the volume of the medium under consideration, thereby leading to the mathematically sound result.The paper analyzes the hypotheses listed above. The main feature of analysis is a succesively using concept of bilateralism surface bounding the medium volume of the arbitrary finite dimensions. The analysis allowed us to reveal the physical adequacy of the considered hypotheses, derive the appropriate differential equations for the basic vector fields of magnetostatics and obtain a new condition. The resulting condition for the closedness of magnetization currents is recorded in entire compliance with the well-known Gauss electrostatic law, which avoids the need for additional, but not always reasonable assumptions.

  10. Arabidopsis basic leucine zipper transcription factors involved in an abscisic acid-dependent signal transduction pathway under drought and high-salinity conditions

    OpenAIRE

    Uno, Yuichi; Furihata, Takashi; Abe, Hiroshi; Yoshida, Riichiro; Shinozaki, Kazuo; Yamaguchi-Shinozaki, Kazuko

    2000-01-01

    The induction of the dehydration-responsive Arabidopsis gene, rd29B, is mediated mainly by abscisic acid (ABA). Promoter analysis of rd29B indicated that two ABA-responsive elements (ABREs) are required for the dehydration-responsive expression of rd29B as cis-acting elements. Three cDNAs encoding basic leucine zipper (bZIP)-type ABRE-binding proteins were isolated by using the yeast one-hybrid system and were designated AREB1, AREB2, and AREB3 (ABA-responsive ...

  11. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    Science.gov (United States)

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. On the scope of the Federal Government to issue orders in plan approval procedures under para. 9b of the Atomic Energy Act as provided by article 85 section 3 of the Basic Law

    International Nuclear Information System (INIS)

    Ossenbuehl, F.

    1991-01-01

    Under Paragraph 9b of the Atomic Energy Act the Lower Saxonian Minister of the Environment has the competence for the plan approval procedure concerning the final disposal site Konrad. The plan approval procedure under atomic energy law is a unitary administrative procedure which makes further administrative procedures and administrative decisions superfluous on the strength of its unitary character and without impingement on constitutional law. In conducting the plan approval procedure the Lower Saxonican Minister of the Environment is acting within the framework of Laender administration on behalf of the Federation. To this extent he is subject to the orders of the Federal Minister of the Enviroment under Article 85 Section 3 of The Basic Law with respect to the formation of the procedure and procedural decisions as well as decisions on the merits pending. The concentrating effect of the plan approval procedure under atomic energy law also extends to permits under water law. (orig./HSCH) [de

  13. 76 FR 81966 - Agency Information Collection Activities; Proposed Collection; Comments Requested; Assumption of...

    Science.gov (United States)

    2011-12-29

    ... Indian country is subject to State criminal jurisdiction under Public Law 280 (18 U.S.C. 1162(a)) to... Collection; Comments Requested; Assumption of Concurrent Federal Criminal Jurisdiction in Certain Areas of Indian Country ACTION: 60-Day notice of information collection under review. The Department of Justice...

  14. Investigating the Assumptions of Uses and Gratifications Research

    Science.gov (United States)

    Lometti, Guy E.; And Others

    1977-01-01

    Discusses a study designed to determine empirically the gratifications sought from communication channels and to test the assumption that individuals differentiate channels based on gratifications. (MH)

  15. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  16. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  17. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the ...

  18. 40 CFR 264.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... FACILITIES Financial Requirements § 264.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure, post-closure care, or... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  19. 40 CFR 261.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... Excluded Hazardous Secondary Materials § 261.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure or liability... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  20. 40 CFR 265.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ..., STORAGE, AND DISPOSAL FACILITIES Financial Requirements § 265.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  1. 40 CFR 144.66 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM Financial Responsibility: Class I Hazardous Waste Injection Wells § 144.66 State assumption of responsibility. (a) If a State either assumes legal... 40 Protection of Environment 22 2010-07-01 2010-07-01 false State assumption of responsibility...

  2. 40 CFR 267.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... STANDARDIZED PERMIT Financial Requirements § 267.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure care or liability... 40 Protection of Environment 26 2010-07-01 2010-07-01 false State assumption of responsibility...

  3. Impact of solvent conditions on separation and detection of basic drugs by micro liquid chromatography-mass spectrometry under overloading conditions.

    Science.gov (United States)

    Schubert, Birthe; Oberacher, Herbert

    2011-06-03

    In this study the impact of solvent conditions on the performance of μLC/MS for the analysis of basic drugs was investigated. Our aim was to find experimental conditions that enable high-performance chromatographic separation particularly at overloading conditions paired with a minimal loss of mass spectrometric detection sensitivity. A focus was put on the evaluation of the usability of different kinds of acidic modifiers (acetic acid (HOAc), formic acid (FA), methansulfonic acid (CH₃SO₃H), trifluoroacetic acid (TFA), pentafluoropropionic acid (PFPA), and heptafluorobutyric acid (HFBA)). The test mixture consisted of eleven compounds (bunitrolol, caffeine, cocaine, codeine, diazepam, doxepin, haloperidol, 3,4-methylendioxyamphetamine, morphine, nicotine, and zolpidem). Best chromatographic performance was obtained with the perfluorinated acids. Particularly, 0.010-0.050% HFBA (v/v) was found to represent a good compromise in terms of chromatographic performance and mass spectrometric detection sensitivity. Compared to HOAc, on average a 50% reduction of the peak widths was observed. The use of HFBA was particularly advantageous for polar compounds such as nicotine; only with such a hydrophobic ion-pairing reagent chromatographic retention of nicotine was observed. Best mass spectrometric performance was obtained with HOAc and FA. Loss of detection sensitivity induced by HFBA, however, was moderate and ranged from 0 to 40%, which clearly demonstrates that improved chromatographic performance is able to compensate to a large extent the negative effect of reduced ionization efficiency on detection sensitivity. Applications of μLC/MS for the qualitative and quantitative analysis of clinical and forensic toxicological samples are presented. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Synthesis of hierarchically porous perovskite-carbon aerogel composite catalysts for the rapid degradation of fuchsin basic under microwave irradiation and an insight into probable catalytic mechanism

    Science.gov (United States)

    Wang, Yin; Wang, Jiayuan; Du, Baobao; Wang, Yun; Xiong, Yang; Yang, Yiqiong; Zhang, Xiaodong

    2018-05-01

    3D hierarchically porous perovskites LaFe0.5M0.5O3-CA (M = Mn, Cu) were synthesized by a two-step method using PMMA as template and supporting with carbon aerogel, which were characterized with SEM, TEM, XRD, XPS and FT-IR spectroscopy. The as-prepared composites were used in microwave (MW) catalytic degradation of fuchsin basic (FB) dye wastewater. Batch experiment results showed that the catalytic degradation of FB could be remarkably improved by coating with CA. And LaFe0.5Cu0.5O3-CA exhibited higher catalytic performance than LaFe0.5Mn0.5O3-CA, which had a close connection with the activity of substitution metal ion in B site of the catalysts. The FB removal fit pseudo-first-order model and the degradation rate constant increased with initial pH value and MW powder while decreases with initial FB concentration. All catalysts presented favorable recycling and stability in the repeated experiment. Radical scavenger measurements indicated that hydroxyl radicals rather than surface peroxide and hole played an important role in the catalytic process, and its quantity determined the degradation of FB. Furthermore, both Cu and Fe species were involved in the formation of active species, which were responsible to the excellent performance of the LaFe0.5Cu0.5O3-CA/MW system. Therefore, LaFe0.5Cu0.5O3-CA/MW showed to be a promising technology for the removal of organic pollutants in wastewater treatment applications.

  5. Arabidopsis basic leucine zipper transcription factors involved in an abscisic acid-dependent signal transduction pathway under drought and high-salinity conditions.

    Science.gov (United States)

    Uno, Y; Furihata, T; Abe, H; Yoshida, R; Shinozaki, K; Yamaguchi-Shinozaki, K

    2000-10-10

    The induction of the dehydration-responsive Arabidopsis gene, rd29B, is mediated mainly by abscisic acid (ABA). Promoter analysis of rd29B indicated that two ABA-responsive elements (ABREs) are required for the dehydration-responsive expression of rd29B as cis-acting elements. Three cDNAs encoding basic leucine zipper (bZIP)-type ABRE-binding proteins were isolated by using the yeast one-hybrid system and were designated AREB1, AREB2, and AREB3 (ABA-responsive element binding protein). Transcription of the AREB1 and AREB2 genes is up-regulated by drought, NaCl, and ABA treatment in vegetative tissues. In a transient transactivation experiment using Arabidopsis leaf protoplasts, both the AREB1 and AREB2 proteins activated transcription of a reporter gene driven by ABRE. AREB1 and AREB2 required ABA for their activation, because their transactivation activities were repressed in aba2 and abi1 mutants and enhanced in an era1 mutant. Activation of AREBs by ABA was suppressed by protein kinase inhibitors. These results suggest that both AREB1 and AREB2 function as transcriptional activators in the ABA-inducible expression of rd29B, and further that ABA-dependent posttranscriptional activation of AREB1 and AREB2, probably by phosphorylation, is necessary for their maximum activation by ABA. Using cultured Arabidopsis cells, we demonstrated that a specific ABA-activated protein kinase of 42-kDa phosphorylated conserved N-terminal regions in the AREB proteins.

  6. PFP issues/assumptions development and management planning guide

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The PFP Issues/Assumptions Development and Management Planning Guide presents the strategy and process used for the identification, allocation, and maintenance of an Issues/Assumptions Management List for the Plutonium Finishing Plant (PFP) integrated project baseline. Revisions to this document will include, as attachments, the most recent version of the Issues/Assumptions Management List, both open and current issues/assumptions (Appendix A), and closed or historical issues/assumptions (Appendix B). This document is intended be a Project-owned management tool. As such, this document will periodically require revisions resulting from improvements of the information, processes, and techniques as now described. Revisions that suggest improved processes will only require PFP management approval

  7. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    Science.gov (United States)

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. On Some Unwarranted Tacit Assumptions in Cognitive Neuroscience†

    Science.gov (United States)

    Mausfeld, Rainer

    2011-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. PMID:22435062

  9. On some unwarranted tacit assumptions in cognitive neuroscience.

    Science.gov (United States)

    Mausfeld, Rainer

    2012-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input-output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings.

  10. Are waves of relational assumptions eroding traditional analysis?

    Science.gov (United States)

    Meredith-Owen, William

    2013-11-01

    The author designates as 'traditional' those elements of psychoanalytic presumption and practice that have, in the wake of Fordham's legacy, helped to inform analytical psychology and expand our capacity to integrate the shadow. It is argued that this element of the broad spectrum of Jungian practice is in danger of erosion by the underlying assumptions of the relational approach, which is fast becoming the new establishment. If the maps of the traditional landscape of symbolic reference (primal scene, Oedipus et al.) are disregarded, analysts are left with only their own self-appointed authority with which to orientate themselves. This self-centric epistemological basis of the relationalists leads to a revision of 'analytic attitude' that may be therapeutic but is not essentially analytic. This theme is linked to the perennial challenge of balancing differentiation and merger and traced back, through Chasseguet-Smirgel, to its roots in Genesis. An endeavour is made to illustrate this within the Journal convention of clinically based discussion through a commentary on Colman's (2013) avowedly relational treatment of the case material presented in his recent Journal paper 'Reflections on knowledge and experience' and through an assessment of Jessica Benjamin's (2004) relational critique of Ron Britton's (1989) transference embodied approach. © 2013, The Society of Analytical Psychology.

  11. Assumptions and Policy Decisions for Vital Area Identification Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myungsu; Bae, Yeon-Kyoung; Lee, Youngseung [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    U.S. Nuclear Regulatory Commission and IAEA guidance indicate that certain assumptions and policy questions should be addressed to a Vital Area Identification (VAI) process. Korea Hydro and Nuclear Power conducted a VAI based on current Design Basis Threat and engineering judgement to identify APR1400 vital areas. Some of the assumptions were inherited from Probabilistic Safety Assessment (PSA) as a sabotage logic model was based on PSA logic tree and equipment location data. This paper illustrates some important assumptions and policy decisions for APR1400 VAI analysis. Assumptions and policy decisions could be overlooked at the beginning stage of VAI, however they should be carefully reviewed and discussed among engineers, plant operators, and regulators. Through APR1400 VAI process, some of the policy concerns and assumptions for analysis were applied based on document research and expert panel discussions. It was also found that there are more assumptions to define for further studies for other types of nuclear power plants. One of the assumptions is mission time, which was inherited from PSA.

  12. Marking and Moderation in the UK: False Assumptions and Wasted Resources

    Science.gov (United States)

    Bloxham, Sue

    2009-01-01

    This article challenges a number of assumptions underlying marking of student work in British universities. It argues that, in developing rigorous moderation procedures, we have created a huge burden for markers which adds little to accuracy and reliability but creates additional work for staff, constrains assessment choices and slows down…

  13. Simple and Regioselective Bromination of 5,6-Disubstituted-indan-1-ones with Br2 Under Acidic and Basic Conditions

    Directory of Open Access Journals (Sweden)

    Eunsook Ma

    2007-01-01

    Full Text Available Bromination of 5,6-dimethoxyindan-1-one with Br2 in acetic acid at room temperature produced exclusively the corresponding 2,4-dibromo compound in 95% yield. Reaction of 5,6-dimethoxyindan-1-one with Br2 in the presence of KOH, K2CO3 or Cs2CO3 at ~0°C gave the monobrominated product 4-bromo-5,6-dimethoxyindan-3-one in 79%, 81% and 67% yield, respectively. 5,6-Dihydroxyindan-1-one was dibrominated on the aromatic ring affording 4,7-dibromo-5,6-dihydroxyindan-1-one both in acetic acid at room temperature and in the presence of KOH at ~0°C. 5,6-Difluoroindan-1-one and 1-indanone were α-monobrominated in acetic acid and α,α-dibrominated under KOH conditions at room temperature.

  14. MONITORED GEOLOGIC REPOSITORY LIFE CYCLE COST ESTIMATE ASSUMPTIONS DOCUMENT

    International Nuclear Information System (INIS)

    R.E. Sweeney

    2001-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost (LCC) estimate and schedule update incorporating information from the Viability Assessment (VA) , License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  15. Monitored Geologic Repository Life Cycle Cost Estimate Assumptions Document

    International Nuclear Information System (INIS)

    Sweeney, R.

    2000-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost estimate and schedule update incorporating information from the Viability Assessment (VA), License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  16. Homotopy Method for a General Multiobjective Programming Problem under Generalized Quasinormal Cone Condition

    Directory of Open Access Journals (Sweden)

    X. Zhao

    2012-01-01

    Full Text Available A combined interior point homotopy continuation method is proposed for solving general multiobjective programming problem. We prove the existence and convergence of a smooth homotopy path from almost any interior initial interior point to a solution of the KKT system under some basic assumptions.

  17. Stem Cell Basics

    Science.gov (United States)

    ... Tips Info Center Research Topics Federal Policy Glossary Stem Cell Information General Information Clinical Trials Funding Information Current ... Basics » Stem Cell Basics I. Back to top Stem Cell Basics I. Introduction: What are stem cells, and ...

  18. Back to the basics: Identifying and addressing underlying challenges in achieving high quality and relevant health statistics for indigenous populations in Canada.

    Science.gov (United States)

    Smylie, Janet; Firestone, Michelle

    Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations.

  19. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  20. Quasi-experimental study designs series-paper 7: assessing the assumptions.

    Science.gov (United States)

    Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian

    2017-09-01

    Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Basic properties of a stationary accretion disk surrounding a black hole

    International Nuclear Information System (INIS)

    Hoshi, Reiun

    1977-01-01

    The structure of a stationary accretion disk surrounding a black hole is studied by means of newly developed basic equations. The basic equations are derived under the assumption that the vertical distribution of disk matter is given by a polytrope. For a Keplerian accretion disk, basic equations reduce to a differential equation of the first order. We have found that solutions of an optically thick accretion disk converge to a limiting value, irrespective of the outer boundary condition. This gives the happy consequence that the inner structure of an optically thick accretion disk is determined irrespective of the outer boundary condition. On the contrary, an optically thin accretion disk shows bimodal behavior, that is, two physically distinct states exist depending on the outer boundary condition imposed at the outer edge of the accretion disk. (auth.)

  2. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  3. Psychopatholgy, fundamental assumptions and CD-4 T lymphocyte ...

    African Journals Online (AJOL)

    In addition, we explored whether psychopathology and negative fundamental assumptions in ... Method: Self-rating questionnaires to assess depressive symptoms, ... associated with all participants scoring in the positive range of the FA scale.

  4. The Immoral Assumption Effect: Moralization Drives Negative Trait Attributions.

    Science.gov (United States)

    Meindl, Peter; Johnson, Kate M; Graham, Jesse

    2016-04-01

    Jumping to negative conclusions about other people's traits is judged as morally bad by many people. Despite this, across six experiments (total N = 2,151), we find that multiple types of moral evaluations--even evaluations related to open-mindedness, tolerance, and compassion--play a causal role in these potentially pernicious trait assumptions. Our results also indicate that moralization affects negative-but not positive-trait assumptions, and that the effect of morality on negative assumptions cannot be explained merely by people's general (nonmoral) preferences or other factors that distinguish moral and nonmoral traits, such as controllability or desirability. Together, these results suggest that one of the more destructive human tendencies--making negative assumptions about others--can be caused by the better angels of our nature. © 2016 by the Society for Personality and Social Psychology, Inc.

  5. Idaho National Engineering Laboratory installation roadmap assumptions document

    International Nuclear Information System (INIS)

    1993-05-01

    This document is a composite of roadmap assumptions developed for the Idaho National Engineering Laboratory (INEL) by the US Department of Energy Idaho Field Office and subcontractor personnel as a key element in the implementation of the Roadmap Methodology for the INEL Site. The development and identification of these assumptions in an important factor in planning basis development and establishes the planning baseline for all subsequent roadmap analysis at the INEL

  6. Oil price assumptions in macroeconomic forecasts: should we follow future market expectations?

    International Nuclear Information System (INIS)

    Coimbra, C.; Esteves, P.S.

    2004-01-01

    In macroeconomic forecasting, in spite of its important role in price and activity developments, oil prices are usually taken as an exogenous variable, for which assumptions have to be made. This paper evaluates the forecasting performance of futures market prices against the other popular technical procedure, the carry-over assumption. The results suggest that there is almost no difference between opting for futures market prices or using the carry-over assumption for short-term forecasting horizons (up to 12 months), while, for longer-term horizons, they favour the use of futures market prices. However, as futures market prices reflect market expectations for world economic activity, futures oil prices should be adjusted whenever market expectations for world economic growth are different to the values underlying the macroeconomic scenarios, in order to fully ensure the internal consistency of those scenarios. (Author)

  7. Catalyst in Basic Oleochemicals

    Directory of Open Access Journals (Sweden)

    Eva Suyenty

    2007-10-01

    Full Text Available Currently Indonesia is the world largest palm oil producer with production volume reaching 16 million tones per annum. The high crude oil and ethylene prices in the last 3 – 4 years contribute to the healthy demand growth for basic oleochemicals: fatty acids and fatty alcohols. Oleochemicals are starting to replace crude oil derived products in various applications. As widely practiced in petrochemical industry, catalyst plays a very important role in the production of basic oleochemicals. Catalytic reactions are abound in the production of oleochemicals: Nickel based catalysts are used in the hydrogenation of unsaturated fatty acids; sodium methylate catalyst in the transesterification of triglycerides; sulfonic based polystyrene resin catalyst in esterification of fatty acids; and copper chromite/copper zinc catalyst in the high pressure hydrogenation of methyl esters or fatty acids to produce fatty alcohols. To maintain long catalyst life, it is crucial to ensure the absence of catalyst poisons and inhibitors in the feed. The preparation methods of nickel and copper chromite catalysts are as follows: precipitation, filtration, drying, and calcinations. Sodium methylate is derived from direct reaction of sodium metal and methanol under inert gas. The sulfonic based polystyrene resin is derived from sulfonation of polystyrene crosslinked with di-vinyl-benzene. © 2007 BCREC UNDIP. All rights reserved.[Presented at Symposium and Congress of MKICS 2007, 18-19 April 2007, Semarang, Indonesia][How to Cite: E. Suyenty, H. Sentosa, M. Agustine, S. Anwar, A. Lie, E. Sutanto. (2007. Catalyst in Basic Oleochemicals. Bulletin of Chemical Reaction Engineering and Catalysis, 2 (2-3: 22-31.  doi:10.9767/bcrec.2.2-3.6.22-31][How to Link/DOI: http://dx.doi.org/10.9767/bcrec.2.2-3.6.22-31 || or local: http://ejournal.undip.ac.id/index.php/bcrec/article/view/6

  8. Basic principles of concrete structures

    CERN Document Server

    Gu, Xianglin; Zhou, Yong

    2016-01-01

    Based on the latest version of designing codes both for buildings and bridges (GB50010-2010 and JTG D62-2004), this book starts from steel and concrete materials, whose properties are very important to the mechanical behavior of concrete structural members. Step by step, analysis of reinforced and prestressed concrete members under basic loading types (tension, compression, flexure, shearing and torsion) and environmental actions are introduced. The characteristic of the book that distinguishes it from other textbooks on concrete structures is that more emphasis has been laid on the basic theories of reinforced concrete and the application of the basic theories in design of new structures and analysis of existing structures. Examples and problems in each chapter are carefully designed to cover every important knowledge point. As a basic course for undergraduates majoring in civil engineering, this course is different from either the previously learnt mechanics courses or the design courses to be learnt. Compa...

  9. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2017-01-01

    are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute......A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...... over what constitutes proper assumptions—even in the absence of corroborating or falsifying empirical evidence. We also discuss how changing assumptions may drive future progress in the resource-based view....

  10. The Emperors sham - wrong assumption that sham needling is sham.

    Science.gov (United States)

    Lundeberg, Thomas; Lund, Iréne; Näslund, Jan; Thomas, Moolamanil

    2008-12-01

    During the last five years a large number of randomised controlled clinical trials (RCTs) have been published on the efficacy of acupuncture in different conditions. In most of these studies verum is compared with sham acupuncture. In general both verum and sham have been found to be effective, and often with little reported difference in outcome. This has repeatedly led to the conclusion that acupuncture is no more effective than placebo treatment. However, this conclusion is based on the assumption that sham acupuncture is inert. Since sham acupuncture evidently is merely another form of acupuncture from the physiological perspective, the assumption that sham is sham is incorrect and conclusions based on this assumption are therefore invalid. Clinical guidelines based on such conclusions may therefore exclude suffering patients from valuable treatments.

  11. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  12. Basic research projects

    International Nuclear Information System (INIS)

    1979-04-01

    The research programs under the cognizance of the Office of Energy Research (OER) are directed toward discovery of natural laws and new knowledge, and to improved understanding of the physical and biological sciences as related to the development, use, and control of energy. The ultimate goal is to develop a scientific underlay for the overall DOE effort and the fundamental principles of natural phenomena so that these phenomena may be understood, and new principles, formulated. The DOE-OER outlay activities include three major programs: High Energy Physics, Nuclear Physics, and Basic Energy Sciences. Taken together, these programs represent some 30 percent of the Nation's Federal support of basic research in the energy sciences. The research activities of OER involve more than 6,000 scientists and engineers working in some 17 major Federal Research Centers and at more than 135 different universities and industrial firms throughout the United States. Contract holders in the areas of high-energy physics, nuclear physics, materials sciences, nuclear science, chemical sciences, engineering, mathematics geosciences, advanced energy projects, and biological energy research are listed. Funding trends for recent years are outlined

  13. Basic scattering theory

    International Nuclear Information System (INIS)

    Queen, N.M.

    1978-01-01

    This series of lectures on basic scattering theory were given as part of a course for postgraduate high energy physicists and were designed to acquaint the student with some of the basic language and formalism used for the phenomenological description of nuclear reactions and decay processes used for the study of elementary particle interactions. Well established and model independent aspects of scattering theory, which are the basis of S-matrix theory, are considered. The subject is considered under the following headings; the S-matrix, cross sections and decay rates, phase space, relativistic kinematics, the Mandelstam variables, the flux factor, two-body phase space, Dalitz plots, other kinematic plots, two-particle reactions, unitarity, the partial-wave expansion, resonances (single-channel case), multi-channel resonances, analyticity and crossing, dispersion relations, the one-particle exchange model, the density matrix, mathematical properties of the density matrix, the density matrix in scattering processes, the density matrix in decay processes, and the helicity formalism. Some exercises for the students are included. (U.K.)

  14. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2012-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R_f=......-Reingold style pseudorandom functions, and auxiliary input secure encryption. This can be seen as an alternative to the known family of k-LIN assumptions....

  15. Emerging Assumptions About Organization Design, Knowledge And Action

    Directory of Open Access Journals (Sweden)

    Alan Meyer

    2013-12-01

    Full Text Available Participants in the Organizational Design Community’s 2013 Annual Conference faced the challenge of “making organization design knowledge actionable.”  This essay summarizes the opinions and insights participants shared during the conference.  I reflect on these ideas, connect them to recent scholarly thinking about organization design, and conclude that seeking to make design knowledge actionable is nudging the community away from an assumption set based upon linearity and equilibrium, and toward a new set of assumptions based on emergence, self-organization, and non-linearity.

  16. Chernobyl versus Basic Law

    Energy Technology Data Exchange (ETDEWEB)

    Sauer, G W

    1986-01-01

    The author discusses the terms 'remaining risk to be accepted' and 'remainder of the aggregate risk', and explains the line of action to be adopted in compliance with the Constitution in order to respond to the event at Chernobyl: The Constitution demands maximum acceptable limits to be defined as low as possible. The author discusses the various dose estimations and the contradictions to be observed in this context. He states that the Chernobyl accident has done most harm to our legal system, as the basic right of freedom from injury has been ploughed under with the radioactivity that covered the soil after the Chernobyl accident. But, he says, a positive effect is that the idea of abandoning nuclear power as too dangerous a technology has gained more widespread acceptance. (HSCH).

  17. Chernobyl versus Basic Law?

    International Nuclear Information System (INIS)

    Sauer, G.W.

    1986-01-01

    The author discusses the terms 'remaining risk to be accepted' and 'remainder of the aggregate risk', and explains the line of action to be adopted in compliance with the Constitution in order to respond to the event at Chernobyl: The Constitution demands maximum acceptable limits to be defined as low as possible. The author discusses the various dose estimations and the contradictions to be observed in this context. He states that the Chernobyl accident has done most harm to our legal system, as the basic right of freedom from injury has been ploughed under with the radioactivity that covered the soil after the Chernobyl accident. But, he says, a positive effect is that the idea of abandoning nuclear power as too dangerous a technology has gained more widespread acceptance. (HSCH) [de

  18. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    ) model~\\cite{borges99data}. These techniques typically rely on the \\textit{Markov assumption with history depth} $n$, i.e., it is assumed that the next requested page is only dependent on the last $n$ pages visited. This is not always valid, i.e. false browsing patterns may be discovered. However, to our...

  19. Interface Input/Output Automata: Splitting Assumptions from Guarantees

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Nyman, Ulrik; Wasowski, Andrzej

    2006-01-01

    's \\IOAs [11], relying on a context dependent notion of refinement based on relativized language inclusion. There are two main contributions of the work. First, we explicitly separate assumptions from guarantees, increasing the modeling power of the specification language and demonstrating an interesting...

  20. Exploring five common assumptions on Attention Deficit Hyperactivity Disorder

    NARCIS (Netherlands)

    Batstra, Laura; Nieweg, Edo H.; Hadders-Algra, Mijna

    The number of children diagnosed with attention deficit hyperactivity disorder (ADHD) and treated with medication is steadily increasing. The aim of this paper was to critically discuss five debatable assumptions on ADHD that may explain these trends to some extent. These are that ADHD (i) causes

  1. Efficient pseudorandom generators based on the DDH assumption

    NARCIS (Netherlands)

    Rezaeian Farashahi, R.; Schoenmakers, B.; Sidorenko, A.; Okamoto, T.; Wang, X.

    2007-01-01

    A family of pseudorandom generators based on the decisional Diffie-Hellman assumption is proposed. The new construction is a modified and generalized version of the Dual Elliptic Curve generator proposed by Barker and Kelsey. Although the original Dual Elliptic Curve generator is shown to be

  2. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  3. Child Development Knowledge and Teacher Preparation: Confronting Assumptions.

    Science.gov (United States)

    Katz, Lilian G.

    This paper questions the widely held assumption that acquiring knowledge of child development is an essential part of teacher preparation and teaching competence, especially among teachers of young children. After discussing the influence of culture, parenting style, and teaching style on developmental expectations and outcomes, the paper asserts…

  4. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  5. Making Predictions about Chemical Reactivity: Assumptions and Heuristics

    Science.gov (United States)

    Maeyer, Jenine; Talanquer, Vicente

    2013-01-01

    Diverse implicit cognitive elements seem to support but also constrain reasoning in different domains. Many of these cognitive constraints can be thought of as either implicit assumptions about the nature of things or reasoning heuristics for decision-making. In this study we applied this framework to investigate college students' understanding of…

  6. Using Contemporary Art to Challenge Cultural Values, Beliefs, and Assumptions

    Science.gov (United States)

    Knight, Wanda B.

    2006-01-01

    Art educators, like many other educators born or socialized within the main-stream culture of a society, seldom have an opportunity to identify, question, and challenge their cultural values, beliefs, assumptions, and perspectives because school culture typically reinforces those they learn at home and in their communities (Bush & Simmons, 1990).…

  7. Does Artificial Neural Network Support Connectivism's Assumptions?

    Science.gov (United States)

    AlDahdouh, Alaa A.

    2017-01-01

    Connectivism was presented as a learning theory for the digital age and connectivists claim that recent developments in Artificial Intelligence (AI) and, more specifically, Artificial Neural Network (ANN) support their assumptions of knowledge connectivity. Yet, very little has been done to investigate this brave allegation. Does the advancement…

  8. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...

  9. 7 CFR 1980.476 - Transfer and assumptions.

    Science.gov (United States)

    2010-01-01

    ...-354 449-30 to recover its pro rata share of the actual loss at that time. In completing Form FmHA or... the lender on liquidations and property management. A. The State Director may approve all transfer and... Director will notify the Finance Office of all approved transfer and assumption cases on Form FmHA or its...

  10. Origins and Traditions in Comparative Education: Challenging Some Assumptions

    Science.gov (United States)

    Manzon, Maria

    2018-01-01

    This article questions some of our assumptions about the history of comparative education. It explores new scholarship on key actors and ways of knowing in the field. Building on the theory of the social constructedness of the field of comparative education, the paper elucidates how power shapes our scholarly histories and identities.

  11. Observing gravitational-wave transient GW150914 with minimal assumptions

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. C.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, R.D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, A.L.S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, T.C; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brocki, P.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderon Bustillo, J.; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Qian; Chua, S. E.; Chung, E.S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, A.C.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, A.L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Debra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.A.; DeRosa, R. T.; Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M.G.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H. -B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, T. M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.M.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. R.; Flaminio, R.; Fletcher, M; Fournier, J. -D.; Franco, S; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.P.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.M.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; de Haas, R.; Hacker, J. J.; Buffoni-Hall, R.; Hall, E. D.; Hammond, G.L.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, P.J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hinder, I.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.H.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kefelian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan., S.; Khan, Z.; Khazanov, E. A.; Kijhunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.M.; King, E. J.; King, P. J.; Kinsey, M.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Laguna, P.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, R.; Leavey, S.; Lebigot, E. O.; Lee, C.H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lueck, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magana-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R.M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, J.C.; Moraru, D.; Gutierrez Moreno, M.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Gutierrez-Neri, M.; Neunzert, A.; Newton-Howes, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J.; Oh, S. H.; Ohme, F.; Oliver, M. B.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Page, J.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prolchorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Puerrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosinska, D.; Rowan, S.; Ruediger, A.; Ruggi, P.; Ryan, K.A.; Sachdev, P.S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabel, R.B.; Schofield, R. M. S.; Schoenbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, M.S.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shithriar, M. S.; Shaltev, M.; Shao, Z.M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, António Dias da; Simakov, D.; Singer, A; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, J.R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepanczyk, M. J.; Tacca, M.D.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Toyra, D.; Travasso, F.; Traylor, G.; Trifiro, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlhruch, H.; Vajente, G.; Valdes, G.; Van Bakel, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasuth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, R. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.M.; Wessels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, D.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J.L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. -P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be

  12. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  13. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  14. Load assumption for fatigue design of structures and components counting methods, safety aspects, practical application

    CERN Document Server

    Köhler, Michael; Pötter, Kurt; Zenner, Harald

    2017-01-01

    Understanding the fatigue behaviour of structural components under variable load amplitude is an essential prerequisite for safe and reliable light-weight design. For designing and dimensioning, the expected stress (load) is compared with the capacity to withstand loads (fatigue strength). In this process, the safety necessary for each particular application must be ensured. A prerequisite for ensuring the required fatigue strength is a reliable load assumption. The authors describe the transformation of the stress- and load-time functions which have been measured under operational conditions to spectra or matrices with the application of counting methods. The aspects which must be considered for ensuring a reliable load assumption for designing and dimensioning are discussed in detail. Furthermore, the theoretical background for estimating the fatigue life of structural components is explained, and the procedures are discussed for numerous applications in practice. One of the prime intentions of the authors ...

  15. Some basic thermohydraulic calculation methods for the analysis of pressure transients in a multicompartment total containment enclosing a breached water reactor circuit

    International Nuclear Information System (INIS)

    Porter, W.H.L.

    1976-05-01

    This paper gives an appreciation and commentary of the basic calculation methods under development at AEE Winfrith for the analysis of multicompartment total containments. The assumptions introduced and the effects of their variation are important in establishing a parametric survey of the range of possible conditions which the containment may be required to meet. These aspects of the performance will be discussed as each individual factor in the train of events is examined in turn. (U.K.)

  16. On the validity of Brownian assumptions in the spin van der Waals model

    International Nuclear Information System (INIS)

    Oh, Suhk Kun

    1985-01-01

    A simple Brownian motion theory of the spin van der Waals model, which can be stationary, Markoffian or Gaussian, is studied. By comparing the Brownian motion theory with an exact theory called the generalized Langevin equation theory, the validity of the Brownian assumptions is tested. Thereby, it is shown explicitly how the Markoffian and Gaussian properties are modified in the spin van der Waals model under the influence of quantum fluctuations and long range ordering. (Author)

  17. Clinical review: Moral assumptions and the process of organ donation in the intensive care unit

    OpenAIRE

    Streat, Stephen

    2004-01-01

    The objective of the present article is to review moral assumptions underlying organ donation in the intensive care unit. Data sources used include personal experience, and a Medline search and a non-Medline search of relevant English-language literature. The study selection included articles concerning organ donation. All data were extracted and analysed by the author. In terms of data synthesis, a rational, utilitarian moral perspective dominates, and has captured and circumscribed, the lan...

  18. Basic Cake Decorating Workbook.

    Science.gov (United States)

    Bogdany, Mel

    Included in this student workbook for basic cake decorating are the following: (1) Drawings of steps in a basic way to ice a layer cake, how to make a paper cone, various sizes of flower nails, various sizes and types of tin pastry tubes, and special rose tubes; (2) recipes for basic decorating icings (buttercream, rose paste, and royal icing);…

  19. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  20. Validity of the mockwitness paradigm: testing the assumptions.

    Science.gov (United States)

    McQuiston, Dawn E; Malpass, Roy S

    2002-08-01

    Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.

  1. Sensitivity of probabilistic MCO water content estimates to key assumptions

    International Nuclear Information System (INIS)

    DUNCAN, D.R.

    1999-01-01

    Sensitivity of probabilistic multi-canister overpack (MCO) water content estimates to key assumptions is evaluated with emphasis on the largest non-cladding film-contributors, water borne by particulates adhering to damage sites, and water borne by canister particulate. Calculations considered different choices of damage state degree of independence, different choices of percentile for reference high inputs, three types of input probability density function (pdfs): triangular, log-normal, and Weibull, and the number of scrap baskets in an MCO

  2. Spatial Angular Compounding for Elastography without the Incompressibility Assumption

    OpenAIRE

    Rao, Min; Varghese, Tomy

    2005-01-01

    Spatial-angular compounding is a new technique that enables the reduction of noise artifacts in ultrasound elastography. Previous results using spatial angular compounding, however, were based on the use of the tissue incompressibility assumption. Compounded elastograms were obtained from a spatially-weighted average of local strain estimated from radiofrequency echo signals acquired at different insonification angles. In this paper, we present a new method for reducing the noise artifacts in...

  3. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    Science.gov (United States)

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Data-driven smooth tests of the proportional hazards assumption

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2007-01-01

    Roč. 13, č. 1 (2007), s. 1-16 ISSN 1380-7870 R&D Projects: GA AV ČR(CZ) IAA101120604; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * Neyman's smooth test * proportional hazards assumption * Schwarz's selection rule Subject RIV: BA - General Mathematics Impact factor: 0.491, year: 2007

  5. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed by Fro...... that there is indeed a constructive role for a wide suite of ecosystem models to evaluate fishing strategies in an ecosystem context...

  6. The incompressibility assumption in computational simulations of nasal airflow.

    Science.gov (United States)

    Cal, Ismael R; Cercos-Pita, Jose Luis; Duque, Daniel

    2017-06-01

    Most of the computational works on nasal airflow up to date have assumed incompressibility, given the low Mach number of these flows. However, for high temperature gradients, the incompressibility assumption could lead to a loss of accuracy, due to the temperature dependence of air density and viscosity. In this article we aim to shed some light on the influence of this assumption in a model of calm breathing in an Asian nasal cavity, by solving the fluid flow equations in compressible and incompressible formulation for different ambient air temperatures using the OpenFOAM package. At low flow rates and warm climatological conditions, similar results were obtained from both approaches, showing that density variations need not be taken into account to obtain a good prediction of all flow features, at least for usual breathing conditions. This agrees with most of the simulations previously reported, at least as far as the incompressibility assumption is concerned. However, parameters like nasal resistance and wall shear stress distribution differ for air temperatures below [Formula: see text]C approximately. Therefore, density variations should be considered for simulations at such low temperatures.

  7. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  8. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  9. Experimental data from irradiation of physical detectors disclose weaknesses in basic assumptions of the δ ray theory of track structure

    DEFF Research Database (Denmark)

    Olsen, K. J.; Hansen, Jørgen-Walther

    1985-01-01

    The applicability of track structure theory has been tested by comparing predictions based on the theory with experimental high-LET dose-response data for an amino acid alanine and a nylon based radiochromic dye film radiation detector. The linear energy transfer LET, has been varied from 28...

  10. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  11. Hydromechanics - basic properties

    International Nuclear Information System (INIS)

    Lee, Sung Tak; Lee, Je Geun

    1987-03-01

    This book tells of hydromechanics, which is about basic properties of hydromechanics such as conception, definition, mass, power and weight, and perfect fluid and perfect gas, hydrostatics with summary, basic equation of hydrostatics, relative balance of hydrostatics, and kinematics of hydromechanics, description method of floating, hydromechanics about basic knowledge, equation of moment, energy equation and application of Bernoulli equation, application of momentum theory, inviscid flow and fluid measuring.

  12. Basic molecular spectroscopy

    CERN Document Server

    Gorry, PA

    1985-01-01

    BASIC Molecular Spectroscopy discusses the utilization of the Beginner's All-purpose Symbolic Instruction Code (BASIC) programming language in molecular spectroscopy. The book is comprised of five chapters that provide an introduction to molecular spectroscopy through programs written in BASIC. The coverage of the text includes rotational spectra, vibrational spectra, and Raman and electronic spectra. The book will be of great use to students who are currently taking a course in molecular spectroscopy.

  13. 5 CFR 551.401 - Basic principles.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Basic principles. 551.401 Section 551.401 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY ADMINISTRATION UNDER THE FAIR LABOR STANDARDS ACT Hours of Work General Provisions § 551.401 Basic principles. (a) All time...

  14. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  15. Are Prescription Opioids Driving the Opioid Crisis? Assumptions vs Facts.

    Science.gov (United States)

    Rose, Mark Edmund

    2018-04-01

    Sharp increases in opioid prescriptions, and associated increases in overdose deaths in the 2000s, evoked widespread calls to change perceptions of opioid analgesics. Medical literature discussions of opioid analgesics began emphasizing patient and public health hazards. Repetitive exposure to this information may influence physician assumptions. While highly consequential to patients with pain whose function and quality of life may benefit from opioid analgesics, current assumptions about prescription opioid analgesics, including their role in the ongoing opioid overdose epidemic, have not been scrutinized. Information was obtained by searching PubMed, governmental agency websites, and conference proceedings. Opioid analgesic prescribing and associated overdose deaths both peaked around 2011 and are in long-term decline; the sharp overdose increase recorded in 2014 was driven by illicit fentanyl and heroin. Nonmethadone prescription opioid analgesic deaths, in the absence of co-ingested benzodiazepines, alcohol, or other central nervous system/respiratory depressants, are infrequent. Within five years of initial prescription opioid misuse, 3.6% initiate heroin use. The United States consumes 80% of the world opioid supply, but opioid access is nonexistent for 80% and severely restricted for 4.1% of the global population. Many current assumptions about opioid analgesics are ill-founded. Illicit fentanyl and heroin, not opioid prescribing, now fuel the current opioid overdose epidemic. National discussion has often neglected the potentially devastating effects of uncontrolled chronic pain. Opioid analgesic prescribing and related overdoses are in decline, at great cost to patients with pain who have benefited or may benefit from, but cannot access, opioid analgesic therapy.

  16. Questioning the foundations of physics which of our fundamental assumptions are wrong?

    CERN Document Server

    Foster, Brendan; Merali, Zeeya

    2015-01-01

    The essays in this book look at way in which the fundaments of physics might need to be changed in order to make progress towards a unified theory. They are based on the prize-winning essays submitted to the FQXi essay competition “Which of Our Basic Physical Assumptions Are Wrong?”, which drew over 270 entries. As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.” The authors of the eighteen prize-winning essays have, where necessary, adapted their essays for the present volume so as to (a) incorporate the community feedback generated in the online discussion of the essays, (b) add new material that has come to light since their completion and (c) to ensure accessibility to a broad audience of re...

  17. Moral dilemmas in professions of public trust and the assumptions of ethics of social consequences

    Directory of Open Access Journals (Sweden)

    Dubiel-Zielińska Paulina

    2016-06-01

    Full Text Available The aim of the article is to show the possibility of applying assumptions from ethics of social consequences when making decisions about actions, as well as in situations of moral dilemmas, by persons performing occupations of public trust on a daily basis. Reasoning in the article is analytical and synthetic. Article begins with an explanation of the basic concepts of “profession” and “the profession of public trust” and a manifestation of the difference between these terms. This is followed by a general description of professions of public trust. The area and definition of moral dilemmas is emphasized. Furthermore, representatives of professions belonging to them are listed. After a brief characterization of axiological foundations and the main assumptions of ethics of social consequences, actions according to Vasil Gluchman and Włodzimierz Galewicz are discussed and actions in line with ethics of social consequences are transferred to the practical domain. The article points out that actions in professional life are obligatory, impermissible, permissible, supererogatory and unmarked in the moral dimension. In the final part of the article an afterthought is included on how to solve moral dilemmas when in the position of a representative of the profession of public trust. The article concludes with a summary report containing the conclusions that stem from ethics of social consequences for professions of public trust, followed by short examples.

  18. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  19. First assumptions and overlooking competing causes of death

    DEFF Research Database (Denmark)

    Leth, Peter Mygind; Andersen, Anh Thao Nguyen

    2014-01-01

    Determining the most probable cause of death is important, and it is sometimes tempting to assume an obvious cause of death, when it readily presents itself, and stop looking for other competing causes of death. The case story presented in the article illustrates this dilemma. The first assumption...... of cause of death, which was based on results from bacteriology tests, proved to be wrong when the results from the forensic toxicology testing became available. This case also illustrates how post mortem computed tomography (PMCT) findings of radio opaque material in the stomach alerted the pathologist...

  20. Assumptions of Corporate Social Responsibility as Competitiveness Factor

    Directory of Open Access Journals (Sweden)

    Zaneta Simanaviciene

    2017-09-01

    Full Text Available The purpose of this study was to examine the assumptions of corporate social responsibility (CSR as competitiveness factor in economic downturn. Findings indicate that factors affecting the quality of the micro-economic business environment, i.e., the sophistication of enterprise’s strategy and management processes, the quality of the human capital resources, the increase of product / service demand, the development of related and supporting sectors and the efficiency of natural resources, and competitive capacities of enterprise impact competitiveness at a micro-level. The outcomes suggest that the implementation of CSR elements, i.e., economic, environmental and social responsibilities, gives good opportunities to increase business competitiveness.

  1. ψ -ontology result without the Cartesian product assumption

    Science.gov (United States)

    Myrvold, Wayne C.

    2018-05-01

    We introduce a weakening of the preparation independence postulate of Pusey et al. [Nat. Phys. 8, 475 (2012), 10.1038/nphys2309] that does not presuppose that the space of ontic states resulting from a product-state preparation can be represented by the Cartesian product of subsystem state spaces. On the basis of this weakened assumption, it is shown that, in any model that reproduces the quantum probabilities, any pair of pure quantum states |ψ >,|ϕ > with ≤1 /√{2 } must be ontologically distinct.

  2. Unconditionally Secure and Universally Composable Commitments from Physical Assumptions

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Scafuro, Alessandra

    2013-01-01

    We present a constant-round unconditional black-box compiler that transforms any ideal (i.e., statistically-hiding and statistically-binding) straight-line extractable commitment scheme, into an extractable and equivocal commitment scheme, therefore yielding to UC-security [9]. We exemplify the u...... of unconditional UC-security with (malicious) PUFs and stateless tokens, our compiler can be instantiated with any ideal straight-line extractable commitment scheme, thus allowing the use of various setup assumptions which may better fit the application or the technology available....

  3. Finding Basic Writing's Place.

    Science.gov (United States)

    Sheridan-Rabideau, Mary P.; Brossell, Gordon

    1995-01-01

    Posits that basic writing serves a vital function by providing writing support for at-risk students and serves the needs of a growing student population that universities accept yet feel needs additional writing instruction. Concludes that the basic writing classroom is the most effective educational support for at-risk students and their writing.…

  4. Biomass Energy Basics | NREL

    Science.gov (United States)

    Biomass Energy Basics Biomass Energy Basics We have used biomass energy, or "bioenergy" keep warm. Wood is still the largest biomass energy resource today, but other sources of biomass can landfills (which are methane, the main component in natural gas) can be used as a biomass energy source. A

  5. Wind Energy Basics | NREL

    Science.gov (United States)

    Wind Energy Basics Wind Energy Basics We have been harnessing the wind's energy for hundreds of grinding grain. Today, the windmill's modern equivalent-a wind turbine can use the wind's energy to most energy. At 100 feet (30 meters) or more aboveground, they can take advantage of the faster and

  6. Solar Energy Basics | NREL

    Science.gov (United States)

    Solar Energy Basics Solar Energy Basics Solar is the Latin word for sun-a powerful source of energy that can be used to heat, cool, and light our homes and businesses. That's because more energy from the technologies convert sunlight to usable energy for buildings. The most commonly used solar technologies for

  7. Learning Visual Basic NET

    CERN Document Server

    Liberty, Jesse

    2009-01-01

    Learning Visual Basic .NET is a complete introduction to VB.NET and object-oriented programming. By using hundreds of examples, this book demonstrates how to develop various kinds of applications--including those that work with databases--and web services. Learning Visual Basic .NET will help you build a solid foundation in .NET.

  8. Health Insurance Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Health Insurance Basics KidsHealth / For Teens / Health Insurance Basics What's ... thought advanced calculus was confusing. What Exactly Is Health Insurance? Health insurance is a plan that people buy ...

  9. Body Basics Library

    Science.gov (United States)

    ... Body Basics articles explain just how each body system, part, and process works. Use this medical library to find out about basic human anatomy, how ... Teeth Skin, Hair, and Nails Spleen and Lymphatic System ... Visit the Nemours Web site. Note: All information on TeensHealth® is for ...

  10. Sugar Cane Genome Numbers Assumption by Ribosomal DNA FISH Techniques

    NARCIS (Netherlands)

    Thumjamras, S.; Jong, de H.; Iamtham, S.; Prammanee, S.

    2013-01-01

    Conventional cytological method is limited for polyploidy plant genome study, especially sugar cane chromosomes that show unstable numbers of each cultivar. Molecular cytogenetic as fluorescent in situ hybridization (FISH) techniques were used in this study. A basic chromosome number of sugar cane

  11. Basic Research Needs for Countering Terrorism

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, W.; Michalske, T.; Trewhella, J.; Makowski, L.; Swanson, B.; Colson, S.; Hazen, T.; Roberto, F.; Franz, D.; Resnick, G.; Jacobson, S.; Valdez, J.; Gourley, P.; Tadros, M.; Sigman, M.; Sailor, M.; Ramsey, M.; Smith, B.; Shea, K.; Hrbek, J.; Rodacy, P.; Tevault, D.; Edelstein, N.; Beitz, J.; Burns, C.; Choppin, G.; Clark, S.; Dietz, M.; Rogers, R.; Traina, S.; Baldwin, D.; Thurnauer, M.; Hall, G.; Newman, L.; Miller, D.; Kung, H.; Parkin, D.; Shuh, D.; Shaw, H.; Terminello, L.; Meisel, D.; Blake, D.; Buchanan, M.; Roberto, J.; Colson, S.; Carling, R.; Samara, G.; Sasaki, D.; Pianetta, P.; Faison, B.; Thomassen, D.; Fryberger, T.; Kiernan, G.; Kreisler, M.; Morgan, L.; Hicks, J.; Dehmer, J.; Kerr, L.; Smith, B.; Mays, J.; Clark, S.

    2002-03-01

    To identify connections between technology needs for countering terrorism and underlying science issues and to recommend investment strategies to increase the impact of basic research on efforts to counter terrorism.

  12. Drug policy in sport: hidden assumptions and inherent contradictions.

    Science.gov (United States)

    Smith, Aaron C T; Stewart, Bob

    2008-03-01

    This paper considers the assumptions underpinning the current drugs-in-sport policy arrangements. We examine the assumptions and contradictions inherent in the policy approach, paying particular attention to the evidence that supports different policy arrangements. We find that the current anti-doping policy of the World Anti-Doping Agency (WADA) contains inconsistencies and ambiguities. WADA's policy position is predicated upon four fundamental principles; first, the need for sport to set a good example; secondly, the necessity of ensuring a level playing field; thirdly, the responsibility to protect the health of athletes; and fourthly, the importance of preserving the integrity of sport. A review of the evidence, however, suggests that sport is a problematic institution when it comes to setting a good example for the rest of society. Neither is it clear that sport has an inherent or essential integrity that can only be sustained through regulation. Furthermore, it is doubtful that WADA's anti-doping policy is effective in maintaining a level playing field, or is the best means of protecting the health of athletes. The WADA anti-doping policy is based too heavily on principals of minimising drug use, and gives insufficient weight to the minimisation of drug-related harms. As a result drug-related harms are being poorly managed in sport. We argue that anti-doping policy in sport would benefit from placing greater emphasis on a harm minimisation model.

  13. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  14. Stable isotopes and elasmobranchs: tissue types, methods, applications and assumptions.

    Science.gov (United States)

    Hussey, N E; MacNeil, M A; Olin, J A; McMeans, B C; Kinney, M J; Chapman, D D; Fisk, A T

    2012-04-01

    Stable-isotope analysis (SIA) can act as a powerful ecological tracer with which to examine diet, trophic position and movement, as well as more complex questions pertaining to community dynamics and feeding strategies or behaviour among aquatic organisms. With major advances in the understanding of the methodological approaches and assumptions of SIA through dedicated experimental work in the broader literature coupled with the inherent difficulty of studying typically large, highly mobile marine predators, SIA is increasingly being used to investigate the ecology of elasmobranchs (sharks, skates and rays). Here, the current state of SIA in elasmobranchs is reviewed, focusing on available tissues for analysis, methodological issues relating to the effects of lipid extraction and urea, the experimental dynamics of isotopic incorporation, diet-tissue discrimination factors, estimating trophic position, diet and mixing models and individual specialization and niche-width analyses. These areas are discussed in terms of assumptions made when applying SIA to the study of elasmobranch ecology and the requirement that investigators standardize analytical approaches. Recommendations are made for future SIA experimental work that would improve understanding of stable-isotope dynamics and advance their application in the study of sharks, skates and rays. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  15. Has the "Equal Environments" assumption been tested in twin studies?

    Science.gov (United States)

    Eaves, Lindon; Foley, Debra; Silberg, Judy

    2003-12-01

    A recurring criticism of the twin method for quantifying genetic and environmental components of human differences is the necessity of the so-called "equal environments assumption" (EEA) (i.e., that monozygotic and dizygotic twins experience equally correlated environments). It has been proposed to test the EEA by stratifying twin correlations by indices of the amount of shared environment. However, relevant environments may also be influenced by genetic differences. We present a model for the role of genetic factors in niche selection by twins that may account for variation in indices of the shared twin environment (e.g., contact between members of twin pairs). Simulations reveal that stratification of twin correlations by amount of contact can yield spurious evidence of large shared environmental effects in some strata and even give false indications of genotype x environment interaction. The stratification approach to testing the equal environments assumption may be misleading and the results of such tests may actually be consistent with a simpler theory of the role of genetic factors in niche selection.

  16. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J.; Kahn, Yonatan; McCullough, Matthew

    2015-10-06

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  17. Estimation of the energy loss at the blades in rowing: common assumptions revisited.

    Science.gov (United States)

    Hofmijster, Mathijs; De Koning, Jos; Van Soest, A J

    2010-08-01

    In rowing, power is inevitably lost as kinetic energy is imparted to the water during push-off with the blades. Power loss is estimated from reconstructed blade kinetics and kinematics. Traditionally, it is assumed that the oar is completely rigid and that force acts strictly perpendicular to the blade. The aim of the present study was to evaluate how reconstructed blade kinematics, kinetics, and average power loss are affected by these assumptions. A calibration experiment with instrumented oars and oarlocks was performed to establish relations between measured signals and oar deformation and blade force. Next, an on-water experiment was performed with a single female world-class rower rowing at constant racing pace in an instrumented scull. Blade kinematics, kinetics, and power loss under different assumptions (rigid versus deformable oars; absence or presence of a blade force component parallel to the oar) were reconstructed. Estimated power losses at the blades are 18% higher when parallel blade force is incorporated. Incorporating oar deformation affects reconstructed blade kinematics and instantaneous power loss, but has no effect on estimation of power losses at the blades. Assumptions on oar deformation and blade force direction have implications for the reconstructed blade kinetics and kinematics. Neglecting parallel blade forces leads to a substantial underestimation of power losses at the blades.

  18. Evaluating methodological assumptions of a catch-curve survival estimation of unmarked precocial shorebird chickes

    Science.gov (United States)

    McGowan, Conor P.; Gardner, Beth

    2013-01-01

    Estimating productivity for precocial species can be difficult because young birds leave their nest within hours or days of hatching and detectability thereafter can be very low. Recently, a method for using a modified catch-curve to estimate precocial chick daily survival for age based count data was presented using Piping Plover (Charadrius melodus) data from the Missouri River. However, many of the assumptions of the catch-curve approach were not fully evaluated for precocial chicks. We developed a simulation model to mimic Piping Plovers, a fairly representative shorebird, and age-based count-data collection. Using the simulated data, we calculated daily survival estimates and compared them with the known daily survival rates from the simulation model. We conducted these comparisons under different sampling scenarios where the ecological and statistical assumptions had been violated. Overall, the daily survival estimates calculated from the simulated data corresponded well with true survival rates of the simulation. Violating the accurate aging and the independence assumptions did not result in biased daily survival estimates, whereas unequal detection for younger or older birds and violating the birth death equilibrium did result in estimator bias. Assuring that all ages are equally detectable and timing data collection to approximately meet the birth death equilibrium are key to the successful use of this method for precocial shorebirds.

  19. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Directory of Open Access Journals (Sweden)

    Giordano James

    2010-01-01

    Full Text Available Abstract A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order, and what counts as abnormality (i.e.- disorder. The distinction(s between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice.

  20. Testing legal assumptions regarding the effects of dancer nudity and proximity to patron on erotic expression.

    Science.gov (United States)

    Linz, D; Blumenthal, E; Donnerstein, E; Kunkel, D; Shafer, B J; Lichtenstein, A

    2000-10-01

    A field experiment was conducted in order to test the assumptions by the Supreme Court in Barnes v. Glen Theatre, Inc. (1991) and the Ninth Circuit Court of Appeals in Colacurcio v. City of Kent (1999) that government restrictions on dancer nudity and dancer-patron proximity do not affect the content of messages conveyed by erotic dancers. A field experiment was conducted in which dancer nudity (nude vs. partial clothing) and dancer-patron proximity (4 feet; 6 in.; 6 in. plus touch) were manipulated under controlled conditions in an adult night club. After male patrons viewed the dances, they completed questionnaires assessing affective states and reception of erotic, relational intimacy, and social messages. Contrary to the assumptions of the courts, the results showed that the content of messages conveyed by the dancers was significantly altered by restrictions placed on dancer nudity and dancer-patron proximity. These findings are interpreted in terms of social psychological responses to nudity and communication theories of nonverbal behavior. The legal implications of rejecting the assumptions made by the courts in light of the findings of this study are discussed. Finally, suggestions are made for future research.

  1. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Science.gov (United States)

    2010-01-01

    A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order), and what counts as abnormality (i.e.- disorder). The distinction(s) between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice. PMID:20109176

  2. Protection against external impacts and missiles - Load assumption and effects on the plant design of a 1300 MW PWR-Plant

    International Nuclear Information System (INIS)

    Gremm, O.; Orth, K.H.

    1978-01-01

    The load assumptions and effects of the external impacts are given. The fundamental properties of the KWU standard design according to these impacts and the consequences for the engineering safeguards are explained. The protection against external impacts includes the protection against all external missiles. The basic measure of protection against internal missiles is the strict separation of redundancies. (author)

  3. From basic needs to basic rights.

    Science.gov (United States)

    Facio, A

    1995-06-01

    After arriving at an understanding that basic rights refer to all human needs, it is clear that a recognition of the basic needs of female humans must precede the realization of their rights. The old Women in Development (WID) framework only understood women's needs from an androcentric perspective which was limited to practical interests. Instead, women's primary need is to be free from their subordination to men. Such an understanding places all of women's immediate needs in a new light. A human rights approach to development would see women not as beneficiaries but as people entitled to enjoy the benefits of development. Discussion of what equality before the law should mean to women began at the Third World Conference on Women in Nairobi where the issue of violence against women was first linked to development. While debate continues about the distinction between civil and political rights and economic, social, and cultural rights, the realities of women's lives do not permit such a distinction. The concept of the universality of human rights did not become codified until the UN proclaimed the Universal Declaration of Human Rights in 1948. The declaration has been criticized by feminists because the view of human rights it embodies has been too strongly influenced by a liberal Western philosophy which stresses individual rights and because it is ambiguous on the distinction between human rights and the rights of a citizen. The protection of rights afforded by the Declaration, however, should not be viewed as a final achievement but as an ongoing struggle. International conferences have led to an analysis of the human-rights approach to sustainable development which concludes that women continue to face the routine denial of their rights. Each human right must be redefined from the perspective of women's needs, which must also be redefined. Women must forego challenging the concept of the universality of human rights in order to overcome the argument of cultural

  4. Basic standards for radiation protection

    International Nuclear Information System (INIS)

    Webb, G.A.M.

    1982-01-01

    The basic standards for radiation protection have been based, for many years, on the recommendations of the International Commission of Radiological Protection. The three basic standards recommended by the Commission may be summarized as ''justification, optimization of protection and adherence to dose limitations. The applications of these basic principles to different aspects of protection are briefly summarized and the particular ways in which they have been applied to waste described in more detail. The application of dose limits, both in the control of occupational exposure and in regulating routine discharges of radioactive effluents is straight forward in principle although the measurement and calculational requirements may be substantial. Secondary standards such as derived limits may be extremely useful and the principles underlying their derivation will be described. Optimization of protection is inherently a more difficult concept to apply in protection and the various techniques used will be outlined by with particular emphasis on the use of cost benefit analysis are recommended by the ICRP. A review will be given of the problems involved in extending these basic concepts of the ICRP to probabilistic analyses such as those required for assessing the consequences of accidents or disruptive events in long term repositories. The particular difficulties posed by the very long timescales involved in the assessment of waste management practices will be discussed in some detail. (orig./RW)

  5. Basic rocks in Finland

    International Nuclear Information System (INIS)

    Piirainen, T.; Gehoer, S.; Iljina, M.; Kaerki, A.; Paakkola, J.; Vuollo, J.

    1992-10-01

    Basic igneous rocks, containing less than 52% SiO 2 , constitute an important part of the Finnish Archaean and Proterozoic crust. In the Archaean crust exist two units which contain the majority of the basic rocks. The Arcaean basic rocks are metavolcanics and situated in the Greenstone Belts of Eastern Finland. They are divided into two units. The greenstones of the lower one are tholeiites, komatiites and basaltic komatiites. The upper consists of bimodal series of volcanics and the basic rocks of which are Fe-tholeiites, basaltic komatiites and komatiites. Proterozoic basic rocks are divided into seven groups according to their ages. The Proterozoic igneous activity started by the volominous basic magmatism 2.44 Ga ago. During this stage formed the layered intrusions and related dykes in the Northern Finland. 2.2 Ga old basic rocks are situated at the margins of Karelian formations. 2.1 Ga aged Fe-tholeiitic magmatic activity is widespread in Eastern and Northern Finland. The basic rocks of 1.97 Ga age group are met within the Karelian Schist Belts as obducted ophiolite complexes but they occur also as tholeiitic diabase dykes cutting the Karelian schists and Archean basement. The intrusions and the volcanics of the 1.9 Ga old basic igneous activity are mostly encountered around the Granitoid Complex of Central Finland. Subjotnian, 1.6 Ga aged tholeiitic diabases are situated around the Rapakivi massifs of Southern Finland, and postjotnian, 1.2 Ga diabases in Western Finland where they form dykes cutting Svecofennian rocks

  6. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  7. Quantum electronics basic theory

    CERN Document Server

    Fain, V M; Sanders, J H

    1969-01-01

    Quantum Electronics, Volume 1: Basic Theory is a condensed and generalized description of the many research and rapid progress done on the subject. It is translated from the Russian language. The volume describes the basic theory of quantum electronics, and shows how the concepts and equations followed in quantum electronics arise from the basic principles of theoretical physics. The book then briefly discusses the interaction of an electromagnetic field with matter. The text also covers the quantum theory of relaxation process when a quantum system approaches an equilibrium state, and explai

  8. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  9. Testing surrogacy assumptions: can threatened and endangered plants be grouped by biological similarity and abundances?

    Directory of Open Access Journals (Sweden)

    Judy P Che-Castaldo

    Full Text Available There is renewed interest in implementing surrogate species approaches in conservation planning due to the large number of species in need of management but limited resources and data. One type of surrogate approach involves selection of one or a few species to represent a larger group of species requiring similar management actions, so that protection and persistence of the selected species would result in conservation of the group of species. However, among the criticisms of surrogate approaches is the need to test underlying assumptions, which remain rarely examined. In this study, we tested one of the fundamental assumptions underlying use of surrogate species in recovery planning: that there exist groups of threatened and endangered species that are sufficiently similar to warrant similar management or recovery criteria. Using a comprehensive database of all plant species listed under the U.S. Endangered Species Act and tree-based random forest analysis, we found no evidence of species groups based on a set of distributional and biological traits or by abundances and patterns of decline. Our results suggested that application of surrogate approaches for endangered species recovery would be unjustified. Thus, conservation planning focused on individual species and their patterns of decline will likely be required to recover listed species.

  10. Basic Financial Accounting

    DEFF Research Database (Denmark)

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects...

  11. HIV Treatment: The Basics

    Science.gov (United States)

    ... AIDS Drugs Clinical Trials Apps skip to content HIV Treatment Home Understanding HIV/AIDS Fact Sheets HIV ... 4 p.m. ET) Send us an email HIV Treatment: The Basics Last Reviewed: March 22, 2018 ...

  12. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... How Peer Counseling Works Julie Gassaway, MS, RN Pediatric Injuries Pediatric Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation Sara Klaas, MSW Transitions for Children ...

  13. Powassan (POW) Virus Basics

    Science.gov (United States)

    ... Health Professionals Related Topics For International Travelers Powassan Virus Disease Basics Download this fact sheet formatted for ... Virus Disease Fact Sheet (PDF) What is Powassan virus? Powassan virus is a tickborne flavivirus that is ...

  14. Brain Basics: Understanding Sleep

    Science.gov (United States)

    ... You are here Home » Disorders » Patient & Caregiver Education Brain Basics: Understanding Sleep Anatomy of Sleep Sleep Stages ... t form or maintain the pathways in your brain that let you learn and create new memories, ...

  15. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of ... injury? What is a Spinal Cord Injury? SCI Medical Experts People Living With SCI Personal Experiences By ...

  16. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Topic Resources Peer Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of Spinal Cord Injury Rehabilitation Adult Injuries Spinal Cord Injury 101 David ...

  17. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... RN Pediatric Injuries Pediatric Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation ... Rogers, PT Recreational Therapy after Spinal Cord Injury Jennifer Piatt, PhD Kristine Cichowski, MS Read Bio Founding ...

  18. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Topic Resources Peer Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of Spinal Cord Injury Rehabilitation Adult Injuries Spinal Cord Injury 101 ...

  19. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation Sara Klaas, MSW Transitions for Children with Spinal Cord Injury Patricia Mucia, RN Family Life After Pediatric Spinal Injury Dawn Sheaffer, MSW Rehabilitation ...

  20. Physical Activity Basics

    Science.gov (United States)

    ... Weight Breastfeeding Micronutrient Malnutrition State and Local Programs Physical Activity Basics Recommend on Facebook Tweet Share Compartir How much physical activity do you need? Regular physical activity helps improve ...

  1. Radionuclide Basics: Iodine

    Science.gov (United States)

    ... Centers Radiation Protection Contact Us Share Radionuclide Basics: Iodine Iodine (chemical symbol I) is a chemical element. ... in the environment Iodine sources Iodine and health Iodine in the Environment All 37 isotopes of iodine ...

  2. EROI of crystalline silicon photovoltaics : Variations under different assumptions regarding manufacturing energy inputs and energy output

    OpenAIRE

    Lundin, Johan

    2013-01-01

    Installed photovoltaic nameplate power have been growing rapidly around the worldin the last few years. But how much energy is returned to society (i.e. net energy) by this technology, and which factors contribute the most to the amount of energy returned? The objective of this thesis was to examine the importance of certain inputs and outputs along the solar panel production chain and their effect on the energy return on (energy) investment (EROI) for crystalline wafer-based photovoltaics. A...

  3. Academics, Self-Esteem, and Race: A Look at the Underlying Assumptions of the Disidentification Hypothesis.

    Science.gov (United States)

    Osborne, Jason W.

    1995-01-01

    Tested hypothesis that African American children protect themselves from failure by detaching their self-esteem from academic outcomes. Analyses revealed a pattern of weakening correlations between self-esteem and academic outcomes from 8th to 10th grade for African American students. Correlations for white students remained stable or increased.…

  4. SAMPLE STANDARD DEVIATION(s) CHART UNDER THE ASSUMPTION OF MODERATENESS AND ITS PERFORMANCE ANALYSIS

    OpenAIRE

    Kalpesh S. Tailor

    2017-01-01

    Moderate distribution proposed by Naik V.D and Desai J.M., is a sound alternative of normal distribution, which has mean and mean deviation as pivotal parameters and which has properties similar to normal distribution. Mean deviation (δ) is a very good alternative of standard deviation (σ) as mean deviation is considered to be the most intuitively and rationally defined measure of dispersion. This fact can be very useful in the field of quality control to construct the control limits of the c...

  5. Close-Form Pricing of Benchmark Equity Default Swaps Under the CEV Assumption

    NARCIS (Netherlands)

    Campi, L.; Sbuelz, A.

    2005-01-01

    Equity Default Swaps are new equity derivatives designed as a product for credit investors.Equipped with a novel pricing result, we provide closedform values that give an analytic contribution to the viability of cross-asset trading related to credit risk.

  6. Discrete-State and Continuous Models of Recognition Memory: Testing Core Properties under Minimal Assumptions

    Science.gov (United States)

    Kellen, David; Klauer, Karl Christoph

    2014-01-01

    A classic discussion in the recognition-memory literature concerns the question of whether recognition judgments are better described by continuous or discrete processes. These two hypotheses are instantiated by the signal detection theory model (SDT) and the 2-high-threshold model, respectively. Their comparison has almost invariably relied on…

  7. The effective Hamiltonian in curved quantum waveguides under mild regularity assumptions

    Czech Academy of Sciences Publication Activity Database

    Krejčiřík, David; Šediváková, Helena

    2012-01-01

    Roč. 24, č. 7 (2012), 1250018/1-1250018/39 ISSN 0129-055X R&D Projects: GA MŠk LC06002; GA ČR GAP203/11/0701 Institutional support: RVO:61389005 Keywords : quantum waveguides * thin-width limit * effective Hamiltonian * twisting versus bending * norm-resolvent convergence * Dirichlet Laplacian * curved tubes * relatively parallel frame * Steklov approximation Subject RIV: BE - Theoretical Physics Impact factor: 1.092, year: 2012

  8. Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

    NARCIS (Netherlands)

    Kelkboom, E.J.C.; Garcia Molina, Gary; Breebaart, Jeroen; Veldhuis, Raymond N.J.; Kevenaar, Tom A.M.; Jonker, Willem

    In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these

  9. Binary Biometrics: An Analytic Framework to Estimate the Bit Error Probability under Gaussian Assumption

    NARCIS (Netherlands)

    Kelkboom, E.J.C.; Molina, G.; Kevenaar, T.A.M.; Veldhuis, Raymond N.J.; Jonker, Willem

    2008-01-01

    In recent years the protection of biometric data has gained increased interest from the scientific community. Methods such as the helper data system, fuzzy extractors, fuzzy vault and cancellable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic

  10. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  11. Factor structure and concurrent validity of the world assumptions scale.

    Science.gov (United States)

    Elklit, Ask; Shevlin, Mark; Solomon, Zahava; Dekel, Rachel

    2007-06-01

    The factor structure of the World Assumptions Scale (WAS) was assessed by means of confirmatory factor analysis. The sample was comprised of 1,710 participants who had been exposed to trauma that resulted in whiplash. Four alternative models were specified and estimated using LISREL 8.72. A correlated 8-factor solution was the best explanation of the sample data. The estimates of reliability of eight subscales of the WAS ranged from .48 to .82. Scores from five subscales correlated significantly with trauma severity as measured by the Harvard Trauma Questionnaire, although the magnitude of the correlations was low to modest, ranging from .08 to -.43. It is suggested that the WAS has adequate psychometric properties for use in both clinical and research settings.

  12. Posttraumatic Growth and Shattered World Assumptions Among Ex-POWs

    DEFF Research Database (Denmark)

    Lahav, Y.; Bellin, Elisheva S.; Solomon, Z.

    2016-01-01

    Objective: The controversy regarding the nature of posttraumatic growth (PTG) includes two main competing claims: one which argues that PTG reflects authentic positive changes and the other which argues that PTG reflects illusionary defenses. The former also suggests that PTG evolves from shattered...... world assumptions (WAs) and that the co-occurrence of high PTG and negative WAs among trauma survivors reflects reconstruction of an integrative belief system. The present study aimed to test these claims by investigating, for the first time, the mediating role of dissociation in the relation between...... PTG and WAs. Method: Former prisoners of war (ex-POWs; n = 158) and comparable controls (n = 106) were assessed 38 years after the Yom Kippur War. Results: Ex-POWs endorsed more negative WAs and higher PTG and dissociation compared to controls. Ex-POWs with posttraumatic stress disorder (PTSD...

  13. Ancestral assumptions and the clinical uncertainty of evolutionary medicine.

    Science.gov (United States)

    Cournoyea, Michael

    2013-01-01

    Evolutionary medicine is an emerging field of medical studies that uses evolutionary theory to explain the ultimate causes of health and disease. Educational tools, online courses, and medical school modules are being developed to help clinicians and students reconceptualize health and illness in light of our evolutionary past. Yet clinical guidelines based on our ancient life histories are epistemically weak, relying on the controversial assumptions of adaptationism and advocating a strictly biophysical account of health. To fulfill the interventionist goals of clinical practice, it seems that proximate explanations are all we need to develop successful diagnostic and therapeutic guidelines. Considering these epistemic concerns, this article argues that the clinical relevance of evolutionary medicine remains uncertain at best.

  14. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  15. Polarized BRDF for coatings based on three-component assumption

    Science.gov (United States)

    Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong

    2017-02-01

    A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.

  16. Topographic controls on shallow groundwater levels in a steep, prealpine catchment: When are the TWI assumptions valid?

    NARCIS (Netherlands)

    Rinderer, M.; van Meerveld, H.J.; Seibert, J.

    2014-01-01

    Topographic indices like the Topographic Wetness Index (TWI) have been used to predict spatial patterns of average groundwater levels and to model the dynamics of the saturated zone during events (e.g., TOPMODEL). However, the assumptions underlying the use of the TWI in hydrological models, of

  17. Timber value—a matter of choice: a study of how end use assumptions affect timber values.

    Science.gov (United States)

    John H. Beuter

    1971-01-01

    The relationship between estimated timber values and actual timber prices is discussed. Timber values are related to how, where, and when the timber is used. An analysis demonstrates the relative values of a typical Douglas-fir stand under assumptions about timber use.

  18. Halo-independent direct detection analyses without mass assumptions

    International Nuclear Information System (INIS)

    Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m χ −σ n plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v min −g-tilde plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v min to nuclear recoil momentum (p R ), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call h-til-tilde(p R ). The entire family of conventional halo-independent g-tilde(v min ) plots for all DM masses are directly found from the single h-tilde(p R ) plot through a simple rescaling of axes. By considering results in h-tilde(p R ) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple g-tilde(v min ) plots for different DM masses. We conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity

  19. Wartime Paris, cirrhosis mortality, and the ceteris paribus assumption.

    Science.gov (United States)

    Fillmore, Kaye Middleton; Roizen, Ron; Farrell, Michael; Kerr, William; Lemmens, Paul

    2002-07-01

    This article critiques the ceteris paribus assumption, which tacitly sustains the epidemiologic literature's inference that the sharp decline in cirrhosis mortality observed in Paris during the Second World War derived from a sharp constriction in wine consumption. Paris's wartime circumstances deviate substantially from the "all else being equal" assumption, and at least three other hypotheses for the cirrhosis decline may be contemplated. Historical and statistical review. Wartime Paris underwent tumultuous changes. Wine consumption did decline, but there were, as well, a myriad of other changes in diet and life experience, many involving new or heightened hardships, nutritional, experiential, institutional, health and mortality risks. Three competing hypotheses are presented: (1) A fraction of the candidates for cirrhosis mortality may have fallen to more sudden forms of death; (2) alcoholics, heavy drinkers and Paris's clochard subpopulation may have been differentially likely to become removed from the city's wartime population, whether by self-initiated departure, arrest and deportation, or death from other causes, even murder; and (3) there was mismeasurement in the cirrhosis mortality decline. The alcohol-cirrhosis connection provided the template for the alcohol research effort (now more than 20 years old) aimed at re-establishing scientific recognition of alcohol's direct alcohol-problems-generating associations and causal responsibilities. In a time given to reports of weaker associations of the alcohol-cirrhosis connection, the place and importance of the Paris curve in the wider literature, as regards that connection, remains. For this reason, the Paris findings should be subjected to as much research scrutiny as they undoubtedly deserve.

  20. Breakdown of Hydrostatic Assumption in Tidal Channel with Scour Holes

    Directory of Open Access Journals (Sweden)

    Chunyan Li

    2016-10-01

    Full Text Available Hydrostatic condition is a common assumption in tidal and subtidal motions in oceans and estuaries.. Theories with this assumption have been largely successful. However, there is no definite criteria separating the hydrostatic from the non-hydrostatic regimes in real applications because real problems often times have multiple scales. With increased refinement of high resolution numerical models encompassing smaller and smaller spatial scales, the need for non-hydrostatic models is increasing. To evaluate the vertical motion over bathymetric changes in tidal channels and assess the validity of the hydrostatic approximation, we conducted observations using a vessel-based acoustic Doppler current profiler (ADCP. Observations were made along a straight channel 18 times over two scour holes of 25 m deep, separated by 330 m, in and out of an otherwise flat 8 m deep tidal pass leading to the Lake Pontchartrain over a time period of 8 hours covering part of the diurnal tidal cycle. Out of the 18 passages over the scour holes, 11 of them showed strong upwelling and downwelling which resulted in the breakdown of hydrostatic condition. The maximum observed vertical velocity was ~ 0.35 m/s, a high value in a tidal channel, and the estimated vertical acceleration reached a high value of 1.76×10-2 m/s2. Analysis demonstrated that the barotropic non-hydrostatic acceleration was dominant. The cause of the non-hydrostatic flow was the that over steep slopes. This demonstrates that in such a system, the bathymetric variation can lead to the breakdown of hydrostatic conditions. Models with hydrostatic restrictions will not be able to correctly capture the dynamics in such a system with significant bathymetric variations particularly during strong tidal currents.

  1. Basic Finite Element Method

    International Nuclear Information System (INIS)

    Lee, Byeong Hae

    1992-02-01

    This book gives descriptions of basic finite element method, which includes basic finite element method and data, black box, writing of data, definition of VECTOR, definition of matrix, matrix and multiplication of matrix, addition of matrix, and unit matrix, conception of hardness matrix like spring power and displacement, governed equation of an elastic body, finite element method, Fortran method and programming such as composition of computer, order of programming and data card and Fortran card, finite element program and application of nonelastic problem.

  2. Development NGOs: Basic Facts

    OpenAIRE

    Aldashev, Gani; Navarra, Cecilia

    2017-01-01

    This paper systematizes the results of the empirical literature on development non-governmental organizations (NGOs), drawing both from quantitative and qualitative analyses, and constructs a set of basic facts about these organizations. These basic facts concern the size of the development NGO sector and its evolution, the funding of NGOs, the allocation of NGO aid and projects across beneficiary countries, the relationship of NGOs with beneficiaries, and the phenomenon of globalization of d...

  3. Accumulated dose calculations in Indian PHWRs under DBA

    International Nuclear Information System (INIS)

    Nesaraj, David; Pradhan, A.S.; Bhardwaj, S.A.

    1996-01-01

    Accumulated gamma dose inside reactor building due to release of fission products from equilibrium core of Indian PHWR under accident condition has been assessed. The assessment has been done for the radiation tolerance limit of the critical equipment inside reactor building. The basic source data has been generated using computer code ORIGEN2 written and developed by Oak Ridge National Laboratory, USA (ORNL). This paper discusses the details of the calculations done on the basis of certain assumption which are mentioned at relevant places. The results indicate accumulated gamma dose at a few typical locations inside reactor building under accident condition. (author). 1 ref., 1 tab., 1 fig

  4. Towards a test of non-locality without 'supplementary assumptions'

    International Nuclear Information System (INIS)

    Barbieri, M.; De Martini, F.; Di Nepi, G.; Mataloni, P.

    2005-01-01

    We have experimentally tested the non-local properties of the two-photon states generated by a high brilliance source of entanglement which virtually allows the direct measurement of the full set of photon pairs created by the basic QED process implied by the parametric quantum scattering. Standard Bell measurements and Bell's inequality violation test have been realized over the entire cone of emission of the degenerate pairs. By the same source we have verified Hardy's ladder theory up to the 20th step and the contradiction between the standard quantum theory and the local realism has been tested for 41% of entangled pairs

  5. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    Science.gov (United States)

    Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.

    2017-02-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA

  6. Assessing moderated mediation in linear models requires fewer confounding assumptions than assessing mediation.

    Science.gov (United States)

    Loeys, Tom; Talloen, Wouter; Goubert, Liesbet; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2016-11-01

    It is well known from the mediation analysis literature that the identification of direct and indirect effects relies on strong no unmeasured confounding assumptions of no unmeasured confounding. Even in randomized studies the mediator may still be correlated with unobserved prognostic variables that affect the outcome, in which case the mediator's role in the causal process may not be inferred without bias. In the behavioural and social science literature very little attention has been given so far to the causal assumptions required for moderated mediation analysis. In this paper we focus on the index for moderated mediation, which measures by how much the mediated effect is larger or smaller for varying levels of the moderator. We show that in linear models this index can be estimated without bias in the presence of unmeasured common causes of the moderator, mediator and outcome under certain conditions. Importantly, one can thus use the test for moderated mediation to support evidence for mediation under less stringent confounding conditions. We illustrate our findings with data from a randomized experiment assessing the impact of being primed with social deception upon observer responses to others' pain, and from an observational study of individuals who ended a romantic relationship assessing the effect of attachment anxiety during the relationship on mental distress 2 years after the break-up. © 2016 The British Psychological Society.

  7. Basic Electromagnetism and Materials

    CERN Document Server

    Moliton, André

    2007-01-01

    Basic Electromagnetism and Materials is the product of many years of teaching basic and applied electromagnetism. This textbook can be used to teach electromagnetism to a wide range of undergraduate science majors in physics, electrical engineering or materials science. However, by making lesser demands on mathematical knowledge than competing texts, and by emphasizing electromagnetic properties of materials and their applications, this textbook is uniquely suited to students of materials science. Many competing texts focus on the study of propagation waves either in the microwave or optical domain, whereas Basic Electromagnetism and Materials covers the entire electromagnetic domain and the physical response of materials to these waves. Professor André Moliton is Director of the Unité de Microélectronique, Optoélectronique et Polymères (Université de Limoges, France), which brings together three groups studying the optoelectronics of molecular and polymer layers, micro-optoelectronic systems for teleco...

  8. 國中教師以學習共同體啟動新學習型態之研究 Adopting a Learning Community in a Junior High School under the 12-Year Basic Education System

    Directory of Open Access Journals (Sweden)

    薛雅慈(曉華) Ya-Ci (Hsiao-Hua Selena Hsueh

    2014-03-01

    of learning through a learning community, and numerous schools have participated in this learning community program. The traditional learning style of speaking while students listen is expected to change. In this qualitative study, student experiences and how they changed under the guidance of a learning community were investigated by conducting interviews, and potential problems in the learning method were identified. Five teachers from a junior high school, in which the learning community method was adopted in their classes, participated in this study. The results of positivist analysis indicate that the implementation of a learning community is expected to be a valuable educational method under the 12-Year Basic Education system. Both the researcher and the teachers observed changes in student learning caused by the use of various teaching strategies. Six crucial findings were derived from this research. (1 The methods used by junior high school teachers for promoting collaborative learning in their classes are comprehensive and diversified. (2 Based on the learning community proposed by Professor Manabu Sato, the most widely used method in practice among junior high school teachers is collaborative learning. (3 The collaborative learning technique used by junior high school teachers is typically cooperative learning, which focuses on group discussion and expression rather than on listening, connecting, and referring to the text, as argued by Sato. (4 Regarding junior high school students, the greatest benefit produced by collaborative learning is the cultivation of motivation and teamwork. (5 Inferior students who were previously unacquainted with their classmates attained achievements through collaborative learning. (6 Overall, the teachers enhanced student learning, and changed the learning style of the students in a positive manner.

  9. Speakers' assumptions about the lexical flexibility of idioms.

    Science.gov (United States)

    Gibbs, R W; Nayak, N P; Bolton, J L; Keppel, M E

    1989-01-01

    In three experiments, we examined why some idioms can be lexically altered and still retain their figurative meanings (e.g., John buttoned his lips about Mary can be changed into John fastened his lips about Mary and still mean "John didn't say anything about Mary"), whereas other idioms cannot be lexically altered without losing their figurative meanings (e.g., John kicked the bucket, meaning "John died," loses its idiomatic meaning when changed into John kicked the pail). Our hypothesis was that the lexical flexibility of idioms is determined by speakers' assumptions about the ways in which parts of idioms contribute to their figurative interpretations as a whole. The results of the three experiments indicated that idioms whose individual semantic components contribute to their overall figurative meanings (e.g., go out on a limb) were judged as less disrupted by changes in their lexical items (e.g., go out on a branch) than were nondecomposable idioms (e.g., kick the bucket) when their individual words were altered (e.g., punt the pail). These findings lend support to the idea that both the syntactic productivity and the lexical makeup of idioms are matters of degree, depending on the idioms' compositional properties. This conclusion suggests that idioms do not form a unique class of linguistic items, but share many of the properties of more literal language.

  10. Stream of consciousness: Quantum and biochemical assumptions regarding psychopathology.

    Science.gov (United States)

    Tonello, Lucio; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A

    2017-04-01

    The accepted paradigms of mainstream neuropsychiatry appear to be incompletely adequate and in various cases offer equivocal analyses. However, a growing number of new approaches are being proposed that suggest the emergence of paradigm shifts in this area. In particular, quantum theories of mind, brain and consciousness seem to offer a profound change to the current approaches. Unfortunately these quantum paradigms harbor at least two serious problems. First, they are simply models, theories, and assumptions, with no convincing experiments supporting their claims. Second, they deviate from contemporary mainstream views of psychiatric illness and do so in revolutionary ways. We suggest a possible way to integrate experimental neuroscience with quantum models in order to address outstanding issues in psychopathology. A key role is played by the phenomenon called the "stream of consciousness", which can be linked to the so-called "Gamma Synchrony" (GS), which is clearly demonstrated by EEG data. In our novel proposal, a unipolar depressed patient could be seen as a subject with an altered stream of consciousness. In particular, some clues suggest that depression is linked to an "increased power" stream of consciousness. It is additionally suggested that such an approach to depression might be extended to psychopathology in general with potential benefits to diagnostics and therapeutics in neuropsychiatry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Assumptions of the primordial spectrum and cosmological parameter estimation

    International Nuclear Information System (INIS)

    Shafieloo, Arman; Souradeep, Tarun

    2011-01-01

    The observables of the perturbed universe, cosmic microwave background (CMB) anisotropy and large structures depend on a set of cosmological parameters, as well as the assumed nature of primordial perturbations. In particular, the shape of the primordial power spectrum (PPS) is, at best, a well-motivated assumption. It is known that the assumed functional form of the PPS in cosmological parameter estimation can affect the best-fit-parameters and their relative confidence limits. In this paper, we demonstrate that a specific assumed form actually drives the best-fit parameters into distinct basins of likelihood in the space of cosmological parameters where the likelihood resists improvement via modifications to the PPS. The regions where considerably better likelihoods are obtained allowing free-form PPS lie outside these basins. In the absence of a preferred model of inflation, this raises a concern that current cosmological parameter estimates are strongly prejudiced by the assumed form of PPS. Our results strongly motivate approaches toward simultaneous estimation of the cosmological parameters and the shape of the primordial spectrum from upcoming cosmological data. It is equally important for theorists to keep an open mind towards early universe scenarios that produce features in the PPS. (paper)

  12. Fourth-order structural steganalysis and analysis of cover assumptions

    Science.gov (United States)

    Ker, Andrew D.

    2006-02-01

    We extend our previous work on structural steganalysis of LSB replacement in digital images, building detectors which analyse the effect of LSB operations on pixel groups as large as four. Some of the method previously applied to triplets of pixels carries over straightforwardly. However we discover new complexities in the specification of a cover image model, a key component of the detector. There are many reasonable symmetry assumptions which we can make about parity and structure in natural images, only some of which provide detection of steganography, and the challenge is to identify the symmetries a) completely, and b) concisely. We give a list of possible symmetries and then reduce them to a complete, non-redundant, and approximately independent set. Some experimental results suggest that all useful symmetries are thus described. A weighting is proposed and its approximate variance stabilisation verified empirically. Finally, we apply symmetries to create a novel quadruples detector for LSB replacement steganography. Experimental results show some improvement, in most cases, over other detectors. However the gain in performance is moderate compared with the increased complexity in the detection algorithm, and we suggest that, without new insight, further extension of structural steganalysis may provide diminishing returns.

  13. Basic properties of semiconductors

    CERN Document Server

    Landsberg, PT

    2013-01-01

    Since Volume 1 was published in 1982, the centres of interest in the basic physics of semiconductors have shifted. Volume 1 was called Band Theory and Transport Properties in the first edition, but the subject has broadened to such an extent that Basic Properties is now a more suitable title. Seven chapters have been rewritten by the original authors. However, twelve chapters are essentially new, with the bulk of this work being devoted to important current topics which give this volume an almost encyclopaedic form. The first three chapters discuss various aspects of modern band theory and the

  14. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  15. Comprehensive basic mathematics

    CERN Document Server

    Veena, GR

    2005-01-01

    Salient Features As per II PUC Basic Mathematics syllabus of Karnataka. Provides an introduction to various basic mathematical techniques and the situations where these could be usefully employed. The language is simple and the material is self-explanatory with a large number of illustrations. Assists the reader in gaining proficiency to solve diverse variety of problems. A special capsule containing a gist and list of formulae titled ''REMEMBER! Additional chapterwise arranged question bank and 3 model papers in a separate section---''EXAMINATION CORNER''.

  16. Ecology and basic laws

    International Nuclear Information System (INIS)

    Mayer-Tasch, P.C.

    1980-01-01

    The author sketches the critical relation between ecology and basic law - critical in more than one sense. He points out the incompatibility of constitutional states and atomic states which is due to constitutional order being jeopardised by nuclear policy. He traces back the continuously rising awareness of pollution and the modern youth movement to their common root i.e. the awakening, the youth movement of the turn of the century. Eventually, he considers an economical, political, and social decentralization as a feasible alternative which would considerably relieve our basic living conditions from the threatening forms of civilization prevailing. (HSCH) [de

  17. Basic reactions induced by radiation

    International Nuclear Information System (INIS)

    Charlesby, A.

    1980-01-01

    This paper summarises some of the basic reactions resulting from exposure to high energy radiation. In the initial stages energy is absorbed, but not necessarily at random, giving radical and ion species which may then react to promote the final chemical change. However, it is possible to intervene at intermediate stages to modify or reduce the radiation effect. Under certain conditions enhanced reactions are also possible. Several expressions are given to calculate radiation yield in terms of energy absorbed. Some analogies between radiation-induced reactions in polymers, and those studied in radiobiology are outlined. (author)

  18. Research into basic rocks types

    International Nuclear Information System (INIS)

    1993-06-01

    Teollisuuden Voima Oy (TVO) has carried out research into basic rock types in Finland. The research programme has been implemented in parallel with the preliminary site investigations for radioactive waste disposal in 1991-1993. The program contained two main objectives: firstly, to study the properties of the basic rock types and compare those with the other rock types under the investigation; secondly, to carry out an inventory of rock formations consisting of basic rock types and suitable in question for final disposal. A study of environmental factors important to know regarding the final disposal was made of formations identified. In total 159 formations exceeding the size of 4 km 2 were identified in the inventory. Of these formations 97 were intrusive igneous rock types and 62 originally extrusive volcanic rock types. Deposits consisting of ore minerals, industrial minerals or building stones related to these formations were studied. Environmental factors like natural resources, protected areas or potential for restrictions in land use were also studied

  19. Achievement report for fiscal 1981 on research under Sunshine Program. Basic research on high-calorie gas production technology; 1981 nendo sunshinte keikaku kenkyu seika hokokusho. Kokarori gas seizo gijutsu no kiso kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1982-03-01

    Test and research are conducted for acquiring basic data regarding coal gasification. In the basic research for the development of a high-calorie gas production process, a small moving bed gas furnace is improved, and a high-calorie gas of a heating value of approximately 5,100kcal/m{sup 3} is stably produced at a rate of approximately 20m{sup 3} per diem. The carbon conversion rate turns out to be approximately 37% for the gas, approximately 53% for the residual char, and approximately 10% for a mixture of tar, naphthalene, benzene, phenol, etc. Hydrogen is produced making use of the residual char when a small amount of coal is added, and it is deemed that the first experiment for the development of a moving bed hydrogasification process has been successfully completed. In the study of the mechanism of hydrogasification reaction, the result of a preliminary experiment in a fixed bed pressurized gasification furnace is compared with the data from a continuous gasification experiment, and the relationship is determined between the coal feed rate and the reaction rate. Conducted besides are a basic study of problems relating to operation, basic research on a reaction mechanism (carbonization by rapid heating), estimation of the equilibrium composition of gas generated by coal gasification, etc. (NEDO)

  20. Precompound Reactions: Basic Concepts

    International Nuclear Information System (INIS)

    Weidenmueller, H. A.

    2008-01-01

    Because of the non-zero nuclear equilibration time, the compound-nucleus scattering model fails when the incident energy exceeds 10 or 20 MeV, and precompound reactions become important. Basic ideas used in the quantum-statistical approaches to these reactions are described

  1. Basic Tuberculosis Facts

    Centers for Disease Control (CDC) Podcasts

    2012-03-12

    In this podcast, Dr. Kenneth Castro, Director of the Division of Tuberculosis Elimination, discusses basic TB prevention, testing, and treatment information.  Created: 3/12/2012 by National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP).   Date Released: 3/12/2012.

  2. Basic SPSS tutorial

    NARCIS (Netherlands)

    Grotenhuis, H.F. te; Matthijssen, A.C.B.

    2015-01-01

    This supplementary book for the social, behavioral, and health sciences helps readers with no prior knowledge of IBM® SPSS® Statistics, statistics, or mathematics learn the basics of SPSS. Designed to reduce fear and build confidence, the book guides readers through point-and-click sequences using

  3. Basic Skills Assessment

    Science.gov (United States)

    Yin, Alexander C.; Volkwein, J. Fredericks

    2010-01-01

    After surveying 1,827 students in their final year at eighty randomly selected two-year and four-year public and private institutions, American Institutes for Research (2006) reported that approximately 30 percent of students in two-year institutions and nearly 20 percent of students in four-year institutions have only basic quantitative…

  4. Basic physics for all

    CERN Document Server

    Kumar, B N

    2012-01-01

    This is a simple, concise book for both student and non-physics students, presenting basic facts in straightforward form and conveying fundamental principles and theories of physics. This book will be helpful as a supplement to class teaching and to aid those who have difficulty in mastering concepts and principles.

  5. Basic pharmaceutical technology

    OpenAIRE

    Angelovska, Bistra; Drakalska, Elena

    2017-01-01

    The lecture deals with basics of pharmaceutical technology as applied discipline of pharmaceutical science, whose main subject of study is formulation and manufacture of drugs. In a broad sense, pharmaceutical technology is science of formulation, preparation, stabilization and determination of the quality of medicines prepared in the pharmacy or in pharmaceutical industry

  6. Basic radiation oncology

    International Nuclear Information System (INIS)

    Beyzadeoglu, M. M.; Ebruli, C.

    2008-01-01

    Basic Radiation Oncology is an all-in-one book. It is an up-to-date bedside oriented book integrating the radiation physics, radiobiology and clinical radiation oncology. It includes the essentials of all aspects of radiation oncology with more than 300 practical illustrations, black and white and color figures. The layout and presentation is very practical and enriched with many pearl boxes. Key studies particularly randomized ones are also included at the end of each clinical chapter. Basic knowledge of all high-tech radiation teletherapy units such as tomotherapy, cyberknife, and proton therapy are also given. The first 2 sections review concepts that are crucial in radiation physics and radiobiology. The remaining 11 chapters describe treatment regimens for main cancer sites and tumor types. Basic Radiation Oncology will greatly help meeting the needs for a practical and bedside oriented oncology book for residents, fellows, and clinicians of Radiation, Medical and Surgical Oncology as well as medical students, physicians and medical physicists interested in Clinical Oncology. English Edition of the book Temel Radyasyon Onkolojisi is being published by Springer Heidelberg this year with updated 2009 AJCC Staging as Basic Radiation Oncology

  7. Bottled Water Basics

    Science.gov (United States)

    Table of Contents Bottled water basics ....................................... pg.2 Advice for people with severely compromised immune systems (Sidebar) ............................. pg2 Know what you’re buying .............................. pg.3 Taste considerations ........................................ pg.4 Bottled water terms (Sidebar) ..................... pg.4 Begin by reading the ...

  8. Monte Carlo: Basics

    OpenAIRE

    Murthy, K. P. N.

    2001-01-01

    An introduction to the basics of Monte Carlo is given. The topics covered include, sample space, events, probabilities, random variables, mean, variance, covariance, characteristic function, chebyshev inequality, law of large numbers, central limit theorem (stable distribution, Levy distribution), random numbers (generation and testing), random sampling techniques (inversion, rejection, sampling from a Gaussian, Metropolis sampling), analogue Monte Carlo and Importance sampling (exponential b...

  9. Ethanol Basics (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2015-01-01

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  10. Basic Soils. Revision.

    Science.gov (United States)

    Montana State Univ., Bozeman. Dept. of Agricultural and Industrial Education.

    This curriculum guide is designed for use in teaching a course in basic soils that is intended for college freshmen. Addressed in the individual lessons of the unit are the following topics: the way in which soil is formed, the physical properties of soil, the chemical properties of soil, the biotic properties of soil, plant-soil-water…

  11. Sensitivity of fluvial sediment source apportionment to mixing model assumptions: A Bayesian model comparison.

    Science.gov (United States)

    Cooper, Richard J; Krueger, Tobias; Hiscock, Kevin M; Rawlins, Barry G

    2014-11-01

    Mixing models have become increasingly common tools for apportioning fluvial sediment load to various sediment sources across catchments using a wide variety of Bayesian and frequentist modeling approaches. In this study, we demonstrate how different model setups can impact upon resulting source apportionment estimates in a Bayesian framework via a one-factor-at-a-time (OFAT) sensitivity analysis. We formulate 13 versions of a mixing model, each with different error assumptions and model structural choices, and apply them to sediment geochemistry data from the River Blackwater, Norfolk, UK, to apportion suspended particulate matter (SPM) contributions from three sources (arable topsoils, road verges, and subsurface material) under base flow conditions between August 2012 and August 2013. Whilst all 13 models estimate subsurface sources to be the largest contributor of SPM (median ∼76%), comparison of apportionment estimates reveal varying degrees of sensitivity to changing priors, inclusion of covariance terms, incorporation of time-variant distributions, and methods of proportion characterization. We also demonstrate differences in apportionment results between a full and an empirical Bayesian setup, and between a Bayesian and a frequentist optimization approach. This OFAT sensitivity analysis reveals that mixing model structural choices and error assumptions can significantly impact upon sediment source apportionment results, with estimated median contributions in this study varying by up to 21% between model versions. Users of mixing models are therefore strongly advised to carefully consider and justify their choice of model structure prior to conducting sediment source apportionment investigations. An OFAT sensitivity analysis of sediment fingerprinting mixing models is conductedBayesian models display high sensitivity to error assumptions and structural choicesSource apportionment results differ between Bayesian and frequentist approaches.

  12. Testing the assumptions of the pyrodiversity begets biodiversity hypothesis for termites in semi-arid Australia.

    Science.gov (United States)

    Davis, Hayley; Ritchie, Euan G; Avitabile, Sarah; Doherty, Tim; Nimmo, Dale G

    2018-04-01

    Fire shapes the composition and functioning of ecosystems globally. In many regions, fire is actively managed to create diverse patch mosaics of fire-ages under the assumption that a diversity of post-fire-age classes will provide a greater variety of habitats, thereby enabling species with differing habitat requirements to coexist, and enhancing species diversity (the pyrodiversity begets biodiversity hypothesis). However, studies provide mixed support for this hypothesis. Here, using termite communities in a semi-arid region of southeast Australia, we test four key assumptions of the pyrodiversity begets biodiversity hypothesis (i) that fire shapes vegetation structure over sufficient time frames to influence species' occurrence, (ii) that animal species are linked to resources that are themselves shaped by fire and that peak at different times since fire, (iii) that species' probability of occurrence or abundance peaks at varying times since fire and (iv) that providing a diversity of fire-ages increases species diversity at the landscape scale. Termite species and habitat elements were sampled in 100 sites across a range of fire-ages, nested within 20 landscapes chosen to represent a gradient of low to high pyrodiversity. We used regression modelling to explore relationships between termites, habitat and fire. Fire affected two habitat elements (coarse woody debris and the cover of woody vegetation) that were associated with the probability of occurrence of three termite species and overall species richness, thus supporting the first two assumptions of the pyrodiversity hypothesis. However, this did not result in those species or species richness being affected by fire history per se. Consequently, landscapes with a low diversity of fire histories had similar numbers of termite species as landscapes with high pyrodiversity. Our work suggests that encouraging a diversity of fire-ages for enhancing termite species richness in this study region is not necessary.

  13. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  14. Uniform background assumption produces misleading lung EIT images.

    Science.gov (United States)

    Grychtol, Bartłomiej; Adler, Andy

    2013-06-01

    Electrical impedance tomography (EIT) estimates an image of conductivity change within a body from stimulation and measurement at body surface electrodes. There is significant interest in EIT for imaging the thorax, as a monitoring tool for lung ventilation. To be useful in this application, we require an understanding of if and when EIT images can produce inaccurate images. In this paper, we study the consequences of the homogeneous background assumption, frequently made in linear image reconstruction, which introduces a mismatch between the reference measurement and the linearization point. We show in simulation and experimental data that the resulting images may contain large and clinically significant errors. A 3D finite element model of thorax conductivity is used to simulate EIT measurements for different heart and lung conductivity, size and position, as well as different amounts of gravitational collapse and ventilation-associated conductivity change. Three common linear EIT reconstruction algorithms are studied. We find that the asymmetric position of the heart can cause EIT images of ventilation to show up to 60% undue bias towards the left lung and that the effect is particularly strong for a ventilation distribution typical of mechanically ventilated patients. The conductivity gradient associated with gravitational lung collapse causes conductivity changes in non-dependent lung to be overestimated by up to 100% with respect to the dependent lung. Eliminating the mismatch by using a realistic conductivity distribution in the forward model of the reconstruction algorithm strongly reduces these undesirable effects. We conclude that subject-specific anatomically accurate forward models should be used in lung EIT and extra care is required when analysing EIT images of subjects whose background conductivity distribution in the lungs is known to be heterogeneous or exhibiting large changes.

  15. Uniform background assumption produces misleading lung EIT images

    International Nuclear Information System (INIS)

    Grychtol, Bartłomiej; Adler, Andy

    2013-01-01

    Electrical impedance tomography (EIT) estimates an image of conductivity change within a body from stimulation and measurement at body surface electrodes. There is significant interest in EIT for imaging the thorax, as a monitoring tool for lung ventilation. To be useful in this application, we require an understanding of if and when EIT images can produce inaccurate images. In this paper, we study the consequences of the homogeneous background assumption, frequently made in linear image reconstruction, which introduces a mismatch between the reference measurement and the linearization point. We show in simulation and experimental data that the resulting images may contain large and clinically significant errors. A 3D finite element model of thorax conductivity is used to simulate EIT measurements for different heart and lung conductivity, size and position, as well as different amounts of gravitational collapse and ventilation-associated conductivity change. Three common linear EIT reconstruction algorithms are studied. We find that the asymmetric position of the heart can cause EIT images of ventilation to show up to 60% undue bias towards the left lung and that the effect is particularly strong for a ventilation distribution typical of mechanically ventilated patients. The conductivity gradient associated with gravitational lung collapse causes conductivity changes in non-dependent lung to be overestimated by up to 100% with respect to the dependent lung. Eliminating the mismatch by using a realistic conductivity distribution in the forward model of the reconstruction algorithm strongly reduces these undesirable effects. We conclude that subject-specific anatomically accurate forward models should be used in lung EIT and extra care is required when analysing EIT images of subjects whose background conductivity distribution in the lungs is known to be heterogeneous or exhibiting large changes. (paper)

  16. Energetics of basic karate kata.

    Science.gov (United States)

    Bussweiler, Jens; Hartmann, Ulrich

    2012-12-01

    Knowledge about energy requirements during exercises seems necessary to develop training concepts in combat sport Karate. It is a commonly held view that the anaerobic lactic energy metabolism plays a key role, but this assumption could not be confirmed so far. The metabolic cost and fractional energy supply of basic Karate Kata (Heian Nidan, Shotokan style) with duration of about 30 s were analyzed. Six male Karateka [mean ± SD (age 29 ± 8 years; height 177 ± 5 cm, body mass 75 ± 9 kg)] with different training experience (advanced athletes, experts, elite athletes) were examined while performing one time and two time continuously the sport-specific movements. During Kata performance oxygen uptake was measured with a portable spirometric device, blood lactate concentrations were examined before and after testing and fractional energy supply was calculated. The results have shown that on average 52 % of the energy supply for one Heian Nidan came from anaerobic alactic metabolism, 25 % from anaerobic lactic and 23 % from aerobic metabolism. For two sequentially executed Heian Nidan and thus nearly doubling the duration, the calculated percentages were 33, 25 and 42 %. Total energy demand for one Kata and two Kata was approximately 61 and 99 kJ, respectively. Despite measured blood lactate concentrations up to 8.1 mmol l(-1), which might suggest a dominance of lactic energy supply, a lactic fraction of only 17-31 % during these relatively short and intense sequences could be found. A heavy use of lactic energy metabolism had to be rejected.

  17. 38 CFR 21.142 - Adult basic education.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Adult basic education. 21...) VOCATIONAL REHABILITATION AND EDUCATION Vocational Rehabilitation and Employment Under 38 U.S.C. Chapter 31 Special Rehabilitation Services § 21.142 Adult basic education. (a) Definition. The term adult basic...

  18. Fundamentals of neurogastroenterology: basic science.

    Science.gov (United States)

    Grundy, David; Al-Chaer, Elie D; Aziz, Qasim; Collins, Stephen M; Ke, Meiyun; Taché, Yvette; Wood, Jackie D

    2006-04-01

    The focus of neurogastroenterology in Rome II was the enteric nervous system (ENS). To avoid duplication with Rome II, only advances in ENS neurobiology after Rome II are reviewed together with stronger emphasis on interactions of the brain, spinal cord, and the gut in terms of relevance for abdominal pain and disordered gastrointestinal function. A committee with expertise in selective aspects of neurogastroenterology was invited to evaluate the literature and provide a consensus overview of the Fundamentals of Neurogastroenterology textbook as they relate to functional gastrointestinal disorders (FGIDs). This review is an abbreviated version of a fuller account that appears in the forthcoming book, Rome III. This report reviews current basic science understanding of visceral sensation and its modulation by inflammation and stress and advances in the neurophysiology of the ENS. Many of the concepts are derived from animal studies in which the physiologic mechanisms underlying visceral sensitivity and neural control of motility, secretion, and blood flow are examined. Impact of inflammation and stress in experimental models relative to FGIDs is reviewed as is human brain imaging, which provides a means for translating basic science to understanding FGID symptoms. Investigative evidence and emerging concepts implicate dysfunction in the nervous system as a significant factor underlying patient symptoms in FGIDs. Continued focus on neurogastroenterologic factors that underlie the development of symptoms will lead to mechanistic understanding that is expected to directly benefit the large contingent of patients and care-givers who deal with FGIDs.

  19. Teaching and Learning Science in the 21st Century: Challenging Critical Assumptions in Post-Secondary Science

    Directory of Open Access Journals (Sweden)

    Amanda L. Glaze

    2018-01-01

    Full Text Available It is widely agreed upon that the goal of science education is building a scientifically literate society. Although there are a range of definitions for science literacy, most involve an ability to problem solve, make evidence-based decisions, and evaluate information in a manner that is logical. Unfortunately, science literacy appears to be an area where we struggle across levels of study, including with students who are majoring in the sciences in university settings. One reason for this problem is that we have opted to continue to approach teaching science in a way that fails to consider the critical assumptions that faculties in the sciences bring into the classroom. These assumptions include expectations of what students should know before entering given courses, whose responsibility it is to ensure that students entering courses understand basic scientific concepts, the roles of researchers and teachers, and approaches to teaching at the university level. Acknowledging these assumptions and the potential for action to shift our teaching and thinking about post-secondary education represents a transformative area in science literacy and preparation for the future of science as a field.

  20. Rethinking our assumptions about the evolution of bird song and other sexually dimorphic signals

    Directory of Open Access Journals (Sweden)

    J. Jordan Price

    2015-04-01

    Full Text Available Bird song is often cited as a classic example of a sexually-selected ornament, in part because historically it has been considered a primarily male trait. Recent evidence that females also sing in many songbird species and that sexual dimorphism in song is often the result of losses in females rather than gains in males therefore appears to challenge our understanding of the evolution of bird song through sexual selection. Here I propose that these new findings do not necessarily contradict previous research, but rather they disagree with some of our assumptions about the evolution of sexual dimorphisms in general and female song in particular. These include misconceptions that current patterns of elaboration and diversity in each sex reflect past rates of change and that levels of sexual dimorphism necessarily reflect levels of sexual selection. Using New World blackbirds (Icteridae as an example, I critically evaluate these past assumptions in light of new phylogenetic evidence. Understanding the mechanisms underlying such sexually dimorphic traits requires a clear understanding of their evolutionary histories. Only then can we begin to ask the right questions.

  1. On the derivation of approximations to cellular automata models and the assumption of independence.

    Science.gov (United States)

    Davies, K J; Green, J E F; Bean, N G; Binder, B J; Ross, J V

    2014-07-01

    Cellular automata are discrete agent-based models, generally used in cell-based applications. There is much interest in obtaining continuum models that describe the mean behaviour of the agents in these models. Previously, continuum models have been derived for agents undergoing motility and proliferation processes, however, these models only hold under restricted conditions. In order to narrow down the reason for these restrictions, we explore three possible sources of error in deriving the model. These sources are the choice of limiting arguments, the use of a discrete-time model as opposed to a continuous-time model and the assumption of independence between the state of sites. We present a rigorous analysis in order to gain a greater understanding of the significance of these three issues. By finding a limiting regime that accurately approximates the conservation equation for the cellular automata, we are able to conclude that the inaccuracy between our approximation and the cellular automata is completely based on the assumption of independence. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Transportation Emissions: some basics

    DEFF Research Database (Denmark)

    Kontovas, Christos A.; Psaraftis, Harilaos N.

    2016-01-01

    transportation and especially carbon dioxide emissions are at the center stage of discussion by the world community through various international treaties, such as the Kyoto Protocol. The transportation sector also emits non-CO2 pollutants that have important effects on air quality, climate, and public health......Transportation is the backbone of international trade and a key engine driving globalization. However, there is growing concern that the Earth’s atmospheric composition is being altered by human activities, including transportation, which can lead to climate change. Air pollution from....... The main purpose of this chapter is to introduce some basic concepts that are relevant in the quest of green transportation logistics. First, we present the basics of estimating emissions from transportation activities, the current statistics and future trends, as well as the total impact of air emissions...

  3. Basic Emotions: A Reconstruction

    Science.gov (United States)

    Mason, William A.; Capitanio, John P.

    2016-01-01

    Emotionality is a basic feature of behavior. The argument over whether the expression of emotions is based primarily on culture (constructivism, nurture) or biology (natural forms, nature) will never be resolved because both alternatives are untenable. The evidence is overwhelming that at all ages and all levels of organization, the development of emotionality is epigenetic: The organism is an active participant in its own development. To ascribe these effects to “experience” was the best that could be done for many years. With the rapid acceleration of information on how changes in organization are actually brought about, it is a good time to review, update, and revitalize our views of experience in relation to the concept of basic emotion. PMID:27110280

  4. Basic electronic circuits

    CERN Document Server

    Buckley, P M

    1980-01-01

    In the past, the teaching of electricity and electronics has more often than not been carried out from a theoretical and often highly academic standpoint. Fundamentals and basic concepts have often been presented with no indication of their practical appli­ cations, and all too frequently they have been illustrated by artificially contrived laboratory experiments bearing little relationship to the outside world. The course comes in the form of fourteen fairly open-ended constructional experiments or projects. Each experiment has associated with it a construction exercise and an explanation. The basic idea behind this dual presentation is that the student can embark on each circuit following only the briefest possible instructions and that an open-ended approach is thereby not prejudiced by an initial lengthy encounter with the theory behind the project; this being a sure way to dampen enthusiasm at the outset. As the investigation progresses, questions inevitably arise. Descriptions of the phenomena encounte...

  5. Basic linear algebra

    CERN Document Server

    Blyth, T S

    2002-01-01

    Basic Linear Algebra is a text for first year students leading from concrete examples to abstract theorems, via tutorial-type exercises. More exercises (of the kind a student may expect in examination papers) are grouped at the end of each section. The book covers the most important basics of any first course on linear algebra, explaining the algebra of matrices with applications to analytic geometry, systems of linear equations, difference equations and complex numbers. Linear equations are treated via Hermite normal forms which provides a successful and concrete explanation of the notion of linear independence. Another important highlight is the connection between linear mappings and matrices leading to the change of basis theorem which opens the door to the notion of similarity. This new and revised edition features additional exercises and coverage of Cramer's rule (omitted from the first edition). However, it is the new, extra chapter on computer assistance that will be of particular interest to readers:...

  6. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  7. Emulsion Science Basic Principles

    CERN Document Server

    Leal-Calderon, Fernando; Schmitt, Véronique

    2007-01-01

    Emulsions are generally made out of two immiscible fluids like oil and water, one being dispersed in the second in the presence of surface-active compounds.They are used as intermediate or end products in a huge range of areas including the food, chemical, cosmetic, pharmaceutical, paint, and coating industries. Besides the broad domain of technological interest, emulsions are raising a variety of fundamental questions at the frontier between physics and chemistry. This book aims to give an overview of the most recent advances in emulsion science. The basic principles, covering aspects of emulsions from their preparation to their destruction, are presented in close relation to both the fundamental physics and the applications of these materials. The book is intended to help scientists and engineers in formulating new materials by giving them the basics of emulsion science.

  8. Basics of Computer Networking

    CERN Document Server

    Robertazzi, Thomas

    2012-01-01

    Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

  9. Risk communication basics

    International Nuclear Information System (INIS)

    Corrado, P.G.

    1995-01-01

    In low-trust, high-concern situations, 50% of your credibility comes from perceived empathy and caring, demonstrated in the first 30 s you come in contact with someone. There is no second chance for a first impression. These and other principles contained in this paper provide you with a basic level of understanding of risk communication. The principles identified are time-tested caveats and will assist you in effectively communicating technical information

  10. Risk communication basics

    Energy Technology Data Exchange (ETDEWEB)

    Corrado, P.G. [Lawrence Livermore National Laboratory, CA (United States)

    1995-12-31

    In low-trust, high-concern situations, 50% of your credibility comes from perceived empathy and caring, demonstrated in the first 30 s you come in contact with someone. There is no second chance for a first impression. These and other principles contained in this paper provide you with a basic level of understanding of risk communication. The principles identified are time-tested caveats and will assist you in effectively communicating technical information.

  11. Basic nucleonics. 2. ed.

    International Nuclear Information System (INIS)

    Guzman, M.E.

    1989-01-01

    This book is oriented mainly towards professionals who are not physicists or experts in nuclear sciences, physicians planning to specialize in nuclear medicine or radiotherapy and technicians involved in nuclear applications. The book covers the fundamental concepts of nuclear science and technology in a simple and ordered fashion. Theory is illustrated with appropriate exercises and answers. With 17 chapters plus 3 appendices on mathematics, basic concepts are covered in: nuclear science, radioactivity, radiation and matter, nuclear reactions, X rays, shielding and radioprotection

  12. Basic of Neutron NDA

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Alexis Chanel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives of this presentation are to introduce the basic physics of neutron production, interactions and detection; identify the processes that generate neutrons; explain the most common neutron mechanism, spontaneous and induced fission and (a,n) reactions; describe the properties of neutron from different sources; recognize advantages of neutron measurements techniques; recognize common neutrons interactions; explain neutron cross section measurements; describe the fundamental of 3He detector function and designs; and differentiate between passive and active assay techniques.

  13. Shoulder arthroscopy: the basics.

    Science.gov (United States)

    Farmer, Kevin W; Wright, Thomas W

    2015-04-01

    Shoulder arthroscopy is a commonly performed and accepted procedure for a wide variety of pathologies. Surgeon experience, patient positioning, knowledge of surgical anatomy, proper portal placement, and proper use of instrumentation can improve technical success and minimize complication risks. This article details the surgical anatomy, indications, patient positioning, portal placement, instrumentation, and complications for basic shoulder arthroscopy. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  14. Basic accelerator optics

    CERN Document Server

    CERN. Geneva. Audiovisual Unit

    1985-01-01

    A complete derivation, from first principles, of the concepts and methods applied in linear accelerator and beamline optics will be presented. Particle motion and beam motion in systems composed of linear magnets, as well as weak and strong focusing and special insertions are treated in mathematically simple terms, and design examples for magnets and systems are given. This series of five lectures is intended to provide all the basic tools required for the design and operation of beam optical systems.

  15. Basic concepts in oceanography

    International Nuclear Information System (INIS)

    Small, L.F.

    1997-01-01

    Basic concepts in oceanography include major wind patterns that drive ocean currents, and the effects that the earth's rotation, positions of land masses, and temperature and salinity have on oceanic circulation and hence global distribution of radioactivity. Special attention is given to coastal and near-coastal processes such as upwelling, tidal effects, and small-scale processes, as radionuclide distributions are currently most associated with coastal regions. (author)

  16. Basic Financial Accounting

    DEFF Research Database (Denmark)

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects......: business entities, the transformation process, types of businesses, stakeholders, legislation, the annual report, the VAT system, double-entry bookkeeping, inventories, and year-end cast flow analysis....

  17. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  18. Educational Technology as a Subversive Activity: Questioning Assumptions Related to Teaching and Leading with Technology

    Science.gov (United States)

    Kruger-Ross, Matthew J.; Holcomb, Lori B.

    2012-01-01

    The use of educational technologies is grounded in the assumptions of teachers, learners, and administrators. Assumptions are choices that structure our understandings and help us make meaning. Current advances in Web 2.0 and social media technologies challenge our assumptions about teaching and learning. The intersection of technology and…

  19. Personal and Communal Assumptions to Determine Pragmatic Meanings of Phatic Functions

    Directory of Open Access Journals (Sweden)

    Kunjana Rahardi

    2016-11-01

    Full Text Available This research was meant to describe the manifestations of phatic function in the education domain. The phatic function in the communication and interaction happening in the education domain could be accurately identified when the utterances were not separated from their determining pragmatic context. The context must not be limited only to contextual and social or societal perspectives, but must be defined as basic assumptions. The data of this research included various kinds of speech gathered naturally in education circles that contain phatic functions. Two methods of data gathering were employed in this study, namely listening and conversation methods. Recorded data was analyzed through the steps as follows (1 data were identified based on the discourse markers found (2 data were classified based on the phatic perception criteria; (3 data were interpreted based on the referenced theories; (4 data were described in the form of analysis result description. The research proves that phatic function in the form of small talks in the education domain cannot be separated from the context surrounding it. 

  20. Spatial modelling of assumption of tourism development with geographic IT using

    Directory of Open Access Journals (Sweden)

    Jitka Machalová

    2010-01-01

    Full Text Available The aim of this article is to show the possibilities of spatial modelling and analysing of assumptions of tourism development in the Czech Republic with the objective to make decision-making processes in tourism easier and more efficient (for companies, clients as well as destination managements. The development and placement of tourism depend on the factors (conditions that influence its application in specific areas. These factors are usually divided into three groups: selective, localization and realization. Tourism is inseparably connected with space – countryside. The countryside can be modelled and consecutively analysed by the means of geographical information technologies. With the help of spatial modelling and following analyses the localization and realization conditions in the regions of the Czech Republic have been evaluated. The best localization conditions have been found in the Liberecký region. The capital city of Prague has negligible natural conditions; however, those social ones are on a high level. Next, the spatial analyses have shown that the best realization conditions are provided by the capital city of Prague. Then the Central-Bohemian, South-Moravian, Moravian-Silesian and Karlovarský regions follow. The development of tourism destination is depended not only on the localization and realization factors but it is basically affected by the level of local destination management. Spatial modelling can help destination managers in decision-making processes in order to optimal use of destination potential and efficient targeting their marketing activities.

  1. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  2. Design assumptions and bases for small D-T-fueled Sperical Tokamak (ST) fusion core

    International Nuclear Information System (INIS)

    Peng, Y.K.M.; Galambos, J.D.; Fogarty, P.J.

    1996-01-01

    Recent progress in defining the assumptions and clarifying the bases for a small D-T-fueled ST fusion core are presented. The paper covers several issues in the physics of ST plasmas, the technology of neutral beam injection, the engineering design configuration, and the center leg material under intense neutron irradiation. This progress was driven by the exciting data from pioneering ST experiments, a heightened interest in proof-of-principle experiments at the MA level in plasma current, and the initiation of the first conceptual design study of the small ST fusion core. The needs recently identified for a restructured fusion energy sciences program have provided a timely impetus for examining the subject of this paper. Our results, though preliminary in nature, strengthen the case for the potential realism and attractiveness of the ST approach

  3. Agronomic Use of Basic Slag

    Directory of Open Access Journals (Sweden)

    Fabio Oliveiri de Nobile

    2015-01-01

    Full Text Available Modern civilization, in recent years, has increased the requirement of products derived from iron and steel, stimulating the growth of the national siderurgical sector and, consequently, the generation of industrial residue called basic slag. In this context, the recycling of residues can contribute to solve problems of the industries that give priority to the excellence of the production with quality. On the other hand, there is a sector of primary production in Brazil, the agriculture, with a great cultivated area in acid ground and with low fertility, being these factors admittedly determinative for vegetal production, under tropical conditions. Thus, there is a scenery of two primary sectors of production, although distinct ones, that present interaction potential, for , on one hand, there is disponibility of a product with similar properties to the liming materials and traditional fertilizers and, on the other hand, a production sector that is highly dependent of these products. And the interaction between these two sectors helps in the preservation of the environment, bringing, thus, a certain sustainability in the production systems of the postmodern civilization that will be the challenge of this new century. Considering the current possibility of recycling these industrial residues in agriculture, three important factors have to be taken into account. The first would be the proper use of the abundant, available and promising industrial residue; the second, in a propitious agricultural environment, acid soil and low fertility; and third, in a responsive and important socio-economic culture, the sugar cane, considering its vast cultivated area. In national literature, few works have dealt with the use of the basic slag and have evaluated the reply of the cultures to its application. Thus, the present work had as its aim to gather information from literature concerning the characterization and production of basic slag in Brazil, as well

  4. Problematic assumptions have slowed down depression research: why symptoms, not syndromes are the way forward

    Directory of Open Access Journals (Sweden)

    Eiko I Fried

    2015-03-01

    Full Text Available Major Depression (MD is a highly heterogeneous diagnostic category. Diverse symptoms such as sad mood, anhedonia, and fatigue are routinely added to an unweighted sum-score, and cutoffs are used to distinguish between depressed participants and healthy controls. Researchers then investigate outcome variables like MD risk factors, biomarkers, and treatment response in such samples. These practices presuppose that (1 depression is a discrete condition, and that (2 symptoms are interchangeable indicators of this latent disorder. Here I review these two assumptions, elucidate their historical roots, show how deeply engrained they are in psychological and psychiatric research, and document that they contrast with evidence. Depression is not a consistent syndrome with clearly demarcated boundaries, and depression symptoms are not interchangeable indicators of an underlying disorder. Current research practices lump individuals with very different problems into one category, which has contributed to the remarkably slow progress in key research domains such as the development of efficacious antidepressants or the identification of biomarkers for depression.The recently proposed network framework offers an alternative to the problematic assumptions. MD is not understood as a distinct condition, but as heterogeneous symptom cluster that substantially overlaps with other syndromes such as anxiety disorders. MD is not framed as an underlying disease with a number of equivalent indicators, but as a network of symptoms that have direct causal influence on each other: insomnia can cause fatigue which then triggers concentration and psychomotor problems. This approach offers new opportunities for constructing an empirically based classification system and has broad implications for future research.

  5. Basic physical phenomena, neutron production and scaling of the dense plasma focus

    International Nuclear Information System (INIS)

    Kaeppeler, H.J.

    This paper presents an attempt at establishing a model theory for the dense plasma focus in order to present a consistent interpretation of the basic physical phenomena leading to neutron production from both acceleration and thermal processes. To achieve this, the temporal history of the focus is divided into the compression of the plasma sheath, a qiescent and very dense phase with ensuing expansion, and an instable phase where the focus plasma is disrupted by instabilities. Finally, the decay of density, velocity and thermal fields is considered. Under the assumption that Io 2 /sigmaoRo 2 = const and to/Tc = const, scaling laws for plasma focus devices are derived. It is shown that while generally the neutron yield scales with the fourth power of maximum current, neutron production from thermal processes becomes increasingly important for large devices, while in the small devices neutron production from acceleration processes is by far predominant. (orig.) [de

  6. The basic approach to age-structured population dynamics models, methods and numerics

    CERN Document Server

    Iannelli, Mimmo

    2017-01-01

    This book provides an introduction to age-structured population modeling which emphasises the connection between mathematical theory and underlying biological assumptions. Through the rigorous development of the linear theory and the nonlinear theory alongside numerics, the authors explore classical equations that describe the dynamics of certain ecological systems. Modeling aspects are discussed to show how relevant problems in the fields of demography, ecology, and epidemiology can be formulated and treated within the theory. In particular, the book presents extensions of age-structured modelling to the spread of diseases and epidemics while also addressing the issue of regularity of solutions, the asymptotic behaviour of solutions, and numerical approximation. With sections on transmission models, non-autonomous models and global dynamics, this book fills a gap in the literature on theoretical population dynamics. The Basic Approach to Age-Structured Population Dynamics will appeal to graduate students an...

  7. Uranium: a basic evaluation

    International Nuclear Information System (INIS)

    Crull, A.W.

    1978-01-01

    All energy sources and technologies, including uranium and the nuclear industry, are needed to provide power. Public misunderstanding of the nature of uranium and how it works as a fuel may jeopardize nuclear energy as a major option. Basic chemical facts about uranium ore and uranium fuel technology are presented. Some of the major policy decisions that must be made include the enrichment, stockpiling, and pricing of uranium. Investigations and lawsuits pertaining to uranium markets are reviewed, and the point is made that oil companies will probably have to divest their non-oil energy activities. Recommendations for nuclear policies that have been made by the General Accounting Office are discussed briefly

  8. C# Database Basics

    CERN Document Server

    Schmalz, Michael

    2012-01-01

    Working with data and databases in C# certainly can be daunting if you're coming from VB6, VBA, or Access. With this hands-on guide, you'll shorten the learning curve considerably as you master accessing, adding, updating, and deleting data with C#-basic skills you need if you intend to program with this language. No previous knowledge of C# is necessary. By following the examples in this book, you'll learn how to tackle several database tasks in C#, such as working with SQL Server, building data entry forms, and using data in a web service. The book's code samples will help you get started

  9. Electrical installation calculations basic

    CERN Document Server

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for basic electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practice. A step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3Fo

  10. Basic structural dynamics

    CERN Document Server

    Anderson, James C

    2012-01-01

    A concise introduction to structural dynamics and earthquake engineering Basic Structural Dynamics serves as a fundamental introduction to the topic of structural dynamics. Covering single and multiple-degree-of-freedom systems while providing an introduction to earthquake engineering, the book keeps the coverage succinct and on topic at a level that is appropriate for undergraduate and graduate students. Through dozens of worked examples based on actual structures, it also introduces readers to MATLAB, a powerful software for solving both simple and complex structural d

  11. Basic heat transfer

    CERN Document Server

    Bacon, D H

    2013-01-01

    Basic Heat Transfer aims to help readers use a computer to solve heat transfer problems and to promote greater understanding by changing data values and observing the effects, which are necessary in design and optimization calculations.The book is concerned with applications including insulation and heating in buildings and pipes, temperature distributions in solids for steady state and transient conditions, the determination of surface heat transfer coefficients for convection in various situations, radiation heat transfer in grey body problems, the use of finned surfaces, and simple heat exc

  12. Back to basics audio

    CERN Document Server

    Nathan, Julian

    1998-01-01

    Back to Basics Audio is a thorough, yet approachable handbook on audio electronics theory and equipment. The first part of the book discusses electrical and audio principles. Those principles form a basis for understanding the operation of equipment and systems, covered in the second section. Finally, the author addresses planning and installation of a home audio system.Julian Nathan joined the audio service and manufacturing industry in 1954 and moved into motion picture engineering and production in 1960. He installed and operated recording theaters in Sydney, Austra

  13. Machine shop basics

    CERN Document Server

    Miller, Rex

    2004-01-01

    Use the right tool the right wayHere, fully updated to include new machines and electronic/digital controls, is the ultimate guide to basic machine shop equipment and how to use it. Whether you're a professional machinist, an apprentice, a trade student, or a handy homeowner, this fully illustrated volume helps you define tools and use them properly and safely. It's packed with review questions for students, and loaded with answers you need on the job.Mark Richard Miller is a Professor and Chairman of the Industrial Technology Department at Texas A&M University in Kingsville, T

  14. Basic bladder neurophysiology.

    Science.gov (United States)

    Clemens, J Quentin

    2010-11-01

    Maintenance of normal lower urinary tract function is a complex process that requires coordination between the central nervous system and the autonomic and somatic components of the peripheral nervous system. This article provides an overview of the basic principles that are recognized to regulate normal urine storage and micturition, including bladder biomechanics, relevant neuroanatomy, neural control of lower urinary tract function, and the pharmacologic processes that translate the neural signals into functional results. Finally, the emerging role of the urothelium as a sensory structure is discussed. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Basic and clinical immunology

    Science.gov (United States)

    Chinen, Javier; Shearer, William T.

    2003-01-01

    Progress in immunology continues to grow exponentially every year. New applications of this knowledge are being developed for a broad range of clinical conditions. Conversely, the study of primary and secondary immunodeficiencies is helping to elucidate the intricate mechanisms of the immune system. We have selected a few of the most significant contributions to the fields of basic and clinical immunology published between October 2001 and October 2002. Our choice of topics in basic immunology included the description of T-bet as a determinant factor for T(H)1 differentiation, the role of the activation-induced cytosine deaminase gene in B-cell development, the characterization of CD4(+)CD25(+) regulatory T cells, and the use of dynamic imaging to study MHC class II transport and T-cell and dendritic cell membrane interactions. Articles related to clinical immunology that were selected for review include the description of immunodeficiency caused by caspase 8 deficiency; a case series report on X-linked agammaglobulinemia; the mechanism of action, efficacy, and complications of intravenous immunoglobulin; mechanisms of autoimmunity diseases; and advances in HIV pathogenesis and vaccine development. We also reviewed two articles that explore the possible alterations of the immune system caused by spaceflights, a new field with increasing importance as human space expeditions become a reality in the 21st century.

  16. Benchmarking Using Basic DBMS Operations

    Science.gov (United States)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  17. Basics and application of PSpice

    International Nuclear Information System (INIS)

    Choi, Pyeong; Cho, Yong Beom; Mok, Hyeong Su; Baek, Dong CHeol

    2006-03-01

    This book is comprised of nineteenth chapters, which introduces basics and application of PSpice. The contents of this book are PSpice?, PSpice introduction, PSpice simulation, DC analysis, parametric analysis, Transient analysis, parametric analysis and measurements, Monte Carlo analysis, changing of device characteristic, ABM application. The elementary laws of circuit, R.L.C. basic circuit, Diode basic cc circuit, Transistor and EET basic circuit, OP-Amp basic circuit, Digital basic circuit, Analog, digital circuit practice, digital circuit application and practice and ABM circuit application and practice.

  18. ESPlannerBASIC CANADA

    Directory of Open Access Journals (Sweden)

    Laurence Kotlikoff

    2015-02-01

    Full Text Available Traditional financial planning is based on a fundamental rule of thumb: Aim to save enough for retirement to replace 80 per cent of your pre-retirement income with income from pensions and assets. Millions of Canadians follow this formula. Yet, there is no guarantee this approach is consistent with a savings plan that will allow them to experience their optimal standard of living — given their income — throughout their working lives. Consumption smoothing happens when a consumer projects her income and her non-discretionary expenses (such as mortgage payments all the way up until the end of her life, and is able to determine her household discretionary spending power over time, to achieve the smoothest living standard path possible without going into debt. When consumption smoothing is calculated accurately, a person’s lifestyle should be roughly the same whether she is in her 30s with small children, in her 50s with kids in college, or in retirement, with adult children. Consumption smoothing allows that to happen. But while it is conceptually straightforward, consumption smoothing requires the use of advanced numerical techniques. Now, Canadian families have access to a powerful consumption-smoothing tool: ESPlannerBASIC Canada. This free, secure and confidential online tool will allow Canadian families to safely and securely enter their earnings and other financial resources and will calculate for them how much they can spend and how much they should save in order to maintain their lifestyle from now until they die, without going into debt. It will also calculate how much life insurance they should buy, to ensure that household living standards are not affected after a family member dies. Users can easily and instantly run “what-if” scenarios to see how retiring early (or later, changing jobs, adjusting retirement contributions, having children, moving homes, timing RRSP withdrawals, and other financial and lifestyle decisions would

  19. The basic science of the subchondral bone

    NARCIS (Netherlands)

    Madry, Henning; van Dijk, C. Niek; Mueller-Gerbl, Magdalena

    2010-01-01

    In the past decades, considerable efforts have been made to propose experimental and clinical treatments for articular cartilage defects. Yet, the problem of cartilage defects extending deep in the underlying subchondral bone has not received adequate attention. A profound understanding of the basic

  20. Cloud computing basics

    CERN Document Server

    Srinivasan, S

    2014-01-01

    Cloud Computing Basics covers the main aspects of this fast moving technology so that both practitioners and students will be able to understand cloud computing. The author highlights the key aspects of this technology that a potential user might want to investigate before deciding to adopt this service. This book explains how cloud services can be used to augment existing services such as storage, backup and recovery. Addressing the details on how cloud security works and what the users must be prepared for when they move their data to the cloud. Also this book discusses how businesses could prepare for compliance with the laws as well as industry standards such as the Payment Card Industry.

  1. Basic semiconductor physics

    CERN Document Server

    Hamaguchi, Chihiro

    2017-01-01

    This book presents a detailed description of basic semiconductor physics. The text covers a wide range of important phenomena in semiconductors, from the simple to the advanced. Four different methods of energy band calculations in the full band region are explained: local empirical pseudopotential, non-local pseudopotential, KP perturbation and tight-binding methods. The effective mass approximation and electron motion in a periodic potential, Boltzmann transport equation and deformation potentials used for analysis of transport properties are discussed. Further, the book examines experiments and theoretical analyses of cyclotron resonance in detail. Optical and transport properties, magneto-transport, two-dimensional electron gas transport (HEMT and MOSFET) and quantum transport are reviewed, while optical transition, electron-phonon interaction and electron mobility are also addressed. Energy and electronic structure of a quantum dot (artificial atom) are explained with the help of Slater determinants. The...

  2. Basic category theory

    CERN Document Server

    Leinster, Tom

    2014-01-01

    At the heart of this short introduction to category theory is the idea of a universal property, important throughout mathematics. After an introductory chapter giving the basic definitions, separate chapters explain three ways of expressing universal properties: via adjoint functors, representable functors, and limits. A final chapter ties all three together. The book is suitable for use in courses or for independent study. Assuming relatively little mathematical background, it is ideal for beginning graduate students or advanced undergraduates learning category theory for the first time. For each new categorical concept, a generous supply of examples is provided, taken from different parts of mathematics. At points where the leap in abstraction is particularly great (such as the Yoneda lemma), the reader will find careful and extensive explanations. Copious exercises are included.

  3. Energy the basics

    CERN Document Server

    Schobert, Harold

    2013-01-01

    People rarely stop to think about where the energy they use to power their everyday lives comes from and when they do it is often to ask a worried question: is mankind's energy usage killing the planet? How do we deal with nuclear waste? What happens when the oil runs out? Energy: The Basics answers these questions but it also does much more. In this engaging yet even-handed introduction, readers are introduced to: the concept of 'energy' and what it really means the ways energy is currently generated and the sources used new and emerging energy technologies such as solar power and biofuels the impacts of energy use on the environment including climate change Featuring explanatory diagrams, tables, a glossary and an extensive further reading list, this book is the ideal starting point for anyone interested in the impact and future of the world's energy supply.

  4. Basic ionizing physic radiation

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    To become an expert in this field, radiographer must first master in radiation physics. That why the second chapter discussed on radiation physic. The topic that must covered such as atom and molecule, atomic structure, proton, isotope, half life, types of radiation and some basic formula such as formula for shielding, half life, half value layer, tenth value layer and more. All of this must be mastered by radiographer if they want to know more detail on this technique because this technique was a combination of theory and practical. Once they failed the theory they cannot go further on this technique. And to master this technique, once cannot depend on theory only. So, for this technique theory and practical must walk together.

  5. 15. Basic economic indicators

    International Nuclear Information System (INIS)

    Carless, J.; Dow, B.; Farivari, R.; O'Connor, J.; Fox, T.; Tunstall, D.; Mentzingen, M.

    1992-01-01

    The clear value of economic data and analysis to decisionmakers has motivated them to mandate the creation of extensive global economic data sets. This chapter contains a set of these basic economic data, which provides the context for understanding the causes and the consequences of many of the decisions that affect the world's resources. Many traditional economic indicators fail to account for the depletion or deterioration of natural resources, the long-term consequences of such depletion, the equitable distribution of income within a country, or the sustainability of current economic practices. The type of measurement shown here, however, is still useful in showing the great differences between the wealthiest and the poorest countries. Tables are given on the following: Gross national product and official development assistance 1969-89; External debt indicators 1979-89; Central government expenditures; and World commodity indexes and prices 1975-89

  6. Basic engineering mathematics

    CERN Document Server

    Bird, John

    2014-01-01

    Introductory mathematics written specifically for students new to engineering Now in its sixth edition, Basic Engineering Mathematics is an established textbook that has helped thousands of students to succeed in their exams. John Bird's approach is based on worked examples and interactive problems. This makes it ideal for students from a wide range of academic backgrounds as the student can work through the material at their own pace. Mathematical theories are explained in a straightforward manner, being supported by practical engineering examples and applications in order to ensure that readers can relate theory to practice. The extensive and thorough topic coverage makes this an ideal text for introductory level engineering courses. This title is supported by a companion website with resources for both students and lecturers, including lists of essential formulae, multiple choice tests, full solutions for all 1,600 further questions contained within the practice exercises, and biographical information on t...

  7. Basic real analysis

    CERN Document Server

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....

  8. Magnetism basics and applications

    CERN Document Server

    Stefanita, Carmen-Gabriela

    2012-01-01

    This textbook is aimed at engineering students who are likely to come across magnetics applications in their professional practice. Whether designing lithography equipment containing ferromagnetic brushes, or detecting defects in aeronautics, some basic knowledge of 21st century magnetism is needed. From the magnetic tape on the pocket credit card to the read head in a personal computer, people run into magnetism in many products. Furthermore, in a variety of disciplines tools of the trade exploit magnetic principles, and many interdisciplinary laboratory research areas cross paths with magnetic phenomena that may seem mysterious to the untrained mind. Therefore, this course offers a broad coverage of magnetism topics encountered more often in this millenium, revealing key concepts on which many practical applications rest. Some traditional subjects in magnetism are discussed in the first half of the book, followed by areas likely to spark the curiosity of those more interested in today’s technological achi...

  9. Atomic Basic Blocks

    Science.gov (United States)

    Scheler, Fabian; Mitzlaff, Martin; Schröder-Preikschat, Wolfgang

    Die Entscheidung, einen zeit- bzw. ereignisgesteuerten Ansatz für ein Echtzeitsystem zu verwenden, ist schwierig und sehr weitreichend. Weitreichend vor allem deshalb, weil diese beiden Ansätze mit äußerst unterschiedlichen Kontrollflussabstraktionen verknüpft sind, die eine spätere Migration zum anderen Paradigma sehr schwer oder gar unmöglich machen. Wir schlagen daher die Verwendung einer Zwischendarstellung vor, die unabhängig von der jeweils verwendeten Kontrollflussabstraktion ist. Für diesen Zweck verwenden wir auf Basisblöcken basierende Atomic Basic Blocks (ABB) und bauen darauf ein Werkzeug, den Real-Time Systems Compiler (RTSC) auf, der die Migration zwischen zeit- und ereignisgesteuerten Systemen unterstützt.

  10. Basic Phage Mathematics.

    Science.gov (United States)

    Abedon, Stephen T; Katsaounis, Tena I

    2018-01-01

    Basic mathematical descriptions are useful in phage ecology, applied phage ecology such as in the course of phage therapy, and also toward keeping track of expected phage-bacterial interactions as seen during laboratory manipulation of phages. The most basic mathematical descriptor of phages is their titer, that is, their concentration within stocks, experimental vessels, or other environments. Various phenomena can serve to modify phage titers, and indeed phage titers can vary as a function of how they are measured. An important aspect of how changes in titers can occur results from phage interactions with bacteria. These changes tend to vary in degree as a function of bacterial densities within environments, and particularly densities of those bacteria that are susceptible to or at least adsorbable by a given phage type. Using simple mathematical models one can describe phage-bacterial interactions that give rise particularly to phage adsorption events. With elaboration one can consider changes in both phage and bacterial densities as a function of both time and these interactions. In addition, phages along with their impact on bacteria can be considered as spatially constrained processes. In this chapter we consider the simpler of these concepts, providing in particular detailed verbal explanations toward facile mathematical insight. The primary goal is to stimulate a more informed use and manipulation of phages and phage populations within the laboratory as well as toward more effective phage application outside of the laboratory, such as during phage therapy. More generally, numerous issues and approaches to the quantification of phages are considered along with the quantification of individual, ecological, and applied properties of phages.

  11. Basic Energy Sciences at NREL

    International Nuclear Information System (INIS)

    Moon, S.

    2000-01-01

    NREL's Center for Basic Sciences performs fundamental research for DOE's Office of Science. Our mission is to provide fundamental knowledge in the basic sciences and engineering that will underpin new and improved renewable energy technologies

  12. Basics of geomatics

    CERN Document Server

    Gomarasca, Mario A

    2009-01-01

    In a systematic way, this volume addresses the complex topics and techniques covered under Geomatics. Abundantly illustrated, it presents a comprehensive and complete treatment of its subject. A detailed bibliography is included with each chapter.

  13. Incorporation of constructivist assumptions into problem-based instruction: a literature review.

    Science.gov (United States)

    Kantar, Lina

    2014-05-01

    The purpose of this literature review was to explore the use of distinct assumptions of constructivism when studying the impact of problem-based learning (PBL) on learners in undergraduate nursing programs. Content analysis research technique. The literature review included information retrieved from sources selected via electronic databases, such as EBSCOhost, ProQuest, Sage Publications, SLACK Incorporation, Springhouse Corporation, and Digital Dissertations. The literature review was conducted utilizing key terms and phrases associated with problem-based learning in undergraduate nursing education. Out of the 100 reviewed abstracts, only 15 studies met the inclusion criteria for the review. Four constructivist assumptions based the review process allowing for analysis and evaluation of the findings, followed by identification of issues and recommendations for the discipline and its research practice in the field of PBL. This literature review provided evidence that the nursing discipline is employing PBL in its programs, yet with limited data supporting conceptions of the constructivist perspective underlying this pedagogical approach. Three major issues were assessed and formed the basis for subsequent recommendations: (a) limited use of a theoretical framework and absence of constructivism in most of the studies, (b) incompatibility between research measures and research outcomes, and (c) brief exposure to PBL during which the change was measured. Educators have made the right choice in employing PBL as a pedagogical practice, yet the need to base implementation on constructivism is mandatory if the aim is a better preparation of graduates for practice. Undeniably there is limited convincing evidence regarding integration of constructivism in nursing education. Research that assesses the impact of PBL on learners' problem-solving and communication skills, self-direction, and motivation is paramount. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. CRITICAL ASSUMPTIONS IN THE F-TANK FARM CLOSURE OPERATIONAL DOCUMENTATION REGARDING WASTE TANK INTERNAL CONFIGURATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Hommel, S.; Fountain, D.

    2012-03-28

    The intent of this document is to provide clarification of critical assumptions regarding the internal configurations of liquid waste tanks at operational closure, with respect to F-Tank Farm (FTF) closure documentation. For the purposes of this document, FTF closure documentation includes: (1) Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the FTF PA) (SRS-REG-2007-00002), (2) Basis for Section 3116 Determination for Closure of F-Tank Farm at the Savannah River Site (DOE/SRS-WD-2012-001), (3) Tier 1 Closure Plan for the F-Area Waste Tank Systems at the Savannah River Site (SRR-CWDA-2010-00147), (4) F-Tank Farm Tanks 18 and 19 DOE Manual 435.1-1 Tier 2 Closure Plan Savannah River Site (SRR-CWDA-2011-00015), (5) Industrial Wastewater Closure Module for the Liquid Waste Tanks 18 and 19 (SRRCWDA-2010-00003), and (6) Tank 18/Tank 19 Special Analysis for the Performance Assessment for the F-Tank Farm at the Savannah River Site (hereafter referred to as the Tank 18/Tank 19 Special Analysis) (SRR-CWDA-2010-00124). Note that the first three FTF closure documents listed apply to the entire FTF, whereas the last three FTF closure documents listed are specific to Tanks 18 and 19. These two waste tanks are expected to be the first two tanks to be grouted and operationally closed under the current suite of FTF closure documents and many of the assumptions and approaches that apply to these two tanks are also applicable to the other FTF waste tanks and operational closure processes.

  15. Robust inference in summary data Mendelian randomization via the zero modal pleiotropy assumption.

    Science.gov (United States)

    Hartwig, Fernando Pires; Davey Smith, George; Bowden, Jack

    2017-12-01

    Mendelian randomization (MR) is being increasingly used to strengthen causal inference in observational studies. Availability of summary data of genetic associations for a variety of phenotypes from large genome-wide association studies (GWAS) allows straightforward application of MR using summary data methods, typically in a two-sample design. In addition to the conventional inverse variance weighting (IVW) method, recently developed summary data MR methods, such as the MR-Egger and weighted median approaches, allow a relaxation of the instrumental variable assumptions. Here, a new method - the mode-based estimate (MBE) - is proposed to obtain a single causal effect estimate from multiple genetic instruments. The MBE is consistent when the largest number of similar (identical in infinite samples) individual-instrument causal effect estimates comes from valid instruments, even if the majority of instruments are invalid. We evaluate the performance of the method in simulations designed to mimic the two-sample summary data setting, and demonstrate its use by investigating the causal effect of plasma lipid fractions and urate levels on coronary heart disease risk. The MBE presented less bias and lower type-I error rates than other methods under the null in many situations. Its power to detect a causal effect was smaller compared with the IVW and weighted median methods, but was larger than that of MR-Egger regression, with sample size requirements typically smaller than those available from GWAS consortia. The MBE relaxes the instrumental variable assumptions, and should be used in combination with other approaches in sensitivity analyses. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  16. An Extension to Deng's Entropy in the Open World Assumption with an Application in Sensor Data Fusion.

    Science.gov (United States)

    Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S

    2018-06-11

    Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  17. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    Science.gov (United States)

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  18. An Extension to Deng’s Entropy in the Open World Assumption with an Application in Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Yongchuan Tang

    2018-06-01

    Full Text Available Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.

  19. Testing the assumption of normality in body sway area calculations during unipedal stance tests with an inertial sensor.

    Science.gov (United States)

    Kyoung Jae Kim; Lucarevic, Jennifer; Bennett, Christopher; Gaunaurd, Ignacio; Gailey, Robert; Agrawal, Vibhor

    2016-08-01

    The quantification of postural sway during the unipedal stance test is one of the essentials of posturography. A shift of center of pressure (CoP) is an indirect measure of postural sway and also a measure of a person's ability to maintain balance. A widely used method in laboratory settings to calculate the sway of body center of mass (CoM) is through an ellipse that encloses 95% of CoP trajectory. The 95% ellipse can be computed under the assumption that the spatial distribution of the CoP points recorded from force platforms is normal. However, to date, this assumption of normality has not been demonstrated for sway measurements recorded from a sacral inertial measurement unit (IMU). This work provides evidence for non-normality of sway trajectories calculated at a sacral IMU with injured subjects as well as healthy subjects.

  20. Incorporating assumption deviation risk in quantitative risk assessments: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Khorsandi, Jahon; Aven, Terje

    2017-01-01

    Quantitative risk assessments (QRAs) of complex engineering systems are based on numerous assumptions and expert judgments, as there is limited information available for supporting the analysis. In addition to sensitivity analyses, the concept of assumption deviation risk has been suggested as a means for explicitly considering the risk related to inaccuracies and deviations in the assumptions, which can significantly impact the results of the QRAs. However, challenges remain for its practical implementation, considering the number of assumptions and magnitude of deviations to be considered. This paper presents an approach for integrating an assumption deviation risk analysis as part of QRAs. The approach begins with identifying the safety objectives for which the QRA aims to support, and then identifies critical assumptions with respect to ensuring the objectives are met. Key issues addressed include the deviations required to violate the safety objectives, the uncertainties related to the occurrence of such events, and the strength of knowledge supporting the assessments. Three levels of assumptions are considered, which include assumptions related to the system's structural and operational characteristics, the effectiveness of the established barriers, as well as the consequence analysis process. The approach is illustrated for the case of an offshore installation. - Highlights: • An approach for assessing the risk of deviations in QRA assumptions is presented. • Critical deviations and uncertainties related to their occurrence are addressed. • The analysis promotes critical thinking about the foundation and results of QRAs. • The approach is illustrated for the case of an offshore installation.

  1. The Atomic energy basic law

    International Nuclear Information System (INIS)

    1979-01-01

    The law aims to secure future energy resources, push forward progress of science and advancement of industry for welfare of the mankind and higher standard of national life by helping research, development and utilization of atomic power. Research, development and utilization of atomic power shall be limited to the peaceful purpose with emphasis laid on safety and carried on independently under democratic administration. Basic concepts and terms are defined, such as: atomic power; nuclear fuel material; nuclear raw material; reactor and radiation. The Atomic Energy Commission and the Atomic Energy Safety Commission shall be set up at the Prime Minister's Office deliberately to realize national policy of research, development and utilization of atomic power and manage democratic administration for atomic energy. The Atomic Energy Commission shall plan, consider and decide matters concerning research, development and utilization of atomic energy. The Atomic Energy Safety Commission shall plan, consider and decide issues particularly concerning safety securing among such matters. The Atomic Energy Research Institute shall be founded under the governmental supervision to perform research, experiment and other necessary affairs for development of atomic energy. The Power Reactor and Nuclear Fuel Development Corporation shall be established likewise to develop fast breeding reactor, advanced thermal reactor and nuclear fuel materials. Development of radioactive minerals, control of nuclear fuel materials and reactors and measures for patent and invention concerning atomic energy, etc. are stipulated respectively. (Okada, K.)

  2. [Spirometry - basic examination of the lung function].

    Science.gov (United States)

    Kociánová, Jana

    Spirometry is one of the basic internal examination methods, similarly as e.g. blood pressure measurement or ECG recording. It is used to detect or assess the extent of ventilatory disorders. Indications include respiratory symptoms or laboratory anomalies, smoking, inhalation risks and more. Its performance and evaluation should be among the basic skills of pulmonologists, internists, alergologists, pediatricians and sports physicians. The results essentially influence the correct diagnosing and treatment method. Therefore spirometry must be performed under standardized conditions and accurately and clearly assessed to enable answering clinical questions.Key words: acceptability - calibration - contraindication - evaluation - indication - parameters - spirometry - standardization.

  3. Basic concepts of epidemiology

    International Nuclear Information System (INIS)

    Savitz, D.A.

    1984-01-01

    Epidemiology can be defined simply as the science of the distribution and determinants of disease in human populations. As a descriptive tool, epidemiology can aid health care service providers, for example, in allocation of resources. In its analytic capacity, the epidemiologic approach can help identify determinants of disease through the study of human populations. Epidemiology is primarily an observational rather than experimental methodology, with corresponding strengths and limitations. Relative to other approaches for assessing disease etiology and impacts of potential health hazards, epidemiology has a rather unique role that is complementary to, but independent of, both basic biologic sciences and clinical medicine. Experimental biologic sciences such as toxicology and physiology provide critical information on biologic mechanisms of disease required for causal inference. Clinical medicine often serves as the warning system that provides etiologic clues to be pursued through systematic investigation. The advantage of the epidemiologic approach is its reliance on human field experience, that is, the real world. While laboratory experimentation is uniquely well suited to defining potential hazards, it can neither determine whether human populations have actually been affected nor quantify that effect. Building all the complexities of human behavior and external factors into a laboratory study or mathematical model is impossible. By studying the world as it exists, epidemiology examines the integrated, summarized product of the myriad factors influencing health

  4. Basic operator theory

    CERN Document Server

    Gohberg, Israel

    2001-01-01

    rii application of linear operators on a Hilbert space. We begin with a chapter on the geometry of Hilbert space and then proceed to the spectral theory of compact self adjoint operators; operational calculus is next presented as a nat­ ural outgrowth of the spectral theory. The second part of the text concentrates on Banach spaces and linear operators acting on these spaces. It includes, for example, the three 'basic principles of linear analysis and the Riesz­ Fredholm theory of compact operators. Both parts contain plenty of applications. All chapters deal exclusively with linear problems, except for the last chapter which is an introduction to the theory of nonlinear operators. In addition to the standard topics in functional anal­ ysis, we have presented relatively recent results which appear, for example, in Chapter VII. In general, in writ­ ing this book, the authors were strongly influenced by re­ cent developments in operator theory which affected the choice of topics, proofs and exercises. One ...

  5. Basics of aerothermodynamics

    CERN Document Server

    Hirschel, Ernst Heinrich

    2015-01-01

    This successful book gives an introduction to the basics of aerothermodynamics, as applied in particular to winged re-entry vehicles and airbreathing hypersonic cruise and acceleration vehicles. The book gives a review of the issues of transport of momentum, energy and mass, real-gas effects as well as inviscid and viscous flow phenomena. In this second, revised edition the chapters with the classical topics of aerothermodynamics more or less were left untouched. The access to some single topics of practical interest was improved. Auxiliary chapters were put into an appendix. The recent successful flights of the X-43A and the X-51A indicate that the dawn of sustained airbreathing hypersonic flight now has arrived. This proves that the original approach of the book to put emphasis on viscous effects and the aerothermodynamics of radiation-cooled vehicle surfaces was timely. This second, revised edition even more accentuates these topics. A new, additional chapter treats examples of viscous thermal surface eff...

  6. Nanodesign: some basic questions

    CERN Document Server

    Schommers, Wolfram

    2013-01-01

    There is no doubt that nanoscience will be the dominant direction for technology in this century, and that this science will influence our lives to a large extent as well as open completely new perspectives on all scientific and technological disciplines. To be able to produce optimal nanosystems with tailor-made properties, it is necessary to analyze and construct such systems in advance by adequate theoretical and computational methods. Since we work in nanoscience and nanotechnology at the ultimate level, we have to apply the basic laws of physics. What methods and tools are relevant here? The book gives an answer to this question. The background of the theoretical methods and tools is critically discussed, and also the world view on which these physical laws are based. Such a debate is not only of academic interest but is of highly general concern, and this is because we constantly move in nanoscience and nanotechnology between two extreme poles, between infinite life and total destruction . On the one ...

  7. Basic Data on Biogas

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    Renewable gases such as biogas and biomethane are considered as key energy carrier when the society is replacing fossil fuels with renewable alternatives. In Sweden, almost 80 % of the fossil fuels are used in the transport sector. Therefore, the focus in Sweden has been to use the produced biogas in this sector as vehicle gas. Basic Data on Biogas contains an overview of production, utilisation, climate effects etc. of biogas from a Swedish perspective. The purpose is to give an easy overview of the current situation in Sweden for politicians, decision makers and interested public. 1.4 TWh of biogas is produced annually in Sweden at approximately 230 facilities. The 135 wastewater treatment plants that produce biogas contribute with around half of the production. In order to reduce the sludge volume, biogas has been produced at wastewater treatment plants for decades. New biogas plants are mainly co-digestion plants and farm plants. The land filling of organic waste has been banned since 2005, thus the biogas produced in landfills is decreasing.

  8. Discrete Neural Signatures of Basic Emotions.

    Science.gov (United States)

    Saarimäki, Heini; Gotsopoulos, Athanasios; Jääskeläinen, Iiro P; Lampinen, Jouko; Vuilleumier, Patrik; Hari, Riitta; Sams, Mikko; Nummenmaa, Lauri

    2016-06-01

    Categorical models of emotions posit neurally and physiologically distinct human basic emotions. We tested this assumption by using multivariate pattern analysis (MVPA) to classify brain activity patterns of 6 basic emotions (disgust, fear, happiness, sadness, anger, and surprise) in 3 experiments. Emotions were induced with short movies or mental imagery during functional magnetic resonance imaging. MVPA accurately classified emotions induced by both methods, and the classification generalized from one induction condition to another and across individuals. Brain regions contributing most to the classification accuracy included medial and inferior lateral prefrontal cortices, frontal pole, precentral and postcentral gyri, precuneus, and posterior cingulate cortex. Thus, specific neural signatures across these regions hold representations of different emotional states in multimodal fashion, independently of how the emotions are induced. Similarity of subjective experiences between emotions was associated with similarity of neural patterns for the same emotions, suggesting a direct link between activity in these brain regions and the subjective emotional experience. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Three-class ROC analysis--the equal error utility assumption and the optimality of three-class ROC surface using the ideal observer.

    Science.gov (United States)

    He, Xin; Frey, Eric C

    2006-08-01

    Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.

  10. The Basketball warms-ups - theoretical assumptions and practical solutions

    Directory of Open Access Journals (Sweden)

    Sebastian Łubiński

    2017-06-01

    Full Text Available Many authors emphasize the importance of warm-up. Warm-up in team games aims at enhancing the body adaptation to the physical activity and to activate physiological functions from the rest state to the active state. Warm-up brings many different benefits, for example: physiological, psychological, and preventive, regardless of the classification of the above. From a psychological standpoint, the warm-up is performed to create the body "alertness", activity and readiness, and a willingness to act effectively. It was found that the players who perform the correct warm-up are better mentally prepared than those who do not perform it. After a well performed warm-up, the athlete is self-confident and has a positive attitude to the match. It is believed that the warm-up can also be the way to relieve tension and anxiety and to increase concentration and motivation before the match. Warm-up also improves the emotional states and reduces fear of failure. It has been verified that the warm-up, performed under appropriate conditions, improves focus, visual perception, action accuracy, self-confidence, speed and responsiveness, speed of processing and decision making. From the physiological point of view, the warm-up is an activity that adapts the basketball player’s body to an effort. It is an important factor that affects the effect of participation in the competition. Data from the literature suggest that the warm-up individualization is necessary in terms of duration and intensity. There are two types of warm-ups: passive and active. Passive warm-up is the one that is performed by using hot showers, baths, saunas, and steam baths or by using energetics massage. Active warm-up requires a lot of commitment and determination from the athlete during exercises that prepare the body and muscles for an effort. The training measures used during this part of warm-up are the general exercises that improve strength, stretch, coordination

  11. Basic Retention Mechanisms

    DEFF Research Database (Denmark)

    Jensen, Bror Skytte; Jensen, H.

    1986-01-01

    The effect of multiple cation competition on the adsorption of Sr onto two synthetic ion-exchange resins, i. e. DOWEX 50W and DOWEX CCR-2, as well as onto the clay mineral, kaolinite has been studied. The results for DOWEX 50W, and under certain experimental conditions also for DOWEX CCR-2 were...

  12. Basic ionizing radiation symbol

    International Nuclear Information System (INIS)

    1987-01-01

    A description is given of the standard symbol for ionizing radiation and of the conditions under which it should not be used. The Arabic equivalent of some English technical terms in this subject is given in one page. 1 ref., 1 fig

  13. Dialogic or Dialectic? The Significance of Ontological Assumptions in Research on Educational Dialogue

    Science.gov (United States)

    Wegerif, Rupert

    2008-01-01

    This article explores the relationship between ontological assumptions and studies of educational dialogue through a focus on Bakhtin's "dialogic". The term dialogic is frequently appropriated to a modernist framework of assumptions, in particular the neo-Vygotskian or sociocultural tradition. However, Vygotsky's theory of education is dialectic,…

  14. 7 CFR 772.10 - Transfer and assumption-AMP loans.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Transfer and assumption-AMP loans. 772.10 Section 772..., DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS SERVICING MINOR PROGRAM LOANS § 772.10 Transfer and assumption—AMP loans. (a) Eligibility. The Agency may approve transfers and assumptions of AMP loans when: (1) The...

  15. A Proposal for Testing Local Realism Without Using Assumptions Related to Hidden Variable States

    Science.gov (United States)

    Ryff, Luiz Carlos

    1996-01-01

    A feasible experiment is discussed which allows us to prove a Bell's theorem for two particles without using an inequality. The experiment could be used to test local realism against quantum mechanics without the introduction of additional assumptions related to hidden variables states. Only assumptions based on direct experimental observation are needed.

  16. Evaluating growth assumptions using diameter or radial increments in natural even-aged longleaf pine

    Science.gov (United States)

    John C. Gilbert; Ralph S. Meldahl; Jyoti N. Rayamajhi; John S. Kush

    2010-01-01

    When using increment cores to predict future growth, one often assumes future growth is identical to past growth for individual trees. Once this assumption is accepted, a decision has to be made between which growth estimate should be used, constant diameter growth or constant basal area growth. Often, the assumption of constant diameter growth is used due to the ease...

  17. Sensitivity of TRIM projections to management, harvest, yield, and stocking adjustment assumptions.

    Science.gov (United States)

    Susan J. Alexander

    1991-01-01

    The Timber Resource Inventory Model (TRIM) was used to make several projections of forest industry timber supply for the Douglas-fir region. The sensitivity of these projections to assumptions about management and yields is discussed. A base run is compared to runs in which yields were altered, stocking adjustment was eliminated, harvest assumptions were changed, and...

  18. Recognising the Effects of Costing Assumptions in Educational Business Simulation Games

    Science.gov (United States)

    Eckardt, Gordon; Selen, Willem; Wynder, Monte

    2015-01-01

    Business simulations are a powerful way to provide experiential learning that is focussed, controlled, and concentrated. Inherent in any simulation, however, are numerous assumptions that determine feedback, and hence the lessons learnt. In this conceptual paper we describe some common cost assumptions that are implicit in simulation design and…

  19. Investigating Teachers' and Students' Beliefs and Assumptions about CALL Programme at Caledonian College of Engineering

    Science.gov (United States)

    Ali, Holi Ibrahim Holi

    2012-01-01

    This study is set to investigate students' and teachers' perceptions and assumptions about newly implemented CALL Programme at the School of Foundation Studies, Caledonian College of Engineering, Oman. Two versions of questionnaire were administered to 24 teachers and 90 students to collect their beliefs and assumption about CALL programame. The…

  20. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan; Almasri, Islam; Shi, Yuexiang; Gao, Xin

    2014-01-01

    of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue

  1. Implicit Assumptions in Special Education Policy: Promoting Full Inclusion for Students with Learning Disabilities

    Science.gov (United States)

    Kirby, Moira

    2017-01-01

    Introduction: Everyday millions of students in the United States receive special education services. Special education is an institution shaped by societal norms. Inherent in these norms are implicit assumptions regarding disability and the nature of special education services. The two dominant implicit assumptions evident in the American…

  2. Sensitivity of the OMI ozone profile retrieval (OMO3PR) to a priori assumptions

    NARCIS (Netherlands)

    Mielonen, T.; De Haan, J.F.; Veefkind, J.P.

    2014-01-01

    We have assessed the sensitivity of the operational OMI ozone profile retrieval (OMO3PR) algorithm to a number of a priori assumptions. We studied the effect of stray light correction, surface albedo assumptions and a priori ozone profiles on the retrieved ozone profile. Then, we studied how to

  3. The Arundel Assumption And Revision Of Some Large-Scale Maps ...

    African Journals Online (AJOL)

    The rather common practice of stating or using the Arundel Assumption without reference to appropriate mapping standards (except mention of its use for graphical plotting) is a major cause of inaccuracies in map revision. This paper describes an investigation to ascertain the applicability of the Assumption to the revision of ...

  4. The Role of Policy Assumptions in Validating High-stakes Testing Programs.

    Science.gov (United States)

    Kane, Michael

    L. Cronbach has made the point that for validity arguments to be convincing to diverse audiences, they need to be based on assumptions that are credible to these audiences. The interpretations and uses of high stakes test scores rely on a number of policy assumptions about what should be taught in schools, and more specifically, about the content…

  5. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  6. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Science.gov (United States)

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  7. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Directory of Open Access Journals (Sweden)

    Anja F. Ernst

    2017-05-01

    Full Text Available Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  8. Significant and Basic Innovations in Urban Planning

    Science.gov (United States)

    Kolyasnikov, V. A.

    2017-11-01

    The article considers the development features of the innovative urban planning in the USSR and Russia in XVIII - XX centuries. Innovative urban planning is defined as an activity on innovations creation and their implementation to obtain a socio-economic, political, environmental or other effect. In the course of urban development history this activity represents a cyclic wave process in which there are phases of rise and fall. The study of cyclic waves in the development of innovative urban planning uses the concept of basic and epochal innovations selection. This concept was developed by scientists for the study of cyclic wave processes in economics. Its adaptation to the conditions of innovative urban planning development allows one to introduce the concept of “basic innovation” and “significant innovation” in the theory and practice of settlement formation and their systems as well as to identify opportunities to highlight these innovations in the history of Russian urban planning. From these positions, six innovation waves committed to the urban development over the past 300 years are being investigated. The observed basic innovations in the domestic urban area show that urban development is a vital area for ensuring the country’s geopolitical security. Basic innovations are translated in time and modernized under new conditions of urban planning development. In this regard, we can predict the development of four basic innovations in post-Soviet Russia.

  9. Environmental protection - can it be regarded as a basic right

    International Nuclear Information System (INIS)

    Soell, H.

    1986-01-01

    The question of the necessity of an 'environmental basic right' is to be seen in connection with the doctrine of the duty of the State to protect the basic rights. Under the present law this obligation of the State applies only to third party intervention, it does not take effect if it is a matter of protecting the environment as such. Therefore the introduction of an 'environmental basic right' is necessary. (WG) [de

  10. Action Relations. Basic Design Concepts for Behaviour Modelling and Refinement.

    OpenAIRE

    Quartel, Dick

    1998-01-01

    This thesis presents basic design concepts, design methods and a basic design language for distributed system behaviours. This language is based on two basic concepts: the action concept and the causality relation concept. Our methods focus on behaviour refinement, which consists of replacing an abstract behaviour by a more concrete behaviour, such that the concrete behaviour conforms to the abstract behaviour. An important idea underlying this thesis is that an effective design methodology s...

  11. Was sind, und wie wirken Grundüberzeugungen in unserer Zeit? Über „Paradigmen“ und „Paradigmenveränderungen“ in der heutigen politischen und sozialen Sphäre – und die Folgen. Ein Gespräch mit Roland Benedikter, Stanford Universität. English summary included. What are basic assumptions, and which effects do they have in our time? On “paradigms” and “paradigm change” in the contemporary political and social domain, and the consequences. A conversation with Roland Benedikter, Stanford University.

    Directory of Open Access Journals (Sweden)

    Roland Benedikter

    2011-10-01

    Full Text Available English Summary: This talk clarifies what is meant by the pervasive but seldom-precise use of the term “paradigm change.” While it appears that this term is often (unwillingly misused particularly by integral and progressive intellectuals and civil society groups as an instrument of predicting future cultural change, it is argued that it should rather be used as a tool of analysis of the past and the present of basic cultural and scientific convictions that dominate their times. In fact, a “paradigm” is defined as a collective bias (or, to use a more technical explanation, a “knowledge-constituting collective prejudice” on certain issues. It defines the validity of what is meant to be true, and what to be false, and what can be accepted as valid, and what not, in a given society at a given time for a given period. A “paradigm” is always functioning (a as a “constitutive paradox” because its claim is to define what is true and what not, but at the same time it is continuously replaced by new paradigms that coin different definitions – thus contradicting the very essence of “paradigm” as such; and (b by incubation periods, i.e., by phases where different claims on what is valid coexist or even form hybrids among them. In the end, “paradigms” are something irrational and in most cases un- or half-conscious cultural formations; but they seem to exist in every period of cultural development. This talk explains the mechanisms of how dominating cultural biases become “paradigms” in order to rule temporarily over the academic and political correctness of their times; and how and to which extent the one-sided “paradigm fetishism” of the epoch of “postmodernity” is currently coming at its end, with new, more integrative and integral blueprints arising that are in their majority trying to balance the prevailing “paradigmatic” nominalism with new, empirical forms of neo-essentialism and neo

  12. Visual Basic 2012 programmer's reference

    CERN Document Server

    Stephens, Rod

    2012-01-01

    The comprehensive guide to Visual Basic 2012 Microsoft Visual Basic (VB) is the most popular programming language in the world, with millions of lines of code used in businesses and applications of all types and sizes. In this edition of the bestselling Wrox guide, Visual Basic expert Rod Stephens offers novice and experienced developers a comprehensive tutorial and reference to Visual Basic 2012. This latest edition introduces major changes to the Visual Studio development platform, including support for developing mobile applications that can take advantage of the Windows 8 operating system

  13. Revealing patterns of cultural transmission from frequency data: equilibrium and non-equilibrium assumptions

    Science.gov (United States)

    Crema, Enrico R.; Kandler, Anne; Shennan, Stephen

    2016-12-01

    A long tradition of cultural evolutionary studies has developed a rich repertoire of mathematical models of social learning. Early studies have laid the foundation of more recent endeavours to infer patterns of cultural transmission from observed frequencies of a variety of cultural data, from decorative motifs on potsherds to baby names and musical preferences. While this wide range of applications provides an opportunity for the development of generalisable analytical workflows, archaeological data present new questions and challenges that require further methodological and theoretical discussion. Here we examine the decorative motifs of Neolithic pottery from an archaeological assemblage in Western Germany, and argue that the widely used (and relatively undiscussed) assumption that observed frequencies are the result of a system in equilibrium conditions is unwarranted, and can lead to incorrect conclusions. We analyse our data with a simulation-based inferential framework that can overcome some of the intrinsic limitations in archaeological data, as well as handle both equilibrium conditions and instances where the mode of cultural transmission is time-variant. Results suggest that none of the models examined can produce the observed pattern under equilibrium conditions, and suggest. instead temporal shifts in the patterns of cultural transmission.

  14. The European Water Framework Directive: How Ecological Assumptions Frame Technical and Social Change

    Directory of Open Access Journals (Sweden)

    Patrick Steyaert

    2007-06-01

    Full Text Available The European Water Framework Directive (WFD is built upon significant cognitive developments in the field of ecological science but also encourages active involvement of all interested parties in its implementation. The coexistence in the same policy text of both substantive and procedural approaches to policy development stimulated this research as did our concerns about the implications of substantive ecological visions within the WFD policy for promoting, or not, social learning processes through participatory designs. We have used a qualitative analysis of the WFD text which shows the ecological dimension of the WFD dedicates its quasi-exclusive attention to a particular current of thought in ecosystems science focusing on ecosystems status and stability and considering human activities as disturbance factors. This particular worldview is juxtaposed within the WFD with a more utilitarian one that gives rise to many policy exemptions without changing the general underlying ecological model. We discuss these policy statements in the light of the tension between substantive and procedural policy developments. We argue that the dominant substantive approach of the WFD, comprising particular ecological assumptions built upon "compositionalism," seems to be contradictory with its espoused intention of involving the public. We discuss that current of thought in regard to more functionalist thinking and adaptive management, which offers greater opportunities for social learning, i.e., place a set of interdependent stakeholders in an intersubjective position in which they operate a "social construction" of water problems through the co-production of knowledge.

  15. 5 CFR 534.603 - Rates of basic pay.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Rates of basic pay. 534.603 Section 534.603 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY UNDER OTHER SYSTEMS Pay for Administrative Appeals Judge Positions § 534.603 Rates of basic pay. (a) The...

  16. Metrology concept design of the GAIA basic angle monitoring system

    NARCIS (Netherlands)

    Veggel, van A.A.; Vink, H.J.P.; Rosielle, P.C.J.N.; Nijmeijer, H.; Wielders, A.A.; Antebi, J.; Lemke, D.

    2004-01-01

    The GAIA satellite, scheduled for launch in 2010, will make a highly accurate map of our Galaxy. It will measure the position of stars with an accuracy of 50 prad using two telescopes, which are positioned under a 'basic' angle between the the lines-of-sight of the telescopes of 106°. With a Basic

  17. The crux of the method: assumptions in ordinary least squares and logistic regression.

    Science.gov (United States)

    Long, Rebecca G

    2008-10-01

    Logistic regression has increasingly become the tool of choice when analyzing data with a binary dependent variable. While resources relating to the technique are widely available, clear discussions of why logistic regression should be used in place of ordinary least squares regression are difficult to find. The current paper compares and contrasts the assumptions of ordinary least squares with those of logistic regression and explains why logistic regression's looser assumptions make it adept at handling violations of the more important assumptions in ordinary least squares.

  18. Stop-loss premiums under dependence

    NARCIS (Netherlands)

    Albers, Willem/Wim

    1999-01-01

    Stop-loss premiums are typically calculated under the assumption that the insured lives in the underlying portfolio are independent. Here we study the effects of small departures from this assumption. Using Edgeworth expansions, it is made transparent which configurations of dependence parameters

  19. Basic Energy Sciences at NREL

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.

    2000-12-04

    NREL's Center for Basic Sciences performs fundamental research for DOE's Office of Science. Our mission is to provide fundamental knowledge in the basic sciences and engineering that will underpin new and improved renewable energy technologies.

  20. BASIC Instructional Program: System Documentation.

    Science.gov (United States)

    Dageforde, Mary L.

    This report documents the BASIC Instructional Program (BIP), a "hands-on laboratory" that teaches elementary programming in the BASIC language, as implemented in the MAINSAIL language, a machine-independent revision of SAIL which should facilitate implementation of BIP on other computing systems. Eight instructional modules which make up…

  1. Solar Photovoltaic Technology Basics | NREL

    Science.gov (United States)

    Photovoltaic Technology Basics Solar Photovoltaic Technology Basics Solar cells, also called found in sand) created an electric charge when exposed to sunlight. Soon solar cells were being used to power space satellites and smaller items like calculators and watches. Photo of a large silicon solar

  2. Solar Process Heat Basics | NREL

    Science.gov (United States)

    Process Heat Basics Solar Process Heat Basics Commercial and industrial buildings may use the same solar technologies-photovoltaics, passive heating, daylighting, and water heating-that are used for residential buildings. These nonresidential buildings can also use solar energy technologies that would be

  3. Basics of LASIK Eye Surgery

    Science.gov (United States)

    ... Vea esta página en español The Basics of LASIK Eye Surgery Share This Page Facebook Twitter Linked- ... Surgery Surgical Alternatives to LASIK For More Information  LASIK Basics If you wear glasses or contact lenses, ...

  4. Fuel Cell Vehicle Basics | NREL

    Science.gov (United States)

    Fuel Cell Vehicle Basics Fuel Cell Vehicle Basics Researchers are developing fuel cells that can be silver four-door sedan being driven on a roadway and containing the words "hydrogen fuel cell electric" across the front and rear doors. This prototype hydrogen fuel cell electric vehicle was

  5. Children and Their Basic Needs.

    Science.gov (United States)

    Prince, Debra Lindsey; Howard, Esther M.

    2002-01-01

    Describes obstacles presented by poverty in the fulfillment of the basic needs of children. Individually addresses Maslow's five basic needs with regard to children reared in poverty: (1) physiological needs; (2) safety needs; (3) belonging and love needs; (4) self-esteem needs; and (5) self-actualization needs. (Author/SD)

  6. Shake gas. Basic information

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-08-15

    The ongoing improvement of production technologies has enabled access to unconventional gas resources present in source rocks. Whether Poland is going to see a gas revolution depends chiefly on the geological conditions. At this point it is difficult to estimate the actual size of Poland's shale gas resources and commercialization of shale gas production. First results will be known in the next four or five years, when operators complete the work under exploration and appraisal licences granted to them by the Ministry of the Environment. Polish government is offering licences on exceptionally favourable terms as an incentive for research on unconventional gas resources. Such an approach is driven by the strategic objective of ending Poland's reliance on foreign sources of natural gas in the future. Shale gas will not change Poland's and the region's energy landscape instantaneously. As in the case of all commodity and energy revolutions, changes occur slowly, but shale gas development offers huge opportunities for a permanent shift in the Polish and European energy sectors. Poland stands a chance of becoming fully independent on natural gas imports, and Polish companies - a chance of improving their international standing.

  7. Intestinal Permeability: The Basics

    Directory of Open Access Journals (Sweden)

    Ingvar Bjarnason

    1995-01-01

    Full Text Available The authors review some of the more fundamental principles underlying the noninvasive assessment of intestinal permeability in humans, the choice of test markers and their analyses, and the practical aspects of test dose composition and how these can be changed to allow the specific assessment of regional permeability changes and other intestinal functions. The implications of increased intestinal permeability in the pathogenesis of human disease is discussed in relation to findings in patients with Crohn’s disease. A common feature of increased intestinal permeability is the development of a low grade enteropathy, and while quantitatively similar changes may be found in Crohn’s disease these seem to predict relapse of disease. Moreover, factors associated with relapse of Crohn’s disease have in common an action to increase intestinal permeability. While increased intestinal permeability does not seem to be important in the etiology of Crohn’s disease it may be a central mechanism in the clinical relapse of disease.

  8. A criterion of orthogonality on the assumption and restrictions in subgrid-scale modelling of turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Fang, L. [LMP, Ecole Centrale de Pékin, Beihang University, Beijing 100191 (China); Co-Innovation Center for Advanced Aero-Engine, Beihang University, Beijing 100191 (China); Sun, X.Y. [LMP, Ecole Centrale de Pékin, Beihang University, Beijing 100191 (China); Liu, Y.W., E-mail: liuyangwei@126.com [National Key Laboratory of Science and Technology on Aero-Engine Aero-Thermodynamics, School of Energy and Power Engineering, Beihang University, Beijing 100191 (China); Co-Innovation Center for Advanced Aero-Engine, Beihang University, Beijing 100191 (China)

    2016-12-09

    In order to shed light on understanding the subgrid-scale (SGS) modelling methodology, we analyze and define the concepts of assumption and restriction in the modelling procedure, then show by a generalized derivation that if there are multiple stationary restrictions in a modelling, the corresponding assumption function must satisfy a criterion of orthogonality. Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion. This study is expected to inspire future research on generally guiding the SGS modelling methodology. - Highlights: • The concepts of assumption and restriction in the SGS modelling procedure are defined. • A criterion of orthogonality on the assumption and restrictions is derived. • Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion.

  9. Who needs the assumption of opportunistic behavior? Transaction cost economics does not!

    DEFF Research Database (Denmark)

    Koch, Carsten Allan

    2000-01-01

    The assumption of opportunistic behavior, familiar from transaction cost economics, has been and remains highly controversial. But opportunistic behavior, albeit undoubtedly an extremely important form of motivation, is not a necessary condition for the contractual problems studied by transaction...

  10. Bayou Corne sinkhole : control measurements of State Highway 70 in Assumption Parish, Louisiana, tech summary.

    Science.gov (United States)

    2014-01-01

    The sinkhole located in Assumption Parish, Louisiana, threatens the stability of Highway 70, a state maintained route. In order to : mitigate the potential damaging e ects of the sinkhole on this infrastructure, the Louisiana Department of Transpo...

  11. Bayou Corne sinkhole : control measurements of State Highway 70 in Assumption Parish, Louisiana.

    Science.gov (United States)

    2014-01-01

    This project measured and assessed the surface stability of the portion of LA Highway 70 that is : potentially vulnerable to the Assumption Parish sinkhole. Using Global Positioning Systems (GPS) : enhanced by a real-time network (RTN) of continuousl...

  12. Bayou Corne Sinkhole: Control Measurements of State Highway 70 in Assumption Parish, Louisiana : Research Project Capsule

    Science.gov (United States)

    2012-09-01

    The sinkhole located in northern Assumption Parish, Louisiana, threatens : the stability of Highway 70, a state-maintained route. In order to monitor : and mitigate potential damage eff ects on this infrastructure, the Louisiana : Department of Trans...

  13. Washington International Renewable Energy Conference (WIREC) 2008 Pledges. Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bilello, Daniel E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cowlin, Shannon C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wise, Alison [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2008-08-01

    This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions resulting from more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy.

  14. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    NARCIS (Netherlands)

    Ernst, Anja F.; Albers, Casper J.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated

  15. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    OpenAIRE

    Hazim Adnan Hashim; Rosli Bin Talif; Lina Hameed Ali

    2016-01-01

    The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led man...

  16. Testing the rationality assumption using a design difference in the TV game show 'Jeopardy'

    OpenAIRE

    Sjögren Lindquist, Gabriella; Säve-Söderbergh, Jenny

    2006-01-01

    Abstract This paper empirically investigates the rationality assumption commonly applied in economic modeling by exploiting a design difference in the game-show Jeopardy between the US and Sweden. In particular we address the assumption of individuals’ capabilities to process complex mathematical problems to find optimal strategies. The vital difference is that US contestants are given explicit information before they act, while Swedish contestants individually need to calculate the same info...

  17. Basic entwinements: unassuming analogue inserts in basic digital modeling (courses)

    DEFF Research Database (Denmark)

    Wiesner, Thomas

    2012-01-01

    Ubiquitous, basic digital modelling tools are currently deployed with relative ease in architecture schools during the course of first year studies. While these first architectural projects essays sometimes communicate matter with already quite impressive professional outlooks, a certain disparit...

  18. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  19. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    Science.gov (United States)

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    Directory of Open Access Journals (Sweden)

    Hazim Adnan Hashim

    2016-09-01

    Full Text Available The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led many individuals to build new kind of beliefs and assumptions about themselves and the world. Many writers have written about the human ordeals that followed this incident. Don DeLillo’s Falling Man reflects the traumatic repercussions of this disaster on Americans’ fundamental assumptions. The objective of this study is to examine the novel from the traumatic perspective that has afflicted the victims’ fundamental understandings of the world and the self. Individuals’ fundamental understandings could be changed or modified due to exposure to certain types of events like war, terrorism, political violence or even the sense of alienation. The Assumptive World theory of Ronnie Janoff-Bulman will be used as a framework to study the traumatic experience of the characters in Falling Man. The significance of the study lies in providing a new perception to the field of trauma that can help trauma victims to adopt alternative assumptions or reshape their previous ones to heal from traumatic effects.

  1. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  2. Comparisons between a new point kernel-based scheme and the infinite plane source assumption method for radiation calculation of deposited airborne radionuclides from nuclear power plants.

    Science.gov (United States)

    Zhang, Xiaole; Efthimiou, George; Wang, Yan; Huang, Meng

    2018-04-01

    Radiation from the deposited radionuclides is indispensable information for environmental impact assessment of nuclear power plants and emergency management during nuclear accidents. Ground shine estimation is related to multiple physical processes, including atmospheric dispersion, deposition, soil and air radiation shielding. It still remains unclear that whether the normally adopted "infinite plane" source assumption for the ground shine calculation is accurate enough, especially for the area with highly heterogeneous deposition distribution near the release point. In this study, a new ground shine calculation scheme, which accounts for both the spatial deposition distribution and the properties of air and soil layers, is developed based on point kernel method. Two sets of "detector-centered" grids are proposed and optimized for both the deposition and radiation calculations to better simulate the results measured by the detectors, which will be beneficial for the applications such as source term estimation. The evaluation against the available data of Monte Carlo methods in the literature indicates that the errors of the new scheme are within 5% for the key radionuclides in nuclear accidents. The comparisons between the new scheme and "infinite plane" assumption indicate that the assumption is tenable (relative errors within 20%) for the area located 1 km away from the release source. Within 1 km range, the assumption mainly causes errors for wet deposition and the errors are independent of rain intensities. The results suggest that the new scheme should be adopted if the detectors are within 1 km from the source under the stable atmosphere (classes E and F), or the detectors are within 500 m under slightly unstable (class C) or neutral (class D) atmosphere. Otherwise, the infinite plane assumption is reasonable since the relative errors induced by this assumption are within 20%. The results here are only based on theoretical investigations. They should

  3. A critical evaluation of the local-equilibrium assumption in modeling NAPL-pool dissolution

    Science.gov (United States)

    Seagren, Eric A.; Rittmann, Bruce E.; Valocchi, Albert J.

    1999-07-01

    An analytical modeling analysis was used to assess when local equilibrium (LE) and nonequilibrium (NE) modeling approaches may be appropriate for describing nonaqueous-phase liquid (NAPL) pool dissolution. NE mass-transfer between NAPL pools and groundwater is expected to affect the dissolution flux under conditions corresponding to values of Sh'St (the modified Sherwood number ( Lxkl/ Dz) multiplied by the Stanton number ( kl/ vx))≈400, the NE and LE solutions converge, and the LE assumption is appropriate. Based on typical groundwater conditions, many cases of interest are expected to fall in this range. The parameter with the greatest impact on Sh'St is kl. The NAPL pool mass-transfer coefficient correlation of Pfannkuch [Pfannkuch, H.-O., 1984. Determination of the contaminant source strength from mass exchange processes at the petroleum-ground-water interface in shallow aquifer systems. In: Proceedings of the NWWA/API Conference on Petroleum Hydrocarbons and Organic Chemicals in Ground Water—Prevention, Detection, and Restoration, Houston, TX. Natl. Water Well Assoc., Worthington, OH, Nov. 1984, pp. 111-129.] was evaluated using the toluene pool data from Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.]. Dissolution flux predictions made with kl calculated using the Pfannkuch correlation were similar to the LE model predictions, and deviated systematically from predictions made using the average overall kl=4.76 m/day estimated by Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.] and from the experimental data for vx>18 m/day. The Pfannkuch correlation kl was too large for vx>≈10 m/day, possibly because of the relatively low Peclet number data used by Pfannkuch [Pfannkuch, H.-O., 1984. Determination

  4. Revisiting the Operating Room Basics

    Directory of Open Access Journals (Sweden)

    Tushar Chakravorty

    2015-12-01

    Full Text Available Young doctors walking into the operating room are eager to develop their skills to become efficient and knowledgeable professionals in future. But precious little is done to actively develop the basic practical skills of the budding doctors. They remain unaware about the layout of the operating room, the OR etiquette and often do not have sound scientific understanding and importance of meticulous execution of the basic operating room protocols. This article stresses the need to develop the basics of OR protocol and to improve the confidence of the young doctor by strengthening his foundation by showing him that attention to the basics of medical care and empathy for the patient can really make a difference to the outcome of a treatment.

  5. New Federalism: Back to Basics.

    Science.gov (United States)

    Durenberger, Dave

    1983-01-01

    The senator explains the basic concepts of New Federalism, including a rethinking of responsibilities and intergovernmental relations and a reconsideration of the role of state and local government. (SK)

  6. Basic statements of relativity theory

    Directory of Open Access Journals (Sweden)

    Wolfgang Muschik

    2010-04-01

    Full Text Available Some basic statements of relativity theory, starting out with geometry and observers up to Einstein's field equations, are collected in a systematical order without any proof, to serve as a short survey of tools and results.

  7. Dental Health: The Basic Facts

    Science.gov (United States)

    Dental Health THE BASIC FACTS MULTIPLE SCLEROSIS Kim, diagnosed in 1986 People with a chronic disease may neglect their general health and wellness, research shows. Dental care is no exception. A tendency to focus ...

  8. Transforming Defense Basic Research Strategy

    National Research Council Canada - National Science Library

    Fountain, Augustus W

    2004-01-01

    ... technologies for development. With a basic research budget less than half that of the National Science Foundation and a mere fraction that of the NIH the DoD can no longer afford to pursue lofty science education goals...

  9. Transforming Defense Basic Research Strategy

    National Research Council Canada - National Science Library

    Fountain, Augustus W

    2004-01-01

    .... Public funding of basic research for the DoD during the Cold War was successful because it minimized risk through taking maximum advantage of long term research projects that produced rather mature...

  10. Basic hypergeometry of supersymmetric dualities

    Energy Technology Data Exchange (ETDEWEB)

    Gahramanov, Ilmar, E-mail: ilmar.gahramanov@aei.mpg.de [Max Planck Institute for Gravitational Physics (Albert Einstein Institute), Am Mühlenberg 1, D14476 Potsdam (Germany); Institut für Physik und IRIS Adlershof, Humboldt-Universität zu Berlin, Zum Grossen Windkanal 6, D12489 Berlin (Germany); Institute of Radiation Problems ANAS, B.Vahabzade 9, AZ1143 Baku (Azerbaijan); Department of Mathematics, Khazar University, Mehseti St. 41, AZ1096, Baku (Azerbaijan); Rosengren, Hjalmar, E-mail: hjalmar@chalmers.se [Department of Mathematical Sciences, Chalmers University of Technology and University of Gothenburg, SE-412 96 Göteborg (Sweden)

    2016-12-15

    We introduce several new identities combining basic hypergeometric sums and integrals. Such identities appear in the context of superconformal index computations for three-dimensional supersymmetric dual theories. We give both analytic proofs and physical interpretations of the presented identities.

  11. Basic HIV/AIDS Statistics

    Science.gov (United States)

    ... HIV Syndicated Content Website Feedback HIV/AIDS Basic Statistics Recommend on Facebook Tweet Share Compartir HIV and ... HIV. Interested in learning more about CDC's HIV statistics? Terms, Definitions, and Calculations Used in CDC HIV ...

  12. Cloud-turbulence interactions: Sensitivity of a general circulation model to closure assumptions

    International Nuclear Information System (INIS)

    Brinkop, S.; Roeckner, E.

    1993-01-01

    Several approaches to parameterize the turbulent transport of momentum, heat, water vapour and cloud water for use in a general circulation model (GCM) have been tested in one-dimensional and three-dimensional model simulations. The schemes differ with respect to their closure assumptions (conventional eddy diffusivity model versus turbulent kinetic energy closure) and also regarding their treatment of cloud-turbulence interactions. The basis properties of these parameterizations are discussed first in column simulations of a stratocumulus-topped atmospheric boundary layer (ABL) under a strong subsidence inversion during the KONTROL experiment in the North Sea. It is found that the K-models tend to decouple the cloud layer from the adjacent layers because the turbulent activity is calculated from local variables. The higher-order scheme performs better in this respect because internally generated turbulence can be transported up and down through the action of turbulent diffusion. Thus, the TKE-scheme provides not only a better link between the cloud and the sub-cloud layer but also between the cloud and the inversion as a result of cloud-top entrainment. In the stratocumulus case study, where the cloud is confined by a pronounced subsidence inversion, increased entrainment favours cloud dilution through enhanced evaporation of cloud droplets. In the GCM study, however, additional cloud-top entrainment supports cloud formation because indirect cloud generating processes are promoted through efficient ventilation of the ABL, such as the enhanced moisture supply by surface evaporation and the increased depth of the ABL. As a result, tropical convection is more vigorous, the hydrological cycle is intensified, the whole troposphere becomes warmer and moister in general and the cloudiness in the upper part of the ABL is increased. (orig.)

  13. X-ray near-field holography. Beyond idealized assumptions of the probe

    International Nuclear Information System (INIS)

    Hagemann, Johannes

    2017-01-01

    The work at hand considers the imperfect, often neglected, aspects of X-ray nearfield phase-contrast propagation imaging, or in short: X-ray near-field holography (NFH). NFH is a X-ray microscopy technique able to yield high resolution, yet low dose imaging of a wide range of specimen. Derived from wave optical theory, propagation-based imaging methods rely on assumptions for the illuminating wave field. These are for example the assumptions of a perfect plane wave or spherical wave emanating from a point source or monochromaticity. Violation of the point source assumption implies for example at the same time the occurrence of a distorted wave front and a finite degree of coherence, both crucial for NFH. With the advances in X-ray focusing, instrumentation and X-ray wave guiding, NFH has become of high interest, since the barriers for practical implementation have been overcome. The idea of holography originates from electron microscopy to overcome the lack of high-quality electron lenses. With holography the need for optics between the specimen and detector is circumvented. The drawback, however, is that the measurement obtained at the detector is not a direct image of the specimen under survey but a ''propagated version'' of it, the so-called hologram. The problem with the optics is replaced by another problem, also referred to as the phase problem. The phase problem is caused by the fact that only the intensities of a wave field can be measured but not the phase information. The phase information is crucial for obtaining the image of the specimen and thus needs to be reconstructed. In recent years the methodology, sometimes also mythology, has been developed to reconstruct the specimen from the measured hologram. For a long time, the standard approach to deal with deviations from the ideal assumptions in real world holography experiments has been to simply ignore these. The prime example for this is the method of the standard flat

  14. X-ray near-field holography. Beyond idealized assumptions of the probe

    Energy Technology Data Exchange (ETDEWEB)

    Hagemann, Johannes

    2017-07-01

    The work at hand considers the imperfect, often neglected, aspects of X-ray nearfield phase-contrast propagation imaging, or in short: X-ray near-field holography (NFH). NFH is a X-ray microscopy technique able to yield high resolution, yet low dose imaging of a wide range of specimen. Derived from wave optical theory, propagation-based imaging methods rely on assumptions for the illuminating wave field. These are for example the assumptions of a perfect plane wave or spherical wave emanating from a point source or monochromaticity. Violation of the point source assumption implies for example at the same time the occurrence of a distorted wave front and a finite degree of coherence, both crucial for NFH. With the advances in X-ray focusing, instrumentation and X-ray wave guiding, NFH has become of high interest, since the barriers for practical implementation have been overcome. The idea of holography originates from electron microscopy to overcome the lack of high-quality electron lenses. With holography the need for optics between the specimen and detector is circumvented. The drawback, however, is that the measurement obtained at the detector is not a direct image of the specimen under survey but a ''propagated version'' of it, the so-called hologram. The problem with the optics is replaced by another problem, also referred to as the phase problem. The phase problem is caused by the fact that only the intensities of a wave field can be measured but not the phase information. The phase information is crucial for obtaining the image of the specimen and thus needs to be reconstructed. In recent years the methodology, sometimes also mythology, has been developed to reconstruct the specimen from the measured hologram. For a long time, the standard approach to deal with deviations from the ideal assumptions in real world holography experiments has been to simply ignore these. The prime example for this is the method of the standard flat

  15. Basic petroleum research. Final report

    International Nuclear Information System (INIS)

    Roesjoe, Bjarne; Stiksrud, Helge

    2004-01-01

    An overview of projects in the field of basic petroleum research (PetroForsk) is presented. A brief presentation of some of the projects is included, as well as political comments on the value of these projects. The research program Basic Petroleum Research (PetroForsk) was established in 1998 and ended in 2004. The program has been part of the Research Council of Norway's long-term effort in petroleum research (ml)

  16. Contemporary assumptions on human nature and work and approach to human potential managing

    Directory of Open Access Journals (Sweden)

    Vujić Dobrila

    2006-01-01

    Full Text Available A general problem of this research is to identify if there is a relationship between the assumption on human nature and work (Mcgregor, Argyris, Schein, Steers and Porter and a general organizational model preference, as well as a mechanism of human resource management? This research was carried out in 2005/2006. The sample consisted of 317 subjects (197 managers, 105 highly educated subordinates and 15 entrepreneurs in 7 big enterprises in a group of small business enterprises differentiating in terms of the entrepreneur’s structure and a type of activity. A general hypothesis "that assumptions on human nature and work are statistically significant in connection to the preference approach (models, of work motivation commitment", has been confirmed. A specific hypothesis have been also confirmed: ·The assumptions on a human as a rational economic being are statistically significant in correlation with only two mechanisms of traditional models, the mechanism of method work control and the working discipline mechanism. ·Statistically significant assumptions on a human as a social being are correlated with all mechanisms of engaging employees, which belong to the model of the human relations, except the mechanism introducing the adequate type of prizes for all employees independently of working results. ·The assumptions on a human as a creative being are statistically significant, positively correlating with preference of two mechanisms belonging to the human resource model by investing into education and training and making conditions for the application of knowledge and skills. The young with assumptions on a human as a creative being prefer much broader repertoire of mechanisms belonging to the human resources model from the remaining category of subjects in the pattern. The connection between the assumption on human nature and preference models of engaging appears especially in the sub-pattern of managers, in the category of young subjects

  17. Detecting and accounting for violations of the constancy assumption in non-inferiority clinical trials.

    Science.gov (United States)

    Koopmeiners, Joseph S; Hobbs, Brian P

    2018-05-01

    Randomized, placebo-controlled clinical trials are the gold standard for evaluating a novel therapeutic agent. In some instances, it may not be considered ethical or desirable to complete a placebo-controlled clinical trial and, instead, the placebo is replaced by an active comparator with the objective of showing either superiority or non-inferiority to the active comparator. In a non-inferiority trial, the experimental treatment is considered non-inferior if it retains a pre-specified proportion of the effect of the active comparator as represented by the non-inferiority margin. A key assumption required for valid inference in the non-inferiority setting is the constancy assumption, which requires that the effect of the active comparator in the non-inferiority trial is consistent with the effect that was observed in previous trials. It has been shown that violations of the constancy assumption can result in a dramatic increase in the rate of incorrectly concluding non-inferiority in the presence of ineffective or even harmful treatment. In this paper, we illustrate how Bayesian hierarchical modeling can be used to facilitate multi-source smoothing of the data from the current trial with the data from historical studies, enabling direct probabilistic evaluation of the constancy assumption. We then show how this result can be used to adapt the non-inferiority margin when the constancy assumption is violated and present simulation results illustrating that our method controls the type-I error rate when the constancy assumption is violated, while retaining the power of the standard approach when the constancy assumption holds. We illustrate our adaptive procedure using a non-inferiority trial of raltegravir, an antiretroviral drug for the treatment of HIV.

  18. Report on achievements in fiscal 1988 on research and development of the photoreactive materials under the next generation basic technology research and development project. Comprehensive surveys and researches on photoreactive materials; 1988 nendo hikari hanno zairyo no kenkyu kaihatsu seika hokokuksho. Hikari hanno zairyo sogo chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-03-01

    This paper describes the achievements in fiscal 1988 on comprehensive surveys on photoreractive materials. Based on the research and development institution for the next generation basic technology under the initiative of the Agency of Industrial Science and Technology, development is being promoted on photoreactive materials, or photochromic materials and photochemical hole burning (PHB) materials (materials in which molecular structures and assembly state can be changed reversely by actions of light, leading to expectation of application thereof to ultra-high density recording, displays at high resolution, and light switches). In order to achieve smooth and efficient promotion of the development thereof, development of the related technologies inside and outside the country was surveyed and discussed. Comprehensive surveys and researches were also carried out on the common basic technologies by looking to the well-informed people for guidance on the research. This fiscal year has held joint committee meetings to which lecturers were invited to discuss respective problems in the research and development trends. In addition, as the domestic survey, the status of research and development at the Tsukuba University was surveyed on photo-responsive polymeric gels. Furthermore, as the survey on the overseas technological trends, information related to photoreactive materials was collected mainly on photochemical burning materials in the U.S.A., and the research trends were surveyed. (NEDO)

  19. Nonenzymatic glycosylation of bovine myelin basic protein

    International Nuclear Information System (INIS)

    Hitz, J.B.

    1987-01-01

    In the CNS myelin sheath the nonenzymatic glycosylation reaction (at the early stage of the Amadori product) occurs only with the myelin basic protein and not with the other myelin proteins. This was observed in isolated bovine myelin by in vitro incubation with [ 14 C]-galactose and [ 14 C]-glucose. The respective in-vitro incorporation rates for purified bovine myelin basic protein with D-galactose, D-glucose and D-mannose were 7.2, 2.4 and 2.4 mmoles/mole myelin basic protein per day at 37 0 C. A more rapid, HPLC method was devised and characterized to specifically analyze for the Amadori product. The HPLC method was correlated to the [ 14 C]-sugar incorporation method for myelin basic protein under a set of standard reaction conditions using [ 14 C]-glucose and [ 14 C]-mannose with HPLC values at 1/6 and 1/5 of the [ 14 C]-sugar incorporation method. A novel myelin basic protein purification step has been developed that yields a relativity proteolytic free preparation that is easy to work with, being totally soluble at a neutral pH. Nine new spots appear for a trypsinized glycosylated MBP in the paper peptide map of which eight correspond to positions of the [ 3 H]-labeled Amadori product in affinity isolated peptides. These studies provide a general characterization of and a structural basis for investigations on nonenzymatically glycosylated MBP as well as identifying MBP as the only nonenzymatically glycosylated protein in the CNS myelin sheath which may accumulate during aging, diabetes, and demyelinating diseases in general

  20. Basic properties of fuel determining its behavior under irradiation

    International Nuclear Information System (INIS)

    Konovalov, I.I.

    2000-01-01

    The theoretical model describing a swelling of nuclear fuel at low irradiation temperatures is considered. The critical physical parameters of substances determining behavior of point defects, gas fission atoms, dislocation density, nucleation and growth of gas-contained pores are determined. The correlation between meanings of critical parameters and physical properties of substance is offered. The accounts of swelling of various dense fuels with reference to work in conditions of research reactors are given. (author)