WorldWideScience

Sample records for evaluating basic assumptions

  1. Helping Students to Recognize and Evaluate an Assumption in Quantitative Reasoning: A Basic Critical-Thinking Activity with Marbles and Electronic Balance

    Science.gov (United States)

    Slisko, Josip; Cruz, Adrian Corona

    2013-01-01

    There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…

  2. Basic concepts and assumptions behind the new ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1979-01-01

    A review is given of some of the basic concepts and assumptions behind the current recommendations by the International Commission on Radiological Protection in ICRP Publications 26 and 28, which form the basis for the revision of the Basic Safety Standards jointly undertaken by IAEA, ILO, NEA and WHO. Special attention is given to the assumption of a linear, non-threshold dose-response relationship for stochastic radiation effects such as cancer and hereditary harm. The three basic principles of protection are discussed: justification of practice, optimization of protection and individual risk limitation. In the new ICRP recommendations particular emphasis is given to the principle of keeping all radiation doses as low as is reasonably achievable. A consequence of this is that the ICRP dose limits are now given as boundary conditions for the justification and optimization procedures rather than as values that should be used for purposes of planning and design. The fractional increase in total risk at various ages after continuous exposure near the dose limits is given as an illustration. The need for taking other sources, present and future, into account when applying the dose limits leads to the use of the commitment concept. This is briefly discussed as well as the new quantity, the effective dose equivalent, introduced by ICRP. (author)

  3. Do unreal assumptions pervert behaviour?

    DEFF Research Database (Denmark)

    Petersen, Verner C.

    of the basic assumptions underlying the theories found in economics. Assumptions relating to the primacy of self-interest, to resourceful, evaluative, maximising models of man, to incentive systems and to agency theory. The major part of the paper then discusses how these assumptions and theories may pervert......-interested way nothing will. The purpose of this paper is to take a critical look at some of the assumptions and theories found in economics and discuss their implications for the models and the practices found in the management of business. The expectation is that the unrealistic assumptions of economics have...... become taken for granted and tacitly included into theories and models of management. Guiding business and manage¬ment to behave in a fashion that apparently makes these assumptions become "true". Thus in fact making theories and models become self-fulfilling prophecies. The paper elucidates some...

  4. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    Science.gov (United States)

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a

  5.  Basic assumptions and definitions in the analysis of financial leverage

    Directory of Open Access Journals (Sweden)

    Tomasz Berent

    2015-12-01

    Full Text Available The financial leverage literature has been in a state of terminological chaos for decades as evidenced, for example, by the Nobel Prize Lecture mistake on the one hand, and the global financial crisis on the other. A meaningful analysis of the leverage phenomenon calls for the formulation of a coherent set of assumptions and basic definitions. The objective of the paper is to answer this call. The paper defines leverage as a value neutral concept useful in explaining the magnification effect exerted by financial activity upon the whole spectrum of financial results. By adopting constructivism as a methodological approach, we are able to introduce various types of leverage such as capital and income, base and non-base, accounting and market value, for levels and for distances (absolute and relative, costs and simple etc. The new definitions formulated here are subsequently adopted in the analysis of the content of leverage statements used by the leading finance textbook.

  6. Basic Assumptions of the New Price System and Supplements to the Tariff System for Electricity Sale

    International Nuclear Information System (INIS)

    Klepo, M.

    1995-01-01

    The article outlines some basic assumptions of the new price system and major elements of the latest proposition for the changes and supplements to the Tariff system for Electricity Sale in the Republic of Croatia, including the analysis of those elements which brought about the present unfavourable and non-productive relations within the electric power system. The paper proposes measures and actions which should by means of a price system and tariff policy improve the present unfavourable relations and their consequences and achieve a desirable consumption structure and characteristics, resulting in rational management and effective power supply-economy relationships within the electric power system as a subsystem of the power supply sector. (author). 2 refs., 3 figs., 4 tabs

  7. E-Basics: Online Basic Training in Program Evaluation

    Science.gov (United States)

    Silliman, Ben

    2016-01-01

    E-Basics is an online training in program evaluation concepts and skills designed for youth development professionals, especially those working in nonformal science education. Ten hours of online training in seven modules is designed to prepare participants for mentoring and applied practice, mastery, and/or team leadership in program evaluation.…

  8. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  9. Measuring Productivity Change without Neoclassical Assumptions: A Conceptual Analysis

    NARCIS (Netherlands)

    B.M. Balk (Bert)

    2008-01-01

    textabstractThe measurement of productivity change (or difference) is usually based on models that make use of strong assumptions such as competitive behaviour and constant returns to scale. This survey discusses the basics of productivity measurement and shows that one can dispense with most if not

  10. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  11. Multiverse Assumptions and Philosophy

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2018-02-01

    Full Text Available Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong topics such as: infinity, duplicate yous, hypothetical fields, more than three space dimensions, Hilbert space, advanced civilizations, and reality established by mathematical relationships. It is easy to confuse multiverse proposals because many divergent models exist. This overview defines the characteristics of eleven popular multiverse proposals. The characteristics compared are: initial conditions, values of constants, laws of nature, number of space dimensions, number of universes, and fine tuning explanations. Future scientific experiments may validate selected assumptions; but until they do, proposals by philosophers may be as valid as theoretical scientific theories.

  12. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Determining Bounds on Assumption Errors in Operational Analysis

    Directory of Open Access Journals (Sweden)

    Neal M. Bengtson

    2014-01-01

    Full Text Available The technique of operational analysis (OA is used in the study of systems performance, mainly for estimating mean values of various measures of interest, such as, number of jobs at a device and response times. The basic principles of operational analysis allow errors in assumptions to be quantified over a time period. The assumptions which are used to derive the operational analysis relationships are studied. Using Karush-Kuhn-Tucker (KKT conditions bounds on error measures of these OA relationships are found. Examples of these bounds are used for representative performance measures to show limits on the difference between true performance values and those estimated by operational analysis relationships. A technique for finding tolerance limits on the bounds is demonstrated with a simulation example.

  14. Natural and laboratory OSL growth curve–Verification of the basic assumption of luminescence dating

    International Nuclear Information System (INIS)

    Kijek, N.; Chruścińska, A.

    2016-01-01

    The basic assumption of luminescence dating is the equality between the growth curve of OSL generated by the natural radiation and the OSL growth curve reconstructed in laboratory conditions. The dose rates that generate the OSL in nature and in laboratory experiments differ by about ten orders of magnitude. Recently some discrepancies between the natural and laboratory growth curves have been observed. It is important to establish their reasons in order to introduce appropriate correction into the OSL dating protocol or to find a test that allows to eliminate the samples which should not be used for dating. For this purpose, both growth curves, natural and laboratory, were reconstructed by means of computer simulations of the processes occurring in the sample during its deposition time in environment as well as those which occur in a laboratory during dating procedure. The simulations were carried out for three models including one shallow trap, two OSL traps, one disconnected deep and one luminescence center. The OSL model for quartz can be more complex than the one used in the presented simulations, but in spite of that the results show effects of growth curves discrepancies similar to those observed in experiments. It is clear that the consistency of growth curves is not a general feature of the OSL processes, but rather a result of an advantageous configuration of trap parameters. The deep disconnected traps play the key role and their complete filling before the zeroing of OSL signal is a necessary condition of the growth curves' consistency. - Highlights: • Process of OSL growth curve generation in nature and in laboratory was simulated. • Discrepancies between the natural and the laboratory growth curves are observed. • Deep disconnected traps play the key role in growth curve inequality. • Empty deep traps before zeroing of OSL cause the inequality of growth curves.

  15. A PSA study for the SMART basic design

    International Nuclear Information System (INIS)

    Han, Sang Hoon; Kim, H. C.; Yang, S. H.; Lee, D. J.

    2002-03-01

    SMART (System-Integrated Modular Advanced Reactor) is under development that is an advanced integral type small and medium category nuclear power reactor with the rated thermal power of 330 MW. A Probabilistic Safety Analysis (PSA) for the SMART basic design has been performed to evaluate the safety and optimize the design. Currently, the basic design is done and the detailed design is not available for the SMART, we made several assumptions about the system design before performing the PSA. The scope of the PSA was limited to the Level-1 internal full power PSA. The level-2 and 3 PSA, the external PSA, and the low power/shutdown PSA will be performed in the final design stage

  16. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    ) model~\\cite{borges99data}. These techniques typically rely on the \\textit{Markov assumption with history depth} $n$, i.e., it is assumed that the next requested page is only dependent on the last $n$ pages visited. This is not always valid, i.e. false browsing patterns may be discovered. However, to our...

  17. The Immoral Assumption Effect: Moralization Drives Negative Trait Attributions.

    Science.gov (United States)

    Meindl, Peter; Johnson, Kate M; Graham, Jesse

    2016-04-01

    Jumping to negative conclusions about other people's traits is judged as morally bad by many people. Despite this, across six experiments (total N = 2,151), we find that multiple types of moral evaluations--even evaluations related to open-mindedness, tolerance, and compassion--play a causal role in these potentially pernicious trait assumptions. Our results also indicate that moralization affects negative-but not positive-trait assumptions, and that the effect of morality on negative assumptions cannot be explained merely by people's general (nonmoral) preferences or other factors that distinguish moral and nonmoral traits, such as controllability or desirability. Together, these results suggest that one of the more destructive human tendencies--making negative assumptions about others--can be caused by the better angels of our nature. © 2016 by the Society for Personality and Social Psychology, Inc.

  18. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    Science.gov (United States)

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  19. Evaluating methodological assumptions of a catch-curve survival estimation of unmarked precocial shorebird chickes

    Science.gov (United States)

    McGowan, Conor P.; Gardner, Beth

    2013-01-01

    Estimating productivity for precocial species can be difficult because young birds leave their nest within hours or days of hatching and detectability thereafter can be very low. Recently, a method for using a modified catch-curve to estimate precocial chick daily survival for age based count data was presented using Piping Plover (Charadrius melodus) data from the Missouri River. However, many of the assumptions of the catch-curve approach were not fully evaluated for precocial chicks. We developed a simulation model to mimic Piping Plovers, a fairly representative shorebird, and age-based count-data collection. Using the simulated data, we calculated daily survival estimates and compared them with the known daily survival rates from the simulation model. We conducted these comparisons under different sampling scenarios where the ecological and statistical assumptions had been violated. Overall, the daily survival estimates calculated from the simulated data corresponded well with true survival rates of the simulation. Violating the accurate aging and the independence assumptions did not result in biased daily survival estimates, whereas unequal detection for younger or older birds and violating the birth death equilibrium did result in estimator bias. Assuring that all ages are equally detectable and timing data collection to approximately meet the birth death equilibrium are key to the successful use of this method for precocial shorebirds.

  20. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    Science.gov (United States)

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  1. Evaluating Basic Technology Instruction in Nigerian Secondary ...

    African Journals Online (AJOL)

    It is an important technique which when appropriately adopted results into effective teaching and learning of practical subjects. This study focused on identification of evaluating techniques aimed at improving the teaching of Basic technology in Edo State. The area of study comprises of the eighteen Local Government Areas ...

  2. Teaching and Learning Science in the 21st Century: Challenging Critical Assumptions in Post-Secondary Science

    Directory of Open Access Journals (Sweden)

    Amanda L. Glaze

    2018-01-01

    Full Text Available It is widely agreed upon that the goal of science education is building a scientifically literate society. Although there are a range of definitions for science literacy, most involve an ability to problem solve, make evidence-based decisions, and evaluate information in a manner that is logical. Unfortunately, science literacy appears to be an area where we struggle across levels of study, including with students who are majoring in the sciences in university settings. One reason for this problem is that we have opted to continue to approach teaching science in a way that fails to consider the critical assumptions that faculties in the sciences bring into the classroom. These assumptions include expectations of what students should know before entering given courses, whose responsibility it is to ensure that students entering courses understand basic scientific concepts, the roles of researchers and teachers, and approaches to teaching at the university level. Acknowledging these assumptions and the potential for action to shift our teaching and thinking about post-secondary education represents a transformative area in science literacy and preparation for the future of science as a field.

  3. Occupancy estimation and the closure assumption

    Science.gov (United States)

    Rota, Christopher T.; Fletcher, Robert J.; Dorazio, Robert M.; Betts, Matthew G.

    2009-01-01

    1. Recent advances in occupancy estimation that adjust for imperfect detection have provided substantial improvements over traditional approaches and are receiving considerable use in applied ecology. To estimate and adjust for detectability, occupancy modelling requires multiple surveys at a site and requires the assumption of 'closure' between surveys, i.e. no changes in occupancy between surveys. Violations of this assumption could bias parameter estimates; however, little work has assessed model sensitivity to violations of this assumption or how commonly such violations occur in nature. 2. We apply a modelling procedure that can test for closure to two avian point-count data sets in Montana and New Hampshire, USA, that exemplify time-scales at which closure is often assumed. These data sets illustrate different sampling designs that allow testing for closure but are currently rarely employed in field investigations. Using a simulation study, we then evaluate the sensitivity of parameter estimates to changes in site occupancy and evaluate a power analysis developed for sampling designs that is aimed at limiting the likelihood of closure. 3. Application of our approach to point-count data indicates that habitats may frequently be open to changes in site occupancy at time-scales typical of many occupancy investigations, with 71% and 100% of species investigated in Montana and New Hampshire respectively, showing violation of closure across time periods of 3 weeks and 8 days respectively. 4. Simulations suggest that models assuming closure are sensitive to changes in occupancy. Power analyses further suggest that the modelling procedure we apply can effectively test for closure. 5. Synthesis and applications. Our demonstration that sites may be open to changes in site occupancy over time-scales typical of many occupancy investigations, combined with the sensitivity of models to violations of the closure assumption, highlights the importance of properly addressing

  4. Evaluative Conditioning 2.0: Direct versus Associative Transfer of Affect to Brands

    NARCIS (Netherlands)

    S.T.L.R. Sweldens (Steven)

    2009-01-01

    textabstractA basic assumption in advertising is that brands become more well-liked after they were presented in positive contexts. This assumption is warranted because studies on ‘evaluative conditioning’ have demonstrated that when a brand is repeatedly presented together with positive affective

  5. Designing an evaluation framework for WFME basic standards for medical education.

    Science.gov (United States)

    Tackett, Sean; Grant, Janet; Mmari, Kristin

    2016-01-01

    To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.

  6. Assumptions used for evaluating the potential radiological consequences of a less of coolant accident for pressurized water reactors - June 1974

    International Nuclear Information System (INIS)

    Anon.

    1974-01-01

    Section 50.34 of 10 CFR Part 50 requires that each applicant for a construction permit or operating license provide an analysis and evaluation of the design and performance of structures, systems, and components of the facility with the objective of assessing the risk to public health and safety resulting from operation of the facility. The design basis loss of coolant accident is one of the postulated accidents used to evaluate the adequacy of these structures, systems, and components with respect to the public health and safety. This guide gives acceptable assumptions that may be used in evaluating the radiological consequences of this accident for a pressurized water reactor. In some cases, unusual site characteristics, plant design features, or other factors may require different assumptions which will be considered on an individual case basis. The Advisory Committee on Reactor Safeguards has been consulted concerning this guide and has concurred in the regulatory position

  7. Assumptions used for evaluating the potential radiological consequences of a loss of coolant accident for boiling water reactors - June 1974

    International Nuclear Information System (INIS)

    Anon.

    1974-01-01

    Section 50.34 of 10 CFR Part 50 requires that each applicant for a construction permit or operating license provide an analysis and evaluation of the design and performance of structures, systems, and components of the facility with the objective of assessing the risk to public health and safety resulting from operation of the facility. The design basis loss of coolant accident is one of the postulated accidents used to evaluate the adequacy of these structures, systems, and components with respect to the public health and safety. This guide gives acceptable assumptions that may be used in evaluating the radiological consequences of this accident for a pressurized water reactor. In some cases, unusual site characteristics, plant design features, or other factors may require different assumptions which will be considered on an individual case basis. The Advisory Committee on Reactor Safeguards has been consulted concerning this guide and has concurred in the regulatory position

  8. A basic evaluated neutronic data file for elemental scandium

    International Nuclear Information System (INIS)

    Smith, A.B.; Meadows, J.W.; Howerton, R.J.

    1992-01-01

    This report documents an evaluated neutronic data file for elemental scandium, presented in the ENDF/B-VI format. This file should provide basic nuclear data essential for neutronic calculations involving elemental scandium. No equivalent file was previously available

  9. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  10. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  11. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Validity of the mockwitness paradigm: testing the assumptions.

    Science.gov (United States)

    McQuiston, Dawn E; Malpass, Roy S

    2002-08-01

    Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.

  13. Adult Learning Assumptions

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to examine Knowles' theory of andragogy and his six assumptions of how adults learn while providing evidence to support two of his assumptions based on the theory of andragogy. As no single theory explains how adults learn, it can best be assumed that adults learn through the accumulation of formal and informal…

  14. Economic assumptions for evaluating reactor-related options for managing plutonium

    International Nuclear Information System (INIS)

    Rothwell, G.

    1996-01-01

    This paper discusses the economic assumptions in the U.S. National Academy of Sciences' report, Management and Disposition of Excess Weapons Plutonium: Reactor-Related Options (1995). It reviews the Net Present Value approach for discounting and comparing the costs and benefits of reactor-related options. It argues that because risks associated with the returns to plutonium management are unlikely to be constant over time, it is preferable to use a real risk-free rate to discount cash flows and explicitly describe the probability distributions for costs and benefits, allowing decision makers to determine the risk premium of each option. As a baseline for comparison, it assumes that one economic benefit of changing the current plutonium management system is a reduction in on-going Surveillance and Maintenance (S and M) costs. This reduction in the present value of S and M costs can be compared with the discounted costs of each option. These costs include direct construction costs, indirect costs, operating costs minus revenues, and decontamination and decommissioning expenses. The paper also discusses how to conduct an uncertainty analysis. It finishes by summarizing conclusions and recommendations and discusses how these recommendations might apply to the evaluation of Russian plutonium management options. (author)

  15. Basic conceptions for reactor pressure vessel manipulators and their evaluation

    International Nuclear Information System (INIS)

    Popp, P.

    1987-01-01

    The study deals with application fields and basic design conceptions of manipulators in reactor pressure vessels as well as their evaluation. It is shown that manipulators supported at the reactor flange have essential advantages

  16. CRITIQUES TOWARDS COSO’S ENTERPRISE RISK MANAGEMENT (ERM) FRAMEWORK IN ITS BASIC ASSUMPTIONS

    OpenAIRE

    Kurniawanti, Ika Atma

    2010-01-01

    Most professionals in internal control, risk management and other similar bailiwickshave agreed that Enterprise Risk Management discourses would’ve invariablyreferred to what the COSO had produced recently: the framework underlying ERM.But this paper takes a bit different stance that views several problematic issuesstem from unclear conceptions of either the basic premise underlying ERM or thenature of some ERM’s components outlined by COSO. This paper notes that, atleast, there are three poi...

  17. Evaluating growth assumptions using diameter or radial increments in natural even-aged longleaf pine

    Science.gov (United States)

    John C. Gilbert; Ralph S. Meldahl; Jyoti N. Rayamajhi; John S. Kush

    2010-01-01

    When using increment cores to predict future growth, one often assumes future growth is identical to past growth for individual trees. Once this assumption is accepted, a decision has to be made between which growth estimate should be used, constant diameter growth or constant basal area growth. Often, the assumption of constant diameter growth is used due to the ease...

  18. [Basic principles and methodological considerations of health economic evaluations].

    Science.gov (United States)

    Loza, Cesar; Castillo-Portilla, Manuel; Rojas, José Luis; Huayanay, Leandro

    2011-01-01

    Health Economics is an essential instrument for health management, and economic evaluations can be considered as tools assisting the decision-making process for the allocation of resources in health. Currently, economic evaluations are increasingly being used worldwide, thus encouraging evidence-based decision-making and seeking efficient and rational alternatives within the framework of health services activities. In this review, we present an overview and define the basic types of economic evaluations, with emphasis on complete Economic Evaluations (EE). In addition, we review key concepts regarding the perspectives from which EE can be conducted, the types of costs that can be considered, the time horizon, discounting, assessment of uncertainty and decision rules. Finally, we describe concepts about the extrapolation and spread of economic evaluations in health.

  19. Primary prevention in public health: an analysis of basic assumptions.

    Science.gov (United States)

    Ratcliffe, J; Wallack, L

    1985-01-01

    The common definition of primary prevention is straightforward; but how it is transformed into a framework to guide action is based on personal and societal feelings and beliefs about the basis for social organization. This article focuses on the two contending primary prevention strategies of health promotion and health protection. The contention between the two strategies stems from a basic disagreement about disease causality in modern society. Health promotion is based on the "lifestyle" theory of disease causality, which sees individual health status linked ultimately to personal decisions about diet, stress, and drug habits. Primary prevention, from this perspective, entails persuading individuals to forgo their risk-taking, self-destructive behavior. Health protection, on the other hand, is based on the "social-structural" theory of disease causality. This theory sees the health status of populations linked ultimately to the unequal distribution of social resources, industrial pollution, occupational stress, and "anti-health promotion" marketing practices. Primary prevention, from this perspective, requires changing existing social and, particularly, economic policies and structures. In order to provide a basis for choosing between these contending strategies, the demonstrated (i.e., past) impact of each strategy on the health of the public is examined. Two conclusions are drawn. First, the health promotion strategy shows little potential for improving the public health, because it systematically ignores the risk-imposing, other-destructive behavior of influential actors (policy-makers and institutions) in society. And second, effective primary prevention efforts entail an "upstream" approach that results in far-reaching sociopolitical and economic change.

  20. The assumption of linearity in soil and plant concentration ratios: an experimental evaluation

    International Nuclear Information System (INIS)

    Sheppard, S.C.; Evenden, W.G.

    1988-01-01

    We have evaluated one of the main assumptions in the use of concentration ratios to describe the transfer of elements in the environment. The ratios examined in detail were the 'concentration ratio' (CR) of leaf to soil and the 'partition coefficient' (Ksub(d)) of solid- to liquid-phase concentrations in soil. Use of these ratios implies a linear relationship between the concentrations. Soil was experimentally contaminated to evaluate this linearity over more than a 1000-fold range in concentration. A secondary objective was to determine CR and Ksub(d) values in a long-term (2 y) outdoor study using a peat soil and blueberries. The elements I, Se, Cs, Pb and U were chosen as environmentally important elements. The results indicated that relationships of leaf and leachate concentrations were not consistently linearly related to the total soil concentrations for each of the elements. The modelling difficulties implied by these concentration dependencies can be partially offset by including the strong negative correlation between CR and Ksub(d). The error introduced by using a mean value of the ratios for Se or U resulted in up to a ten-fold increase in variability for CR and a three-fold increase for Ksub(d). (author)

  1. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    International Nuclear Information System (INIS)

    Weir, Scott M.; Suski, Jamie G.; Salice, Christopher J.

    2010-01-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  2. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure.

    Science.gov (United States)

    Weir, Scott M; Suski, Jamie G; Salice, Christopher J

    2010-12-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    Energy Technology Data Exchange (ETDEWEB)

    Weir, Scott M., E-mail: scott.weir@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States); Suski, Jamie G., E-mail: jamie.suski@ttu.ed [Texas Tech University, Department of Biological Sciences, Box 43131, Lubbock, TX (United States); Salice, Christopher J., E-mail: chris.salice@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States)

    2010-12-15

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  4. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption...

  5. Quasi-experimental study designs series-paper 7: assessing the assumptions.

    Science.gov (United States)

    Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian

    2017-09-01

    Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. GPRA (Government Performance and Results Act) and research evaluation for basic science

    International Nuclear Information System (INIS)

    Takahashi, Shoji

    2002-08-01

    The purpose of the Government Performance and Results Act of 1993 (GPRA) is to ask federal agencies for evaluating their program performance especially from cost-efficiency aspect and to report to Congress. GPRA is to hold agencies accountable for their programs by requiring that they think strategically (in most cases every 5 years) and set, measure and report goals annually. The agencies which have responsibilities for enhancing basic science like Department of Energy (DOE) and National Science Fund (NSF) are not excluded by reasons of the difficulties of economic evaluations. In Japan, based on 'the Rationalization program for the public corporations' of 2001, the research developing type corporations should make a cost-performance evaluation in addition to the conventional ones. They have same theme as US agencies struggles. The purpose of this report is to get some hints for this theme by surveying GPRA reports of DOE and NSF and analyzing related information. At present, I have to conclude although everybody accepts the necessities of socio-economic evaluations and investment criteria for basic research, studies and discussions about ways and means are still continuing even in the US. (author)

  7. Assumption-versus data-based approaches to summarizing species' ranges.

    Science.gov (United States)

    Peterson, A Townsend; Navarro-Sigüenza, Adolfo G; Gordillo, Alejandro

    2018-06-01

    For conservation decision making, species' geographic distributions are mapped using various approaches. Some such efforts have downscaled versions of coarse-resolution extent-of-occurrence maps to fine resolutions for conservation planning. We examined the quality of the extent-of-occurrence maps as range summaries and the utility of refining those maps into fine-resolution distributional hypotheses. Extent-of-occurrence maps tend to be overly simple, omit many known and well-documented populations, and likely frequently include many areas not holding populations. Refinement steps involve typological assumptions about habitat preferences and elevational ranges of species, which can introduce substantial error in estimates of species' true areas of distribution. However, no model-evaluation steps are taken to assess the predictive ability of these models, so model inaccuracies are not noticed. Whereas range summaries derived by these methods may be useful in coarse-grained, global-extent studies, their continued use in on-the-ground conservation applications at fine spatial resolutions is not advisable in light of reliance on assumptions, lack of real spatial resolution, and lack of testing. In contrast, data-driven techniques that integrate primary data on biodiversity occurrence with remotely sensed data that summarize environmental dimensions (i.e., ecological niche modeling or species distribution modeling) offer data-driven solutions based on a minimum of assumptions that can be evaluated and validated quantitatively to offer a well-founded, widely accepted method for summarizing species' distributional patterns for conservation applications. © 2016 Society for Conservation Biology.

  8. Anti-Atheist Bias in the United States: Testing Two Critical Assumptions

    Directory of Open Access Journals (Sweden)

    Lawton K Swan

    2012-02-01

    Full Text Available Decades of opinion polling and empirical investigations have clearly demonstrated a pervasive anti-atheist prejudice in the United States. However, much of this scholarship relies on two critical and largely unaddressed assumptions: (a that when people report negative attitudes toward atheists, they do so because they are reacting specifically to their lack of belief in God; and (b that survey questions asking about attitudes toward atheists as a group yield reliable information about biases against individual atheist targets. To test these assumptions, an online survey asked a probability-based random sample of American adults (N = 618 to evaluate a fellow research participant (“Jordan”. Jordan garnered significantly more negative evaluations when identified as an atheist than when described as religious or when religiosity was not mentioned. This effect did not differ as a function of labeling (“atheist” versus “no belief in God”, or the amount of individuating information provided about Jordan. These data suggest that both assumptions are tenable: nonbelief—rather than extraneous connotations of the word “atheist”—seems to underlie the effect, and participants exhibited a marked bias even when confronted with an otherwise attractive individual.

  9. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed by Fro...... that there is indeed a constructive role for a wide suite of ecosystem models to evaluate fishing strategies in an ecosystem context...

  10. Sensitivity Analysis Without Assumptions.

    Science.gov (United States)

    Ding, Peng; VanderWeele, Tyler J

    2016-05-01

    Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.

  11. Basic life support: evaluation of learning using simulation and immediate feedback devices

    Directory of Open Access Journals (Sweden)

    Lucia Tobase

    2017-10-01

    Full Text Available ABSTRACT Objective: to evaluate students’ learning in an online course on basic life support with immediate feedback devices, during a simulation of care during cardiorespiratory arrest. Method: a quasi-experimental study, using a before-and-after design. An online course on basic life support was developed and administered to participants, as an educational intervention. Theoretical learning was evaluated by means of a pre- and post-test and, to verify the practice, simulation with immediate feedback devices was used. Results: there were 62 participants, 87% female, 90% in the first and second year of college, with a mean age of 21.47 (standard deviation 2.39. With a 95% confidence level, the mean scores in the pre-test were 6.4 (standard deviation 1.61, and 9.3 in the post-test (standard deviation 0.82, p <0.001; in practice, 9.1 (standard deviation 0.95 with performance equivalent to basic cardiopulmonary resuscitation, according to the feedback device; 43.7 (standard deviation 26.86 mean duration of the compression cycle by second of 20.5 (standard deviation 9.47; number of compressions 167.2 (standard deviation 57.06; depth of compressions of 48.1 millimeter (standard deviation 10.49; volume of ventilation 742.7 (standard deviation 301.12; flow fraction percentage of 40.3 (standard deviation 10.03. Conclusion: the online course contributed to learning of basic life support. In view of the need for technological innovations in teaching and systematization of cardiopulmonary resuscitation, simulation and feedback devices are resources that favor learning and performance awareness in performing the maneuvers.

  12. Questioning the foundations of physics which of our fundamental assumptions are wrong?

    CERN Document Server

    Foster, Brendan; Merali, Zeeya

    2015-01-01

    The essays in this book look at way in which the fundaments of physics might need to be changed in order to make progress towards a unified theory. They are based on the prize-winning essays submitted to the FQXi essay competition “Which of Our Basic Physical Assumptions Are Wrong?”, which drew over 270 entries. As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.” The authors of the eighteen prize-winning essays have, where necessary, adapted their essays for the present volume so as to (a) incorporate the community feedback generated in the online discussion of the essays, (b) add new material that has come to light since their completion and (c) to ensure accessibility to a broad audience of re...

  13. Monitoring Assumptions in Assume-Guarantee Contracts

    Directory of Open Access Journals (Sweden)

    Oleg Sokolsky

    2016-05-01

    Full Text Available Pre-deployment verification of software components with respect to behavioral specifications in the assume-guarantee form does not, in general, guarantee absence of errors at run time. This is because assumptions about the environment cannot be discharged until the environment is fixed. An intuitive approach is to complement pre-deployment verification of guarantees, up to the assumptions, with post-deployment monitoring of environment behavior to check that the assumptions are satisfied at run time. Such a monitor is typically implemented by instrumenting the application code of the component. An additional challenge for the monitoring step is that environment behaviors are typically obtained through an I/O library, which may alter the component's view of the input format. This transformation requires us to introduce a second pre-deployment verification step to ensure that alarms raised by the monitor would indeed correspond to violations of the environment assumptions. In this paper, we describe an approach for constructing monitors and verifying them against the component assumption. We also discuss limitations of instrumentation-based monitoring and potential ways to overcome it.

  14. Using Instrument Simulators and a Satellite Database to Evaluate Microphysical Assumptions in High-Resolution Simulations of Hurricane Rita

    Science.gov (United States)

    Hristova-Veleva, S. M.; Chao, Y.; Chau, A. H.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Martin, J. M.; Poulsen, W. L.; Rodriguez, E.; Stiles, B. W.; Turk, J.; Vu, Q.

    2009-12-01

    Improving forecasting of hurricane intensity remains a significant challenge for the research and operational communities. Many factors determine a tropical cyclone’s intensity. Ultimately, though, intensity is dependent on the magnitude and distribution of the latent heating that accompanies the hydrometeor production during the convective process. Hence, the microphysical processes and their representation in hurricane models are of crucial importance for accurately simulating hurricane intensity and evolution. The accurate modeling of the microphysical processes becomes increasingly important when running high-resolution models that should properly reflect the convective processes in the hurricane eyewall. There are many microphysical parameterizations available today. However, evaluating their performance and selecting the most representative ones remains a challenge. Several field campaigns were focused on collecting in situ microphysical observations to help distinguish between different modeling approaches and improve on the most promising ones. However, these point measurements cannot adequately reflect the space and time correlations characteristic of the convective processes. An alternative approach to evaluating microphysical assumptions is to use multi-parameter remote sensing observations of the 3D storm structure and evolution. In doing so, we could compare modeled to retrieved geophysical parameters. The satellite retrievals, however, carry their own uncertainty. To increase the fidelity of the microphysical evaluation results, we can use instrument simulators to produce satellite observables from the model fields and compare to the observed. This presentation will illustrate how instrument simulators can be used to discriminate between different microphysical assumptions. We will compare and contrast the members of high-resolution ensemble WRF model simulations of Hurricane Rita (2005), each member reflecting different microphysical assumptions

  15. Detecting and accounting for violations of the constancy assumption in non-inferiority clinical trials.

    Science.gov (United States)

    Koopmeiners, Joseph S; Hobbs, Brian P

    2018-05-01

    Randomized, placebo-controlled clinical trials are the gold standard for evaluating a novel therapeutic agent. In some instances, it may not be considered ethical or desirable to complete a placebo-controlled clinical trial and, instead, the placebo is replaced by an active comparator with the objective of showing either superiority or non-inferiority to the active comparator. In a non-inferiority trial, the experimental treatment is considered non-inferior if it retains a pre-specified proportion of the effect of the active comparator as represented by the non-inferiority margin. A key assumption required for valid inference in the non-inferiority setting is the constancy assumption, which requires that the effect of the active comparator in the non-inferiority trial is consistent with the effect that was observed in previous trials. It has been shown that violations of the constancy assumption can result in a dramatic increase in the rate of incorrectly concluding non-inferiority in the presence of ineffective or even harmful treatment. In this paper, we illustrate how Bayesian hierarchical modeling can be used to facilitate multi-source smoothing of the data from the current trial with the data from historical studies, enabling direct probabilistic evaluation of the constancy assumption. We then show how this result can be used to adapt the non-inferiority margin when the constancy assumption is violated and present simulation results illustrating that our method controls the type-I error rate when the constancy assumption is violated, while retaining the power of the standard approach when the constancy assumption holds. We illustrate our adaptive procedure using a non-inferiority trial of raltegravir, an antiretroviral drug for the treatment of HIV.

  16. Evaluation of Achievement of Universal Basic Education (UBE) in Delta State

    Science.gov (United States)

    Osadebe, P. U.

    2014-01-01

    The study evaluated the objectives of the Universal Basic Education (UBE) programme in Delta State. It considered the extent to which each objective was achieved. A research question on the extent to which the UBE objectives were achieved guided the study. Two hypotheses were tested. A sample of 300 students was randomly drawn through the use of…

  17. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other...... application domains of the theory. We argue that assumptional symmetry leads to theoretical advancement by promoting the development of theory with greater falsifiability and stronger ontological grounding. Thus, strategic management theory may be advanced by systematically searching for asymmetrical...

  18. EVALUATION OF BASIC COURSE WORKSHOP CONDUCTED IN A MEDICAL COLLEGE

    OpenAIRE

    Manasee Panda; Krishna Kar; Kaushik Mishra

    2017-01-01

    BACKGROUND Faculty development is perhaps one of the foremost issues among the factors influencing the quality of medical education. It was planned to evaluate Basic course workshop (BCW) on Medical education Technologies (MET) conducted in the institution with following objectives 1. To assess the effectiveness of the B CW in MET conducted in the Medical College. 2. To study the changes in teaching practices and assessment methods of faculties after the workshop. MATERIALS ...

  19. An evaluation of the 18- and 12-month basic postgraduate training programmes in Denmark

    DEFF Research Database (Denmark)

    Kjaer, Niels Kristian; Qvesel, Dorte; Kodal, Troels

    2010-01-01

    equipped and less ready for continued specialisation than doctors of the 18-month programme and they requested a downward adjustment of the learning objectives associated with the educational positions which follow their basic training. Physicians do not expect the increased focus on learning...... and new programmes evaluate their training, and it explores their attitudes towards the new postgraduate training programme. MATERIAL AND METHODS: We developed a questionnaire by which quantitative and qualitative data were collected. The questionnaire was sent to all physicians following basic...... and supervision to compensate for the six-month reduction of the training period. Internal medicine should be included in the basic postgraduate training of all physicians. Training in secondary as well as primary health care was requested. CONCLUSION: The young physicians were reluctant towards the new basic...

  20. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  1. A critical evaluation of the local-equilibrium assumption in modeling NAPL-pool dissolution

    Science.gov (United States)

    Seagren, Eric A.; Rittmann, Bruce E.; Valocchi, Albert J.

    1999-07-01

    An analytical modeling analysis was used to assess when local equilibrium (LE) and nonequilibrium (NE) modeling approaches may be appropriate for describing nonaqueous-phase liquid (NAPL) pool dissolution. NE mass-transfer between NAPL pools and groundwater is expected to affect the dissolution flux under conditions corresponding to values of Sh'St (the modified Sherwood number ( Lxkl/ Dz) multiplied by the Stanton number ( kl/ vx))≈400, the NE and LE solutions converge, and the LE assumption is appropriate. Based on typical groundwater conditions, many cases of interest are expected to fall in this range. The parameter with the greatest impact on Sh'St is kl. The NAPL pool mass-transfer coefficient correlation of Pfannkuch [Pfannkuch, H.-O., 1984. Determination of the contaminant source strength from mass exchange processes at the petroleum-ground-water interface in shallow aquifer systems. In: Proceedings of the NWWA/API Conference on Petroleum Hydrocarbons and Organic Chemicals in Ground Water—Prevention, Detection, and Restoration, Houston, TX. Natl. Water Well Assoc., Worthington, OH, Nov. 1984, pp. 111-129.] was evaluated using the toluene pool data from Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.]. Dissolution flux predictions made with kl calculated using the Pfannkuch correlation were similar to the LE model predictions, and deviated systematically from predictions made using the average overall kl=4.76 m/day estimated by Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.] and from the experimental data for vx>18 m/day. The Pfannkuch correlation kl was too large for vx>≈10 m/day, possibly because of the relatively low Peclet number data used by Pfannkuch [Pfannkuch, H.-O., 1984. Determination

  2. Basic Test Framework for the Evaluation of Text Line Segmentation and Text Parameter Extraction

    Directory of Open Access Journals (Sweden)

    Darko Brodić

    2010-05-01

    Full Text Available Text line segmentation is an essential stage in off-line optical character recognition (OCR systems. It is a key because inaccurately segmented text lines will lead to OCR failure. Text line segmentation of handwritten documents is a complex and diverse problem, complicated by the nature of handwriting. Hence, text line segmentation is a leading challenge in handwritten document image processing. Due to inconsistencies in measurement and evaluation of text segmentation algorithm quality, some basic set of measurement methods is required. Currently, there is no commonly accepted one and all algorithm evaluation is custom oriented. In this paper, a basic test framework for the evaluation of text feature extraction algorithms is proposed. This test framework consists of a few experiments primarily linked to text line segmentation, skew rate and reference text line evaluation. Although they are mutually independent, the results obtained are strongly cross linked. In the end, its suitability for different types of letters and languages as well as its adaptability are its main advantages. Thus, the paper presents an efficient evaluation method for text analysis algorithms.

  3. Basic life support: evaluation of learning using simulation and immediate feedback devices1.

    Science.gov (United States)

    Tobase, Lucia; Peres, Heloisa Helena Ciqueto; Tomazini, Edenir Aparecida Sartorelli; Teodoro, Simone Valentim; Ramos, Meire Bruna; Polastri, Thatiane Facholi

    2017-10-30

    to evaluate students' learning in an online course on basic life support with immediate feedback devices, during a simulation of care during cardiorespiratory arrest. a quasi-experimental study, using a before-and-after design. An online course on basic life support was developed and administered to participants, as an educational intervention. Theoretical learning was evaluated by means of a pre- and post-test and, to verify the practice, simulation with immediate feedback devices was used. there were 62 participants, 87% female, 90% in the first and second year of college, with a mean age of 21.47 (standard deviation 2.39). With a 95% confidence level, the mean scores in the pre-test were 6.4 (standard deviation 1.61), and 9.3 in the post-test (standard deviation 0.82, p basic cardiopulmonary resuscitation, according to the feedback device; 43.7 (standard deviation 26.86) mean duration of the compression cycle by second of 20.5 (standard deviation 9.47); number of compressions 167.2 (standard deviation 57.06); depth of compressions of 48.1 millimeter (standard deviation 10.49); volume of ventilation 742.7 (standard deviation 301.12); flow fraction percentage of 40.3 (standard deviation 10.03). the online course contributed to learning of basic life support. In view of the need for technological innovations in teaching and systematization of cardiopulmonary resuscitation, simulation and feedback devices are resources that favor learning and performance awareness in performing the maneuvers.

  4. Behavioural assumptions in labour economics: Analysing social security reforms and labour market transitions

    OpenAIRE

    van Huizen, T.M.

    2012-01-01

    The aim of this dissertation is to test behavioural assumptions in labour economics models and thereby improve our understanding of labour market behaviour. The assumptions under scrutiny in this study are derived from an analysis of recent influential policy proposals: the introduction of savings schemes in the system of social security. A central question is how this reform will affect labour market incentives and behaviour. Part I (Chapter 2 and 3) evaluates savings schemes. Chapter 2 exam...

  5. Wrong assumptions in the financial crisis

    NARCIS (Netherlands)

    Aalbers, M.B.

    2009-01-01

    Purpose - The purpose of this paper is to show how some of the assumptions about the current financial crisis are wrong because they misunderstand what takes place in the mortgage market. Design/methodology/approach - The paper discusses four wrong assumptions: one related to regulation, one to

  6. PFP issues/assumptions development and management planning guide

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The PFP Issues/Assumptions Development and Management Planning Guide presents the strategy and process used for the identification, allocation, and maintenance of an Issues/Assumptions Management List for the Plutonium Finishing Plant (PFP) integrated project baseline. Revisions to this document will include, as attachments, the most recent version of the Issues/Assumptions Management List, both open and current issues/assumptions (Appendix A), and closed or historical issues/assumptions (Appendix B). This document is intended be a Project-owned management tool. As such, this document will periodically require revisions resulting from improvements of the information, processes, and techniques as now described. Revisions that suggest improved processes will only require PFP management approval

  7. Basic data generation and pressure loss coefficient evaluation for HANARO core thermal-hydraulic analyses

    International Nuclear Information System (INIS)

    Chae, Hee Taek; Lee, Kye Hong

    1999-06-01

    MATRA-h, a HANARO subchannel analysis computer code, is used to evaluate thermal margin of the HANARO fuel. It's capability includes the assessments of CHF, ONB margin, and fuel temperature. In this report, basic input data and core design parameters required to perform the subchannel analysis with MATRA-h code are collected. These data include the subchannel geometric data, thermal-hydraulic correlations, empirical constants and material properties. The friction and form loss coefficients of the fuel assemblies were determined based on the results of the pressure drop test. At the same time, different form loss coefficients at the end plates and spacers are evaluated for various subchannels. The adequate correlations are applied to the evaluation of the form loss coefficients for various subchannels, which are corrected by measured values in order to have a same pressure drop at each flow channel. These basic input data and design parameters described in this report will be applied usefully to evaluate the thermal margin of the HANARO fuel. (author). 11 refs., 13 tabs., 11 figs

  8. IAEA consultants' meeting on selection of basic evaluations for the FENDL-2 library. Summary report

    International Nuclear Information System (INIS)

    Pashchenko, A.B.

    1996-09-01

    FENDL-1 is the international reference nuclear data library for fusion design applications, available from the IAEA Nuclear Data Section. FENDL/E is the sublibrary for evaluated neutron reaction data. An updated version, FENDL-2, is being developed. The present report contains the Summary of the IAEA Consultants' Meeting on ''Selection of Basic Evaluations for the FENDL-2 Library'', held at Karlsruhe, Germany, from 24 to 28 June 1996. This meeting was organized by the IAEA Nuclear Data Section (NDS) with the co-operation and assistance of local organizers of the Forschungszentrum Karlsruhe, Germany. Summarized are the conclusions and recommendations for the selection of basic evaluations from candidates submitted by five national projects (JENDL-FF, BROND, EFF, ENDF/B-VI and CENDL) for FENDL/E-2.0 international reference data library. (author). 1 tab

  9. Oil price assumptions in macroeconomic forecasts: should we follow future market expectations?

    International Nuclear Information System (INIS)

    Coimbra, C.; Esteves, P.S.

    2004-01-01

    In macroeconomic forecasting, in spite of its important role in price and activity developments, oil prices are usually taken as an exogenous variable, for which assumptions have to be made. This paper evaluates the forecasting performance of futures market prices against the other popular technical procedure, the carry-over assumption. The results suggest that there is almost no difference between opting for futures market prices or using the carry-over assumption for short-term forecasting horizons (up to 12 months), while, for longer-term horizons, they favour the use of futures market prices. However, as futures market prices reflect market expectations for world economic activity, futures oil prices should be adjusted whenever market expectations for world economic growth are different to the values underlying the macroeconomic scenarios, in order to fully ensure the internal consistency of those scenarios. (Author)

  10. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, R.S.; Alonso, D.; McKane, A.J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  11. Basic And Alternative Rules In Evaluation Of Tangible And Intangible Assets

    OpenAIRE

    Luminiţa Rus

    2010-01-01

    The purpose of this report is to bring to the forefront the basic and alternative national rules in evaluation of tangible and intangible assets approved by the Order of the Ministry of Public Finance no. 3055/2009, compared with the International Standards of Accounting matters and positioning of this accounting treatment in the context of the International Regulations. It also is reviewing fiscal influence of these valuation rules.

  12. BASIC AND ALTERNATIVE RULES IN EVALUATION OF TANGIBLE AND INTANGIBLE ASSETS

    Directory of Open Access Journals (Sweden)

    LUMINIŢA RUS

    2010-01-01

    Full Text Available The purpose of this report is to bring to the forefront the basic and alternative national rules in evaluation of tangible and intangible assets approved by the Order of the Ministry of Public Finance no. 3055/2009, compared with the International Standards of Accounting matters and positioning of this accounting treatment in the context of the International Regulations. It also is reviewing fiscal influence of these valuation rules.

  13. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    Science.gov (United States)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  14. Evaluating a hybrid web-based basic genetics course for health professionals.

    Science.gov (United States)

    Wallen, Gwenyth R; Cusack, Georgie; Parada, Suzan; Miller-Davis, Claiborne; Cartledge, Tannia; Yates, Jan

    2011-08-01

    Health professionals, particularly nurses, continue to struggle with the expanding role of genetics information in the care of their patients. This paper describes an evaluation study of the effectiveness of a hybrid basic genetics course for healthcare professionals combining web-based learning with traditional face-to-face instructional techniques. A multidisciplinary group from the National Institutes of Health (NIH) created "Basic Genetics Education for Healthcare Providers" (BGEHCP). This program combined 7 web-based self-education modules with monthly traditional face-to-face lectures by genetics experts. The course was pilot tested by 186 healthcare providers from various disciplines with 69% (n=129) of the class registrants enrolling in a pre-post evaluation trial. Outcome measures included critical thinking knowledge items and a Web-based Learning Environment Inventory (WEBLEI). Results indicated a significant (peffectiveness particularly in the area of convenience, access and the course structure and design. Although significant increases in overall knowledge scores were achieved, scores in content areas surrounding genetic risk identification and ethical issues regarding genetic testing reflected continued gaps in knowledge. Web-based genetics education may help overcome genetics knowledge deficits by providing access for health professionals with diverse schedules in a variety of national and international settings. Published by Elsevier Ltd.

  15. Protection against external impacts and missiles - Load assumption and effects on the plant design of a 1300 MW PWR-Plant

    International Nuclear Information System (INIS)

    Gremm, O.; Orth, K.H.

    1978-01-01

    The load assumptions and effects of the external impacts are given. The fundamental properties of the KWU standard design according to these impacts and the consequences for the engineering safeguards are explained. The protection against external impacts includes the protection against all external missiles. The basic measure of protection against internal missiles is the strict separation of redundancies. (author)

  16. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  17. An Evaluation of the Employee Training and Development Process for Nicolet Area Technical College's Basic Education Program.

    Science.gov (United States)

    Karl, Luis C.

    The adult basic education (ABE) program at Nicolet Area Technical College (NATC) evaluated its training and development (T&D) process for new basic education instructors. The study gathered monitoring and screening criteria that addressed valuable components for use in an instrument for validating effectiveness of the ABE program (T&D)…

  18. Effect of basic fog of medical x-ray films on image quality and patient dose-method of evaluation and steps to control

    International Nuclear Information System (INIS)

    Bohra, Reena; Nair, C.P.R.; Jayalakshmi, V.; Govindarajan, K.N.; Bhatt, B.C.

    2003-01-01

    Unacceptable basic fog of medical x-ray films has been reported recently from many hospitals. The paper presents the effect of basic fog on radiographic quality of films like sensitivity (speed), contrast and maximum density (DMax). Several batches of general- purpose medical x-ray films from five different manufacturers were studied to evaluate batch-to-batch variation in basic fog. Increase in basic fog with aging of films was also evaluated. Reasons for increased basic fog observed in the film processing facilities of a few hospitals were analysed. Factors responsible for increase in basic fog and the steps to control it have been discussed

  19. Uranium: a basic evaluation

    International Nuclear Information System (INIS)

    Crull, A.W.

    1978-01-01

    All energy sources and technologies, including uranium and the nuclear industry, are needed to provide power. Public misunderstanding of the nature of uranium and how it works as a fuel may jeopardize nuclear energy as a major option. Basic chemical facts about uranium ore and uranium fuel technology are presented. Some of the major policy decisions that must be made include the enrichment, stockpiling, and pricing of uranium. Investigations and lawsuits pertaining to uranium markets are reviewed, and the point is made that oil companies will probably have to divest their non-oil energy activities. Recommendations for nuclear policies that have been made by the General Accounting Office are discussed briefly

  20. Basic evaluation of typical nanoporous silica nanoparticles in being drug carrier: Structure, wettability and hemolysis.

    Science.gov (United States)

    Li, Jing; Guo, Yingyu

    2017-04-01

    Herein, the present work devoted to study the basic capacity of nanoporous silica nanoparticles in being drug carrier that covered structure, wettability and hemolysis so as to provide crucial evaluation. Typical nanoporous silica nanoparticles that consist of nanoporous silica nanoparticles (NSN), amino modified nanoporous silica nanoparticles (amino-NSN), carboxyl modified nanoporous silica nanoparticles (carboxyl-NSN) and hierachical nanoporous silica nanoparticles (hierachical-NSN) were studied. The results showed that their wettability and hemolysis were closely related to structure and surface modification. Basically, wettability became stronger as the amount of OH on the surface of NSN was higher. Both large nanopores and surface modification can reduce the wettability of NSN. Furthermore, NSN series were safe to be used when they circulated into the blood in low concentration, while if high concentration can not be avoided during administration, high porosity or amino modification of NSN were safer to be considered. It is believed that the basic evaluation of NSN can make contribution in providing scientific instruction for designing drug loaded NSN systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Evaluation of Some Approved Basic Science and Technology Textbooks in Use in Junior Secondary Schools in Nigeria

    Science.gov (United States)

    Nwafor, C. E.; Umoke, C. C.

    2016-01-01

    This study was designed to evaluate the content adequacy and readability of approved basic science and technology textbooks in use in junior secondary schools in Nigeria. Eight research questions guided the study. The sample of the study consisted of six (6) approved basic science and technology textbooks, 30 Junior Secondary Schools randomly…

  2. The relevance of ''theory rich'' bridge assumptions

    NARCIS (Netherlands)

    Lindenberg, S

    1996-01-01

    Actor models are increasingly being used as a form of theory building in sociology because they can better represent the caul mechanisms that connect macro variables. However, actor models need additional assumptions, especially so-called bridge assumptions, for filling in the relatively empty

  3. Assumptions and Policy Decisions for Vital Area Identification Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myungsu; Bae, Yeon-Kyoung; Lee, Youngseung [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    U.S. Nuclear Regulatory Commission and IAEA guidance indicate that certain assumptions and policy questions should be addressed to a Vital Area Identification (VAI) process. Korea Hydro and Nuclear Power conducted a VAI based on current Design Basis Threat and engineering judgement to identify APR1400 vital areas. Some of the assumptions were inherited from Probabilistic Safety Assessment (PSA) as a sabotage logic model was based on PSA logic tree and equipment location data. This paper illustrates some important assumptions and policy decisions for APR1400 VAI analysis. Assumptions and policy decisions could be overlooked at the beginning stage of VAI, however they should be carefully reviewed and discussed among engineers, plant operators, and regulators. Through APR1400 VAI process, some of the policy concerns and assumptions for analysis were applied based on document research and expert panel discussions. It was also found that there are more assumptions to define for further studies for other types of nuclear power plants. One of the assumptions is mission time, which was inherited from PSA.

  4. Underlying assumptions and core beliefs in anorexia nervosa and dieting.

    Science.gov (United States)

    Cooper, M; Turner, H

    2000-06-01

    To investigate assumptions and beliefs in anorexia nervosa and dieting. The Eating Disorder Belief Questionnaire (EDBQ), was administered to patients with anorexia nervosa, dieters and female controls. The patients scored more highly than the other two groups on assumptions about weight and shape, assumptions about eating and negative self-beliefs. The dieters scored more highly than the female controls on assumptions about weight and shape. The cognitive content of anorexia nervosa (both assumptions and negative self-beliefs) differs from that found in dieting. Assumptions about weight and shape may also distinguish dieters from female controls.

  5. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  6. Pre-equilibrium assumptions and statistical model parameters effects on reaction cross-section calculations

    International Nuclear Information System (INIS)

    Avrigeanu, M.; Avrigeanu, V.

    1992-02-01

    A systematic study on effects of statistical model parameters and semi-classical pre-equilibrium emission models has been carried out for the (n,p) reactions on the 56 Fe and 60 Co target nuclei. The results obtained by using various assumptions within a given pre-equilibrium emission model differ among them more than the ones of different models used under similar conditions. The necessity of using realistic level density formulas is emphasized especially in connection with pre-equilibrium emission models (i.e. with the exciton state density expression), while a basic support could be found only by replacement of the Williams exciton state density formula with a realistic one. (author). 46 refs, 12 figs, 3 tabs

  7. Spatial modelling of assumption of tourism development with geographic IT using

    Directory of Open Access Journals (Sweden)

    Jitka Machalová

    2010-01-01

    Full Text Available The aim of this article is to show the possibilities of spatial modelling and analysing of assumptions of tourism development in the Czech Republic with the objective to make decision-making processes in tourism easier and more efficient (for companies, clients as well as destination managements. The development and placement of tourism depend on the factors (conditions that influence its application in specific areas. These factors are usually divided into three groups: selective, localization and realization. Tourism is inseparably connected with space – countryside. The countryside can be modelled and consecutively analysed by the means of geographical information technologies. With the help of spatial modelling and following analyses the localization and realization conditions in the regions of the Czech Republic have been evaluated. The best localization conditions have been found in the Liberecký region. The capital city of Prague has negligible natural conditions; however, those social ones are on a high level. Next, the spatial analyses have shown that the best realization conditions are provided by the capital city of Prague. Then the Central-Bohemian, South-Moravian, Moravian-Silesian and Karlovarský regions follow. The development of tourism destination is depended not only on the localization and realization factors but it is basically affected by the level of local destination management. Spatial modelling can help destination managers in decision-making processes in order to optimal use of destination potential and efficient targeting their marketing activities.

  8. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the ...

  9. HYPROLOG: A New Logic Programming Language with Assumptions and Abduction

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2005-01-01

    We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars. The lan......We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars....... The language shows a novel flexibility in the interaction between the different paradigms, including all additional built-in predicates and constraints solvers that may be available. Assumptions and abduction are especially useful for language processing, and we can show how HYPROLOG works seamlessly together...

  10. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    Science.gov (United States)

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. String cosmology basic ideas and general results

    CERN Document Server

    Veneziano, Gabriele

    1995-01-01

    After recalling a few basic concepts from cosmology and string theory, I will outline the main ideas/assumptions underlying (our own group's approach to) string cosmology and show how these lead to the definition of a two-parameter family of ``minimal" models. I will then briefly explain how to compute, in terms of those parameters, the spectrum of scalar, tensor and electromagnetic perturbations, and mention their most relevant physical consequences. More details on the latter part of this talk can be found in Maurizio Gasperini's contribution to these proceedings.

  12. Basic principles on the safety evaluation of the HTGR hydrogen production system

    International Nuclear Information System (INIS)

    Ohashi, Kazutaka; Nishihara, Tetsuo; Tazawa, Yujiro; Tachibana, Yukio; Kunitomi, Kazuhiko

    2009-03-01

    As HTGR hydrogen production systems, such as HTTR-IS system or GTHTR300C currently being developed by Japan Atomic Energy Agency, consists of nuclear reactor and chemical plant, which are without a precedent in the world, safety design philosophy and regulatory framework should be newly developed. In this report, phenomena to be considered and events to be postulated in the safety evaluation of the HTGR hydrogen production systems were investigated and basic principles to establish acceptance criteria for the explosion and toxic gas release accidents were provided. Especially for the explosion accident, quantitative criteria to the reactor building are proposed with relating sample calculation results. It is necessary to treat abnormal events occurred in the hydrogen production system as an 'external events to the nuclear plant' in order to classify the hydrogen production system as no-nuclear facility' and basic policy to meet such requirement was also provided. (author)

  13. A simulation study to compare three self-controlled case series approaches: correction for violation of assumption and evaluation of bias.

    Science.gov (United States)

    Hua, Wei; Sun, Guoying; Dodd, Caitlin N; Romio, Silvana A; Whitaker, Heather J; Izurieta, Hector S; Black, Steven; Sturkenboom, Miriam C J M; Davis, Robert L; Deceuninck, Genevieve; Andrews, N J

    2013-08-01

    The assumption that the occurrence of outcome event must not alter subsequent exposure probability is critical for preserving the validity of the self-controlled case series (SCCS) method. This assumption is violated in scenarios in which the event constitutes a contraindication for exposure. In this simulation study, we compared the performance of the standard SCCS approach and two alternative approaches when the event-independent exposure assumption was violated. Using the 2009 H1N1 and seasonal influenza vaccines and Guillain-Barré syndrome as a model, we simulated a scenario in which an individual may encounter multiple unordered exposures and each exposure may be contraindicated by the occurrence of outcome event. The degree of contraindication was varied at 0%, 50%, and 100%. The first alternative approach used only cases occurring after exposure with follow-up time starting from exposure. The second used a pseudo-likelihood method. When the event-independent exposure assumption was satisfied, the standard SCCS approach produced nearly unbiased relative incidence estimates. When this assumption was partially or completely violated, two alternative SCCS approaches could be used. While the post-exposure cases only approach could handle only one exposure, the pseudo-likelihood approach was able to correct bias for both exposures. Violation of the event-independent exposure assumption leads to an overestimation of relative incidence which could be corrected by alternative SCCS approaches. In multiple exposure situations, the pseudo-likelihood approach is optimal; the post-exposure cases only approach is limited in handling a second exposure and may introduce additional bias, thus should be used with caution. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Basic properties of a stationary accretion disk surrounding a black hole

    International Nuclear Information System (INIS)

    Hoshi, Reiun

    1977-01-01

    The structure of a stationary accretion disk surrounding a black hole is studied by means of newly developed basic equations. The basic equations are derived under the assumption that the vertical distribution of disk matter is given by a polytrope. For a Keplerian accretion disk, basic equations reduce to a differential equation of the first order. We have found that solutions of an optically thick accretion disk converge to a limiting value, irrespective of the outer boundary condition. This gives the happy consequence that the inner structure of an optically thick accretion disk is determined irrespective of the outer boundary condition. On the contrary, an optically thin accretion disk shows bimodal behavior, that is, two physically distinct states exist depending on the outer boundary condition imposed at the outer edge of the accretion disk. (auth.)

  15. Contextuality under weak assumptions

    International Nuclear Information System (INIS)

    Simmons, Andrew W; Rudolph, Terry; Wallman, Joel J; Pashayan, Hakop; Bartlett, Stephen D

    2017-01-01

    The presence of contextuality in quantum theory was first highlighted by Bell, Kochen and Specker, who discovered that for quantum systems of three or more dimensions, measurements could not be viewed as deterministically revealing pre-existing properties of the system. More precisely, no model can assign deterministic outcomes to the projectors of a quantum measurement in a way that depends only on the projector and not the context (the full set of projectors) in which it appeared, despite the fact that the Born rule probabilities associated with projectors are independent of the context. A more general, operational definition of contextuality introduced by Spekkens, which we will term ‘probabilistic contextuality’, drops the assumption of determinism and allows for operations other than measurements to be considered contextual. Even two-dimensional quantum mechanics can be shown to be contextual under this generalised notion. Probabilistic noncontextuality represents the postulate that elements of an operational theory that cannot be distinguished from each other based on the statistics of arbitrarily many repeated experiments (they give rise to the same operational probabilities) are ontologically identical. In this paper, we introduce a framework that enables us to distinguish between different noncontextuality assumptions in terms of the relationships between the ontological representations of objects in the theory given a certain relation between their operational representations. This framework can be used to motivate and define a ‘possibilistic’ analogue, encapsulating the idea that elements of an operational theory that cannot be unambiguously distinguished operationally can also not be unambiguously distinguished ontologically. We then prove that possibilistic noncontextuality is equivalent to an alternative notion of noncontextuality proposed by Hardy. Finally, we demonstrate that these weaker noncontextuality assumptions are sufficient to prove

  16. Evaluating subway drivers’ exposure to whole body vibration based on Basic and VDV methods (with ISO 2631-1 standard

    Directory of Open Access Journals (Sweden)

    A. Khavanin

    2014-07-01

    Conclusion: Investigation of the result obtained from Basic method and VDV method manifested different amounts of vibration exposure in a way that VDV predicts higher level of risk, compared to basic method. The results shows that some presented indicators can not presented the safe zone in human vibration evaluations.

  17. Sensitivity of probabilistic MCO water content estimates to key assumptions

    International Nuclear Information System (INIS)

    DUNCAN, D.R.

    1999-01-01

    Sensitivity of probabilistic multi-canister overpack (MCO) water content estimates to key assumptions is evaluated with emphasis on the largest non-cladding film-contributors, water borne by particulates adhering to damage sites, and water borne by canister particulate. Calculations considered different choices of damage state degree of independence, different choices of percentile for reference high inputs, three types of input probability density function (pdfs): triangular, log-normal, and Weibull, and the number of scrap baskets in an MCO

  18. Investigation and basic evaluation for ultra-high burnup fuel cladding material

    International Nuclear Information System (INIS)

    Ioka, Ikuo; Nagase, Fumihisa; Futakawa, Masatoshi; Kiuchi, Kiyoshi

    2001-03-01

    In ultra-high burnup of the power reactor, it is an essential problem to develop the cladding with excellent durability. First, development history and approach of the safety assessment of Zircaloy for the high burnup fuel were summarized in the report. Second, the basic evaluation and investigation were carried out on the material with high practicability in order to select the candidate materials for the ultra-high burnup fuel. In addition, the basic research on modification technology of the cladding surface was carried out from the viewpoint of the addition of safety margin as a cladding. From the development history of the zirconium alloy including the Zircaloy, it is hard to estimate the results of in-pile test from those of the conventional corrosion test (out-pile test). Therefore, the development of the new testing technology that can simulate the actual environment and the elucidation of the corrosion-controlling factor of the cladding are desired. In cases of RIA (Reactivity Initiated Accident) and LOCA (Loss of Coolant Accident), it seems that the loss of ductility in zirconium alloys under heavy irradiation and boiling of high temperature water restricts the extension of fuel burnup. From preliminary evaluation on the high corrosion-resistance materials (austenitic stainless steel, iron or nickel base superalloys, titanium alloy, niobium alloy, vanadium alloy and ferritic stainless steel), stabilized austenitic stainless steels with a capability of future improvement and high-purity niobium alloys with a expectation of the good corrosion resistance were selected as candidate materials of ultra-high burnup cladding. (author)

  19. 40 CFR 265.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ..., STORAGE, AND DISPOSAL FACILITIES Financial Requirements § 265.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  20. 40 CFR 144.66 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM Financial Responsibility: Class I Hazardous Waste Injection Wells § 144.66 State assumption of responsibility. (a) If a State either assumes legal... 40 Protection of Environment 22 2010-07-01 2010-07-01 false State assumption of responsibility...

  1. The Analysis of Basic Public Service Supply Regional Equalization in China’s Provinces——Based on the Theil Index Evaluation

    Science.gov (United States)

    Liao, Zangyi

    2017-12-01

    Accomplishing the regional equalization of basic public service supply among the provinces in China is an important objective that can promote the people’s livelihood construction. In order to measure the problem which is about the non-equalization of basic public service supply, this paper takes these aspects as the first index, such as the infrastructure construction, basic education services, public employment services, public health service and social security service. At the same time, it cooperates with 16 index as the second index to construct the performance evaluation systems, and then use the Theil index to evaluate the performance in provinces that using the panel data from the year 2000 to 2012.

  2. 40 CFR 264.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... FACILITIES Financial Requirements § 264.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure, post-closure care, or... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  3. 40 CFR 261.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... Excluded Hazardous Secondary Materials § 261.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure or liability... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  4. 40 CFR 267.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... STANDARDIZED PERMIT Financial Requirements § 267.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure care or liability... 40 Protection of Environment 26 2010-07-01 2010-07-01 false State assumption of responsibility...

  5. Derivation of simplified basic equations of gas-liquid two-phase dispersed flow based on two-fluid model

    International Nuclear Information System (INIS)

    Kataoka, Isao; Tomiyama, Akio

    2004-01-01

    The simplified and physically reasonable basic equations for the gas-liquid dispersed flow were developed based on some appropriate assumptions and the treatment of dispersed phase as isothermal rigid particles. Based on the local instant formulation of mass, momentum and energy conservation of the dispersed flow, time-averaged equations were obtained assuming that physical quantities in the dispersed phase are uniform. These assumptions are approximately valid when phase change rate and/or chemical reaction rate are not so large at gas-liquid interface and there is no heat generation in within the dispersed phase. Detailed discussions were made on the characteristics of obtained basic equations and physical meanings of terms consisting the basic equations. It is shown that, in the derived averaged momentum equation, the terms of pressure gradient and viscous momentum diffusion do not appear and, in the energy equation, the term of molecular thermal diffusion heat flux does not appear. These characteristics of the derived equations were shown to be very consistent concerning the physical interpretation of the gas-liquid dispersed flow. Furthermore, the obtained basic equations are consistent with experiments for the dispersed flow where most of averaged physical quantities are obtained assuming that the distributions of those are uniform within the dispersed phase. Investigation was made on the problem whether the obtained basic equations are well-posed or ill-posed for the initial value problem. The eigenvalues of the simplified mass and momentum equations are calculated for basic equations obtained here and previous two-fluid basic equations with one pressure model. Well-posedness and ill-posedness are judged whether the eigenvalues are real or imaginary. The result indicated the newly developed basic equations always constitute the well-posed initial value problem while the previous two-fluid basic equations based on one pressure model constitutes ill

  6. Science Awareness and Science Literacy through the Basic Physics Course: Physics with a bit of Metaphysics?

    Science.gov (United States)

    Rusli, Aloysius

    2016-08-01

    Until the 1980s, it is well known and practiced in Indonesian Basic Physics courses, to present physics by its effective technicalities: The ideally elastic spring, the pulley and moving blocks, the thermodynamics of ideal engine models, theoretical electrostatics and electrodynamics with model capacitors and inductors, wave behavior and its various superpositions, and hopefully closed with a modern physics description. A different approach was then also experimented with, using the Hobson and Moore texts, stressing the alternative aim of fostering awareness, not just mastery, of science and the scientific method. This is hypothesized to be more in line with the changed attitude of the so-called Millenials cohort who are less attentive if not interested, and are more used to multi-tasking which suits their shorter span of attention. The upside is increased awareness of science and the scientific method. The downside is that they are getting less experience of the scientific method which intensely bases itself on critical observation, analytic thinking to set up conclusions or hypotheses, and checking consistency of the hypotheses with measured data. Another aspect is recognition that the human person encompasses both the reasoning capacity and the mental- spiritual-cultural capacity. This is considered essential, as the world grows even smaller due to increased communication capacity, causing strong interactions, nonlinear effects, and showing that value systems become more challenging and challenged due to physics / science and its cosmology, which is successfully based on the scientific method. So students should be made aware of the common basis of these two capacities: the assumptions, the reasoning capacity and the consistency assumption. This shows that the limits of science are their set of basic quantifiable assumptions, and the limits of the mental-spiritual-cultural aspects of life are their set of basic metaphysical (non-quantifiable) assumptions. The

  7. Science Awareness and Science Literacy through the Basic Physics Course: Physics with a bit of Metaphysics?

    International Nuclear Information System (INIS)

    Rusli, Aloysius

    2016-01-01

    Until the 1980s, it is well known and practiced in Indonesian Basic Physics courses, to present physics by its effective technicalities: The ideally elastic spring, the pulley and moving blocks, the thermodynamics of ideal engine models, theoretical electrostatics and electrodynamics with model capacitors and inductors, wave behavior and its various superpositions, and hopefully closed with a modern physics description. A different approach was then also experimented with, using the Hobson and Moore texts, stressing the alternative aim of fostering awareness, not just mastery, of science and the scientific method. This is hypothesized to be more in line with the changed attitude of the so-called Millenials cohort who are less attentive if not interested, and are more used to multi-tasking which suits their shorter span of attention. The upside is increased awareness of science and the scientific method. The downside is that they are getting less experience of the scientific method which intensely bases itself on critical observation, analytic thinking to set up conclusions or hypotheses, and checking consistency of the hypotheses with measured data. Another aspect is recognition that the human person encompasses both the reasoning capacity and the mental- spiritual-cultural capacity. This is considered essential, as the world grows even smaller due to increased communication capacity, causing strong interactions, nonlinear effects, and showing that value systems become more challenging and challenged due to physics / science and its cosmology, which is successfully based on the scientific method. So students should be made aware of the common basis of these two capacities: the assumptions, the reasoning capacity and the consistency assumption. This shows that the limits of science are their set of basic quantifiable assumptions, and the limits of the mental-spiritual-cultural aspects of life are their set of basic metaphysical (non-quantifiable) assumptions. The

  8. Discrete Neural Signatures of Basic Emotions.

    Science.gov (United States)

    Saarimäki, Heini; Gotsopoulos, Athanasios; Jääskeläinen, Iiro P; Lampinen, Jouko; Vuilleumier, Patrik; Hari, Riitta; Sams, Mikko; Nummenmaa, Lauri

    2016-06-01

    Categorical models of emotions posit neurally and physiologically distinct human basic emotions. We tested this assumption by using multivariate pattern analysis (MVPA) to classify brain activity patterns of 6 basic emotions (disgust, fear, happiness, sadness, anger, and surprise) in 3 experiments. Emotions were induced with short movies or mental imagery during functional magnetic resonance imaging. MVPA accurately classified emotions induced by both methods, and the classification generalized from one induction condition to another and across individuals. Brain regions contributing most to the classification accuracy included medial and inferior lateral prefrontal cortices, frontal pole, precentral and postcentral gyri, precuneus, and posterior cingulate cortex. Thus, specific neural signatures across these regions hold representations of different emotional states in multimodal fashion, independently of how the emotions are induced. Similarity of subjective experiences between emotions was associated with similarity of neural patterns for the same emotions, suggesting a direct link between activity in these brain regions and the subjective emotional experience. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  10. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  11. Issues in the economic evaluation of influenza vaccination by injection of healthy working adults in the US: a review and decision analysis of ten published studies.

    Science.gov (United States)

    Hogan, Thomas J

    2012-05-01

    The objective was to review recent economic evaluations of influenza vaccination by injection in the US, assess their evidence, and conclude on their collective findings. The literature was searched for economic evaluations of influenza vaccination injection in healthy working adults in the US published since 1995. Ten evaluations described in nine papers were identified. These were synopsized and their results evaluated, the basic structure of all evaluations was ascertained, and sensitivity of outcomes to changes in parameter values were explored using a decision model. Areas to improve economic evaluations were noted. Eight of nine evaluations with credible economic outcomes were favourable to vaccination, representing a statistically significant result compared with a proportion of 50% that would be expected if vaccination and no vaccination were economically equivalent. Evaluations shared a basic structure, but differed considerably with respect to cost components, assumptions, methods, and parameter estimates. Sensitivity analysis indicated that changes in parameter values within the feasible range, individually or simultaneously, could reverse economic outcomes. Given stated misgivings, the methods of estimating influenza reduction ascribed to vaccination must be researched to confirm that they produce accurate and reliable estimates. Research is also needed to improve estimates of the costs per case of influenza illness and the costs of vaccination. Based on their assumptions, the reviewed papers collectively appear to support the economic benefits of influenza vaccination of healthy adults. Yet the underlying assumptions, methods and parameter estimates themselves warrant further research to confirm they are accurate, reliable and appropriate to economic evaluation purposes.

  12. Formalization and Analysis of Reasoning by Assumption

    OpenAIRE

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been speci...

  13. Teaching Basic Cooking Skills: Evaluation of the North Carolina Extension "Cook Smart, Eat Smart" Program

    Science.gov (United States)

    Dunn, Carolyn; Jayaratne, K. S. U.; Baughman, Kristen; Levine, Katrina

    2014-01-01

    Cook Smart, Eat Smart (CSES) is a 12-hour cooking school that teaches participants to prepare nutritious, delicious food using simple, healthy preparation techniques, basic ingredients, and minimal equipment. The purpose of this evaluation was to examine the impact of CSES on food preparation and meal consumption behavior. Program outcomes include…

  14. Evaluating the Sensitivity of the Mass-Based Particle Removal Calculations for HVAC Filters in ISO 16890 to Assumptions for Aerosol Distributions

    Directory of Open Access Journals (Sweden)

    Brent Stephens

    2018-02-01

    Full Text Available High efficiency particle air filters are increasingly being recommended for use in heating, ventilating, and air-conditioning (HVAC systems to improve indoor air quality (IAQ. ISO Standard 16890-2016 provides a methodology for approximating mass-based particle removal efficiencies for PM1, PM2.5, and PM10 using size-resolved removal efficiency measurements for 0.3 µm to 10 µm particles. Two historical volume distribution functions for ambient aerosol distributions are assumed to represent ambient air in urban and rural areas globally. The goals of this work are to: (i review the ambient aerosol distributions used in ISO 16890, (ii evaluate the sensitivity of the mass-based removal efficiency calculation procedures described in ISO 16890 to various assumptions that are related to indoor and outdoor aerosol distributions, and (iii recommend several modifications to the standard that can yield more realistic estimates of mass-based removal efficiencies for HVAC filters, and thus provide a more realistic representation of a greater number of building scenarios. The results demonstrate that knowing the PM mass removal efficiency estimated using ISO 16890 is not sufficient to predict the PM mass removal efficiency in all of the environments in which the filter might be used. The main reason for this insufficiency is that the assumptions for aerosol number and volume distributions can substantially impact the results, albeit with some exceptions.

  15. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2012-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R_f=......-Reingold style pseudorandom functions, and auxiliary input secure encryption. This can be seen as an alternative to the known family of k-LIN assumptions....

  16. A framework for the organizational assumptions underlying safety culture

    International Nuclear Information System (INIS)

    Packer, Charles

    2002-01-01

    The safety culture of the nuclear organization can be addressed at the three levels of culture proposed by Edgar Schein. The industry literature provides a great deal of insight at the artefact and espoused value levels, although as yet it remains somewhat disorganized. There is, however, an overall lack of understanding of the assumption level of safety culture. This paper describes a possible framework for conceptualizing the assumption level, suggesting that safety culture is grounded in unconscious beliefs about the nature of the safety problem, its solution and how to organize to achieve the solution. Using this framework, the organization can begin to uncover the assumptions at play in its normal operation, decisions and events and, if necessary, engage in a process to shift them towards assumptions more supportive of a strong safety culture. (author)

  17. Formalization and analysis of reasoning by assumption.

    Science.gov (United States)

    Bosse, Tibor; Jonker, Catholijn M; Treur, Jan

    2006-01-02

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.

  18. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  19. Evaluating the Effects of Basic Skills Mathematics Placement on Academic Outcomes of Community College Students

    Science.gov (United States)

    Melguizo, Tatiana; Bo, Hans; Prather, George; Kim, Bo

    2011-01-01

    The main objective of the authors' proposed study is to evaluate the effectiveness of math placement policies for entering community college students on these students' academic success in math, and their transfer and graduation rates. The main research question that guides the proposed study is: What are the effects of various basic skills…

  20. MONITORED GEOLOGIC REPOSITORY LIFE CYCLE COST ESTIMATE ASSUMPTIONS DOCUMENT

    International Nuclear Information System (INIS)

    R.E. Sweeney

    2001-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost (LCC) estimate and schedule update incorporating information from the Viability Assessment (VA) , License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  1. The stable model semantics under the any-world assumption

    OpenAIRE

    Straccia, Umberto; Loyer, Yann

    2004-01-01

    The stable model semantics has become a dominating approach to complete the knowledge provided by a logic program by means of the Closed World Assumption (CWA). The CWA asserts that any atom whose truth-value cannot be inferred from the facts and rules is supposed to be false. This assumption is orthogonal to the so-called the Open World Assumption (OWA), which asserts that every such atom's truth is supposed to be unknown. The topic of this paper is to be more fine-grained. Indeed, the objec...

  2. Monitored Geologic Repository Life Cycle Cost Estimate Assumptions Document

    International Nuclear Information System (INIS)

    Sweeney, R.

    2000-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost estimate and schedule update incorporating information from the Viability Assessment (VA), License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  3. [Implementation of the International Health Regulations in Cuba: evaluation of basic capacities of the health sector in selected provinces].

    Science.gov (United States)

    Gala, Ángela; Toledo, María Eugenia; Arias, Yanisnubia; Díaz González, Manuel; Alvarez Valdez, Angel Manuel; Estévez, Gonzalo; Abreu, Rolando Miyar; Flores, Gustavo Kourí

    2012-09-01

    Obtain baseline information on the status of the basic capacities of the health sector at the local, municipal, and provincial levels in order to facilitate identification of priorities and guide public policies that aim to comply with the requirements and capacities established in Annex 1A of the International Health Regulations 2005 (IHR-2005). A descriptive cross-sectional study was conducted by application of an instrument of evaluation of basic capacities referring to legal and institutional autonomy, the surveillance and research process, and the response to health emergencies in 36 entities involved in international sanitary control at the local, municipal, and provincial levels in the provinces of Havana, Cienfuegos, and Santiago de Cuba. The polyclinics and provincial centers of health and epidemiology in the three provinces had more than 75% of the basic capacities required. Twelve out of 36 units had implemented 50% of the legal and institutional framework. There was variable availability of routine surveillance and research, whereas the entities in Havana had more than 40% of the basic capacities in the area of events response. The provinces evaluated have integrated the basic capacities that will allow implementation of IHR-2005 within the period established by the World Health Organization. It is necessary to develop and establish effective action plans to consolidate surveillance as an essential activity of national and international security in terms of public health.

  4. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  5. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  6. A Memory-Based Model of Posttraumatic Stress Disorder: Evaluating Basic Assumptions Underlying the PTSD Diagnosis

    Science.gov (United States)

    Rubin, David C.; Berntsen, Dorthe; Bohni, Malene Klindt

    2008-01-01

    In the mnemonic model of posttraumatic stress disorder (PTSD), the current memory of a negative event, not the event itself, determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; American Psychiatric Association,…

  7. Evaluation of the basic concepts of approaches for the coexistence of nuclear energy and people/local community

    International Nuclear Information System (INIS)

    Kondo, Shunsuke; Kuroki, Shinichi; Nakagiri, Yuko

    2007-01-01

    In November 2007, the Policy Evaluation Committee compiled the report, which evaluated the basic concepts of approaches to the coexistence of nuclear energy and people/local community, specified in the Framework for Nuclear Energy Policy. The report states that the 'concerned administrative bodies are carrying out measures related to the coexistence of nuclear energy and people/local communities in line with these basic concept' and summarizes fifteen proposals conductive to the betterment and improvement of these measures, which were classified as 1) secure transparency and promotion of mutual understanding with the public, 2) development and enrichment of learning opportunities and public participation, 3) relationship between the government and local governments and 4) coexistence with local residents. The Japan Atomic Energy Commission (JAEC) considers this report to be reasonable. This article presented an overview of this activity. (T. Tanaka)

  8. Idaho National Engineering Laboratory installation roadmap assumptions document

    International Nuclear Information System (INIS)

    1993-05-01

    This document is a composite of roadmap assumptions developed for the Idaho National Engineering Laboratory (INEL) by the US Department of Energy Idaho Field Office and subcontractor personnel as a key element in the implementation of the Roadmap Methodology for the INEL Site. The development and identification of these assumptions in an important factor in planning basis development and establishes the planning baseline for all subsequent roadmap analysis at the INEL

  9. Super learning to hedge against incorrect inference from arbitrary parametric assumptions in marginal structural modeling.

    Science.gov (United States)

    Neugebauer, Romain; Fireman, Bruce; Roy, Jason A; Raebel, Marsha A; Nichols, Gregory A; O'Connor, Patrick J

    2013-08-01

    Clinical trials are unlikely to ever be launched for many comparative effectiveness research (CER) questions. Inferences from hypothetical randomized trials may however be emulated with marginal structural modeling (MSM) using observational data, but success in adjusting for time-dependent confounding and selection bias typically relies on parametric modeling assumptions. If these assumptions are violated, inferences from MSM may be inaccurate. In this article, we motivate the application of a data-adaptive estimation approach called super learning (SL) to avoid reliance on arbitrary parametric assumptions in CER. Using the electronic health records data from adults with new-onset type 2 diabetes, we implemented MSM with inverse probability weighting (IPW) estimation to evaluate the effect of three oral antidiabetic therapies on the worsening of glomerular filtration rate. Inferences from IPW estimation were noticeably sensitive to the parametric assumptions about the associations between both the exposure and censoring processes and the main suspected source of confounding, that is, time-dependent measurements of hemoglobin A1c. SL was successfully implemented to harness flexible confounding and selection bias adjustment from existing machine learning algorithms. Erroneous IPW inference about clinical effectiveness because of arbitrary and incorrect modeling decisions may be avoided with SL. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  11. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  12. A COMPARISON OF BASIC AND EXTENDED MARKOWITZ MODEL ON CROATIAN CAPITAL MARKET

    Directory of Open Access Journals (Sweden)

    Bruna Škarica

    2012-12-01

    Full Text Available Markowitz' mean - variance model for portfolio selection, first introduced in H.M. Markowitz' 1952 article, is one of the best known models in finance. However, the Markowitz model is based on many assumptions about financial markets and investors, which do not coincide with the real world. One of these assumptions is that there are no taxes or transaction costs, when in reality all financial products are subject to both taxes and transaction costs – such as brokerage fees. In this paper, we consider an extension of the standard portfolio problem which includes transaction costs that arise when constructing an investment portfolio. Finally, we compare both the extension of the Markowitz' model, including transaction costs, and the basic model on the example of the Croatian capital market.

  13. Major Assumptions of Mastery Learning.

    Science.gov (United States)

    Anderson, Lorin W.

    Mastery learning can be described as a set of group-based, individualized, teaching and learning strategies based on the premise that virtually all students can and will, in time, learn what the school has to teach. Inherent in this description are assumptions concerning the nature of schools, classroom instruction, and learners. According to the…

  14. 7 CFR 772.10 - Transfer and assumption-AMP loans.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Transfer and assumption-AMP loans. 772.10 Section 772..., DEPARTMENT OF AGRICULTURE SPECIAL PROGRAMS SERVICING MINOR PROGRAM LOANS § 772.10 Transfer and assumption—AMP loans. (a) Eligibility. The Agency may approve transfers and assumptions of AMP loans when: (1) The...

  15. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  16. Personal and Communal Assumptions to Determine Pragmatic Meanings of Phatic Functions

    Directory of Open Access Journals (Sweden)

    Kunjana Rahardi

    2016-11-01

    Full Text Available This research was meant to describe the manifestations of phatic function in the education domain. The phatic function in the communication and interaction happening in the education domain could be accurately identified when the utterances were not separated from their determining pragmatic context. The context must not be limited only to contextual and social or societal perspectives, but must be defined as basic assumptions. The data of this research included various kinds of speech gathered naturally in education circles that contain phatic functions. Two methods of data gathering were employed in this study, namely listening and conversation methods. Recorded data was analyzed through the steps as follows (1 data were identified based on the discourse markers found (2 data were classified based on the phatic perception criteria; (3 data were interpreted based on the referenced theories; (4 data were described in the form of analysis result description. The research proves that phatic function in the form of small talks in the education domain cannot be separated from the context surrounding it. 

  17. Quality quantification model of basic raw materials

    Directory of Open Access Journals (Sweden)

    Š. Vilamová

    2016-07-01

    Full Text Available Basic raw materials belong to the key input sources in the production of pig iron. The properties of basic raw materials can be evaluated using a variety of criteria. The essential ones include the physical and chemical properties. Current competitive pressures, however, force the producers of iron more and more often to include cost and logistic criteria into the decision-making process. In this area, however, they are facing a problem of how to convert a variety of vastly different parameters into one evaluation indicator in order to compare the available raw materials. This article deals with the analysis of a model created to evaluate the basic raw materials, which was designed as part of the research.

  18. Basic concepts and assumptions behind the ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1981-03-01

    The paper gives a review of the current radiation protection recommendations by the International Commission on Radiological Protection (ICRP). It discusses concepts like stochastic effects, radiation detriments, collective dose, dose equivalent and dose limits. (G.B.)

  19. Measurement and Basic Physics Committee of the U.S. Cross-Section Evaluation Working Group annual report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L. [ed.] [comp.] [Argonne National Lab., IL (United States); McLane, V. [ed.] [comp.] [Brookhaven National Lab., Upton, NY (United States)

    1997-10-01

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with responsibility for organizing and overseeing the US cross-section evaluation effort. It`s main product is the official US evaluated nuclear data file, ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the US were declining at an alarming rate and needed considerable encouragement to avoid the loss of this resource. The mission of the Committee is to maintain contact with experimentalists in the Us and to encourage them to contribute to the national nuclear data effort. Improved communication and the facilitation of collaborative activities are among the tools employed in achieving this objective. In 1994 the Committee was given an additional mission, namely, to serve as an interface between the applied interests represented in CSEWG and the basic nuclear science community. Accordingly, its name was changed to the Measurement and Basic Physics Committee. The present annual report is the third such document issued by the Committee. It contains voluntary contributions from several laboratories in the US. Their contributions were submitted to the Chairman for compilation and editing.

  20. Estimation of the energy loss at the blades in rowing: common assumptions revisited.

    Science.gov (United States)

    Hofmijster, Mathijs; De Koning, Jos; Van Soest, A J

    2010-08-01

    In rowing, power is inevitably lost as kinetic energy is imparted to the water during push-off with the blades. Power loss is estimated from reconstructed blade kinetics and kinematics. Traditionally, it is assumed that the oar is completely rigid and that force acts strictly perpendicular to the blade. The aim of the present study was to evaluate how reconstructed blade kinematics, kinetics, and average power loss are affected by these assumptions. A calibration experiment with instrumented oars and oarlocks was performed to establish relations between measured signals and oar deformation and blade force. Next, an on-water experiment was performed with a single female world-class rower rowing at constant racing pace in an instrumented scull. Blade kinematics, kinetics, and power loss under different assumptions (rigid versus deformable oars; absence or presence of a blade force component parallel to the oar) were reconstructed. Estimated power losses at the blades are 18% higher when parallel blade force is incorporated. Incorporating oar deformation affects reconstructed blade kinematics and instantaneous power loss, but has no effect on estimation of power losses at the blades. Assumptions on oar deformation and blade force direction have implications for the reconstructed blade kinetics and kinematics. Neglecting parallel blade forces leads to a substantial underestimation of power losses at the blades.

  1. Proposal for basic safety requirements regarding the disposal of high-level radioactive waste

    International Nuclear Information System (INIS)

    1980-04-01

    A working group commissioned to prepare proposals for basic safety requirements for the storage and transport of radioactive waste prepared its report to the Danish Agency of Environmental Protection. The proposals include: radiation protection requirements, requirements concerning the properties of high-level waste units, the geological conditions of the waste disposal location, the supervision of waste disposal areas. The proposed primary requirements for safety evaluation of the disposal of high-level waste in deep geological formations are of a general nature, not being tied to specific assumptions regarding the waste itself, the geological and other conditions at the place of disposal, and the technical methods of disposal. It was impossible to test the proposals for requirements on a working repository. As no country has, to the knowledge of the working group, actually disposed of hifg-level radioactive waste or approved of plans for such disposal. Methods for evaluating the suitability of geological formations for waste disposal, and background material concerning the preparation of these proposals for basic safety requirements relating to radiation, waste handling and geological conditions are reviewed. Appended to the report is a description of the phases of the fuel cycle that are related to the storage of spent fuel and the disposal of high-level reprocessing waste in a salt formation. It should be noted that the proposals of the working group are not limited to the disposal of reprocessed fuel, but also include the direct disposal of spent fuel as well as disposal in geological formations other than salt. (EG)

  2. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan; Almasri, Islam; Shi, Yuexiang; Gao, Xin

    2014-01-01

    of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue

  3. Three-class ROC analysis--the equal error utility assumption and the optimality of three-class ROC surface using the ideal observer.

    Science.gov (United States)

    He, Xin; Frey, Eric C

    2006-08-01

    Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.

  4. Public-private partnerships to improve primary healthcare surgeries: clarifying assumptions about the role of private provider activities.

    Science.gov (United States)

    Mudyarabikwa, Oliver; Tobi, Patrick; Regmi, Krishna

    2017-07-01

    Aim To examine assumptions about public-private partnership (PPP) activities and their role in improving public procurement of primary healthcare surgeries. PPPs were developed to improve the quality of care and patient satisfaction. However, evidence of their effectiveness in delivering health benefits is limited. A qualitative study design was employed. A total of 25 interviews with public sector staff (n=23) and private sector managers (n=2) were conducted to understand their interpretations of assumptions in the activities of private investors and service contractors participating in Local Improvement Finance Trust (LIFT) partnerships. Realist evaluation principles were applied in the data analysis to interpret the findings. Six thematic areas of assumed health benefits were identified: (i) quality improvement; (ii) improved risk management; (iii) reduced procurement costs; (iv) increased efficiency; (v) community involvement; and (vi) sustainable investment. Primary Care Trusts that chose to procure their surgeries through LIFT were expected to support its implementation by providing an environment conducive for the private participants to achieve these benefits. Private participant activities were found to be based on a range of explicit and tacit assumptions perceived helpful in achieving government objectives for LIFT. The success of PPPs depended upon private participants' (i) capacity to assess how PPP assumptions added value to their activities, (ii) effectiveness in interpreting assumptions in their expected activities, and (iii) preparedness to align their business principles to government objectives for PPPs. They risked missing some of the expected benefits because of some factors constraining realization of the assumptions. The ways in which private participants preferred to carry out their activities also influenced the extent to which expected benefits were achieved. Giving more discretion to public than private participants over critical

  5. Dominant region: a basic feature for group motion analysis and its application to teamwork evaluation in soccer games

    Science.gov (United States)

    Taki, Tsuyoshi; Hasegawa, Jun-ichi

    1998-12-01

    This paper proposes a basic feature for quantitative measurement and evaluation of group behavior of persons. This feature called 'dominant region' is a kind of sphere of influence for each person in the group. The dominant region is defined as a region in where the person can arrive earlier than any other persons and can be formulated as Voronoi region modified by replacing the distance function with a time function. This time function is calculated based on a computational model of moving ability of the person. As an application of the dominant region, we present a motion analysis system of soccer games. The purpose of this system is to evaluate the teamwork quantitatively based on movement of all the players in the game. From experiments using motion pictures of actual games, it is suggested that the proposed feature is useful for measurement and evaluation of group behavior in team sports. This basic feature may be applied to other team ball games, such as American football, basketball, handball and water polo.

  6. Moral dilemmas in professions of public trust and the assumptions of ethics of social consequences

    Directory of Open Access Journals (Sweden)

    Dubiel-Zielińska Paulina

    2016-06-01

    Full Text Available The aim of the article is to show the possibility of applying assumptions from ethics of social consequences when making decisions about actions, as well as in situations of moral dilemmas, by persons performing occupations of public trust on a daily basis. Reasoning in the article is analytical and synthetic. Article begins with an explanation of the basic concepts of “profession” and “the profession of public trust” and a manifestation of the difference between these terms. This is followed by a general description of professions of public trust. The area and definition of moral dilemmas is emphasized. Furthermore, representatives of professions belonging to them are listed. After a brief characterization of axiological foundations and the main assumptions of ethics of social consequences, actions according to Vasil Gluchman and Włodzimierz Galewicz are discussed and actions in line with ethics of social consequences are transferred to the practical domain. The article points out that actions in professional life are obligatory, impermissible, permissible, supererogatory and unmarked in the moral dimension. In the final part of the article an afterthought is included on how to solve moral dilemmas when in the position of a representative of the profession of public trust. The article concludes with a summary report containing the conclusions that stem from ethics of social consequences for professions of public trust, followed by short examples.

  7. Incorporating assumption deviation risk in quantitative risk assessments: A semi-quantitative approach

    International Nuclear Information System (INIS)

    Khorsandi, Jahon; Aven, Terje

    2017-01-01

    Quantitative risk assessments (QRAs) of complex engineering systems are based on numerous assumptions and expert judgments, as there is limited information available for supporting the analysis. In addition to sensitivity analyses, the concept of assumption deviation risk has been suggested as a means for explicitly considering the risk related to inaccuracies and deviations in the assumptions, which can significantly impact the results of the QRAs. However, challenges remain for its practical implementation, considering the number of assumptions and magnitude of deviations to be considered. This paper presents an approach for integrating an assumption deviation risk analysis as part of QRAs. The approach begins with identifying the safety objectives for which the QRA aims to support, and then identifies critical assumptions with respect to ensuring the objectives are met. Key issues addressed include the deviations required to violate the safety objectives, the uncertainties related to the occurrence of such events, and the strength of knowledge supporting the assessments. Three levels of assumptions are considered, which include assumptions related to the system's structural and operational characteristics, the effectiveness of the established barriers, as well as the consequence analysis process. The approach is illustrated for the case of an offshore installation. - Highlights: • An approach for assessing the risk of deviations in QRA assumptions is presented. • Critical deviations and uncertainties related to their occurrence are addressed. • The analysis promotes critical thinking about the foundation and results of QRAs. • The approach is illustrated for the case of an offshore installation.

  8. Emerging Assumptions About Organization Design, Knowledge And Action

    Directory of Open Access Journals (Sweden)

    Alan Meyer

    2013-12-01

    Full Text Available Participants in the Organizational Design Community’s 2013 Annual Conference faced the challenge of “making organization design knowledge actionable.”  This essay summarizes the opinions and insights participants shared during the conference.  I reflect on these ideas, connect them to recent scholarly thinking about organization design, and conclude that seeking to make design knowledge actionable is nudging the community away from an assumption set based upon linearity and equilibrium, and toward a new set of assumptions based on emergence, self-organization, and non-linearity.

  9. Capturing Assumptions while Designing a Verification Model for Embedded Systems

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    A formal proof of a system correctness typically holds under a number of assumptions. Leaving them implicit raises the chance of using the system in a context that violates some assumptions, which in return may invalidate the correctness proof. The goal of this paper is to show how combining

  10. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  11. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  12. Critically Challenging Some Assumptions in HRD

    Science.gov (United States)

    O'Donnell, David; McGuire, David; Cross, Christine

    2006-01-01

    This paper sets out to critically challenge five interrelated assumptions prominent in the (human resource development) HRD literature. These relate to: the exploitation of labour in enhancing shareholder value; the view that employees are co-contributors to and co-recipients of HRD benefits; the distinction between HRD and human resource…

  13. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  14. The Arundel Assumption And Revision Of Some Large-Scale Maps ...

    African Journals Online (AJOL)

    The rather common practice of stating or using the Arundel Assumption without reference to appropriate mapping standards (except mention of its use for graphical plotting) is a major cause of inaccuracies in map revision. This paper describes an investigation to ascertain the applicability of the Assumption to the revision of ...

  15. An Evaluation of Organizational and Experience Factors Affecting the Perceived Transfer of U.S. Air Force Basic Combat Skills Training

    National Research Council Canada - National Science Library

    Crow, Shirley D

    2007-01-01

    .... In this study, basic combat skills training was evaluated using a number of training factors that potentially affect trainees' perception of training transfer, or their ability to apply the skills...

  16. Evaluation of the implementation of a quality system in a basic research laboratory: viability and impacts.

    Science.gov (United States)

    Fraga, Hilda Carolina de Jesus Rios; Fukutani, Kiyoshi Ferreira; Celes, Fabiana Santana; Barral, Aldina Maria Prado; Oliveira, Camila Indiani de

    2012-01-01

    To evaluate the process of implementing a quality management system in a basic research laboratory of a public institution, particularly considering the feasibility and impacts of this improvement. This was a prospective and qualitative study. We employed the norm "NIT DICLA 035--Princípios das Boas Práticas de Laboratório (BPL)" and auxiliary documents of Organisation for Economic Co-operation and Development to complement the planning and implementation of a Quality System, in a basic research laboratory. In parallel, we used the PDCA tool to define the goals of each phase of the implementation process. This study enabled the laboratory to comply with the NIT DICLA 035 norm and to implement this norm during execution of a research study. Accordingly, documents were prepared and routines were established such as the registration of non-conformities, traceability of research data and equipment calibration. The implementation of a quality system, the setting of a laboratory focused on basic research is feasible once certain structural changes are made. Importantly, impacts were noticed during the process, which could be related to several improvements in the laboratory routine.

  17. Measurement and basic physics committee of the U.S. cross-section evaluation working group, annual report 1997

    International Nuclear Information System (INIS)

    Smith, D.L.; McLane, V.

    1998-01-01

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with responsibility for organizing and overseeing the US cross-section evaluation effort. Its main product is the official US evaluated nuclear data file, ENDF. The current version of this file is Version VI. All evaluations included in ENDF, as well as periodic modifications and updates to the file, are reviewed and approved by CSEWG and issued by the US Nuclear Data Center, Brookhaven National Laboratory. CSEWG is comprised of volunteers from the US nuclear data community who possess expertise in evaluation methodologies and who collectively have been responsible for producing most of the evaluations included in ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the US were declining at an alarming rate and needed considerable encouragement to avoid the loss of this resource. The mission of the Committee is to maintain contact with experimentalists in the US and to encourage them to contribute to the national nuclear data effort. Improved communication and the facilitation of collaborative activities are among the tools employed in achieving this objective. In 1994 the Committee was given an additional mission, namely, to serve as an interface between the applied interests represented in CSEWG and the basic nuclear science community. Accordingly, its name was changed to the Measurement and Basic Physics Committee. The present annual report is the third such document issued by the Committee. It contains voluntary contributions from several laboratories in the US. Their contributions were submitted to the Chairman for compilation and editing

  18. MEASUREMENT AND BASIC PHYSICS COMMITTEE OF THE U.S. CROSS-SECTION EVALUATION WORKING GROUP, ANNUAL REPORT 1997

    Energy Technology Data Exchange (ETDEWEB)

    SMITH,D.L.; MCLANE,V.

    1998-10-20

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with responsibility for organizing and overseeing the US cross-section evaluation effort. Its main product is the official US evaluated nuclear data file, ENDF. The current version of this file is Version VI. All evaluations included in ENDF, as well as periodic modifications and updates to the file, are reviewed and approved by CSEWG and issued by the US Nuclear Data Center, Brookhaven National Laboratory. CSEWG is comprised of volunteers from the US nuclear data community who possess expertise in evaluation methodologies and who collectively have been responsible for producing most of the evaluations included in ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the US were declining at an alarming rate and needed considerable encouragement to avoid the loss of this resource. The mission of the Committee is to maintain contact with experimentalists in the US and to encourage them to contribute to the national nuclear data effort. Improved communication and the facilitation of collaborative activities are among the tools employed in achieving this objective. In 1994 the Committee was given an additional mission, namely, to serve as an interface between the applied interests represented in CSEWG and the basic nuclear science community. Accordingly, its name was changed to the Measurement and Basic Physics Committee. The present annual report is the third such document issued by the Committee. It contains voluntary contributions from several laboratories in the US. Their contributions were submitted to the Chairman for compilation and editing.

  19. EVALUATION OF BASIC COURSE WORKSHOP CONDUCTED IN A MEDICAL COLLEGE

    Directory of Open Access Journals (Sweden)

    Manasee Panda

    2017-08-01

    Full Text Available BACKGROUND Faculty development is perhaps one of the foremost issues among the factors influencing the quality of medical education. It was planned to evaluate Basic course workshop (BCW on Medical education Technologies (MET conducted in the institution with following objectives 1. To assess the effectiveness of the B CW in MET conducted in the Medical College. 2. To study the changes in teaching practices and assessment methods of faculties after the workshop. MATERIALS AND METHODS Present Evaluation study was conducted at the RTC (SCB Medical College, Odisha of MCI in MET from February 2012 to December 2012. Kirkpatrick’s model with four levels of program outcomes (reaction, learning, behaviour, and result was used to evaluate the effectiveness of workshop. Convenient sampling method was used. All the faculties in the first 4 batches of the workshop were the study participants. Data was collected from the record of the RTC from the filled in Feedback form, PrePst test forms, filled semi structured questionnaire from the participants, in-depth interview of facilitators and focus group discussion of students. Descriptive statistics like percentage, Proportions and Chi-square test used. RESULTS A total of 67 faculties responded to the questionnaire. There was gain in knowledge for majority of faculties in different teaching learning process and assessment methods due to the workshop. More than 90% of faculties had the attitude to practice interactive teaching, PBL and preparing MCQs and structured oral questions. Self-reported change in teaching behavior and assessment method was reported by more than 80% of the faculties. Reasons for non- implementation were given as the lack of support from the institution (64%, from other faculties (34%,lack of self-motivation(13%.Facilitators were satisfied with the quality of training. But FGD conducted for the students revealed that they failed to recognize noticeable change in the teaching and

  20. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  1. Unrealistic Assumptions in Economics: an Analysis under the Logic of Socioeconomic Processes

    Directory of Open Access Journals (Sweden)

    Leonardo Ivarola

    2014-11-01

    Full Text Available The realism of assumptions is an ongoing debate within the philosophy of economics. One of the most referenced papers in this matter belongs to Milton Friedman. He defends the use of unrealistic assumptions, not only because of a pragmatic issue, but also the intrinsic difficulties of determining the extent of realism. On the other hand, realists have criticized (and still do today the use of unrealistic assumptions - such as the assumption of rational choice, perfect information, homogeneous goods, etc. However, they did not accompany their statements with a proper epistemological argument that supports their positions. In this work it is expected to show that the realism of (a particular sort of assumptions is clearly relevant when examining economic models, since the system under study (the real economies is not compatible with logic of invariance and of mechanisms, but with the logic of possibility trees. Because of this, models will not function as tools for predicting outcomes, but as representations of alternative scenarios, whose similarity to the real world will be examined in terms of the verisimilitude of a class of model assumptions

  2. Rapid Deterioration of Basic Life Support Skills in Dentists With Basic Life Support Healthcare Provider.

    Science.gov (United States)

    Nogami, Kentaro; Taniguchi, Shogo; Ichiyama, Tomoko

    2016-01-01

    The aim of this study was to investigate the correlation between basic life support skills in dentists who had completed the American Heart Association's Basic Life Support (BLS) Healthcare Provider qualification and time since course completion. Thirty-six dentists who had completed the 2005 BLS Healthcare Provider course participated in the study. We asked participants to perform 2 cycles of cardiopulmonary resuscitation on a mannequin and evaluated basic life support skills. Dentists who had previously completed the BLS Healthcare Provider course displayed both prolonged reaction times, and the quality of their basic life support skills deteriorated rapidly. There were no correlations between basic life support skills and time since course completion. Our results suggest that basic life support skills deteriorate rapidly for dentists who have completed the BLS Healthcare Provider. Newer guidelines stressing chest compressions over ventilation may help improve performance over time, allowing better cardiopulmonary resuscitation in dental office emergencies. Moreover, it may be effective to provide a more specialized version of the life support course to train the dentists, stressing issues that may be more likely to occur in the dental office.

  3. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Science.gov (United States)

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  4. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    Directory of Open Access Journals (Sweden)

    Anja F. Ernst

    2017-05-01

    Full Text Available Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  5. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  6. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  7. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2017-01-01

    are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute......A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...... over what constitutes proper assumptions—even in the absence of corroborating or falsifying empirical evidence. We also discuss how changing assumptions may drive future progress in the resource-based view....

  8. Dosimetric quantities and basic data for the evaluation of generalised derived limits

    International Nuclear Information System (INIS)

    Harrison, N.T.; Simmonds, J.R.

    1980-12-01

    The procedures, dosimetric quantities and basic data to be used for the evaluation of Generalised Derived Limits (GDLs) in environmental materials and of Generalised Derived Limits for discharges to atmosphere are described. The dosimetric considerations and the appropriate intake rates for both children and adults are discussed. In most situations in the nuclear industry and in those institutions, hospitals and laboratories which use relatively small quantities of radioactive material, the Generalised Derived Limits provide convenient reference levels against which the results of environmental monitoring can be compared, and atmospheric discharges can be assessed. They are intended for application when the environmental contamination or discharge to atmosphere is less than about 5% of the Generalised Derived Limit; above this level, it will usually be necessary to undertake a more detailed site-specific assessment. (author)

  9. Investigating the Assumptions of Uses and Gratifications Research

    Science.gov (United States)

    Lometti, Guy E.; And Others

    1977-01-01

    Discusses a study designed to determine empirically the gratifications sought from communication channels and to test the assumption that individuals differentiate channels based on gratifications. (MH)

  10. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    Science.gov (United States)

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Thermal lattice benchmarks for testing basic evaluated data files, developed with MCNP4B

    International Nuclear Information System (INIS)

    Maucec, M.; Glumac, B.

    1996-01-01

    The development of unit cell and full reactor core models of DIMPLE S01A and TRX-1 and TRX-2 benchmark experiments, using Monte Carlo computer code MCNP4B is presented. Nuclear data from ENDF/B-V and VI version of cross-section library were used in the calculations. In addition, a comparison to results obtained with the similar models and cross-section data from the EJ2-MCNPlib library (which is based upon the JEF-2.2 evaluation) developed in IRC Petten, Netherlands is presented. The results of the criticality calculation with ENDF/B-VI data library, and a comparison to results obtained using JEF-2.2 evaluation, confirm the MCNP4B full core model of a DIMPLE reactor as a good benchmark for testing basic evaluated data files. On the other hand, the criticality calculations results obtained using the TRX full core models show less agreement with experiment. It is obvious that without additional data about the TRX geometry, our TRX models are not suitable as Monte Carlo benchmarks. (author)

  12. Basic research for environmental restoration

    International Nuclear Information System (INIS)

    1990-12-01

    The Department of Energy (DOE) is in the midst of a major environmental restoration effort to reduce the health and environmental risks resulting from past waste management and disposal practices at DOE sites. This report describes research needs in environmental restoration and complements a previously published document, DOE/ER-0419, Evaluation of Mid-to-Long Term Basic Research for Environmental Restoration. Basic research needs have been grouped into five major categories patterned after those identified in DOE/ER-0419: (1) environmental transport and transformations; (2) advanced sampling, characterization, and monitoring methods; (3) new remediation technologies; (4) performance assessment; and (5) health and environmental effects. In addition to basic research, this document deals with education and training needs for environmental restoration. 2 figs., 6 tabs

  13. Basic research for environmental restoration

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The Department of Energy (DOE) is in the midst of a major environmental restoration effort to reduce the health and environmental risks resulting from past waste management and disposal practices at DOE sites. This report describes research needs in environmental restoration and complements a previously published document, DOE/ER-0419, Evaluation of Mid-to-Long Term Basic Research for Environmental Restoration. Basic research needs have been grouped into five major categories patterned after those identified in DOE/ER-0419: (1) environmental transport and transformations; (2) advanced sampling, characterization, and monitoring methods; (3) new remediation technologies; (4) performance assessment; and (5) health and environmental effects. In addition to basic research, this document deals with education and training needs for environmental restoration. 2 figs., 6 tabs.

  14. The Emperors sham - wrong assumption that sham needling is sham.

    Science.gov (United States)

    Lundeberg, Thomas; Lund, Iréne; Näslund, Jan; Thomas, Moolamanil

    2008-12-01

    During the last five years a large number of randomised controlled clinical trials (RCTs) have been published on the efficacy of acupuncture in different conditions. In most of these studies verum is compared with sham acupuncture. In general both verum and sham have been found to be effective, and often with little reported difference in outcome. This has repeatedly led to the conclusion that acupuncture is no more effective than placebo treatment. However, this conclusion is based on the assumption that sham acupuncture is inert. Since sham acupuncture evidently is merely another form of acupuncture from the physiological perspective, the assumption that sham is sham is incorrect and conclusions based on this assumption are therefore invalid. Clinical guidelines based on such conclusions may therefore exclude suffering patients from valuable treatments.

  15. Homogeneous groups of plants, development scenarios, and basic configurations on the cogeneration systems optimization from the alcohol sector

    International Nuclear Information System (INIS)

    Silva Walter, A.C. da; Bajay, S.V.; Carrillo, J.L.L.

    1990-01-01

    The evaluation of introducing or diffusing new technologies at a macro economic level using micro economic information can be carried out through the careful selection of a small number of homogeneous groups of plants from the point of view of the main technical parameters being considered. In this paper this concept is applied to the study of cogeneration in sugar and alcohol producing plants. The statistical techniques of Cluster Analysis, regressions and mean value testing are used. Basic cogeneration plant designs are proposed for alternatives development scenarios for this industrial branch. These scenarios are based upon differing assumptions about the expansion of alcohol market, use of surplus sugar cane bagasse as saleable commodity, as a fuel or raw material, and price expectations for the sale of surplus power from the cogeneration plants to the local grid. (author)

  16. Selecting between-sample RNA-Seq normalization methods from the perspective of their assumptions.

    Science.gov (United States)

    Evans, Ciaran; Hardin, Johanna; Stoebel, Daniel M

    2017-02-27

    RNA-Seq is a widely used method for studying the behavior of genes under different biological conditions. An essential step in an RNA-Seq study is normalization, in which raw data are adjusted to account for factors that prevent direct comparison of expression measures. Errors in normalization can have a significant impact on downstream analysis, such as inflated false positives in differential expression analysis. An underemphasized feature of normalization is the assumptions on which the methods rely and how the validity of these assumptions can have a substantial impact on the performance of the methods. In this article, we explain how assumptions provide the link between raw RNA-Seq read counts and meaningful measures of gene expression. We examine normalization methods from the perspective of their assumptions, as an understanding of methodological assumptions is necessary for choosing methods appropriate for the data at hand. Furthermore, we discuss why normalization methods perform poorly when their assumptions are violated and how this causes problems in subsequent analysis. To analyze a biological experiment, researchers must select a normalization method with assumptions that are met and that produces a meaningful measure of expression for the given experiment. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. [Contract focused short-term group therapy--results of an evaluation].

    Science.gov (United States)

    Hirschberg, Rainer; Meyer, Birgit

    2010-01-01

    A short description outlines the development of commission focused short-term therapy (AFoG) for children and adolescents. Subsequently the generic principles of psychotherapy are applied to AFoG in order to underline the basic assumptions of this variation of systemic group therapy. Behavioural changes arising in different contexts (school, family, group therapy) show the need for an appropriate flexibility of group therapy techniques. The evaluation was accomplished using the Child Behaviour Checklist (CBCL 4-18) at the beginning and 3 month after the end of the group therapy. The results show positive effects which finally are discussed critically.

  18. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and

  19. Recognising the Effects of Costing Assumptions in Educational Business Simulation Games

    Science.gov (United States)

    Eckardt, Gordon; Selen, Willem; Wynder, Monte

    2015-01-01

    Business simulations are a powerful way to provide experiential learning that is focussed, controlled, and concentrated. Inherent in any simulation, however, are numerous assumptions that determine feedback, and hence the lessons learnt. In this conceptual paper we describe some common cost assumptions that are implicit in simulation design and…

  20. National Uranium Resource Evaluation Program. Hydrogeochemical and Stream Sediment Reconnaissance Basic Data Reports Computer Program Requests Manual

    International Nuclear Information System (INIS)

    1980-01-01

    This manual is intended to aid those who are unfamiliar with ordering computer output for verification and preparation of Uranium Resource Evaluation (URE) Project reconnaissance basic data reports. The manual is also intended to help standardize the procedures for preparing the reports. Each section describes a program or group of related programs. The sections are divided into three parts: Purpose, Request Forms, and Requested Information

  1. Basic study of water-cement ratio evaluation for fresh mortar using an ultrasonic measurement technique

    International Nuclear Information System (INIS)

    Hamza Haffies Ismail; Murata, Yorinobu

    2009-01-01

    The objective of this research is for the basic study of ultrasonic evaluation method for the determination of the water-cement-ratio (W/C) in fresh concrete at the early age of hardening. Water-cement ratio is a important parameter to evaluate the strength of concrete for concrete construction. Using an ultrasonic pulse measurement technique, wave velocity and frequency variations depend on the age of concrete during hardening process could be evaluated. As a sample test, fresh mortar of water-cement ratio of 40 %, 50% and 60 % was poured into cylindrical plastic mould form (φ100 mm x 50 mm). For an ultrasonic pulse wave transmission technique, two wide band ultrasonic transducers were set on the top and bottom surface of mortar, and start measuring from 10 minutes after pouring water until 60 minutes of 5 minutes of intervals. As a result, it was confirmed that wave velocity and center frequency were changed with the age of mortar depends on the water-cement ratio. (author)

  2. Evaluation of the basic mechanical and thermal properties of deep crystalline rocks

    International Nuclear Information System (INIS)

    Park, Byoung Yoon; Bae, Dae Seok; Kim, Chun Soo; Kim, Kyung Su; Koh, Young Kwon; Jeon, Seok Won

    2001-04-01

    This report provides the mechanical and thermal properties of granitic intact rocks obtained from Deep Core Drilling Program which is carried out as part of the assessment of deep geological environmental condition. These data are the basic material properties of the core samples from the boreholes drilled up to 500 m depth at the Yusung and Kosung sites. These sites were selected based on the result of preliminary site evaluation study. In this study, the mechanical properties include density, porosity, P-wave velocity, S-wave velocity, uniaxial compressive strength, Young's modulus, Poisson's ratio, tensile strength, and shear strength of fractures, and the thermal properties are heat conductivity, thermal expansion coefficient, specific heat and so on. Those properties were measured through laboratory tests and these data are compared with the existing test results of several domestic rocks

  3. Evaluation of the basic mechanical and thermal properties of deep crystalline rocks

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung Yoon; Bae, Dae Seok; Kim, Chun Soo; Kim, Kyung Su; Koh, Young Kwon; Jeon, Seok Won

    2001-04-01

    This report provides the mechanical and thermal properties of granitic intact rocks obtained from Deep Core Drilling Program which is carried out as part of the assessment of deep geological environmental condition. These data are the basic material properties of the core samples from the boreholes drilled up to 500 m depth at the Yusung and Kosung sites. These sites were selected based on the result of preliminary site evaluation study. In this study, the mechanical properties include density, porosity, P-wave velocity, S-wave velocity, uniaxial compressive strength, Young's modulus, Poisson's ratio, tensile strength, and shear strength of fractures, and the thermal properties are heat conductivity, thermal expansion coefficient, specific heat and so on. Those properties were measured through laboratory tests and these data are compared with the existing test results of several domestic rocks.

  4. An optical flow algorithm based on gradient constancy assumption for PIV image processing

    International Nuclear Information System (INIS)

    Zhong, Qianglong; Yang, Hua; Yin, Zhouping

    2017-01-01

    Particle image velocimetry (PIV) has matured as a flow measurement technique. It enables the description of the instantaneous velocity field of the flow by analyzing the particle motion obtained from digitally recorded images. Correlation based PIV evaluation technique is widely used because of its good accuracy and robustness. Although very successful, correlation PIV technique has some weakness which can be avoided by optical flow based PIV algorithms. At present, most of the optical flow methods applied to PIV are based on brightness constancy assumption. However, some factors of flow imaging technology and the nature property of the fluids make the brightness constancy assumption less appropriate in real PIV cases. In this paper, an implementation of a 2D optical flow algorithm (GCOF) based on gradient constancy assumption is introduced. The proposed GCOF assumes the edges of the illuminated PIV particles are constant during motion. It comprises two terms: a combined local-global gradient data term and a first-order divergence and vorticity smooth term. The approach can provide accurate dense motion fields. The approach are tested on synthetic images and on two experimental flows. The comparison of GCOF with other optical flow algorithms indicates the proposed method is more accurate especially in conditions of illumination variation. The comparison of GCOF with correlation PIV technique shows that the proposed GCOF has advantages on preserving small divergence and vorticity structures of the motion field and getting less outliers. As a consequence, the GCOF acquire a more accurate and better topological description of the turbulent flow. (paper)

  5. Triatominae Biochemistry Goes to School: Evaluation of a Novel Tool for Teaching Basic Biochemical Concepts of Chagas Disease Vectors

    Science.gov (United States)

    Cunha, Leonardo Rodrigues; de Oliveria Cudischevitch, Cecília; Carneiro, Alan Brito; Macedo, Gustavo Bartholomeu; Lannes, Denise; da Silva-Neto, Mário Alberto Cardoso

    2014-01-01

    We evaluate a new approach to teaching the basic biochemistry mechanisms that regulate the biology of Triatominae, major vectors of "Trypanosoma cruzi," the causative agent of Chagas disease. We have designed and used a comic book, "Carlos Chagas: 100 years after a hero's discovery" containing scientific information obtained by…

  6. The Importance of the Assumption of Uncorrelated Errors in Psychometric Theory

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.; Patelis, Thanos

    2015-01-01

    A critical discussion of the assumption of uncorrelated errors in classical psychometric theory and its applications is provided. It is pointed out that this assumption is essential for a number of fundamental results and underlies the concept of parallel tests, the Spearman-Brown's prophecy and the correction for attenuation formulas as well as…

  7. Formalization and Analysis of Reasoning by Assumption

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning

  8. Psychopatholgy, fundamental assumptions and CD-4 T lymphocyte ...

    African Journals Online (AJOL)

    In addition, we explored whether psychopathology and negative fundamental assumptions in ... Method: Self-rating questionnaires to assess depressive symptoms, ... associated with all participants scoring in the positive range of the FA scale.

  9. [Spirometry - basic examination of the lung function].

    Science.gov (United States)

    Kociánová, Jana

    Spirometry is one of the basic internal examination methods, similarly as e.g. blood pressure measurement or ECG recording. It is used to detect or assess the extent of ventilatory disorders. Indications include respiratory symptoms or laboratory anomalies, smoking, inhalation risks and more. Its performance and evaluation should be among the basic skills of pulmonologists, internists, alergologists, pediatricians and sports physicians. The results essentially influence the correct diagnosing and treatment method. Therefore spirometry must be performed under standardized conditions and accurately and clearly assessed to enable answering clinical questions.Key words: acceptability - calibration - contraindication - evaluation - indication - parameters - spirometry - standardization.

  10. A basic review on the inferior alveolar nerve block techniques.

    Science.gov (United States)

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.

  11. Experimental evaluation of the pure configurational stress assumption in the flow dynamics of entangled polymer melts

    DEFF Research Database (Denmark)

    Rasmussen, Henrik K.; Bejenariu, Anca Gabriela; Hassager, Ole

    2010-01-01

    to the flow in the non-linear flow regime. This has allowed highly elastic measurements within the limit of pure orientational stress, as the time of the flow was considerably smaller than the Rouse time. A Doi-Edwards [J. Chem. Soc., Faraday Trans. 2 74, 1818-1832 (1978)] type of constitutive model...... with the assumption of pure configurational stress was accurately able to predict the startup as well as the reversed flow behavior. This confirms that this commonly used theoretical picture for the flow of polymeric liquids is a correct physical principle to apply. c 2010 The Society of Rheology. [DOI: 10.1122/1.3496378]...

  12. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    Directory of Open Access Journals (Sweden)

    Hazim Adnan Hashim

    2016-09-01

    Full Text Available The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led many individuals to build new kind of beliefs and assumptions about themselves and the world. Many writers have written about the human ordeals that followed this incident. Don DeLillo’s Falling Man reflects the traumatic repercussions of this disaster on Americans’ fundamental assumptions. The objective of this study is to examine the novel from the traumatic perspective that has afflicted the victims’ fundamental understandings of the world and the self. Individuals’ fundamental understandings could be changed or modified due to exposure to certain types of events like war, terrorism, political violence or even the sense of alienation. The Assumptive World theory of Ronnie Janoff-Bulman will be used as a framework to study the traumatic experience of the characters in Falling Man. The significance of the study lies in providing a new perception to the field of trauma that can help trauma victims to adopt alternative assumptions or reshape their previous ones to heal from traumatic effects.

  13. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  14. The Role of Policy Assumptions in Validating High-stakes Testing Programs.

    Science.gov (United States)

    Kane, Michael

    L. Cronbach has made the point that for validity arguments to be convincing to diverse audiences, they need to be based on assumptions that are credible to these audiences. The interpretations and uses of high stakes test scores rely on a number of policy assumptions about what should be taught in schools, and more specifically, about the content…

  15. BPA review of Washington Public Power Supply System, Projects 1 and 3 (WNP 1 and 3), construction schedule and financing assumptions

    International Nuclear Information System (INIS)

    1984-01-01

    This document contains the following appendices: Data provided By Supply System Regarding Costs and Schedules; Basic Supply System Data and Assumptions; Detailed Modeling of Net Present Values; Origin and Detailed Description of the System Analysis Mode; Decision Analysis Model; Pro Forma Budget Expenditure Levels for Fiscal years 1984 through 1990; Financial Flexibility Analysis - Discretionary/Nondiscretionary Expenditure Levels; Detailed Analysis of BPA's Debt Structure Under the 13 Pro Forma Budget Scenarios for Fiscal Years 1984 through 1990; Wertheim and Co., Inc., August 30, 1984 Letter; Project Considerations and Licensing/Regulatory Issues, Supply System September 15, 1984 Letter; and Summary of Litigation Affecting WNP 1 and 3, and WNP 4 and 5

  16. Ultrasound assisted evaluation of chest pain in the emergency department.

    Science.gov (United States)

    Colony, M Deborah; Edwards, Frank; Kellogg, Dylan

    2018-04-01

    Chest pain is a commonly encountered emergency department complaint, with a broad differential including several life-threatening possible conditions. Ultrasound-assisted evaluation can potentially be used to rapidly and accurately arrive at the correct diagnosis. We propose an organized, ultrasound assisted evaluation of the patient with chest pain using a combination of ultrasound, echocardiography and clinical parameters. Basic echo techniques which can be mastered by residents in a short time are used plus standardized clinical questions and examination. Information is kept on a checklist. We hypothesize that this will result in a quicker, more accurate evaluation of chest pain in the ED leading to timely treatment and disposition of the patient, less provider anxiety, a reduction in the number of diagnostic errors, and the removal of false assumptions from the diagnostic process. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Directory of Open Access Journals (Sweden)

    Giordano James

    2010-01-01

    Full Text Available Abstract A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order, and what counts as abnormality (i.e.- disorder. The distinction(s between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice.

  18. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Science.gov (United States)

    2010-01-01

    A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order), and what counts as abnormality (i.e.- disorder). The distinction(s) between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice. PMID:20109176

  19. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  20. The incompressibility assumption in computational simulations of nasal airflow.

    Science.gov (United States)

    Cal, Ismael R; Cercos-Pita, Jose Luis; Duque, Daniel

    2017-06-01

    Most of the computational works on nasal airflow up to date have assumed incompressibility, given the low Mach number of these flows. However, for high temperature gradients, the incompressibility assumption could lead to a loss of accuracy, due to the temperature dependence of air density and viscosity. In this article we aim to shed some light on the influence of this assumption in a model of calm breathing in an Asian nasal cavity, by solving the fluid flow equations in compressible and incompressible formulation for different ambient air temperatures using the OpenFOAM package. At low flow rates and warm climatological conditions, similar results were obtained from both approaches, showing that density variations need not be taken into account to obtain a good prediction of all flow features, at least for usual breathing conditions. This agrees with most of the simulations previously reported, at least as far as the incompressibility assumption is concerned. However, parameters like nasal resistance and wall shear stress distribution differ for air temperatures below [Formula: see text]C approximately. Therefore, density variations should be considered for simulations at such low temperatures.

  1. Impacts of cloud overlap assumptions on radiative budgets and heating fields in convective regions

    Science.gov (United States)

    Wang, XiaoCong; Liu, YiMin; Bao, Qing

    2016-01-01

    Impacts of cloud overlap assumptions on radiative budgets and heating fields are explored with the aid of a cloud-resolving model (CRM), which provided cloud geometry as well as cloud micro and macro properties. Large-scale forcing data to drive the CRM are from TRMM Kwajalein Experiment and the Global Atmospheric Research Program's Atlantic Tropical Experiment field campaigns during which abundant convective systems were observed. The investigated overlap assumptions include those that were traditional and widely used in the past and the one that was recently addressed by Hogan and Illingworth (2000), in which the vertically projected cloud fraction is expressed by a linear combination of maximum and random overlap, with the weighting coefficient depending on the so-called decorrelation length Lcf. Results show that both shortwave and longwave cloud radiative forcings (SWCF/LWCF) are significantly underestimated under maximum (MO) and maximum-random (MRO) overlap assumptions, whereas remarkably overestimated under the random overlap (RO) assumption in comparison with that using CRM inherent cloud geometry. These biases can reach as high as 100 Wm- 2 for SWCF and 60 Wm- 2 for LWCF. By its very nature, the general overlap (GenO) assumption exhibits an encouraging performance on both SWCF and LWCF simulations, with the biases almost reduced by 3-fold compared with traditional overlap assumptions. The superiority of GenO assumption is also manifested in the simulation of shortwave and longwave radiative heating fields, which are either significantly overestimated or underestimated under traditional overlap assumptions. The study also pointed out the deficiency of constant assumption on Lcf in GenO assumption. Further examinations indicate that the CRM diagnostic Lcf varies among different cloud types and tends to be stratified in the vertical. The new parameterization that takes into account variation of Lcf in the vertical well reproduces such a relationship and

  2. Forecasting Value-at-Risk under Different Distributional Assumptions

    Directory of Open Access Journals (Sweden)

    Manuela Braione

    2016-01-01

    Full Text Available Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR. We provide a comprehensive look at the problem by considering the impact that different distributional assumptions have on the accuracy of both univariate and multivariate GARCH models in out-of-sample VaR prediction. The set of analyzed distributions comprises the normal, Student, Multivariate Exponential Power and their corresponding skewed counterparts. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures used to rank the different specifications. The results show the importance of allowing for heavy-tails and skewness in the distributional assumption with the skew-Student outperforming the others across all tests and confidence levels.

  3. Evaluation of mid-to-long term basic research for environmental restoration

    International Nuclear Information System (INIS)

    1989-09-01

    This document describes a long-term basic research program for the US Department of Energy (DOE) that complements departmental initiatives in waste management and site cleanup. The most important problems faced by DOE are environmental restoration of waste sites and cleanup of inactive facilities. Environmental restoration is defined in this report as characterization, assessment, remediation, and post-closure verification within the waste/environmental system at DOE sites. Remediation of inactive, contaminated waste-disposal sites is the largest and most expensive task facing DOE. Immobilization, isolation, separation, and destruction of waste, either aboveground or in situ, are difficult and costly tasks. Technologies for these tasks are primitive or do not exist. Departmental problems in the long term are being analyzed scientifically and research needs are being identified. When completed, the Office of Energy Research's (OER's) basis research plan will describe potential scientific research needs for universities, national laboratories, and others as a basis for research proposals to DOE. Extensive interaction with the scientific community is planned to further refine and prioritize research needs. Basic research within DOE is directed toward fundamental knowledge leading to the discovery of new scientific or engineering concepts and principles that may or may not have immediate specific technological applications. However, because DOE is a mission-oriented agency, basic research in DOE is strongly influenced by national energy and environmental policy and may be multidisciplinary in nature. Basic research will provide innovative concepts and the fundamental knowledge base that facilitates the development and application of new and emerging technologies. 41 refs., 5 figs., 9 tabs

  4. False assumptions.

    Science.gov (United States)

    Swaminathan, M

    1997-01-01

    Indian women do not have to be told the benefits of breast feeding or "rescued from the clutches of wicked multinational companies" by international agencies. There is no proof that breast feeding has declined in India; in fact, a 1987 survey revealed that 98% of Indian women breast feed. Efforts to promote breast feeding among the middle classes rely on such initiatives as the "baby friendly" hospital where breast feeding is promoted immediately after birth. This ignores the 76% of Indian women who give birth at home. Blaming this unproved decline in breast feeding on multinational companies distracts attention from more far-reaching and intractable effects of social change. While the Infant Milk Substitutes Act is helpful, it also deflects attention from more pressing issues. Another false assumption is that Indian women are abandoning breast feeding to comply with the demands of employment, but research indicates that most women give up employment for breast feeding, despite the economic cost to their families. Women also seek work in the informal sector to secure the flexibility to meet their child care responsibilities. Instead of being concerned about "teaching" women what they already know about the benefits of breast feeding, efforts should be made to remove the constraints women face as a result of their multiple roles and to empower them with the support of families, governmental policies and legislation, employers, health professionals, and the media.

  5. 7 CFR 1980.476 - Transfer and assumptions.

    Science.gov (United States)

    2010-01-01

    ...-354 449-30 to recover its pro rata share of the actual loss at that time. In completing Form FmHA or... the lender on liquidations and property management. A. The State Director may approve all transfer and... Director will notify the Finance Office of all approved transfer and assumption cases on Form FmHA or its...

  6. Why we do what we do: a theoretical evaluation of the integrated practice model for forensic nursing science.

    Science.gov (United States)

    Valentine, Julie L

    2014-01-01

    An evaluation of the Integrated Practice Model for Forensic Nursing Science () is presented utilizing methods outlined by . A brief review of nursing theory basics and evaluation methods by Meleis is provided to enhance understanding of the ensuing theoretical evaluation and critique. The Integrated Practice Model for Forensic Nursing Science, created by forensic nursing pioneer Virginia Lynch, captures the theories, assumptions, concepts, and propositions inherent in forensic nursing practice and science. The historical background of the theory is explored as Lynch's model launched the role development of forensic nursing practice as both a nursing and forensic science specialty. It is derived from a combination of nursing, sociological, and philosophical theories to reflect the grounding of forensic nursing in the nursing, legal, psychological, and scientific communities. As Lynch's model is the first inception of forensic nursing theory, it is representative of a conceptual framework although the title implies a practice theory. The clarity and consistency displayed in the theory's structural components of assumptions, concepts, and propositions are analyzed. The model is described and evaluated. A summary of the strengths and limitations of the model is compiled followed by application to practice, education, and research with suggestions for ongoing theory development.

  7. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  8. Interface Input/Output Automata: Splitting Assumptions from Guarantees

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Nyman, Ulrik; Wasowski, Andrzej

    2006-01-01

    's \\IOAs [11], relying on a context dependent notion of refinement based on relativized language inclusion. There are two main contributions of the work. First, we explicitly separate assumptions from guarantees, increasing the modeling power of the specification language and demonstrating an interesting...

  9. Technical note: Evaluation of the simultaneous measurements of mesospheric OH, HO2, and O3 under a photochemical equilibrium assumption - a statistical approach

    Science.gov (United States)

    Kulikov, Mikhail Y.; Nechaev, Anton A.; Belikovich, Mikhail V.; Ermakova, Tatiana S.; Feigin, Alexander M.

    2018-05-01

    This Technical Note presents a statistical approach to evaluating simultaneous measurements of several atmospheric components under the assumption of photochemical equilibrium. We consider simultaneous measurements of OH, HO2, and O3 at the altitudes of the mesosphere as a specific example and their daytime photochemical equilibrium as an evaluating relationship. A simplified algebraic equation relating local concentrations of these components in the 50-100 km altitude range has been derived. The parameters of the equation are temperature, neutral density, local zenith angle, and the rates of eight reactions. We have performed a one-year simulation of the mesosphere and lower thermosphere using a 3-D chemical-transport model. The simulation shows that the discrepancy between the calculated evolution of the components and the equilibrium value given by the equation does not exceed 3-4 % in the full range of altitudes independent of season or latitude. We have developed a statistical Bayesian evaluation technique for simultaneous measurements of OH, HO2, and O3 based on the equilibrium equation taking into account the measurement error. The first results of the application of the technique to MLS/Aura data (Microwave Limb Sounder) are presented in this Technical Note. It has been found that the satellite data of the HO2 distribution regularly demonstrate lower altitudes of this component's mesospheric maximum. This has also been confirmed by model HO2 distributions and comparison with offline retrieval of HO2 from the daily zonal means MLS radiance.

  10. On the Basic Equations of the Magnetostatics

    Directory of Open Access Journals (Sweden)

    A. M. Makarov

    2016-01-01

    Full Text Available The paper studies the physical relationship between the main objects of the magnetic field in a continuous medium with magnetization effects. Consistently considers the following hypotheses: a hypothesis of the primacy and the physical reality of the magnetization vector field environment, a similar hypothesis about the real existence of Ampere currents (molecular currents, magnetization currents, a hypothesis of a magnetic dipole moment of the medium volume element in view of bulk density of electric currents in this volume. A more rigorous derivation of the basic differential equations of magnetostatics from the Biot-Savart-Laplace equation is proposed.The well-known works justifying basic equations of magnetostatics use a procedure wherein when proving the local differential ratio is used a transformation of some volume integral to the surface integral bounding this volume. Thus, there is a specific way to select a closed surface that is either a surface in a vacuum (beyond the medium volume under consideration, or a surface of the conductor (a normal component of currents to the surface, here, becomes zero. In the paper the control surface is arbitrarily carried out within the volume of the medium under consideration, thereby leading to the mathematically sound result.The paper analyzes the hypotheses listed above. The main feature of analysis is a succesively using concept of bilateralism surface bounding the medium volume of the arbitrary finite dimensions. The analysis allowed us to reveal the physical adequacy of the considered hypotheses, derive the appropriate differential equations for the basic vector fields of magnetostatics and obtain a new condition. The resulting condition for the closedness of magnetization currents is recorded in entire compliance with the well-known Gauss electrostatic law, which avoids the need for additional, but not always reasonable assumptions.

  11. Impact of one-layer assumption on diffuse reflectance spectroscopy of skin

    Science.gov (United States)

    Hennessy, Ricky; Markey, Mia K.; Tunnell, James W.

    2015-02-01

    Diffuse reflectance spectroscopy (DRS) can be used to noninvasively measure skin properties. To extract skin properties from DRS spectra, you need a model that relates the reflectance to the tissue properties. Most models are based on the assumption that skin is homogenous. In reality, skin is composed of multiple layers, and the homogeneity assumption can lead to errors. In this study, we analyze the errors caused by the homogeneity assumption. This is accomplished by creating realistic skin spectra using a computational model, then extracting properties from those spectra using a one-layer model. The extracted parameters are then compared to the parameters used to create the modeled spectra. We used a wavelength range of 400 to 750 nm and a source detector separation of 250 μm. Our results show that use of a one-layer skin model causes underestimation of hemoglobin concentration [Hb] and melanin concentration [mel]. Additionally, the magnitude of the error is dependent on epidermal thickness. The one-layer assumption also causes [Hb] and [mel] to be correlated. Oxygen saturation is overestimated when it is below 50% and underestimated when it is above 50%. We also found that the vessel radius factor used to account for pigment packaging is correlated with epidermal thickness.

  12. Modelling sexual transmission of HIV: testing the assumptions, validating the predictions

    Science.gov (United States)

    Baggaley, Rebecca F.; Fraser, Christophe

    2010-01-01

    Purpose of review To discuss the role of mathematical models of sexual transmission of HIV: the methods used and their impact. Recent findings We use mathematical modelling of “universal test and treat” as a case study to illustrate wider issues relevant to all modelling of sexual HIV transmission. Summary Mathematical models are used extensively in HIV epidemiology to deduce the logical conclusions arising from one or more sets of assumptions. Simple models lead to broad qualitative understanding, while complex models can encode more realistic assumptions and thus be used for predictive or operational purposes. An overreliance on model analysis where assumptions are untested and input parameters cannot be estimated should be avoided. Simple models providing bold assertions have provided compelling arguments in recent public health policy, but may not adequately reflect the uncertainty inherent in the analysis. PMID:20543600

  13. The crux of the method: assumptions in ordinary least squares and logistic regression.

    Science.gov (United States)

    Long, Rebecca G

    2008-10-01

    Logistic regression has increasingly become the tool of choice when analyzing data with a binary dependent variable. While resources relating to the technique are widely available, clear discussions of why logistic regression should be used in place of ordinary least squares regression are difficult to find. The current paper compares and contrasts the assumptions of ordinary least squares with those of logistic regression and explains why logistic regression's looser assumptions make it adept at handling violations of the more important assumptions in ordinary least squares.

  14. Efficient pseudorandom generators based on the DDH assumption

    NARCIS (Netherlands)

    Rezaeian Farashahi, R.; Schoenmakers, B.; Sidorenko, A.; Okamoto, T.; Wang, X.

    2007-01-01

    A family of pseudorandom generators based on the decisional Diffie-Hellman assumption is proposed. The new construction is a modified and generalized version of the Dual Elliptic Curve generator proposed by Barker and Kelsey. Although the original Dual Elliptic Curve generator is shown to be

  15. [Preparation and evaluation of stationary phase of high performance liquid chromatography for the separation of basic solutes].

    Science.gov (United States)

    Wang, P; Wang, J; Cong, R; Dong, B

    1997-05-01

    A bonded phase for high performance liquid chromatography (HPLC) has been prepared by the new reaction between silica and silicon ether. The ether was synthesized from alkylchlorosilane and pentane-2,4-dione in the presence of imidazole under inert conditions by using anhydrous tetrahydrofuran as solvent. The bonded phase thus obtained was characterized by elemental analysis, diffuse reflectance infrared Fourier transform (DRIFT) spectroscopy and HPLC evaluation. The carbon content was 9.4% and the surface coverage almost attained 3.0micromol/m2 without end-capping. The silanol absorption peaks of the product cannot be observed from the DRIFT spectrum, which revealed that the silanization reaction proceeded thoroughly. The basic solutes, such as aniline, o-toluidine, p-toluidine, N,N-dimethylaniline and pyridine were used as the probe solutes to examine their interaction with the residual silanols on the surface of the products. No buffer or salt was used in the mobile phase for these experiments. In comparison with an acidic solute, such as, phenol, basic aniline eluted in front of phenol, and the ratio of asymmetry of aniline peak to that of the phenol peak was 1.1. Furthermore the relative k' value of p-toluidine to that of o-toluidine was also 1.1. All the results showed that the stationary phase has better quality and reproducibility and can be used for the separation of basic solutes efficiently.

  16. Contemporary assumptions on human nature and work and approach to human potential managing

    Directory of Open Access Journals (Sweden)

    Vujić Dobrila

    2006-01-01

    Full Text Available A general problem of this research is to identify if there is a relationship between the assumption on human nature and work (Mcgregor, Argyris, Schein, Steers and Porter and a general organizational model preference, as well as a mechanism of human resource management? This research was carried out in 2005/2006. The sample consisted of 317 subjects (197 managers, 105 highly educated subordinates and 15 entrepreneurs in 7 big enterprises in a group of small business enterprises differentiating in terms of the entrepreneur’s structure and a type of activity. A general hypothesis "that assumptions on human nature and work are statistically significant in connection to the preference approach (models, of work motivation commitment", has been confirmed. A specific hypothesis have been also confirmed: ·The assumptions on a human as a rational economic being are statistically significant in correlation with only two mechanisms of traditional models, the mechanism of method work control and the working discipline mechanism. ·Statistically significant assumptions on a human as a social being are correlated with all mechanisms of engaging employees, which belong to the model of the human relations, except the mechanism introducing the adequate type of prizes for all employees independently of working results. ·The assumptions on a human as a creative being are statistically significant, positively correlating with preference of two mechanisms belonging to the human resource model by investing into education and training and making conditions for the application of knowledge and skills. The young with assumptions on a human as a creative being prefer much broader repertoire of mechanisms belonging to the human resources model from the remaining category of subjects in the pattern. The connection between the assumption on human nature and preference models of engaging appears especially in the sub-pattern of managers, in the category of young subjects

  17. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  18. VQABQ: Visual Question Answering by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-03-19

    Taking an image and question as the input of our method, it can output the text-based answer of the query question about the given image, so called Visual Question Answering (VQA). There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the basic questions of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question. We formulate the basic questions generation problem as a LASSO optimization problem, and also propose a criterion about how to exploit these basic questions to help answer main question. Our method is evaluated on the challenging VQA dataset and yields state-of-the-art accuracy, 60.34% in open-ended task.

  19. VQABQ: Visual Question Answering by Basic Questions

    KAUST Repository

    Huang, Jia-Hong; Alfadly, Modar; Ghanem, Bernard

    2017-01-01

    Taking an image and question as the input of our method, it can output the text-based answer of the query question about the given image, so called Visual Question Answering (VQA). There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the basic questions of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question. We formulate the basic questions generation problem as a LASSO optimization problem, and also propose a criterion about how to exploit these basic questions to help answer main question. Our method is evaluated on the challenging VQA dataset and yields state-of-the-art accuracy, 60.34% in open-ended task.

  20. Tank waste remediation system retrieval and disposal mission key enabling assumptions

    International Nuclear Information System (INIS)

    Baldwin, J.H.

    1998-01-01

    An overall systems approach has been applied to develop action plans to support the retrieval and immobilization waste disposal mission. The review concluded that the systems and infrastructure required to support the mission are known. Required systems are either in place or plans have been developed. An analysis of the programmatic, management and technical activities necessary to declare Readiness to Proceed with execution of the mission demonstrates that the system, people, and hardware will be on line and ready to support the private contractors. The systems approach included defining the retrieval and immobilized waste disposal mission requirements and evaluating the readiness of the TWRS contractor to supply waste feed to the private contractors in June 2002. The Phase 1 feed delivery requirements from the Private Contractor Request for Proposals were reviewed, transfer piping routes were mapped on it, existing systems were evaluated, and upgrade requirements were defined. Technical Basis Reviews were completed to define work scope in greater detail, cost estimates and associated year by year financial analyses were completed. Personnel training, qualifications, management systems and procedures were reviewed and shown to be in place and ready to support the Phase 1B mission. Key assumptions and risks that could negatively impact mission success were evaluated and appropriate mitigative actions plans were planned and scheduled

  1. Sensitivity of TRIM projections to management, harvest, yield, and stocking adjustment assumptions.

    Science.gov (United States)

    Susan J. Alexander

    1991-01-01

    The Timber Resource Inventory Model (TRIM) was used to make several projections of forest industry timber supply for the Douglas-fir region. The sensitivity of these projections to assumptions about management and yields is discussed. A base run is compared to runs in which yields were altered, stocking adjustment was eliminated, harvest assumptions were changed, and...

  2. Educational Technology as a Subversive Activity: Questioning Assumptions Related to Teaching and Leading with Technology

    Science.gov (United States)

    Kruger-Ross, Matthew J.; Holcomb, Lori B.

    2012-01-01

    The use of educational technologies is grounded in the assumptions of teachers, learners, and administrators. Assumptions are choices that structure our understandings and help us make meaning. Current advances in Web 2.0 and social media technologies challenge our assumptions about teaching and learning. The intersection of technology and…

  3. Basic tests on integrity evaluation for natural hexafluoride transporting container

    International Nuclear Information System (INIS)

    Gomi, Yoshio; Yamakawa, Hidetsugu; Kato, Osamu; Kobayashi, Seiichi

    1990-01-01

    In this study, the affected factors that needed to integrity evaluation for UF 6 transporting 48Y cylinder, were confirmed by basic tests and preliminary analysis. The factors were the sealing parts and external surface emissivity that ruled both the behavior under fire accident condition and the fire resistance capability of the cylinder, and the external pressure resistance capability at the sunk accident. The results obtained as follows. (1) Confirming tests for fire resistance of cylinder valve and plug, seat leakage of the valve caused at 150 degrees C. by unequal thermal expansion between the valve body and the stem. The tin-lead solder coating the tapered thread of valve and plug, melted at 200 degrees C., then the sealing boundary broke. (2) An external emissivity influence to radiation heat transfer measured with test pieces heated by electric oven. The covered paints of the specimen burned and separated, the emissivity changed 0.4 to 0.6, dependent on the surrounding temperature. Type 48Y cylinder filled with 12.5 tons of UF 6 and the measured emissivity was used the computer code analysis. The hydraulic breaking did not happen under the fire accident condition at 800 degrees C., for 30 minutes. (3) The external pressure test of the valve endured the hydrostatic pressure at 3000 meters, which corresponded to about five times the cylinder body buckling strength. (author)

  4. Rethinking our assumptions about the evolution of bird song and other sexually dimorphic signals

    Directory of Open Access Journals (Sweden)

    J. Jordan Price

    2015-04-01

    Full Text Available Bird song is often cited as a classic example of a sexually-selected ornament, in part because historically it has been considered a primarily male trait. Recent evidence that females also sing in many songbird species and that sexual dimorphism in song is often the result of losses in females rather than gains in males therefore appears to challenge our understanding of the evolution of bird song through sexual selection. Here I propose that these new findings do not necessarily contradict previous research, but rather they disagree with some of our assumptions about the evolution of sexual dimorphisms in general and female song in particular. These include misconceptions that current patterns of elaboration and diversity in each sex reflect past rates of change and that levels of sexual dimorphism necessarily reflect levels of sexual selection. Using New World blackbirds (Icteridae as an example, I critically evaluate these past assumptions in light of new phylogenetic evidence. Understanding the mechanisms underlying such sexually dimorphic traits requires a clear understanding of their evolutionary histories. Only then can we begin to ask the right questions.

  5. Child Development Knowledge and Teacher Preparation: Confronting Assumptions.

    Science.gov (United States)

    Katz, Lilian G.

    This paper questions the widely held assumption that acquiring knowledge of child development is an essential part of teacher preparation and teaching competence, especially among teachers of young children. After discussing the influence of culture, parenting style, and teaching style on developmental expectations and outcomes, the paper asserts…

  6. Evaluation of the assumption of continuity: outline of a new tool

    OpenAIRE

    Inácio, Helena Coelho; Serrano Moracho, Francisco

    2010-01-01

    The evaluation of going-concern is one of the most visible elements of the auditor’s reports. Usually the auditor is criticized about is incapacity of identified the going-concern red flags. The auditor report doesn’t have always the effects that we expect, but there is evidence of some effects and it is an additional element to be considered in the moment of a decision. For these reasons, some statically models have been developed to help auditors in the evaluation of going-conce...

  7. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  8. 38 CFR 3.315 - Basic eligibility determinations; dependents, loans, education.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Basic eligibility determinations; dependents, loans, education. 3.315 Section 3.315 Pensions, Bonuses, and Veterans' Relief... Ratings and Evaluations; Service Connection § 3.315 Basic eligibility determinations; dependents, loans...

  9. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  10. Making Predictions about Chemical Reactivity: Assumptions and Heuristics

    Science.gov (United States)

    Maeyer, Jenine; Talanquer, Vicente

    2013-01-01

    Diverse implicit cognitive elements seem to support but also constrain reasoning in different domains. Many of these cognitive constraints can be thought of as either implicit assumptions about the nature of things or reasoning heuristics for decision-making. In this study we applied this framework to investigate college students' understanding of…

  11. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  12. Dialogic or Dialectic? The Significance of Ontological Assumptions in Research on Educational Dialogue

    Science.gov (United States)

    Wegerif, Rupert

    2008-01-01

    This article explores the relationship between ontological assumptions and studies of educational dialogue through a focus on Bakhtin's "dialogic". The term dialogic is frequently appropriated to a modernist framework of assumptions, in particular the neo-Vygotskian or sociocultural tradition. However, Vygotsky's theory of education is dialectic,…

  13. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  14. [Evaluative study of nursing consultation in the basic networks of Curitiba, Brazil].

    Science.gov (United States)

    da Silva, Sandra Honorato; Cubas, Marcia Regina; Fedalto, Maira Aparecida; da Silva, Sandra Regina; Limas, Thaís Cristina da Costa

    2010-03-01

    The implementation of the electronic health record in the basic networks of Curitiba enabled an advance in the implementation of the nursing consultation and the ICNPCH, whose modeling uses the ICNP axes structure and the ICNPCH list of action. The objective of this study was to evaluate the nursing consultation from the productivity and assistance coverage perspective. The studied population was obtained from a secondary database of nursing consultations from April to June of 2005. The analysis was performed using the Datawarehouse and OLAP tool. The productivity per professional was found to be 2.5 consultations per day. Professionals use 16% of their daily work time with this activity and up to 27% of their potential per month. The ICNPCH was used in 21% of the consultations. There is a 0.08 consultation coverage per inhabitant for 6% of the population. The nursing consultation makes it possible to characterize the nurses' role in health care and a new professional position capable of affecting the construction of public politics.

  15. Evaluating the assessment of essay type questions in the basic ...

    African Journals Online (AJOL)

    Methodology: We examined the merits and demerits of the closed and open systems of assessment of essay type questions and viva voce in professional exams in the Basic Medical Sciences together with the challenges of present day Medical Education. Result: The result showed that the closed system of marking in its ...

  16. Breakdown of Hydrostatic Assumption in Tidal Channel with Scour Holes

    Directory of Open Access Journals (Sweden)

    Chunyan Li

    2016-10-01

    Full Text Available Hydrostatic condition is a common assumption in tidal and subtidal motions in oceans and estuaries.. Theories with this assumption have been largely successful. However, there is no definite criteria separating the hydrostatic from the non-hydrostatic regimes in real applications because real problems often times have multiple scales. With increased refinement of high resolution numerical models encompassing smaller and smaller spatial scales, the need for non-hydrostatic models is increasing. To evaluate the vertical motion over bathymetric changes in tidal channels and assess the validity of the hydrostatic approximation, we conducted observations using a vessel-based acoustic Doppler current profiler (ADCP. Observations were made along a straight channel 18 times over two scour holes of 25 m deep, separated by 330 m, in and out of an otherwise flat 8 m deep tidal pass leading to the Lake Pontchartrain over a time period of 8 hours covering part of the diurnal tidal cycle. Out of the 18 passages over the scour holes, 11 of them showed strong upwelling and downwelling which resulted in the breakdown of hydrostatic condition. The maximum observed vertical velocity was ~ 0.35 m/s, a high value in a tidal channel, and the estimated vertical acceleration reached a high value of 1.76×10-2 m/s2. Analysis demonstrated that the barotropic non-hydrostatic acceleration was dominant. The cause of the non-hydrostatic flow was the that over steep slopes. This demonstrates that in such a system, the bathymetric variation can lead to the breakdown of hydrostatic conditions. Models with hydrostatic restrictions will not be able to correctly capture the dynamics in such a system with significant bathymetric variations particularly during strong tidal currents.

  17. Pre-training evaluation and feedback improved skills retention of basic life support in medical students.

    Science.gov (United States)

    Li, Qi; Zhou, Rong-hua; Liu, Jin; Lin, Jing; Ma, Er-Li; Liang, Peng; Shi, Ting-wei; Fang, Li-qun; Xiao, Hong

    2013-09-01

    Pre-training evaluation and feedback have been shown to improve medical students' skills acquisition of basic life support (BLS) immediately following training. The impact of such training on BLS skills retention is unknown. This study was conducted to investigate effects of pre-training evaluation and feedback on BLS skills retention in medical students. Three hundred and thirty 3rd year medical students were randomized to two groups, the control group (C group) and pre-training evaluation and feedback group (EF group). Each group was subdivided into four subgroups according to the time of retention-test (at 1-, 3-, 6-, 12-month following the initial training). After a 45-min BLS lecture, BLS skills were assessed (pre-training evaluation) in both groups before training. Following this, the C group received 45 min training. 15 min of group feedback corresponding to students' performance in pre-training evaluation was given only in the EF group that was followed by 30 min of BLS training. BLS skills were assessed immediately after training (post-test) and at follow up (retention-test). No skills difference was observed between the two groups in pre-training evaluation. Better skills acquisition was observed in the EF group (85.3 ± 7.3 vs. 68.1 ± 12.2 in C group) at post-test (p<0.001). In all retention-test, better skills retention was observed in each EF subgroup, compared with its paired C subgroup. Pre-training evaluation and feedback improved skills retention in the EF group for 12 months after the initial training, compared with the control group. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Hygiene Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Hygiene Basics KidsHealth / For Teens / Hygiene Basics What's in this article? Oily Hair Sweat ... smell, anyway? Read below for information on some hygiene basics — and learn how to deal with greasy ...

  19. Towards New Probabilistic Assumptions in Business Intelligence

    OpenAIRE

    Schumann Andrew; Szelc Andrzej

    2015-01-01

    One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot ...

  20. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... for the component. The CPN model can be used to validate the environment assumptions and the requirements. The validation is performed by execution of the model during which traces of events and states are automatically generated and evaluated against the requirements....

  1. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    Science.gov (United States)

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977

  2. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    Science.gov (United States)

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  3. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI: Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    Directory of Open Access Journals (Sweden)

    Andrea Nechtelberger

    2017-11-01

    Full Text Available The United Nations Academic Impact (UNAI Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7. Starting from the assumptions of Moral Foundations Theory (MFT, we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  4. Interpretation of hydraulic conductivity data and parameter evaluation for groundwater flow models

    International Nuclear Information System (INIS)

    Niemi, A.

    1991-01-01

    The report reviews recent developments in evaluating effective permeabilities for groundwater flow models, starting from methods of well test interpretation for and proceeding to the principles of parameter estimation. Basic concepts of parameter evaluation as well as expressions derived for effective permeabilities in traditional porous medium are described. Due to the assumptions made, these do often not apply for fractured media. Specific features of fractured medium are discussed, including approaches used determining the size of a possible REV and questions related to the application of stochastic theories. Due to the difficulties encountered when applying traditional deterministic models for fractured media, stochastic and fracture network approaches have been developed. The application of these techniques is still under development, the main questions to be resolved being related to the scarcity of data

  5. Thinking science with thinking machines: The multiple realities of basic and applied knowledge in a research border zone.

    Science.gov (United States)

    Hoffman, Steve G

    2015-04-01

    Some scholars dismiss the distinction between basic and applied science as passé, yet substantive assumptions about this boundary remain obdurate in research policy, popular rhetoric, the sociology and philosophy of science, and, indeed, at the level of bench practice. In this article, I draw on a multiple ontology framework to provide a more stable affirmation of a constructivist position in science and technology studies that cannot be reduced to a matter of competing perspectives on a single reality. The analysis is grounded in ethnographic research in the border zone of Artificial Intelligence science. I translate in-situ moments in which members of neighboring but differently situated labs engage in three distinct repertoires that render the reality of basic and applied science: partitioning, flipping, and collapsing. While the essences of scientific objects are nowhere to be found, the boundary between basic and applied is neither illusion nor mere propaganda. Instead, distinctions among scientific knowledge are made real as a matter of course.

  6. A dedicated breast-PET/CT scanner: Evaluation of basic performance characteristics.

    Science.gov (United States)

    Raylman, Raymond R; Van Kampen, Will; Stolin, Alexander V; Gong, Wenbo; Jaliparthi, Gangadhar; Martone, Peter F; Smith, Mark F; Sarment, David; Clinthorne, Neal H; Perna, Mark

    2018-04-01

    Application of advanced imaging techniques, such as PET and x ray CT, can potentially improve detection of breast cancer. Unfortunately, both modalities have challenges in the detection of some lesions. The combination of the two techniques, however, could potentially lead to an overall improvement in diagnostic breast imaging. The purpose of this investigation is to test the basic performance of a new dedicated breast-PET/CT. The PET component consists of a rotating pair of detectors. Its performance was evaluated using the NEMA NU4-2008 protocols. The CT component utilizes a pulsed x ray source and flat panel detector mounted on the same gantry as the PET scanner. Its performance was assessed using specialized phantoms. The radiation dose to a breast during CT imaging was explored by the measurement of free-in-air kerma and air kerma measured at the center of a 16 cm-diameter PMMA cylinder. Finally, the combined capabilities of the system were demonstrated by imaging of a micro-hot-rod phantom. Overall, performance of the PET component is comparable to many pre-clinical and other dedicated breast-PET scanners. Its spatial resolution is 2.2 mm, 5 mm from the center of the scanner using images created with the single-sliced-filtered-backprojection algorithm. Peak NECR is 24.6 kcps; peak sensitivity is 1.36%; the scatter fraction is 27%. Spatial resolution of the CT scanner is 1.1 lp/mm at 10% MTF. The free-in-air kerma is 2.33 mGy, while the PMMA-air kerma is 1.24 mGy. Finally, combined imaging of a micro-hot-rod phantom illustrated the potential utility of the dual-modality images produced by the system. The basic performance characteristics of a new dedicated breast-PET/CT scanner are good, demonstrating that its performance is similar to current dedicated PET and CT scanners. The potential value of this system is the capability to produce combined duality-modality images that could improve detection of breast disease. The next stage in development of this system

  7. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J.; Kahn, Yonatan; McCullough, Matthew

    2015-10-06

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  8. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    Science.gov (United States)

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Heterosexual assumptions in verbal and non-verbal communication in nursing.

    Science.gov (United States)

    Röndahl, Gerd; Innala, Sune; Carlsson, Marianne

    2006-11-01

    This paper reports a study of what lesbian women and gay men had to say, as patients and as partners, about their experiences of nursing in hospital care, and what they regarded as important to communicate about homosexuality and nursing. The social life of heterosexual cultures is based on the assumption that all people are heterosexual, thereby making homosexuality socially invisible. Nurses may assume that all patients and significant others are heterosexual, and these heteronormative assumptions may lead to poor communication that affects nursing quality by leading nurses to ask the wrong questions and make incorrect judgements. A qualitative interview study was carried out in the spring of 2004. Seventeen women and 10 men ranging in age from 23 to 65 years from different parts of Sweden participated. They described 46 experiences as patients and 31 as partners. Heteronormativity was communicated in waiting rooms, in patient documents and when registering for admission, and nursing staff sometimes showed perplexity when an informant deviated from this heteronormative assumption. Informants had often met nursing staff who showed fear of behaving incorrectly, which could lead to a sense of insecurity, thereby impeding further communication. As partners of gay patients, informants felt that they had to deal with heterosexual assumptions more than they did when they were patients, and the consequences were feelings of not being accepted as a 'true' relative, of exclusion and neglect. Almost all participants offered recommendations about how nursing staff could facilitate communication. Heterosexual norms communicated unconsciously by nursing staff contribute to ambivalent attitudes and feelings of insecurity that prevent communication and easily lead to misconceptions. Educational and management interventions, as well as increased communication, could make gay people more visible and thereby encourage openness and awareness by hospital staff of the norms that they

  10. Validation of a novel basic virtual reality simulator, the LAP-X, for training basic laparoscopic skills.

    Science.gov (United States)

    Kawaguchi, Koji; Egi, Hiroyuki; Hattori, Minoru; Sawada, Hiroyuki; Suzuki, Takahisa; Ohdan, Hideki

    2014-10-01

    Virtual reality surgical simulators are becoming popular as a means of providing trainees with an opportunity to practice laparoscopic skills. The Lap-X (Epona Medical, Rotterdam, the Netherlands) is a novel VR simulator for training basic skills in laparoscopic surgery. The objective of this study was to validate the LAP-X laparoscopic virtual reality simulator by assessing the face and construct validity in order to determine whether the simulator is adequate for basic skills training. The face and content validity were evaluated using a structured questionnaire. To assess the construct validity, the participants, nine expert surgeons (median age: 40 (32-45)) (>100 laparoscopic procedures) and 11 novices performed three basic laparoscopic tasks using the Lap-X. The participants reported a high level of content validity. No significant differences were found between the expert surgeons and the novices (Ps > 0.246). The performance of the expert surgeons on the three tasks was significantly better than that of the novices in all parameters (Ps training device.

  11. [Platforms are needed for innovative basic research in ophthalmology].

    Science.gov (United States)

    Wang, Yi-qiang

    2012-07-01

    Basic research poses the cornerstone of technical innovation in all lines including medical sciences. Currently, there are shortages of professional scientists as well as technical supporting teams and facilities in the field of basic research of ophthalmology and visual science in China. Evaluation system and personnel policies are not supportive for innovative but high-risk-of-failure research projects. Discussion of reasons and possible solutions are given here to address these problems, aiming at promoting buildup of platforms hosting novel and important basic research in eye science in this country.

  12. Observing gravitational-wave transient GW150914 with minimal assumptions

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. C.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, R.D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, A.L.S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, T.C; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brocki, P.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderon Bustillo, J.; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Qian; Chua, S. E.; Chung, E.S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, A.C.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, A.L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Debra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.A.; DeRosa, R. T.; Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M.G.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H. -B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, T. M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.M.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. R.; Flaminio, R.; Fletcher, M; Fournier, J. -D.; Franco, S; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.P.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.M.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; de Haas, R.; Hacker, J. J.; Buffoni-Hall, R.; Hall, E. D.; Hammond, G.L.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, P.J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hinder, I.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.H.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kefelian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan., S.; Khan, Z.; Khazanov, E. A.; Kijhunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.M.; King, E. J.; King, P. J.; Kinsey, M.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Laguna, P.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, R.; Leavey, S.; Lebigot, E. O.; Lee, C.H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lueck, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magana-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R.M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, J.C.; Moraru, D.; Gutierrez Moreno, M.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Gutierrez-Neri, M.; Neunzert, A.; Newton-Howes, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J.; Oh, S. H.; Ohme, F.; Oliver, M. B.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Page, J.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prolchorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Puerrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosinska, D.; Rowan, S.; Ruediger, A.; Ruggi, P.; Ryan, K.A.; Sachdev, P.S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabel, R.B.; Schofield, R. M. S.; Schoenbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, M.S.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shithriar, M. S.; Shaltev, M.; Shao, Z.M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, António Dias da; Simakov, D.; Singer, A; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, J.R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepanczyk, M. J.; Tacca, M.D.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Toyra, D.; Travasso, F.; Traylor, G.; Trifiro, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlhruch, H.; Vajente, G.; Valdes, G.; Van Bakel, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasuth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, R. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.M.; Wessels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, D.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J.L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. -P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be

  13. Basic electrotechnology

    CERN Document Server

    Ashen, R A

    2013-01-01

    BASIC Electrotechnology discusses the applications of Beginner's All-purpose Symbolic Instruction Code (BASIC) in engineering, particularly in solving electrotechnology-related problems. The book is comprised of six chapters that cover several topics relevant to BASIC and electrotechnology. Chapter 1 provides an introduction to BASIC, and Chapter 2 talks about the use of complex numbers in a.c. circuit analysis. Chapter 3 covers linear circuit analysis with d.c. and sinusoidal a.c. supplies. The book also discusses the elementary magnetic circuit theory. The theory and performance of two windi

  14. The 'revealed preferences' theory: Assumptions and conjectures

    International Nuclear Information System (INIS)

    Green, C.H.

    1983-01-01

    Being kind of intuitive psychology the 'Revealed-Preferences'- theory based approaches towards determining the acceptable risks are a useful method for the generation of hypotheses. In view of the fact that reliability engineering develops faster than methods for the determination of reliability aims the Revealed-Preferences approach is a necessary preliminary help. Some of the assumptions on which the 'Revealed-Preferences' theory is based will be identified and analysed and afterwards compared with experimentally obtained results. (orig./DG) [de

  15. High Speed Railway Environment Safety Evaluation Based on Measurement Attribute Recognition Model

    Directory of Open Access Journals (Sweden)

    Qizhou Hu

    2014-01-01

    Full Text Available In order to rationally evaluate the high speed railway operation safety level, the environmental safety evaluation index system of high speed railway should be well established by means of analyzing the impact mechanism of severe weather such as raining, thundering, lightning, earthquake, winding, and snowing. In addition to that, the attribute recognition will be identified to determine the similarity between samples and their corresponding attribute classes on the multidimensional space, which is on the basis of the Mahalanobis distance measurement function in terms of Mahalanobis distance with the characteristics of noncorrelation and nondimensionless influence. On top of the assumption, the high speed railway of China environment safety situation will be well elaborated by the suggested methods. The results from the detailed analysis show that the evaluation is basically matched up with the actual situation and could lay a scientific foundation for the high speed railway operation safety.

  16. Analysis On Political Speech Of Susilo Bambang Yudhoyono: Common Sense Assumption And Ideology

    Directory of Open Access Journals (Sweden)

    Sayit Abdul Karim

    2015-10-01

    Full Text Available This paper presents an analysis on political speech of Susilo Bambang Yudhoyono (SBY, the former president of Indonesia at the Indonesian conference on “Moving towards sustainability: together we must create the future we want”. Ideologies are closely linked to power and language because using language is the commonest form of social behavior, and the form of social behavior where we rely most on ‘common-sense’ assumptions. The objectives of this study are to discuss the common sense assumption and ideology by means of language use in SBY’s political speech which is mainly grounded in Norman Fairclough’s theory of language and power in critical discourse analysis. There are two main problems of analysis, namely; first, what are the common sense assumption and ideology in Susilo Bambang Yudhoyono’s political speech; and second, how do they relate to each other in the political discourse? The data used in this study was in the form of written text on “moving towards sustainability: together we must create the future we want”. A qualitative descriptive analysis was employed to analyze the common sense assumption and ideology in the written text of Susilo Bambang Yudhoyono’s political speech which was delivered at Riocto entro Convention Center, Rio de Janeiro on June 20, 2012. One dimension of ‘common sense’ is the meaning of words. The results showed that the common sense assumption and ideology conveyed through SBY’s specific words or expressions can significantly explain how political discourse is constructed and affected by the SBY’s rule and position, life experience, and power relations. He used language as a powerful social tool to present his common sense assumption and ideology to convince his audiences and fellow citizens that the future of sustainability has been an important agenda for all people.

  17. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...

  18. The social contact hypothesis under the assumption of endemic equilibrium: Elucidating the transmission potential of VZV in Europe

    Directory of Open Access Journals (Sweden)

    E. Santermans

    2015-06-01

    Full Text Available The basic reproduction number R0 and the effective reproduction number R are pivotal parameters in infectious disease epidemiology, quantifying the transmission potential of an infection in a population. We estimate both parameters from 13 pre-vaccination serological data sets on varicella zoster virus (VZV in 12 European countries and from population-based social contact surveys under the commonly made assumptions of endemic and demographic equilibrium. The fit to the serology is evaluated using the inferred effective reproduction number R as a model eligibility criterion combined with AIC as a model selection criterion. For only 2 out of 12 countries, the common choice of a constant proportionality factor is sufficient to provide a good fit to the seroprevalence data. For the other countries, an age-specific proportionality factor provides a better fit, assuming physical contacts lasting longer than 15 min are a good proxy for potential varicella transmission events. In all countries, primary infection with VZV most often occurs in early childhood, but there is substantial variation in transmission potential with R0 ranging from 2.8 in England and Wales to 7.6 in The Netherlands. Two non-parametric methods, the maximal information coefficient (MIC and a random forest approach, are used to explain these differences in R0 in terms of relevant country-specific characteristics. Our results suggest an association with three general factors: inequality in wealth, infant vaccination coverage and child care attendance. This illustrates the need to consider fundamental differences between European countries when formulating and parameterizing infectious disease models.

  19. Compendium of cost-effectiveness evaluations of modifications for dose reduction at nuclear power plants

    International Nuclear Information System (INIS)

    Baum, J.W.; Matthews, G.R.

    1985-12-01

    This report summarizes available information on cost effectiveness of engineering modifications potentially valuable for dose reduction at nuclear power plants. Data were gathered from several US utilities, published literature, equipment and service suppliers, and recent technical meetings. Five simplified econometric models were employed to evaluate data and arrive at a value for cost effectiveness expressed in either (a) dollars/rem, or (b) total dollar savings calculated using a nominal value of $1000/rem. Models employed were: a basic model with no consideration given to the time value of money; two models in which discounting was used to evaluate costs and savings in terms of present values; and two models in which income taxes and revenue requirements were considered. Results from different models varied by as much as a factor of 10, and were generally lowest for the basic model and highest for the before-tax revenue requirements model. Results for 151 evaluations employing different assumptions concerning number of plants per site and outage impacts were tabulated in order of decreasing cost effectiveness. Twenty-five evaluations were identified as exceptionally cost effective since both costs and dose were saved. Forty evaluations indicated highly cost-effective changes based on costs below $1000/rem saved using results of the present-worth model that included discounting of future dose savings

  20. The predictive value of demonstrable stress incontinence during basic office evaluation and urodynamics in women without symptomatic urinary incontinence undergoing vaginal prolapse surgery

    NARCIS (Netherlands)

    van der Ploeg, J. Marinus; Zwolsman, Sandra E.; Posthuma, Selina; Wiarda, Hylco S.; van der Vaart, C. Huub; Roovers, Jan-Paul W. R.

    2017-01-01

    Women with pelvic organ prolapse without symptoms of urinary incontinence (UI) might demonstrate stress urinary incontinence (SUI) with or without prolapse reduction. We aimed to determine the value of demonstrable SUI during basic office evaluation or urodynamics in predicting SUI after vaginal

  1. Shattering Man’s Fundamental Assumptions in Don DeLillo’s Falling Man

    OpenAIRE

    Hazim Adnan Hashim; Rosli Bin Talif; Lina Hameed Ali

    2016-01-01

    The present study addresses effects of traumatic events such as the September 11 attacks on victims’ fundamental assumptions. These beliefs or assumptions provide individuals with expectations about the world and their sense of self-worth. Thus, they ground people’s sense of security, stability, and orientation. The September 11 terrorist attacks in the U.S.A. were very tragic for Americans because this fundamentally changed their understandings about many aspects in life. The attacks led man...

  2. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  3. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    Rose, Jeremy; Aaen, Ivan; Nielsen, Peter Axel

    2008-01-01

    Thinking about improving the management of software development in software firms is dominated by one approach: the capability maturity model devised and administered at the Software Engineering Institute at Carnegie Mellon University. Though CMM, and its replacement CMMI are widely known and used...... thinking about large production and manufacturing organisations (particularly in America) in the late industrial age. Many of the difficulties reported with CMMI can be attributed basing practice on these assumptions in organisations which have different cultures and management traditions, perhaps...

  4. Commentary: Considering Assumptions in Associations Between Music Preferences and Empathy-Related Responding

    Directory of Open Access Journals (Sweden)

    Susan A O'Neill

    2015-09-01

    Full Text Available This commentary considers some of the assumptions underpinning the study by Clark and Giacomantonio (2015. Their exploratory study examined relationships between young people's music preferences and their cognitive and affective empathy-related responses. First, the prescriptive assumption that music preferences can be measured according to how often an individual listens to a particular music genre is considered within axiology or value theory as a multidimensional construct (general, specific, and functional values. This is followed by a consideration of the causal assumption that if we increase young people's empathy through exposure to prosocial song lyrics this will increase their prosocial behavior. It is suggested that the predictive power of musical preferences on empathy-related responding might benefit from a consideration of the larger pattern of psychological and subjective wellbeing within the context of developmental regulation across ontogeny that involves mutually influential individual—context relations.

  5. New Assumptions to Guide SETI Research

    Science.gov (United States)

    Colombano, S. P.

    2018-01-01

    The recent Kepler discoveries of Earth-like planets offer the opportunity to focus our attention on detecting signs of life and technology in specific planetary systems, but I feel we need to become more flexible in our assumptions. The reason is that, while it is still reasonable and conservative to assume that life is most likely to have originated in conditions similar to ours, the vast time differences in potential evolutions render the likelihood of "matching" technologies very slim. In light of these challenges I propose a more "aggressive"� approach to future SETI exploration in directions that until now have received little consideration.

  6. Logic Assumptions and Risks Framework Applied to Defence Campaign Planning and Evaluation

    Science.gov (United States)

    2013-05-01

    based on prescriptive targets of reduction in particular crime statistics in a certain timeframe. Similarly, if overall desired effects are not well...the Evaluation Journal of Australasia, Australasian Evaluation Society.  UNCLASSIFIED 21 UNCLASSIFIED DSTO-TR-2840 These six campaign functions...Callahan’s article in “Anecdotally” Newsletter January 2013, Anecdote Pty Ltd., a commercial consultancy specialising in narrative technique for business

  7. Design and evaluation of basic standard encryption algorithm modules using nanosized complementary metal oxide semiconductor molecular circuits

    Science.gov (United States)

    Masoumi, Massoud; Raissi, Farshid; Ahmadian, Mahmoud; Keshavarzi, Parviz

    2006-01-01

    We are proposing that the recently proposed semiconductor-nanowire-molecular architecture (CMOL) is an optimum platform to realize encryption algorithms. The basic modules for the advanced encryption standard algorithm (Rijndael) have been designed using CMOL architecture. The performance of this design has been evaluated with respect to chip area and speed. It is observed that CMOL provides considerable improvement over implementation with regular CMOS architecture even with a 20% defect rate. Pseudo-optimum gate placement and routing are provided for Rijndael building blocks and the possibility of designing high speed, attack tolerant and long key encryptions are discussed.

  8. A criterion of orthogonality on the assumption and restrictions in subgrid-scale modelling of turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Fang, L. [LMP, Ecole Centrale de Pékin, Beihang University, Beijing 100191 (China); Co-Innovation Center for Advanced Aero-Engine, Beihang University, Beijing 100191 (China); Sun, X.Y. [LMP, Ecole Centrale de Pékin, Beihang University, Beijing 100191 (China); Liu, Y.W., E-mail: liuyangwei@126.com [National Key Laboratory of Science and Technology on Aero-Engine Aero-Thermodynamics, School of Energy and Power Engineering, Beihang University, Beijing 100191 (China); Co-Innovation Center for Advanced Aero-Engine, Beihang University, Beijing 100191 (China)

    2016-12-09

    In order to shed light on understanding the subgrid-scale (SGS) modelling methodology, we analyze and define the concepts of assumption and restriction in the modelling procedure, then show by a generalized derivation that if there are multiple stationary restrictions in a modelling, the corresponding assumption function must satisfy a criterion of orthogonality. Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion. This study is expected to inspire future research on generally guiding the SGS modelling methodology. - Highlights: • The concepts of assumption and restriction in the SGS modelling procedure are defined. • A criterion of orthogonality on the assumption and restrictions is derived. • Numerical tests using one-dimensional nonlinear advection equation are performed to validate this criterion.

  9. Exploring five common assumptions on Attention Deficit Hyperactivity Disorder

    NARCIS (Netherlands)

    Batstra, Laura; Nieweg, Edo H.; Hadders-Algra, Mijna

    The number of children diagnosed with attention deficit hyperactivity disorder (ADHD) and treated with medication is steadily increasing. The aim of this paper was to critically discuss five debatable assumptions on ADHD that may explain these trends to some extent. These are that ADHD (i) causes

  10. The use of the SF-36 questionnaire in adult survivors of childhood cancer: evaluation of data quality, score reliability, and scaling assumptions

    Directory of Open Access Journals (Sweden)

    Winter David L

    2006-10-01

    Full Text Available Abstract Background The SF-36 has been used in a number of previous studies that have investigated the health status of childhood cancer survivors, but it never has been evaluated regarding data quality, scaling assumptions, and reliability in this population. As health status among childhood cancer survivors is being increasingly investigated, it is important that the measurement instruments are reliable, validated and appropriate for use in this population. The aim of this paper was to determine whether the SF-36 questionnaire is a valid and reliable instrument in assessing self-perceived health status of adult survivors of childhood cancer. Methods We examined the SF-36 to see how it performed with respect to (1 data completeness, (2 distribution of the scale scores, (3 item-internal consistency, (4 item-discriminant validity, (5 internal consistency, and (6 scaling assumptions. For this investigation we used SF-36 data from a population-based study of 10,189 adult survivors of childhood cancer. Results Overall, missing values ranged per item from 0.5 to 2.9 percent. Ceiling effects were found to be highest in the role limitation-physical (76.7% and role limitation-emotional (76.5% scales. All correlations between items and their hypothesised scales exceeded the suggested standard of 0.40 for satisfactory item-consistency. Across all scales, the Cronbach's alpha coefficient of reliability was found to be higher than the suggested value of 0.70. Consistent across all cancer groups, the physical health related scale scores correlated strongly with the Physical Component Summary (PCS scale scores and weakly with the Mental Component Summary (MCS scale scores. Also, the mental health and role limitation-emotional scales correlated strongly with the MCS scale score and weakly with the PCS scale score. Moderate to strong correlations with both summary scores were found for the general health perception, energy/vitality, and social functioning

  11. Report of subcommittee on Promotion of basic technology

    International Nuclear Information System (INIS)

    1988-01-01

    In the long term plan of atomic energy development and utilization decided in June, 1987, the policy of promoting the development of the basic technology that connects basic research to project development was shown, placing emphasis on the creative and innovative aspect of atomic energy. It is necessary to accomplish the international responsibility and to make breakthrough in the present day problems such as the heightening of safety, reliability and economical efficiency imposed on atomic energy by purposefully and efficiently advancing the development of these basic technologies, in this way, to build up atomic energy technological system for the beginning of 21st century. The trend of atomic energy development so far, the change of the situation surrounding atomic energy, the trend of developing atomic energy technology hereafter and the basic technology, the concept of developing material technology, artificial intelligence technology, laser technology and the technology for evaluating and reducing radiation risks, the plan of the development of basic technology for atomic energy and the efficient promotion of its development are discussed. (K.I.)

  12. Implicit Assumptions in Special Education Policy: Promoting Full Inclusion for Students with Learning Disabilities

    Science.gov (United States)

    Kirby, Moira

    2017-01-01

    Introduction: Everyday millions of students in the United States receive special education services. Special education is an institution shaped by societal norms. Inherent in these norms are implicit assumptions regarding disability and the nature of special education services. The two dominant implicit assumptions evident in the American…

  13. A basic system architecture for sensor data diffusion of environment sensors for intelligent cruise control systems; Eine Basis-Systemarchitektur zur Sensordatenfusion von Umfeldsensoren fuer Fahrerassistenzsysteme

    Energy Technology Data Exchange (ETDEWEB)

    Darms, M.

    2007-07-01

    The design of the system architecture for sensor data diffusion at the beginning of the development process has significant influence on the cost. With a view to intelligent cruise control systems, the author investigated general assumptions concerning data association and data filtering for sensor data diffusion of environment sensors which must be considered when designing an architecture or may be considered for optimisation. The validity of the assumption is illustrated by simulations of adaptive speed control and time-to-collision calculations as well as on the basis of available literature. A basic sytem architecture is presented as a precursor of the final architecture which is based on these assumptions. Their applicability is proved by implementation in the PRORETA project. The author's work provides a validated basis for architects of a serial system architecture enabling them to design and implement their ultimate systems. (orig.)

  14. LEVEL OF KNOWLEDGE OF THE BASIC CONCEPTS OF PHYSICAL EVALUATION FOR THE PROFESSIONALS IN THE ACADEMICS OF THE CITY OF JOÃO PESSOA - PB

    Directory of Open Access Journals (Sweden)

    Rodrigo Benevides Ceriani

    2005-10-01

    Full Text Available The objective of this study is to verify the level of knowledge of the basic concepts of physical evaluation for the responsible professionals for this practice in the academies. He/she/you elapses of a traverse study, of field, with professionals that act in the area of Physical Evaluation, registered by CREF 10 - PB/RN. questionnaire of open and closed questions was Applied in 39 individuals. The statistics was applied of percentile of frequency through spreadsheet Excel. The results found that: 61,54% collect for the physical activity, being in 41,66% of the cases, 15 real; 69,23% don't include in the registration; 84,61% have knowledge of the one that is test; 61,54% of the one that it is to measure and 53,45% of the one that it is to evaluate. Three people were found without graduation in physical education, or in another course of superior level, acting in the area Conclusions: They still act inside of the academies, directly with the physical evaluation, professionals not graduated in physical education or in another course of superior level. Many appraisers don't possess the basic theoretical knowledge regarding the concepts that involve to test, to measure and to evaluate. In general it is collected by the physical evaluation, being most included in the customer's registration

  15. Semi-supervised learning via regularized boosting working on multiple semi-supervised assumptions.

    Science.gov (United States)

    Chen, Ke; Wang, Shihai

    2011-01-01

    Semi-supervised learning concerns the problem of learning in the presence of labeled and unlabeled data. Several boosting algorithms have been extended to semi-supervised learning with various strategies. To our knowledge, however, none of them takes all three semi-supervised assumptions, i.e., smoothness, cluster, and manifold assumptions, together into account during boosting learning. In this paper, we propose a novel cost functional consisting of the margin cost on labeled data and the regularization penalty on unlabeled data based on three fundamental semi-supervised assumptions. Thus, minimizing our proposed cost functional with a greedy yet stagewise functional optimization procedure leads to a generic boosting framework for semi-supervised learning. Extensive experiments demonstrate that our algorithm yields favorite results for benchmark and real-world classification tasks in comparison to state-of-the-art semi-supervised learning algorithms, including newly developed boosting algorithms. Finally, we discuss relevant issues and relate our algorithm to the previous work.

  16. Does Artificial Neural Network Support Connectivism's Assumptions?

    Science.gov (United States)

    AlDahdouh, Alaa A.

    2017-01-01

    Connectivism was presented as a learning theory for the digital age and connectivists claim that recent developments in Artificial Intelligence (AI) and, more specifically, Artificial Neural Network (ANN) support their assumptions of knowledge connectivity. Yet, very little has been done to investigate this brave allegation. Does the advancement…

  17. Sensitivity of the OMI ozone profile retrieval (OMO3PR) to a priori assumptions

    NARCIS (Netherlands)

    Mielonen, T.; De Haan, J.F.; Veefkind, J.P.

    2014-01-01

    We have assessed the sensitivity of the operational OMI ozone profile retrieval (OMO3PR) algorithm to a number of a priori assumptions. We studied the effect of stray light correction, surface albedo assumptions and a priori ozone profiles on the retrieved ozone profile. Then, we studied how to

  18. Basic principles of test-negative design in evaluating influenza vaccine effectiveness.

    Science.gov (United States)

    Fukushima, Wakaba; Hirota, Yoshio

    2017-08-24

    Based on the unique characteristics of influenza, the concept of "monitoring" influenza vaccine effectiveness (VE) across the seasons using the same observational study design has been developed. In recent years, there has been a growing number of influenza VE reports using the test-negative design, which can minimize both misclassification of diseases and confounding by health care-seeking behavior. Although the test-negative designs offer considerable advantages, there are some concerns that widespread use of the test-negative design without knowledge of the basic principles of epidemiology could produce invalid findings. In this article, we briefly review the basic concepts of the test-negative design with respect to classic study design such as cohort studies or case-control studies. We also mention selection bias, which may be of concern in some countries where rapid diagnostic testing is frequently used in routine clinical practices, as in Japan. Copyright © 2017. Published by Elsevier Ltd.

  19. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  20. I Assumed You Knew: Teaching Assumptions as Co-Equal to Observations in Scientific Work

    Science.gov (United States)

    Horodyskyj, L.; Mead, C.; Anbar, A. D.

    2016-12-01

    Introductory science curricula typically begin with a lesson on the "nature of science". Usually this lesson is short, built with the assumption that students have picked up this information elsewhere and only a short review is necessary. However, when asked about the nature of science in our classes, student definitions were often confused, contradictory, or incomplete. A cursory review of how the nature of science is defined in a number of textbooks is similarly inconsistent and excessively loquacious. With such confusion both from the student and teacher perspective, it is no surprise that students walk away with significant misconceptions about the scientific endeavor, which they carry with them into public life. These misconceptions subsequently result in poor public policy and personal decisions on issues with scientific underpinnings. We will present a new way of teaching the nature of science at the introductory level that better represents what we actually do as scientists. Nature of science lessons often emphasize the importance of observations in scientific work. However, they rarely mention and often hide the importance of assumptions in interpreting those observations. Assumptions are co-equal to observations in building models, which are observation-assumption networks that can be used to make predictions about future observations. The confidence we place in these models depends on whether they are assumption-dominated (hypothesis) or observation-dominated (theory). By presenting and teaching science in this manner, we feel that students will better comprehend the scientific endeavor, since making observations and assumptions and building mental models is a natural human behavior. We will present a model for a science lab activity that can be taught using this approach.

  1. Causality and headache triggers

    Science.gov (United States)

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  2. 38 CFR 3.314 - Basic pension determinations.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Basic pension determinations. 3.314 Section 3.314 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS ADJUDICATION Pension, Compensation, and Dependency and Indemnity Compensation Ratings and Evaluations; Service...

  3. Comparisons between a new point kernel-based scheme and the infinite plane source assumption method for radiation calculation of deposited airborne radionuclides from nuclear power plants.

    Science.gov (United States)

    Zhang, Xiaole; Efthimiou, George; Wang, Yan; Huang, Meng

    2018-04-01

    Radiation from the deposited radionuclides is indispensable information for environmental impact assessment of nuclear power plants and emergency management during nuclear accidents. Ground shine estimation is related to multiple physical processes, including atmospheric dispersion, deposition, soil and air radiation shielding. It still remains unclear that whether the normally adopted "infinite plane" source assumption for the ground shine calculation is accurate enough, especially for the area with highly heterogeneous deposition distribution near the release point. In this study, a new ground shine calculation scheme, which accounts for both the spatial deposition distribution and the properties of air and soil layers, is developed based on point kernel method. Two sets of "detector-centered" grids are proposed and optimized for both the deposition and radiation calculations to better simulate the results measured by the detectors, which will be beneficial for the applications such as source term estimation. The evaluation against the available data of Monte Carlo methods in the literature indicates that the errors of the new scheme are within 5% for the key radionuclides in nuclear accidents. The comparisons between the new scheme and "infinite plane" assumption indicate that the assumption is tenable (relative errors within 20%) for the area located 1 km away from the release source. Within 1 km range, the assumption mainly causes errors for wet deposition and the errors are independent of rain intensities. The results suggest that the new scheme should be adopted if the detectors are within 1 km from the source under the stable atmosphere (classes E and F), or the detectors are within 500 m under slightly unstable (class C) or neutral (class D) atmosphere. Otherwise, the infinite plane assumption is reasonable since the relative errors induced by this assumption are within 20%. The results here are only based on theoretical investigations. They should

  4. Leakage-Resilient Circuits without Computational Assumptions

    DEFF Research Database (Denmark)

    Dziembowski, Stefan; Faust, Sebastian

    2012-01-01

    Physical cryptographic devices inadvertently leak information through numerous side-channels. Such leakage is exploited by so-called side-channel attacks, which often allow for a complete security breache. A recent trend in cryptography is to propose formal models to incorporate leakage...... on computational assumptions, our results are purely information-theoretic. In particular, we do not make use of public key encryption, which was required in all previous works...... into the model and to construct schemes that are provably secure within them. We design a general compiler that transforms any cryptographic scheme, e.g., a block-cipher, into a functionally equivalent scheme which is resilient to any continual leakage provided that the following three requirements are satisfied...

  5. Being Explicit about Underlying Values, Assumptions and Views when Designing for Children in the IDC Community

    DEFF Research Database (Denmark)

    Skovbjerg, Helle Marie; Bekker, Tilde; Barendregt, Wolmet

    2016-01-01

    In this full-day workshop we want to discuss how the IDC community can make underlying assumptions, values and views regarding children and childhood in making design decisions more explicit. What assumptions do IDC designers and researchers make, and how can they be supported in reflecting......, and intends to share different approaches for uncovering and reflecting on values, assumptions and views about children and childhood in design....

  6. Incorporation of constructivist assumptions into problem-based instruction: a literature review.

    Science.gov (United States)

    Kantar, Lina

    2014-05-01

    The purpose of this literature review was to explore the use of distinct assumptions of constructivism when studying the impact of problem-based learning (PBL) on learners in undergraduate nursing programs. Content analysis research technique. The literature review included information retrieved from sources selected via electronic databases, such as EBSCOhost, ProQuest, Sage Publications, SLACK Incorporation, Springhouse Corporation, and Digital Dissertations. The literature review was conducted utilizing key terms and phrases associated with problem-based learning in undergraduate nursing education. Out of the 100 reviewed abstracts, only 15 studies met the inclusion criteria for the review. Four constructivist assumptions based the review process allowing for analysis and evaluation of the findings, followed by identification of issues and recommendations for the discipline and its research practice in the field of PBL. This literature review provided evidence that the nursing discipline is employing PBL in its programs, yet with limited data supporting conceptions of the constructivist perspective underlying this pedagogical approach. Three major issues were assessed and formed the basis for subsequent recommendations: (a) limited use of a theoretical framework and absence of constructivism in most of the studies, (b) incompatibility between research measures and research outcomes, and (c) brief exposure to PBL during which the change was measured. Educators have made the right choice in employing PBL as a pedagogical practice, yet the need to base implementation on constructivism is mandatory if the aim is a better preparation of graduates for practice. Undeniably there is limited convincing evidence regarding integration of constructivism in nursing education. Research that assesses the impact of PBL on learners' problem-solving and communication skills, self-direction, and motivation is paramount. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. A Proposal for Testing Local Realism Without Using Assumptions Related to Hidden Variable States

    Science.gov (United States)

    Ryff, Luiz Carlos

    1996-01-01

    A feasible experiment is discussed which allows us to prove a Bell's theorem for two particles without using an inequality. The experiment could be used to test local realism against quantum mechanics without the introduction of additional assumptions related to hidden variables states. Only assumptions based on direct experimental observation are needed.

  8. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  9. Basic design of parallel computational program for probabilistic structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  10. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    Science.gov (United States)

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  11. Origins and Traditions in Comparative Education: Challenging Some Assumptions

    Science.gov (United States)

    Manzon, Maria

    2018-01-01

    This article questions some of our assumptions about the history of comparative education. It explores new scholarship on key actors and ways of knowing in the field. Building on the theory of the social constructedness of the field of comparative education, the paper elucidates how power shapes our scholarly histories and identities.

  12. Basic Automotive Mechanics. Florida Vocational Program Guide.

    Science.gov (United States)

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This program guide identifies primary concerns in the organization, operation, and evaluation of a basic automotive mechanics program. It is designed for local school district and community college administrators, instructors, program advisory committees, and regional coordinating councils. The guide begins with the Dictionary of Occupational…

  13. An improved method for basic hydrolysis of isoflavone malonylglucosides and quality evaluation of Chinese soy materials.

    Science.gov (United States)

    Yuan, Dan; Pan, Yingni; Chen, Yan; Uno, Toshio; Zhang, Shaohui; Kano, Yoshihiro

    2008-01-01

    Basic hydrolysis procedure is often included in the sample preparation in order to quantify malonylglucosides or acetylglucosides of soy materials. However, it is preferable not to use NaOH as a hydrolytic reagent considering the effect of its alkalinity on the successive injection to HPLC and low acidity of soy isoflavones. This paper presents an improved method for basic hydrolysis using ammonia as a hydrolytic reagent without the additional neutralization step. Moreover, by means of HPLC and LC-MS methods, a systematic quality evaluation of natural soy materials from Chinese markets were established and discussed, inclusive of soybeans, black soybeans, defatted soy flours, as well as the distribution of isoflavones in the seed coat, hypocotyl and cotyledon. The results indicate that HPLC profiling patterns of originating various isoflavone constituents of Chinese soybeans was similar to those of Japanese ones, and those of Chinese black soybeans was similar to those of American ones. The average content level of total soy isoflavones of Chinese soybeans and black soybeans were a little lower than that of American and Japanese ones. Additionally, the thorough analysis for Semen Sojae Praeparatum, a Chinese herbal medicine made from fermented black soybeans or soybeans was done for the first time and the characteristic of its HPLC profiling patterns shows the higher content of isoflavone glucosides and aglycones than those of natural soy materials.

  14. The Perspectives of Students and Teachers in the English Department in the College of Basic Education on the Student Evaluation of Teachers

    Science.gov (United States)

    Taqi, Hanan A.; Al-Nouh, Nowreyah A.; Dashti, Abdulmuhsin A.; Shuqair, Khaled M.

    2014-01-01

    In the context of students' evaluation of teachers in higher education, this paper examines the perspectives of students and faculty members in the English Department in the college of Basic education (CBE) in the State of Kuwait. The study is based on a survey that covered 320 students and 19 members of staff in the English department. The study…

  15. Curriculum Reform and School Performance: An Evaluation of the "New Basics."

    Science.gov (United States)

    Alexander, Karl L.; Pallas, Aaron M.

    This report examines whether a high school curriculum organized around the five "new basics" suggested by the National Commission on Excellence in Education is likely to enhance student achievement. Data from the ETS Growth Study reveals that completion of the core curriculum has sizable effects on senior-year test performance, even when…

  16. Stem Cell Basics

    Science.gov (United States)

    ... Tips Info Center Research Topics Federal Policy Glossary Stem Cell Information General Information Clinical Trials Funding Information Current ... Basics » Stem Cell Basics I. Back to top Stem Cell Basics I. Introduction: What are stem cells, and ...

  17. Evaluation of a Workplace Basic Skills Program: An Impact Study of AVC Edmonton's 1990 Job Effectiveness Training Program at Stelco Steel. Report Summary.

    Science.gov (United States)

    Barker, Kathryn Chang

    The pilot Job Effectiveness Training (JET) workplace basic skills program, developed by Canada's Alberta Vocational College (AVC), Edmonton, for Stelco Steel during 1989-90, was evaluated in terms of impacts or changes from the perspective of the four major stakeholder groups: the students (12 Stelco employees); the employers (Stelco management);…

  18. Validity of the isotropic thermal conductivity assumption in supercell lattice dynamics

    Science.gov (United States)

    Ma, Ruiyuan; Lukes, Jennifer R.

    2018-02-01

    Superlattices and nano phononic crystals have attracted significant attention due to their low thermal conductivities and their potential application as thermoelectric materials. A widely used expression to calculate thermal conductivity, presented by Klemens and expressed in terms of the relaxation time by Callaway and Holland, originates from the Boltzmann transport equation. In its most general form, this expression involves a direct summation of the heat current contributions from individual phonons of all wavevectors and polarizations in the first Brillouin zone. In common practice, the expression is simplified by making an isotropic assumption that converts the summation over wavevector to an integral over wavevector magnitude. The isotropic expression has been applied to superlattices and phononic crystals, but its validity for different supercell sizes has not been studied. In this work, the isotropic and direct summation methods are used to calculate the thermal conductivities of bulk Si, and Si/Ge quantum dot superlattices. The results show that the differences between the two methods increase substantially with the supercell size. These differences arise because the vibrational modes neglected in the isotropic assumption provide an increasingly important contribution to the thermal conductivity for larger supercells. To avoid the significant errors that can result from the isotropic assumption, direct summation is recommended for thermal conductivity calculations in superstructures.

  19. Measurement and Basic Physics Committee of the US cross-section evaluation working group. Annual report 1996

    International Nuclear Information System (INIS)

    Smith, D.L.; McLane, V.

    1996-11-01

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with the responsibility for organizing and overseeing the U.S. cross-section evaluation effort. It's main product is the official U.S. evaluated nuclear data file, ENDF. The current version of this file is Version VI. All evaluations included in ENDF are reviewed and approved by CSEWG and issued by the U.S. Nuclear Data Center, Brookhaven National Laboratory. CSEWG is comprised of volunteers from the U.S. nuclear data community who possess expertise in evaluation methodologies and who collectively have been responsible for producing most of the evaluations included in ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the U.S. were declining at an alarming rate and needed all possible encouragement to avoid the loss of this resource. The mission of the Committee is to maintain a network of experimentalists in the U.S. that would provide needed encouragement to the national nuclear data measurement effort through improved communication and facilitation of collaborative activities. In 1994, an additional charge was added to the responsibilities of this Committee, namely, to serve as an interface between the more applied interests represented in CSEWG and the basic nuclear science community. This annual report is the second such document issued by the Committee. It contains voluntary contributions from eleven laboratories in the U.S. which have been prepared by members of the Committee and submitted to the Chairman for compilation and editing. It is hoped that the information provided here on the work that is going on at the reporting laboratories will prove interesting and stimulating to the readers

  20. METHODICAL MODEL FOR TEACHING BASIC SKI TURN

    Directory of Open Access Journals (Sweden)

    Danijela Kuna

    2013-07-01

    Full Text Available With the aim of forming an expert model of the most important operators for basic ski turn teaching in ski schools, an experiment was conducted on a sample of 20 ski experts from different countries (Croatia, Bosnia and Herzegovina and Slovenia. From the group of the most commonly used operators for teaching basic ski turn the experts picked the 6 most important: uphill turn and jumping into snowplough, basic turn with hand sideways, basic turn with clapping, ski poles in front, ski poles on neck, uphill turn with active ski guiding. Afterwards, ranking and selection of the most efficient operators was carried out. Due to the set aim of research, a Chi square test was used, as well as the differences between frequencies of chosen operators, differences between values of the most important operators and differences between experts due to their nationality. Statistically significant differences were noticed between frequencies of chosen operators (c2= 24.61; p=0.01, while differences between values of the most important operators were not obvious (c2= 1.94; p=0.91. Meanwhile, the differences between experts concerning thier nationality were only noticeable in the expert evaluation of ski poles on neck operator (c2=7.83; p=0.02. Results of current research are reflected in obtaining useful information about methodological priciples of learning basic ski turn organization in ski schools.

  1. Basic hydraulics

    CERN Document Server

    Smith, P D

    1982-01-01

    BASIC Hydraulics aims to help students both to become proficient in the BASIC programming language by actually using the language in an important field of engineering and to use computing as a means of mastering the subject of hydraulics. The book begins with a summary of the technique of computing in BASIC together with comments and listing of the main commands and statements. Subsequent chapters introduce the fundamental concepts and appropriate governing equations. Topics covered include principles of fluid mechanics; flow in pipes, pipe networks and open channels; hydraulic machinery;

  2. Fair-sampling assumption is not necessary for testing local realism

    International Nuclear Information System (INIS)

    Berry, Dominic W.; Jeong, Hyunseok; Stobinska, Magdalena; Ralph, Timothy C.

    2010-01-01

    Almost all Bell inequality experiments to date have used postselection and therefore relied on the fair sampling assumption for their interpretation. The standard form of the fair sampling assumption is that the loss is independent of the measurement settings, so the ensemble of detected systems provides a fair statistical sample of the total ensemble. This is often assumed to be needed to interpret Bell inequality experiments as ruling out hidden-variable theories. Here we show that it is not necessary; the loss can depend on measurement settings, provided the detection efficiency factorizes as a function of the measurement settings and any hidden variable. This condition implies that Tsirelson's bound must be satisfied for entangled states. On the other hand, we show that it is possible for Tsirelson's bound to be violated while the Clauser-Horne-Shimony-Holt (CHSH)-Bell inequality still holds for unentangled states, and present an experimentally feasible example.

  3. Ex-post evaluation. Research independency of the basic science study of JAERI

    International Nuclear Information System (INIS)

    Yanagisawa, Kazuaki; Takahashi, Shoji

    2010-06-01

    A research independency was defined here as the continuity and the development of a corresponding research field with an evolution of history. The authors took three fields as research parameters for the ex-post evaluation. They were all belonged to the basic science field studied in the Japan Atomic Energy Research Institute (JAERI). The first parameter was actinides, which was situated in the center of research networking from the viewpoint of socio-economy. The second parameter was positron, which was situated in the periphery of research networking and the third one was neutron, which had competition with other research organizations in Japan. The three were supported and promoted financially by the JAERI. The target year was covered from 1978 to 2002, a 25-years. INIS (International Nuclear Information Systems) operated by the International Atomic Energy Agency (IAEA) was used as the tool for the present bibliometric study. It was revealed that important factors that led the sustainable success of the research independency were the constant efforts to accomplish their mission, the education of their successors to instructing the explicit and tacit research findings and the construction of intellectual networking with learned circles and industries, those were in good collaboration with JAERI. These were quantitatively clarified. Conversely, main factors that impeded the development of the research independency were discontinuance of research caused by a retirement, a change of post or that of occupation, and an unexpected accident (death) of the core researchers. Among three parameters, the authors confirmed that there occurred the time-dependent stage of germination, development and declination of the research independency attributing to the interaction between the succession factors and impeded factors. For this kind of ex-post evaluation, the support of field research laboratory was inevitable. (author)

  4. Positron emission tomography basic sciences

    CERN Document Server

    Townsend, D W; Valk, P E; Maisey, M N

    2003-01-01

    Essential for students, science and medical graduates who want to understand the basic science of Positron Emission Tomography (PET), this book describes the physics, chemistry, technology and overview of the clinical uses behind the science of PET and the imaging techniques it uses. In recent years, PET has moved from high-end research imaging tool used by the highly specialized to an essential component of clinical evaluation in the clinic, especially in cancer management. Previously being the realm of scientists, this book explains PET instrumentation, radiochemistry, PET data acquisition and image formation, integration of structural and functional images, radiation dosimetry and protection, and applications in dedicated areas such as drug development, oncology, and gene expression imaging. The technologist, the science, engineering or chemistry graduate seeking further detailed information about PET, or the medical advanced trainee wishing to gain insight into the basic science of PET will find this book...

  5. Anesthesia Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Anesthesia Basics KidsHealth / For Teens / Anesthesia Basics What's in ... español Conceptos básicos sobre la anestesia What Is Anesthesia? No doubt about it, getting an operation can ...

  6. Evaluation of machine learning algorithms for prediction of regions of high Reynolds averaged Navier Stokes uncertainty

    Science.gov (United States)

    Ling, J.; Templeton, J.

    2015-08-01

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests. The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. Feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.

  7. Bank stress testing under different balance sheet assumptions

    OpenAIRE

    Busch, Ramona; Drescher, Christian; Memmel, Christoph

    2017-01-01

    Using unique supervisory survey data on the impact of a hypothetical interest rate shock on German banks, we analyse price and quantity effects on banks' net interest margin components under different balance sheet assumptions. In the first year, the cross-sectional variation of banks' simulated price effect is nearly eight times as large as the one of the simulated quantity effect. After five years, however, the importance of both effects converges. Large banks adjust their balance sheets mo...

  8. Are Prescription Opioids Driving the Opioid Crisis? Assumptions vs Facts.

    Science.gov (United States)

    Rose, Mark Edmund

    2018-04-01

    Sharp increases in opioid prescriptions, and associated increases in overdose deaths in the 2000s, evoked widespread calls to change perceptions of opioid analgesics. Medical literature discussions of opioid analgesics began emphasizing patient and public health hazards. Repetitive exposure to this information may influence physician assumptions. While highly consequential to patients with pain whose function and quality of life may benefit from opioid analgesics, current assumptions about prescription opioid analgesics, including their role in the ongoing opioid overdose epidemic, have not been scrutinized. Information was obtained by searching PubMed, governmental agency websites, and conference proceedings. Opioid analgesic prescribing and associated overdose deaths both peaked around 2011 and are in long-term decline; the sharp overdose increase recorded in 2014 was driven by illicit fentanyl and heroin. Nonmethadone prescription opioid analgesic deaths, in the absence of co-ingested benzodiazepines, alcohol, or other central nervous system/respiratory depressants, are infrequent. Within five years of initial prescription opioid misuse, 3.6% initiate heroin use. The United States consumes 80% of the world opioid supply, but opioid access is nonexistent for 80% and severely restricted for 4.1% of the global population. Many current assumptions about opioid analgesics are ill-founded. Illicit fentanyl and heroin, not opioid prescribing, now fuel the current opioid overdose epidemic. National discussion has often neglected the potentially devastating effects of uncontrolled chronic pain. Opioid analgesic prescribing and related overdoses are in decline, at great cost to patients with pain who have benefited or may benefit from, but cannot access, opioid analgesic therapy.

  9. Moving from assumption to observation: Implications for energy and emissions impacts of plug-in hybrid electric vehicles

    International Nuclear Information System (INIS)

    Davies, Jamie; Kurani, Kenneth S.

    2013-01-01

    Plug-in hybrid electric vehicles (PHEVs) are currently for sale in most parts of the United States, Canada, Europe and Japan. These vehicles are promoted as providing distinct consumer and public benefits at the expense of grid electricity. However, the specific benefits or impacts of PHEVs ultimately relies on consumers purchase and vehicle use patterns. While considerable effort has been dedicated to understanding PHEV impacts on a per mile basis few studies have assessed the impacts of PHEV given actual consumer use patterns or operating conditions. Instead, simplifying assumptions have been made about the types of cars individual consumers will choose to purchase and how they will drive and charge them. Here, we highlight some of these consumer purchase and use assumptions, studies which have employed these assumptions and compare these assumptions to actual consumer data recorded in a PHEV demonstration project. Using simulation and hypothetical scenarios we discuss the implication for PHEV impact analyses and policy if assumptions about key PHEV consumer use variables such as vehicle choice, home charging frequency, distribution of driving distances, and access to workplace charging were to change. -- Highlights: •The specific benefits or impacts of PHEVs ultimately relies on consumers purchase and vehicle use patterns. •Simplifying, untested, assumptions have been made by prior studies about PHEV consumer driving, charging and vehicle purchase behaviors. •Some simplifying assumptions do not match observed data from a PHEV demonstration project. •Changing the assumptions about PHEV consumer driving, charging, and vehicle purchase behaviors affects estimates of PHEV impacts. •Premature simplification may have lasting consequences for standard setting and performance based incentive programs which rely on these estimates

  10. Using LISREL to Evaluate Measurement Models and Scale Reliability.

    Science.gov (United States)

    Fleishman, John; Benson, Jeri

    1987-01-01

    LISREL program was used to examine measurement model assumptions and to assess reliability of Coopersmith Self-Esteem Inventory for Children, Form B. Data on 722 third-sixth graders from over 70 schools in large urban school district were used. LISREL program assessed (1) nature of basic measurement model for scale, (2) scale invariance across…

  11. Data-driven smooth tests of the proportional hazards assumption

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2007-01-01

    Roč. 13, č. 1 (2007), s. 1-16 ISSN 1380-7870 R&D Projects: GA AV ČR(CZ) IAA101120604; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * Neyman's smooth test * proportional hazards assumption * Schwarz's selection rule Subject RIV: BA - General Mathematics Impact factor: 0.491, year: 2007

  12. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning.

    Directory of Open Access Journals (Sweden)

    Anne Hsu

    Full Text Available A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning.

  13. Sampling Assumptions Affect Use of Indirect Negative Evidence in Language Learning

    Science.gov (United States)

    2016-01-01

    A classic debate in cognitive science revolves around understanding how children learn complex linguistic patterns, such as restrictions on verb alternations and contractions, without negative evidence. Recently, probabilistic models of language learning have been applied to this problem, framing it as a statistical inference from a random sample of sentences. These probabilistic models predict that learners should be sensitive to the way in which sentences are sampled. There are two main types of sampling assumptions that can operate in language learning: strong and weak sampling. Strong sampling, as assumed by probabilistic models, assumes the learning input is drawn from a distribution of grammatical samples from the underlying language and aims to learn this distribution. Thus, under strong sampling, the absence of a sentence construction from the input provides evidence that it has low or zero probability of grammaticality. Weak sampling does not make assumptions about the distribution from which the input is drawn, and thus the absence of a construction from the input as not used as evidence of its ungrammaticality. We demonstrate in a series of artificial language learning experiments that adults can produce behavior consistent with both sets of sampling assumptions, depending on how the learning problem is presented. These results suggest that people use information about the way in which linguistic input is sampled to guide their learning. PMID:27310576

  14. Investigating Teachers' and Students' Beliefs and Assumptions about CALL Programme at Caledonian College of Engineering

    Science.gov (United States)

    Ali, Holi Ibrahim Holi

    2012-01-01

    This study is set to investigate students' and teachers' perceptions and assumptions about newly implemented CALL Programme at the School of Foundation Studies, Caledonian College of Engineering, Oman. Two versions of questionnaire were administered to 24 teachers and 90 students to collect their beliefs and assumption about CALL programame. The…

  15. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  16. Basic Pharmaceutical Sciences Examination as a Predictor of Student Performance during Clinical Training.

    Science.gov (United States)

    Fassett, William E.; Campbell, William H.

    1984-01-01

    A comparison of Basic Pharmaceutical Sciences Examination (BPSE) results with student performance evaluations in core clerkships, institutional and community externships, didactic and clinical courses, and related basic science coursework revealed the BPSE does not predict student performance during clinical instruction. (MSE)

  17. BASIC Programming.

    Science.gov (United States)

    Jennings, Carol Ann

    Designed for use by both secondary- and postsecondary-level business teachers, this curriculum guide consists of 10 units of instructional materials dealing with Beginners All-Purpose Symbol Instruction Code (BASIC) programing. Topics of the individual lessons are numbering BASIC programs and using the PRINT, END, and REM statements; system…

  18. Testing the rationality assumption using a design difference in the TV game show 'Jeopardy'

    OpenAIRE

    Sjögren Lindquist, Gabriella; Säve-Söderbergh, Jenny

    2006-01-01

    Abstract This paper empirically investigates the rationality assumption commonly applied in economic modeling by exploiting a design difference in the game-show Jeopardy between the US and Sweden. In particular we address the assumption of individuals’ capabilities to process complex mathematical problems to find optimal strategies. The vital difference is that US contestants are given explicit information before they act, while Swedish contestants individually need to calculate the same info...

  19. Assumptions for the Annual Energy Outlook 1992

    International Nuclear Information System (INIS)

    1992-01-01

    This report serves a auxiliary document to the Energy Information Administration (EIA) publication Annual Energy Outlook 1992 (AEO) (DOE/EIA-0383(92)), released in January 1992. The AEO forecasts were developed for five alternative cases and consist of energy supply, consumption, and price projections by major fuel and end-use sector, which are published at a national level of aggregation. The purpose of this report is to present important quantitative assumptions, including world oil prices and macroeconomic growth, underlying the AEO forecasts. The report has been prepared in response to external requests, as well as analyst requirements for background information on the AEO and studies based on the AEO forecasts

  20. Investigating assumptions of crown archetypes for modelling LiDAR returns

    NARCIS (Netherlands)

    Calders, K.; Lewis, P.; Disney, M.; Verbesselt, J.; Herold, M.

    2013-01-01

    LiDAR has the potential to derive canopy structural information such as tree height and leaf area index (LAI), via models of the LiDAR signal. Such models often make assumptions regarding crown shape to simplify parameter retrieval and crown archetypes are typically assumed to contain a turbid

  1. Incorporation of proficiency criteria for basic laparoscopic skills training: How does it work?

    NARCIS (Netherlands)

    E. Verdaasdonk (Egg); J. Dankelman (Jenny); J.F. Lange (Johan); L.P. Stassen (Laurents)

    2008-01-01

    textabstractBackground: It is desirable that surgical trainees are proficient in basic laparoscopic motor skills (eye-hand coordination). The present study evaluated the use of predefined proficiency criteria on a basic virtual reality (VR) simulator in preparation for a laparoscopic course on

  2. Semi-Supervised Transductive Hot Spot Predictor Working on Multiple Assumptions

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-05-23

    Protein-protein interactions are critically dependent on just a few residues (“hot spots”) at the interfaces. Hot spots make a dominant contribution to the binding free energy and if mutated they can disrupt the interaction. As mutagenesis studies require significant experimental efforts, there exists a need for accurate and reliable computational hot spot prediction methods. Compared to the supervised hot spot prediction algorithms, the semi-supervised prediction methods can take into consideration both the labeled and unlabeled residues in the dataset during the prediction procedure. The transductive support vector machine has been utilized for this task and demonstrated a better prediction performance. To the best of our knowledge, however, none of the transductive semi-supervised algorithms takes all the three semisupervised assumptions, i.e., smoothness, cluster and manifold assumptions, together into account during learning. In this paper, we propose a novel semi-supervised method for hot spot residue prediction, by considering all the three semisupervised assumptions using nonlinear models. Our algorithm, IterPropMCS, works in an iterative manner. In each iteration, the algorithm first propagates the labels of the labeled residues to the unlabeled ones, along the shortest path between them on a graph, assuming that they lie on a nonlinear manifold. Then it selects the most confident residues as the labeled ones for the next iteration, according to the cluster and smoothness criteria, which is implemented by a nonlinear density estimator. Experiments on a benchmark dataset, using protein structure-based features, demonstrate that our approach is effective in predicting hot spots and compares favorably to other available methods. The results also show that our method outperforms the state-of-the-art transductive learning methods.

  3. Automatic ethics: the effects of implicit assumptions and contextual cues on moral behavior.

    Science.gov (United States)

    Reynolds, Scott J; Leavitt, Keith; DeCelles, Katherine A

    2010-07-01

    We empirically examine the reflexive or automatic aspects of moral decision making. To begin, we develop and validate a measure of an individual's implicit assumption regarding the inherent morality of business. Then, using an in-basket exercise, we demonstrate that an implicit assumption that business is inherently moral impacts day-to-day business decisions and interacts with contextual cues to shape moral behavior. Ultimately, we offer evidence supporting a characterization of employees as reflexive interactionists: moral agents whose automatic decision-making processes interact with the environment to shape their moral behavior.

  4. Evaluation of a School Building in Turkey According to the Basic Sustainable Design Criteria

    Science.gov (United States)

    Arslan, H. D.

    2017-08-01

    In Turkey, as well as many other developing countries, the significance of sustainable education buildings has only recently become recognized and the issue of sustainability issue has not been sufficiently involved in laws and regulations. In this study, first of all architectural sustainability with basic design criteria has been explained. After that selected type primary school project in Turkey has been evaluated according to the sustainable design criteria. Type project of school buildings significantly limits the sustainability performance expected from buildings. It is clear that type projects shorten the planning time as they include a designing process that is independent of settlement and they are repeated in various places with different characteristics, indeed. On the other hand; abundance of disadvantages such as the overlook of the natural physical and structural properties of the location mostly restricts the sustainable design of the building. For sustainable buildings, several factors such as the environment, land, climate, insolation, direction etc. shall be taken into consideration at the beginning stage. Therefore; implementation of type projects can be deemed to be inappropriate for sustainability.

  5. ψ -ontology result without the Cartesian product assumption

    Science.gov (United States)

    Myrvold, Wayne C.

    2018-05-01

    We introduce a weakening of the preparation independence postulate of Pusey et al. [Nat. Phys. 8, 475 (2012), 10.1038/nphys2309] that does not presuppose that the space of ontic states resulting from a product-state preparation can be represented by the Cartesian product of subsystem state spaces. On the basis of this weakened assumption, it is shown that, in any model that reproduces the quantum probabilities, any pair of pure quantum states |ψ >,|ϕ > with ≤1 /√{2 } must be ontologically distinct.

  6. Some basic thermohydraulic calculation methods for the analysis of pressure transients in a multicompartment total containment enclosing a breached water reactor circuit

    International Nuclear Information System (INIS)

    Porter, W.H.L.

    1976-05-01

    This paper gives an appreciation and commentary of the basic calculation methods under development at AEE Winfrith for the analysis of multicompartment total containments. The assumptions introduced and the effects of their variation are important in establishing a parametric survey of the range of possible conditions which the containment may be required to meet. These aspects of the performance will be discussed as each individual factor in the train of events is examined in turn. (U.K.)

  7. Using Contemporary Art to Challenge Cultural Values, Beliefs, and Assumptions

    Science.gov (United States)

    Knight, Wanda B.

    2006-01-01

    Art educators, like many other educators born or socialized within the main-stream culture of a society, seldom have an opportunity to identify, question, and challenge their cultural values, beliefs, assumptions, and perspectives because school culture typically reinforces those they learn at home and in their communities (Bush & Simmons, 1990).…

  8. Tank waste remediation system retrieval and disposal mission key enabling assumptions

    International Nuclear Information System (INIS)

    Baldwin, J.H.

    1998-01-01

    An overall systems approach has been applied to develop action plans to support the retrieval and immobilization waste disposal mission. The review concluded that the systems and infrastructure required to support the mission are known. Required systems are either in place or plans have been developed to ensure they exist when needed. The review showed that since October 1996 a robust system engineering approach to establishing integrated Technical Baselines, work breakdown structures, tank farm structure and configurations and work scope and costs has been established itself as part of the culture within TWRS. An analysis of the programmatic, management and technical activities necessary to declare readiness to proceed with execution of the mission demonstrates that the system, people and hardware will be on line and ready to support the private contractors. The systems approach included defining the retrieval and immobilized waste disposal mission requirements and evaluating the readiness of the TWRS contractor to supply waste feed to the private contractors in June 2OO2. The Phase 1 feed delivery requirements from the Private Contractor Request for Proposals were reviewed. Transfer piping routes were mapped out, existing systems were evaluated, and upgrade requirements were defined. Technical Basis Reviews were completed to define work scope in greater detail, cost estimates and associated year by year financial analyses were completed. TWRS personnel training, qualifications, management systems and procedures were reviewed and shown to be in place and ready to support the Phase 1B mission. Key assumptions and risks that could negatively impact mission success were evaluated and appropriate mitigative actions plans were planned and scheduled

  9. Assumptions and Challenges of Open Scholarship

    Directory of Open Access Journals (Sweden)

    George Veletsianos

    2012-10-01

    Full Text Available Researchers, educators, policymakers, and other education stakeholders hope and anticipate that openness and open scholarship will generate positive outcomes for education and scholarship. Given the emerging nature of open practices, educators and scholars are finding themselves in a position in which they can shape and/or be shaped by openness. The intention of this paper is (a to identify the assumptions of the open scholarship movement and (b to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. Through a critique of technology use in education, an understanding of educational technology narratives and their unfulfilled potential, and an appreciation of the negotiated implementation of technology use, we hope that this paper helps spark a conversation for a more critical, equitable, and effective future for education and open scholarship.

  10. Robustness Analysis of Visual QA Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-09-14

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  11. Robustness Analysis of Visual QA Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong; Alfadly, Modar; Ghanem, Bernard

    2017-01-01

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  12. World assumptions, posttraumatic stress and quality of life after a natural disaster: A longitudinal study

    Science.gov (United States)

    2012-01-01

    Background Changes in world assumptions are a fundamental concept within theories that explain posttraumatic stress disorder. The objective of the present study was to gain a greater understanding of how changes in world assumptions are related to quality of life and posttraumatic stress symptoms after a natural disaster. Methods A longitudinal study of 574 Norwegian adults who survived the Southeast Asian tsunami in 2004 was undertaken. Multilevel analyses were used to identify which factors at six months post-tsunami predicted quality of life and posttraumatic stress symptoms two years post-tsunami. Results Good quality of life and posttraumatic stress symptoms were negatively related. However, major differences in the predictors of these outcomes were found. Females reported significantly higher quality of life and more posttraumatic stress than men. The association between level of exposure to the tsunami and quality of life seemed to be mediated by posttraumatic stress. Negative perceived changes in the assumption “the world is just” were related to adverse outcome in both quality of life and posttraumatic stress. Positive perceived changes in the assumptions “life is meaningful” and “feeling that I am a valuable human” were associated with higher levels of quality of life but not with posttraumatic stress. Conclusions Quality of life and posttraumatic stress symptoms demonstrate differences in their etiology. World assumptions may be less specifically related to posttraumatic stress than has been postulated in some cognitive theories. PMID:22742447

  13. Oil production, oil prices, and macroeconomic adjustment under different wage assumptions

    International Nuclear Information System (INIS)

    Harvie, C.; Maleka, P.T.

    1992-01-01

    In a previous paper one of the authors developed a simple model to try to identify the possible macroeconomic adjustment processes arising in an economy experiencing a temporary period of oil production, under alternative wage adjustment assumptions, namely nominal and real wage rigidity. Certain assumptions were made regarding the characteristics of actual production, the permanent revenues generated from that oil production, and the net exports/imports of oil. The role of the price of oil, and possible changes in that price was essentially ignored. Here we attempt to incorporate the price of oil, as well as changes in that price, in conjunction with the production of oil, the objective being to identify the contribution which the price of oil, and changes in it, make to the adjustment process itself. The emphasis in this paper is not given to a mathematical derivation and analysis of the model's dynamics of adjustment or its comparative statics, but rather to the derivation of simulation results from the model, for a specific assumed case, using a numerical algorithm program, conducive to the type of theoretical framework utilized here. The results presented suggest that although the adjustment profiles of the macroeconomic variables of interest, for either wage adjustment assumption, remain fundamentally the same, the magnitude of these adjustments is increased. Hence to derive a more accurate picture of the dimensions of adjustment of these macroeconomic variables, it is essential to include the price of oil as well as changes in that price. (Author)

  14. Spatial Angular Compounding for Elastography without the Incompressibility Assumption

    OpenAIRE

    Rao, Min; Varghese, Tomy

    2005-01-01

    Spatial-angular compounding is a new technique that enables the reduction of noise artifacts in ultrasound elastography. Previous results using spatial angular compounding, however, were based on the use of the tissue incompressibility assumption. Compounded elastograms were obtained from a spatially-weighted average of local strain estimated from radiofrequency echo signals acquired at different insonification angles. In this paper, we present a new method for reducing the noise artifacts in...

  15. Hydromechanics - basic properties

    International Nuclear Information System (INIS)

    Lee, Sung Tak; Lee, Je Geun

    1987-03-01

    This book tells of hydromechanics, which is about basic properties of hydromechanics such as conception, definition, mass, power and weight, and perfect fluid and perfect gas, hydrostatics with summary, basic equation of hydrostatics, relative balance of hydrostatics, and kinematics of hydromechanics, description method of floating, hydromechanics about basic knowledge, equation of moment, energy equation and application of Bernoulli equation, application of momentum theory, inviscid flow and fluid measuring.

  16. Medical students can learn the basic application, analytic, evaluative, and psychomotor skills of critical care medicine.

    Science.gov (United States)

    Rogers, P L; Jacob, H; Thomas, E A; Harwell, M; Willenkin, R L; Pinsky, M R

    2000-02-01

    To determine whether fourth-year medical students can learn the basic analytic, evaluative, and psychomotor skills needed to initially manage a critically ill patient. Student learning was evaluated using a performance examination, the objective structured clinical examination (OSCE). Students were randomly assigned to one of two clinical scenarios before the elective. After the elective, students completed the other scenario, using a crossover design. Five surgical intensive care units in a tertiary care university teaching hospital. Forty fourth-year medical students enrolled in the critical care medicine (CCM) elective. All students evaluated a live "simulated critically ill" patient, requested physiologic data from a nurse, ordered laboratory tests, received data in real time, and intervened as they deemed appropriate. Student performance of specific behavioral objectives was evaluated at five stations. They were expected to a) assess airway, breathing, and circulation in appropriate sequence; b) prepare a manikin for intubation, obtain an acceptable airway on the manikin, demonstrate bag-mouth ventilation, and perform acceptable laryngoscopy and intubation; c) provide appropriate mechanical ventilator settings; d) manage hypotension; and e) request and interpret pulmonary artery data and initiate appropriate therapy. OSCEs were videotaped and reviewed by two faculty members masked to time of examination. A checklist of key behaviors was used to evaluate performance. The primary outcome measure was the difference in examination score before and after the rotation. Secondary outcomes included the difference in scores at each rotation. The mean preelective score was 57.0%+/-8.3% compared with 85.9%+/-7.4% (ppsychomotor skills necessary to initially manage critically ill patients. After an appropriate 1-month CCM elective, students' thinking and application skills required to initially manage critically ill patients improved markedly, as demonstrated by an OSCE

  17. Robust inference in summary data Mendelian randomization via the zero modal pleiotropy assumption.

    Science.gov (United States)

    Hartwig, Fernando Pires; Davey Smith, George; Bowden, Jack

    2017-12-01

    Mendelian randomization (MR) is being increasingly used to strengthen causal inference in observational studies. Availability of summary data of genetic associations for a variety of phenotypes from large genome-wide association studies (GWAS) allows straightforward application of MR using summary data methods, typically in a two-sample design. In addition to the conventional inverse variance weighting (IVW) method, recently developed summary data MR methods, such as the MR-Egger and weighted median approaches, allow a relaxation of the instrumental variable assumptions. Here, a new method - the mode-based estimate (MBE) - is proposed to obtain a single causal effect estimate from multiple genetic instruments. The MBE is consistent when the largest number of similar (identical in infinite samples) individual-instrument causal effect estimates comes from valid instruments, even if the majority of instruments are invalid. We evaluate the performance of the method in simulations designed to mimic the two-sample summary data setting, and demonstrate its use by investigating the causal effect of plasma lipid fractions and urate levels on coronary heart disease risk. The MBE presented less bias and lower type-I error rates than other methods under the null in many situations. Its power to detect a causal effect was smaller compared with the IVW and weighted median methods, but was larger than that of MR-Egger regression, with sample size requirements typically smaller than those available from GWAS consortia. The MBE relaxes the instrumental variable assumptions, and should be used in combination with other approaches in sensitivity analyses. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association

  18. Developing Competency of Teachers in Basic Education Schools

    Science.gov (United States)

    Yuayai, Rerngrit; Chansirisira, Pacharawit; Numnaphol, Kochaporn

    2015-01-01

    This study aims to develop competency of teachers in basic education schools. The research instruments included the semi-structured in-depth interview form, questionnaire, program developing competency, and evaluation competency form. The statistics used for data analysis were percentage, mean, and standard deviation. The research found that…

  19. First assumptions and overlooking competing causes of death

    DEFF Research Database (Denmark)

    Leth, Peter Mygind; Andersen, Anh Thao Nguyen

    2014-01-01

    Determining the most probable cause of death is important, and it is sometimes tempting to assume an obvious cause of death, when it readily presents itself, and stop looking for other competing causes of death. The case story presented in the article illustrates this dilemma. The first assumption...... of cause of death, which was based on results from bacteriology tests, proved to be wrong when the results from the forensic toxicology testing became available. This case also illustrates how post mortem computed tomography (PMCT) findings of radio opaque material in the stomach alerted the pathologist...

  20. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  1. Endotracheal tube placement by EMT-Basics in a rural EMS system.

    Science.gov (United States)

    Pratt, Jeffrey C; Hirshberg, Alan J

    2005-01-01

    To evaluate the effectiveness of an intubation-training module and special-waiver project in which Emergency Medical Technician (EMT)-Basics were trained to perform endotracheal intubations in a rural community. This was a prospective observational study over a four-year period (July 1998 through May 2002) of all intubation attempts by EMT-Basics in the field. The authors observed intubation data, training methods, and quality-assurance methods of a special-waiver project agreed to by the State Department of Public Health to train and allow EMT-Basics to intubate patients. Data were from documentation unique to the project. Project documentation evaluated the placement and complication(s) of endotracheal tube (ETT) placement after arrival to the emergency department. An intubation attempt was defined as direct laryngoscopy. A successful attempt was defined as an appropriately sized ETT placed and secured in the trachea below the vocal cords and above the carina. Confirmation of placement in the field included accepted clinical methods and the use of qualitative colorimetric end-tidal carbon dioxide detectors. The EMT-Basics were trained using a paramedic curriculum, including operating room intubations on live adult patients. All patients were in either cardiopulmonary or respiratory arrest. Thirty-two intubations were performed by EMT-Basics. Thirty attempts were successful and two were unsuccessful (94%; 95% confidence interval [CI] 80-98%). Unsuccessful ETT placements were managed with accepted basic life support airway standards. There were no unrecognized esophageal ETT placements (0%; 95% CI 0-11%). This study demonstrated that with an intensive training program using selected highly motivated providers and close monitoring, a program of EMT-Basic ETT placement in a rural setting can achieve acceptable success rates in patients in cardiac or respiratory arrest.

  2. Basic considerations for the safety analysis report of the Greek Research Reactor-1 (GRR-1)

    International Nuclear Information System (INIS)

    Anoussis, J.N.; Chrysochoides, N.G.; Papastergiou, C.N.

    1980-09-01

    The basic considerations upon which the new revised Safety Analysis Report (SAR) for the GRR-1 will be based are presented. The format and the content the SAR will follow are given. A number of credible and less credible accidents is briefly analysed on the basis of present knowledge and experience for similar reactors, as well as the experience gained in the last 10 years of the GRR-1 operation at 5 MW. The accident caused by partial blockage of the cooling flow is considered to be the Maximum Credible Accident (MCA) for the GRR-1. The MCA is analysed and its radiological impact to the environment is estimated using conservative assumptions. (T.A.)

  3. A Taxonomy of Latent Structure Assumptions for Probability Matrix Decomposition Models.

    Science.gov (United States)

    Meulders, Michel; De Boeck, Paul; Van Mechelen, Iven

    2003-01-01

    Proposed a taxonomy of latent structure assumptions for probability matrix decomposition (PMD) that includes the original PMD model and a three-way extension of the multiple classification latent class model. Simulation study results show the usefulness of the taxonomy. (SLD)

  4. Comparison of risk-dominant scenario assumptions for several TRU waste facilities in the DOE complex

    International Nuclear Information System (INIS)

    Foppe, T.L.; Marx, D.R.

    1999-01-01

    In order to gain a risk management perspective, the DOE Rocky Flats Field Office (RFFO) initiated a survey of other DOE sites regarding risks from potential accidents associated with transuranic (TRU) storage and/or processing facilities. Recently-approved authorization basis documents at the Rocky Flats Environmental Technology Site (RFETS) have been based on the DOE Standard 3011 risk assessment methodology with three qualitative estimates of frequency of occurrence and quantitative estimates of radiological consequences to the collocated worker and the public binned into three severity levels. Risk Class 1 and 2 events after application of controls to prevent or mitigate the accident are designated as risk-dominant scenarios. Accident Evaluation Guidelines for selection of Technical Safety Requirements (TSRs) are based on the frequency and consequence bin assignments to identify controls that can be credited to reduce risk to Risk Class 3 or 4, or that are credited for Risk Class 1 and 2 scenarios that cannot be further reduced. This methodology resulted in several risk-dominant scenarios for either the collocated worker or the public that warranted consideration on whether additional controls should be implemented. RFFO requested the survey because of these high estimates of risks that are primarily due to design characteristics of RFETS TRU waste facilities (i.e., Butler-type buildings without a ventilation and filtration system, and a relatively short distance to the Site boundary). Accident analysis methodologies and key assumptions are being compared for the DOE sites responding to the survey. This includes type of accidents that are risk dominant (e.g., drum explosion, material handling breach, fires, natural phenomena, external events, etc.), source term evaluation (e.g., radionuclide material-at-risk, chemical and physical form, damage ratio, airborne release fraction, respirable fraction, leakpath factors), dispersion analysis (e.g., meteorological

  5. Investigating Darcy-scale assumptions by means of a multiphysics algorithm

    Science.gov (United States)

    Tomin, Pavel; Lunati, Ivan

    2016-09-01

    Multiphysics (or hybrid) algorithms, which couple Darcy and pore-scale descriptions of flow through porous media in a single numerical framework, are usually employed to decrease the computational cost of full pore-scale simulations or to increase the accuracy of pure Darcy-scale simulations when a simple macroscopic description breaks down. Despite the massive increase in available computational power, the application of these techniques remains limited to core-size problems and upscaling remains crucial for practical large-scale applications. In this context, the Hybrid Multiscale Finite Volume (HMsFV) method, which constructs the macroscopic (Darcy-scale) problem directly by numerical averaging of pore-scale flow, offers not only a flexible framework to efficiently deal with multiphysics problems, but also a tool to investigate the assumptions used to derive macroscopic models and to better understand the relationship between pore-scale quantities and the corresponding macroscale variables. Indeed, by direct comparison of the multiphysics solution with a reference pore-scale simulation, we can assess the validity of the closure assumptions inherent to the multiphysics algorithm and infer the consequences for macroscopic models at the Darcy scale. We show that the definition of the scale ratio based on the geometric properties of the porous medium is well justified only for single-phase flow, whereas in case of unstable multiphase flow the nonlinear interplay between different forces creates complex fluid patterns characterized by new spatial scales, which emerge dynamically and weaken the scale-separation assumption. In general, the multiphysics solution proves very robust even when the characteristic size of the fluid-distribution patterns is comparable with the observation length, provided that all relevant physical processes affecting the fluid distribution are considered. This suggests that macroscopic constitutive relationships (e.g., the relative

  6. Consequences of Violated Equating Assumptions under the Equivalent Groups Design

    Science.gov (United States)

    Lyren, Per-Erik; Hambleton, Ronald K.

    2011-01-01

    The equal ability distribution assumption associated with the equivalent groups equating design was investigated in the context of a selection test for admission to higher education. The purpose was to assess the consequences for the test-takers in terms of receiving improperly high or low scores compared to their peers, and to find strong…

  7. Consenting to Heteronormativity: Assumptions in Biomedical Research

    NARCIS (Netherlands)

    Cottingham, M.D.; Fisher, J.A.

    2015-01-01

    The process of informed consent is fundamental to basic scientific research with human subjects. As one aspect of the scientific enterprise, clinical drug trials rely on informed consent documents to safeguard the ethical treatment of trial participants. This paper explores the role of

  8. Basic molecular spectroscopy

    CERN Document Server

    Gorry, PA

    1985-01-01

    BASIC Molecular Spectroscopy discusses the utilization of the Beginner's All-purpose Symbolic Instruction Code (BASIC) programming language in molecular spectroscopy. The book is comprised of five chapters that provide an introduction to molecular spectroscopy through programs written in BASIC. The coverage of the text includes rotational spectra, vibrational spectra, and Raman and electronic spectra. The book will be of great use to students who are currently taking a course in molecular spectroscopy.

  9. Controversies in psychotherapy research: epistemic differences in assumptions about human psychology.

    Science.gov (United States)

    Shean, Glenn D

    2013-01-01

    It is the thesis of this paper that differences in philosophical assumptions about the subject matter and treatment methods of psychotherapy have contributed to disagreements about the external validity of empirically supported therapies (ESTs). These differences are evident in the theories that are the basis for both the design and interpretation of recent psychotherapy efficacy studies. The natural science model, as applied to psychotherapy outcome research, transforms the constitutive features of the study subject in a reciprocal manner so that problems, treatments, and indicators of effectiveness are limited to what can be directly observed. Meaning-based approaches to therapy emphasize processes and changes that do not lend themselves to experimental study. Hermeneutic philosophy provides a supplemental model to establishing validity in those instances where outcome indicators do not lend themselves to direct observation and measurement and require "deep" interpretation. Hermeneutics allows for a broadening of psychological study that allows one to establish a form of validity that is applicable when constructs do not refer to things that literally "exist" in nature. From a hermeneutic perspective the changes that occur in meaning-based therapies must be understood and evaluated on the manner in which they are applied to new situations, the logical ordering and harmony of the parts with the theoretical whole, and the capability of convincing experts and patients that the interpretation can stand up against other ways of understanding. Adoption of this approach often is necessary to competently evaluate the effectiveness of meaning-based therapies.

  10. Has the "Equal Environments" assumption been tested in twin studies?

    Science.gov (United States)

    Eaves, Lindon; Foley, Debra; Silberg, Judy

    2003-12-01

    A recurring criticism of the twin method for quantifying genetic and environmental components of human differences is the necessity of the so-called "equal environments assumption" (EEA) (i.e., that monozygotic and dizygotic twins experience equally correlated environments). It has been proposed to test the EEA by stratifying twin correlations by indices of the amount of shared environment. However, relevant environments may also be influenced by genetic differences. We present a model for the role of genetic factors in niche selection by twins that may account for variation in indices of the shared twin environment (e.g., contact between members of twin pairs). Simulations reveal that stratification of twin correlations by amount of contact can yield spurious evidence of large shared environmental effects in some strata and even give false indications of genotype x environment interaction. The stratification approach to testing the equal environments assumption may be misleading and the results of such tests may actually be consistent with a simpler theory of the role of genetic factors in niche selection.

  11. Bell violation using entangled photons without the fair-sampling assumption.

    Science.gov (United States)

    Giustina, Marissa; Mech, Alexandra; Ramelow, Sven; Wittmann, Bernhard; Kofler, Johannes; Beyer, Jörn; Lita, Adriana; Calkins, Brice; Gerrits, Thomas; Nam, Sae Woo; Ursin, Rupert; Zeilinger, Anton

    2013-05-09

    The violation of a Bell inequality is an experimental observation that forces the abandonment of a local realistic viewpoint--namely, one in which physical properties are (probabilistically) defined before and independently of measurement, and in which no physical influence can propagate faster than the speed of light. All such experimental violations require additional assumptions depending on their specific construction, making them vulnerable to so-called loopholes. Here we use entangled photons to violate a Bell inequality while closing the fair-sampling loophole, that is, without assuming that the sample of measured photons accurately represents the entire ensemble. To do this, we use the Eberhard form of Bell's inequality, which is not vulnerable to the fair-sampling assumption and which allows a lower collection efficiency than other forms. Technical improvements of the photon source and high-efficiency transition-edge sensors were crucial for achieving a sufficiently high collection efficiency. Our experiment makes the photon the first physical system for which each of the main loopholes has been closed, albeit in different experiments.

  12. Hygroscopical behaviour of basic electrodes in a tropical humid climate

    International Nuclear Information System (INIS)

    Valencia, E.; Galeano, N.J.

    1993-01-01

    The study of the wetting kynetics of basic electrodes in a tropical humid climate is very important since the water contained in them is the main source for the atomic hydrogen absorbed by the fused metal during electric arc welding. It is also the origin of multiple defects in the added metal. A calculating method is established for evaluating the kynetics of wetness incorporation to the coating of basic electrodes exposed to a humid tropical climate. The method is based on the Fick's diffusion equation for both adequate system geometry and boundary conditions, which allows the evaluation of the effective diffusion coefficient and critical times of exposure to the different environments, along with the packing and storage conditions of electrodes. (Author)

  13. Autonomic nervous system response patterns specificity to basic emotions.

    Science.gov (United States)

    Collet, C; Vernet-Maury, E; Delhomme, G; Dittmar, A

    1997-01-12

    The aim of this study was to test the assumption that the autonomic nervous system responses to emotional stimuli are specific. A series of six slides was randomly presented to the subjects while six autonomic nervous system (ANS) parameters were recorded: skin conductance, skin potential, skin resistance, skin blood flow, skin temperature and instantaneous respiratory frequency. Each slide induced a basic emotion: happiness, surprise, anger, fear, sadness and disgust. Results have been first considered with reference to electrodermal responses (EDR) and secondly through thermo-vascular and respiratory variations. Classical as well as original indices were used to quantify autonomic responses. The six basic emotions were distinguished by Friedman variance analysis. Thus, ANS values corresponding to each emotion were compared two-by-two. EDR distinguished 13 emotion-pairs out of 15. 10 emotion-pairs were separated by skin resistance as well as skin conductance ohmic perturbation duration indices whereas conductance amplitude was only capable of distinguishing 7 emotion-pairs. Skin potential responses distinguished surprise and fear from sadness, and fear from disgust, according to their elementary pattern analysis in form and sign. Two-by-two comparisons of skin temperature, skin blood flow (estimated by the new non-oscillary duration index) and instantaneous respiratory frequency, enabled the distinction of 14 emotion-pairs out of 15. 9 emotion-pairs were distinguished by the non-oscillatory duration index values. Skin temperature was demonstrated to be different i.e. positive versus negative in response to anger and fear. The instantaneous respiratory frequency perturbation duration index was the only one capable of separating sadness from disgust. From the six ANS parameters study, different autonomic patterns were identified, each characterizing one of the six basic emotion used as inducing signals. No index alone, nor group of parameters (EDR and thermovascular

  14. Influence of model assumptions about HIV disease progression after initiating or stopping treatment on estimates of infections and deaths averted by scaling up antiretroviral therapy

    Science.gov (United States)

    Sucharitakul, Kanes; Boily, Marie-Claude; Dimitrov, Dobromir

    2018-01-01

    Background Many mathematical models have investigated the population-level impact of expanding antiretroviral therapy (ART), using different assumptions about HIV disease progression on ART and among ART dropouts. We evaluated the influence of these assumptions on model projections of the number of infections and deaths prevented by expanded ART. Methods A new dynamic model of HIV transmission among men who have sex with men (MSM) was developed, which incorporated each of four alternative assumptions about disease progression used in previous models: (A) ART slows disease progression; (B) ART halts disease progression; (C) ART reverses disease progression by increasing CD4 count; (D) ART reverses disease progression, but disease progresses rapidly once treatment is stopped. The model was independently calibrated to HIV prevalence and ART coverage data from the United States under each progression assumption in turn. New HIV infections and HIV-related deaths averted over 10 years were compared for fixed ART coverage increases. Results Little absolute difference (ART coverage (varied between 33% and 90%) if ART dropouts reinitiated ART at the same rate as ART-naïve MSM. Larger differences in the predicted fraction of HIV-related deaths averted were observed (up to 15pp). However, if ART dropouts could only reinitiate ART at CD4ART interruption did not affect the fraction of HIV infections averted with expanded ART, unless ART dropouts only re-initiated ART at low CD4 counts. Different disease progression assumptions had a larger influence on the fraction of HIV-related deaths averted with expanded ART. PMID:29554136

  15. Langmuir probe-based observables for plasma-turbulence code validation and application to the TORPEX basic plasma physics experiment

    International Nuclear Information System (INIS)

    Ricci, Paolo; Theiler, C.; Fasoli, A.; Furno, I.; Labit, B.; Mueller, S. H.; Podesta, M.; Poli, F. M.

    2009-01-01

    The methodology for plasma-turbulence code validation is discussed, with focus on the quantities to use for the simulation-experiment comparison, i.e., the validation observables, and application to the TORPEX basic plasma physics experiment [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)]. The considered validation observables are deduced from Langmuir probe measurements and are ordered into a primacy hierarchy, according to the number of model assumptions and to the combinations of measurements needed to form each of them. The lowest levels of the primacy hierarchy correspond to observables that require the lowest number of model assumptions and measurement combinations, such as the statistical and spectral properties of the ion saturation current time trace, while at the highest levels, quantities such as particle transport are considered. The comparison of the observables at the lowest levels in the hierarchy is more stringent than at the highest levels. Examples of the use of the proposed observables are applied to a specific TORPEX plasma configuration characterized by interchange-driven turbulence.

  16. Sensitivity of Rooftop PV Projections in the SunShot Vision Study to Market Assumptions

    Energy Technology Data Exchange (ETDEWEB)

    Drury, E.; Denholm, P.; Margolis, R.

    2013-01-01

    The SunShot Vision Study explored the potential growth of solar markets if solar prices decreased by about 75% from 2010 to 2020. The SolarDS model was used to simulate rooftop PV demand for this study, based on several PV market assumptions--future electricity rates, customer access to financing, and others--in addition to the SunShot PV price projections. This paper finds that modeled PV demand is highly sensitive to several non-price market assumptions, particularly PV financing parameters.

  17. A method of statistical analysis in the field of sports science when assumptions of parametric tests are not violated

    OpenAIRE

    Sandurska, Elżbieta; Szulc, Aleksandra

    2016-01-01

    Sandurska Elżbieta, Szulc Aleksandra. A method of statistical analysis in the field of sports science when assumptions of parametric tests are not violated. Journal of Education Health and Sport. 2016;6(13):275-287. eISSN 2391-8306. DOI http://dx.doi.org/10.5281/zenodo.293762 http://ojs.ukw.edu.pl/index.php/johs/article/view/4278 The journal has had 7 points in Ministry of Science and Higher Education parametric evaluation. Part B item 754 (09.12.2016). 754 Journal...

  18. Establishing the minimal number of virtual reality simulator training sessions necessary to develop basic laparoscopic skills competence: evaluation of the learning curve

    Directory of Open Access Journals (Sweden)

    Ricardo Jordao Duarte

    2013-09-01

    Full Text Available Introduction Medical literature is scarce on information to define a basic skills training program for laparoscopic surgery (peg and transferring, cutting, clipping. The aim of this study was to determine the minimal number of simulator sessions of basic laparoscopic tasks necessary to elaborate an optimal virtual reality training curriculum. Materials and Methods Eleven medical students with no previous laparoscopic experience were spontaneously enrolled. They were submitted to simulator training sessions starting at level 1 (Immersion Lap VR, San Jose, CA, including sequentially camera handling, peg and transfer, clipping and cutting. Each student trained twice a week until 10 sessions were completed. The score indexes were registered and analyzed. The total of errors of the evaluation sequences (camera, peg and transfer, clipping and cutting were computed and thereafter, they were correlated to the total of items evaluated in each step, resulting in a success percent ratio for each student for each set of each completed session. Thereafter, we computed the cumulative success rate in 10 sessions, obtaining an analysis of the learning process. By non-linear regression the learning curve was analyzed. Results By the non-linear regression method the learning curve was analyzed and a r2 = 0.73 (p < 0.001 was obtained, being necessary 4.26 (∼five sessions to reach the plateau of 80% of the estimated acquired knowledge, being that 100% of the students have reached this level of skills. From the fifth session till the 10th, the gain of knowledge was not significant, although some students reached 96% of the expected improvement. Conclusions This study revealed that after five simulator training sequential sessions the students' learning curve reaches a plateau. The forward sessions in the same difficult level do not promote any improvement in laparoscopic basic surgical skills, and the students should be introduced to a more difficult training

  19. Establishing the minimal number of virtual reality simulator training sessions necessary to develop basic laparoscopic skills competence: evaluation of the learning curve.

    Science.gov (United States)

    Duarte, Ricardo Jordão; Cury, José; Oliveira, Luis Carlos Neves; Srougi, Miguel

    2013-01-01

    Medical literature is scarce on information to define a basic skills training program for laparoscopic surgery (peg and transferring, cutting, clipping). The aim of this study was to determine the minimal number of simulator sessions of basic laparoscopic tasks necessary to elaborate an optimal virtual reality training curriculum. Eleven medical students with no previous laparoscopic experience were spontaneously enrolled. They were submitted to simulator training sessions starting at level 1 (Immersion Lap VR, San Jose, CA), including sequentially camera handling, peg and transfer, clipping and cutting. Each student trained twice a week until 10 sessions were completed. The score indexes were registered and analyzed. The total of errors of the evaluation sequences (camera, peg and transfer, clipping and cutting) were computed and thereafter, they were correlated to the total of items evaluated in each step, resulting in a success percent ratio for each student for each set of each completed session. Thereafter, we computed the cumulative success rate in 10 sessions, obtaining an analysis of the learning process. By non-linear regression the learning curve was analyzed. By the non-linear regression method the learning curve was analyzed and a r2 = 0.73 (p sessions) to reach the plateau of 80% of the estimated acquired knowledge, being that 100% of the students have reached this level of skills. From the fifth session till the 10th, the gain of knowledge was not significant, although some students reached 96% of the expected improvement. This study revealed that after five simulator training sequential sessions the students' learning curve reaches a plateau. The forward sessions in the same difficult level do not promote any improvement in laparoscopic basic surgical skills, and the students should be introduced to a more difficult training tasks level.

  20. Estimating the global prevalence of inadequate zinc intake from national food balance sheets: effects of methodological assumptions.

    Directory of Open Access Journals (Sweden)

    K Ryan Wessells

    Full Text Available The prevalence of inadequate zinc intake in a population can be estimated by comparing the zinc content of the food supply with the population's theoretical requirement for zinc. However, assumptions regarding the nutrient composition of foods, zinc requirements, and zinc absorption may affect prevalence estimates. These analyses were conducted to: (1 evaluate the effect of varying methodological assumptions on country-specific estimates of the prevalence of dietary zinc inadequacy and (2 generate a model considered to provide the best estimates.National food balance data were obtained from the Food and Agriculture Organization of the United Nations. Zinc and phytate contents of these foods were estimated from three nutrient composition databases. Zinc absorption was predicted using a mathematical model (Miller equation. Theoretical mean daily per capita physiological and dietary requirements for zinc were calculated using recommendations from the Food and Nutrition Board of the Institute of Medicine and the International Zinc Nutrition Consultative Group. The estimated global prevalence of inadequate zinc intake varied between 12-66%, depending on which methodological assumptions were applied. However, country-specific rank order of the estimated prevalence of inadequate intake was conserved across all models (r = 0.57-0.99, P<0.01. A "best-estimate" model, comprised of zinc and phytate data from a composite nutrient database and IZiNCG physiological requirements for absorbed zinc, estimated the global prevalence of inadequate zinc intake to be 17.3%.Given the multiple sources of uncertainty in this method, caution must be taken in the interpretation of the estimated prevalence figures. However, the results of all models indicate that inadequate zinc intake may be fairly common globally. Inferences regarding the relative likelihood of zinc deficiency as a public health problem in different countries can be drawn based on the country

  1. Sensitivity Analysis and Bounding of Causal Effects with Alternative Identifying Assumptions

    Science.gov (United States)

    Jo, Booil; Vinokur, Amiram D.

    2011-01-01

    When identification of causal effects relies on untestable assumptions regarding nonidentified parameters, sensitivity of causal effect estimates is often questioned. For proper interpretation of causal effect estimates in this situation, deriving bounds on causal parameters or exploring the sensitivity of estimates to scientifically plausible…

  2. Examination of Conservatism in Ground-level Source Release Assumption when Performing Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung-yeop; Lim, Ho-Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    One of these assumptions frequently assumed is the assumption of ground-level source release. The user manual of a consequence analysis software HotSpot is mentioning like below: 'If you cannot estimate or calculate the effective release height, the actual physical release height (height of the stack) or zero for ground-level release should be used. This will usually yield a conservative estimate, (i.e., larger radiation doses for all downwind receptors, etc).' This recommendation could be agreed in aspect of conservatism but quantitative examination of the effect of this assumption to the result of consequence analysis is necessary. The source terms of Fukushima Dai-ichi NPP accident have been estimated by several studies using inverse modeling and one of the biggest sources of the difference between the results of these studies was different effective source release height assumed by each studies. It supports the importance of the quantitative examination of the influence by release height. Sensitivity analysis of the effective release height of radioactive sources was performed and the influence to the total effective dose was quantitatively examined in this study. Above 20% difference is maintained even at longer distances, when we compare the dose between the result assuming ground-level release and the results assuming other effective plume height. It means that we cannot ignore the influence of ground-level source assumption to the latent cancer fatality estimations. In addition, the assumption of ground-level release fundamentally prevents detailed analysis including diffusion of plume from effective plume height to the ground even though the influence of it is relatively lower in longer distance. When we additionally consider the influence of surface roughness, situations could be more serious. The ground level dose could be highly over-estimated in short downwind distance at the NPP sites which have low surface roughness such as Barakah site in

  3. Modeling soil CO2 production and transport with dynamic source and diffusion terms: testing the steady-state assumption using DETECT v1.0

    Science.gov (United States)

    Ryan, Edmund M.; Ogle, Kiona; Kropp, Heather; Samuels-Crow, Kimberly E.; Carrillo, Yolima; Pendall, Elise

    2018-05-01

    The flux of CO2 from the soil to the atmosphere (soil respiration, Rsoil) is a major component of the global carbon (C) cycle. Methods to measure and model Rsoil, or partition it into different components, often rely on the assumption that soil CO2 concentrations and fluxes are in steady state, implying that Rsoil is equal to the rate at which CO2 is produced by soil microbial and root respiration. Recent research, however, questions the validity of this assumption. Thus, the aim of this work was two-fold: (1) to describe a non-steady state (NSS) soil CO2 transport and production model, DETECT, and (2) to use this model to evaluate the environmental conditions under which Rsoil and CO2 production are likely in NSS. The backbone of DETECT is a non-homogeneous, partial differential equation (PDE) that describes production and transport of soil CO2, which we solve numerically at fine spatial and temporal resolution (e.g., 0.01 m increments down to 1 m, every 6 h). Production of soil CO2 is simulated for every depth and time increment as the sum of root respiration and microbial decomposition of soil organic matter. Both of these factors can be driven by current and antecedent soil water content and temperature, which can also vary by time and depth. We also analytically solved the ordinary differential equation (ODE) corresponding to the steady-state (SS) solution to the PDE model. We applied the DETECT NSS and SS models to the six-month growing season period representative of a native grassland in Wyoming. Simulation experiments were conducted with both model versions to evaluate factors that could affect departure from SS, such as (1) varying soil texture; (2) shifting the timing or frequency of precipitation; and (3) with and without the environmental antecedent drivers. For a coarse-textured soil, Rsoil from the SS model closely matched that of the NSS model. However, in a fine-textured (clay) soil, growing season Rsoil was ˜ 3 % higher under the assumption of

  4. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  5. Basic rocks in Finland

    International Nuclear Information System (INIS)

    Piirainen, T.; Gehoer, S.; Iljina, M.; Kaerki, A.; Paakkola, J.; Vuollo, J.

    1992-10-01

    Basic igneous rocks, containing less than 52% SiO 2 , constitute an important part of the Finnish Archaean and Proterozoic crust. In the Archaean crust exist two units which contain the majority of the basic rocks. The Arcaean basic rocks are metavolcanics and situated in the Greenstone Belts of Eastern Finland. They are divided into two units. The greenstones of the lower one are tholeiites, komatiites and basaltic komatiites. The upper consists of bimodal series of volcanics and the basic rocks of which are Fe-tholeiites, basaltic komatiites and komatiites. Proterozoic basic rocks are divided into seven groups according to their ages. The Proterozoic igneous activity started by the volominous basic magmatism 2.44 Ga ago. During this stage formed the layered intrusions and related dykes in the Northern Finland. 2.2 Ga old basic rocks are situated at the margins of Karelian formations. 2.1 Ga aged Fe-tholeiitic magmatic activity is widespread in Eastern and Northern Finland. The basic rocks of 1.97 Ga age group are met within the Karelian Schist Belts as obducted ophiolite complexes but they occur also as tholeiitic diabase dykes cutting the Karelian schists and Archean basement. The intrusions and the volcanics of the 1.9 Ga old basic igneous activity are mostly encountered around the Granitoid Complex of Central Finland. Subjotnian, 1.6 Ga aged tholeiitic diabases are situated around the Rapakivi massifs of Southern Finland, and postjotnian, 1.2 Ga diabases in Western Finland where they form dykes cutting Svecofennian rocks

  6. Challenging the assumptions for thermal sensation scales

    DEFF Research Database (Denmark)

    Schweiker, Marcel; Fuchs, Xaver; Becker, Susanne

    2016-01-01

    Scales are widely used to assess the personal experience of thermal conditions in built environments. Most commonly, thermal sensation is assessed, mainly to determine whether a particular thermal condition is comfortable for individuals. A seven-point thermal sensation scale has been used...... extensively, which is suitable for describing a one-dimensional relationship between physical parameters of indoor environments and subjective thermal sensation. However, human thermal comfort is not merely a physiological but also a psychological phenomenon. Thus, it should be investigated how scales for its...... assessment could benefit from a multidimensional conceptualization. The common assumptions related to the usage of thermal sensation scales are challenged, empirically supported by two analyses. These analyses show that the relationship between temperature and subjective thermal sensation is non...

  7. Basics and application of PSpice

    International Nuclear Information System (INIS)

    Choi, Pyeong; Cho, Yong Beom; Mok, Hyeong Su; Baek, Dong CHeol

    2006-03-01

    This book is comprised of nineteenth chapters, which introduces basics and application of PSpice. The contents of this book are PSpice?, PSpice introduction, PSpice simulation, DC analysis, parametric analysis, Transient analysis, parametric analysis and measurements, Monte Carlo analysis, changing of device characteristic, ABM application. The elementary laws of circuit, R.L.C. basic circuit, Diode basic cc circuit, Transistor and EET basic circuit, OP-Amp basic circuit, Digital basic circuit, Analog, digital circuit practice, digital circuit application and practice and ABM circuit application and practice.

  8. Implicit assumptions underlying simple harvest models of marine bird populations can mislead environmental management decisions.

    Science.gov (United States)

    O'Brien, Susan H; Cook, Aonghais S C P; Robinson, Robert A

    2017-10-01

    Assessing the potential impact of additional mortality from anthropogenic causes on animal populations requires detailed demographic information. However, these data are frequently lacking, making simple algorithms, which require little data, appealing. Because of their simplicity, these algorithms often rely on implicit assumptions, some of which may be quite restrictive. Potential Biological Removal (PBR) is a simple harvest model that estimates the number of additional mortalities that a population can theoretically sustain without causing population extinction. However, PBR relies on a number of implicit assumptions, particularly around density dependence and population trajectory that limit its applicability in many situations. Among several uses, it has been widely employed in Europe in Environmental Impact Assessments (EIA), to examine the acceptability of potential effects of offshore wind farms on marine bird populations. As a case study, we use PBR to estimate the number of additional mortalities that a population with characteristics typical of a seabird population can theoretically sustain. We incorporated this level of additional mortality within Leslie matrix models to test assumptions within the PBR algorithm about density dependence and current population trajectory. Our analyses suggest that the PBR algorithm identifies levels of mortality which cause population declines for most population trajectories and forms of population regulation. Consequently, we recommend that practitioners do not use PBR in an EIA context for offshore wind energy developments. Rather than using simple algorithms that rely on potentially invalid implicit assumptions, we recommend use of Leslie matrix models for assessing the impact of additional mortality on a population, enabling the user to explicitly define assumptions and test their importance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Quantum information versus black hole physics: deep firewalls from narrow assumptions.

    Science.gov (United States)

    Braunstein, Samuel L; Pirandola, Stefano

    2018-07-13

    The prevalent view that evaporating black holes should simply be smaller black holes has been challenged by the firewall paradox. In particular, this paradox suggests that something different occurs once a black hole has evaporated to one-half its original surface area. Here, we derive variations of the firewall paradox by tracking the thermodynamic entropy within a black hole across its entire lifetime and extend it even to anti-de Sitter space-times. Our approach sweeps away many unnecessary assumptions, allowing us to demonstrate a paradox exists even after its initial onset (when conventional assumptions render earlier analyses invalid). The most natural resolution may be to accept firewalls as a real phenomenon. Further, the vast entropy accumulated implies a deep firewall that goes 'all the way down' in contrast with earlier work describing only a structure at the horizon.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  10. Quantum information versus black hole physics: deep firewalls from narrow assumptions

    Science.gov (United States)

    Braunstein, Samuel L.; Pirandola, Stefano

    2018-07-01

    The prevalent view that evaporating black holes should simply be smaller black holes has been challenged by the firewall paradox. In particular, this paradox suggests that something different occurs once a black hole has evaporated to one-half its original surface area. Here, we derive variations of the firewall paradox by tracking the thermodynamic entropy within a black hole across its entire lifetime and extend it even to anti-de Sitter space-times. Our approach sweeps away many unnecessary assumptions, allowing us to demonstrate a paradox exists even after its initial onset (when conventional assumptions render earlier analyses invalid). The most natural resolution may be to accept firewalls as a real phenomenon. Further, the vast entropy accumulated implies a deep firewall that goes `all the way down' in contrast with earlier work describing only a structure at the horizon. This article is part of a discussion meeting issue `Foundations of quantum mechanics and their impact on contemporary society'.

  11. An introduction to economic analysis in medicine - the basics of methology and chosen trems. Examples of results of evaluation in nuclear medicine

    International Nuclear Information System (INIS)

    Brockhuis, B.M.; Lass, P.

    2002-01-01

    This article overviews the basics terms and methodology of economic analysis in health care. The most important forms of economic analysis: cost-effectiveness, cost-utility and cost-minimisation analysis and aims of their application are presented. Particular emphasis is put on economic evaluation in nuclear medicine, e.g. FDG-PET v. thoracotomy in lung cancer diagnosis, radioiodine therapy v. antithyroid drugs in hyperthyroidism and technetium-99m-MIBI breast imaging v. biopsy in nonpalpable breast abnormalities. (author)

  12. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  13. Health Insurance Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Health Insurance Basics KidsHealth / For Teens / Health Insurance Basics What's ... thought advanced calculus was confusing. What Exactly Is Health Insurance? Health insurance is a plan that people buy ...

  14. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  15. Drug policy in sport: hidden assumptions and inherent contradictions.

    Science.gov (United States)

    Smith, Aaron C T; Stewart, Bob

    2008-03-01

    This paper considers the assumptions underpinning the current drugs-in-sport policy arrangements. We examine the assumptions and contradictions inherent in the policy approach, paying particular attention to the evidence that supports different policy arrangements. We find that the current anti-doping policy of the World Anti-Doping Agency (WADA) contains inconsistencies and ambiguities. WADA's policy position is predicated upon four fundamental principles; first, the need for sport to set a good example; secondly, the necessity of ensuring a level playing field; thirdly, the responsibility to protect the health of athletes; and fourthly, the importance of preserving the integrity of sport. A review of the evidence, however, suggests that sport is a problematic institution when it comes to setting a good example for the rest of society. Neither is it clear that sport has an inherent or essential integrity that can only be sustained through regulation. Furthermore, it is doubtful that WADA's anti-doping policy is effective in maintaining a level playing field, or is the best means of protecting the health of athletes. The WADA anti-doping policy is based too heavily on principals of minimising drug use, and gives insufficient weight to the minimisation of drug-related harms. As a result drug-related harms are being poorly managed in sport. We argue that anti-doping policy in sport would benefit from placing greater emphasis on a harm minimisation model.

  16. Bioaccumulation factors and the steady state assumption for cesium isotopes in aquatic foodwebs near nuclear facilities.

    Science.gov (United States)

    Rowan, D J

    2013-07-01

    Steady state approaches, such as transfer coefficients or bioaccumulation factors, are commonly used to model the bioaccumulation of (137)Cs in aquatic foodwebs from routine operations and releases from nuclear generating stations and other nuclear facilities. Routine releases from nuclear generating stations and facilities, however, often consist of pulses as liquid waste is stored, analyzed to ensure regulatory compliance and then released. The effect of repeated pulse releases on the steady state assumption inherent in the bioaccumulation factor approach has not been evaluated. In this study, I examine the steady state assumption for aquatic biota by analyzing data for two cesium isotopes in the same biota, one isotope in steady state (stable (133)Cs) from geologic sources and the other released in pulses ((137)Cs) from reactor operations. I also compare (137)Cs bioaccumulation factors for similar upstream populations from the same system exposed solely to weapon test (137)Cs, and assumed to be in steady state. The steady state assumption appears to be valid for small organisms at lower trophic levels (zooplankton, rainbow smelt and 0+ yellow perch) but not for older and larger fish at higher trophic levels (walleye). Attempts to account for previous exposure and retention through a biokinetics approach had a similar effect on steady state, upstream and non-steady state, downstream populations of walleye, but were ineffective in explaining the more or less constant deviation between fish with steady state exposures and non-steady state exposures of about 2-fold for all age classes of walleye. These results suggest that for large, piscivorous fish, repeated exposure to short duration, pulse releases leads to much higher (137)Cs BAFs than expected from (133)Cs BAFs for the same fish or (137)Cs BAFs for similar populations in the same system not impacted by reactor releases. These results suggest that the steady state approach should be used with caution in any

  17. Adult Basic Skills Instructor Training and Experiential Learning Theory.

    Science.gov (United States)

    Marlowe, Mike; And Others

    1991-01-01

    Competency-based training workshops based on Kolb's experiential learning theory were held for North Carolina adult basic education teachers; 251 attended 1-day sessions and 91 a week-long summer institute. Topics included interpersonal communication, reading, numeracy, language arts, math, assessment, and program evaluation. (SK)

  18. Robustness Analysis of Visual Question Answering Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-11-01

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  19. Robustness Analysis of Visual Question Answering Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-01-01

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  20. A brief simulation intervention increasing basic science and clinical knowledge

    Directory of Open Access Journals (Sweden)

    Maria L. Sheakley

    2016-04-01

    Full Text Available Background: The United States Medical Licensing Examination (USMLE is increasing clinical content on the Step 1 exam; thus, inclusion of clinical applications within the basic science curriculum is crucial. Including simulation activities during basic science years bridges the knowledge gap between basic science content and clinical application. Purpose: To evaluate the effects of a one-off, 1-hour cardiovascular simulation intervention on a summative assessment after adjusting for relevant demographic and academic predictors. Methods: This study was a non-randomized study using historical controls to evaluate curricular change. The control group received lecture (n l=515 and the intervention group received lecture plus a simulation exercise (nl+s=1,066. Assessment included summative exam questions (n=4 that were scored as pass/fail (≥75%. USMLE-style assessment questions were identical for both cohorts. Descriptive statistics for variables are presented and odds of passage calculated using logistic regression. Results: Undergraduate grade point ratio, MCAT-BS, MCAT-PS, age, attendance at an academic review program, and gender were significant predictors of summative exam passage. Students receiving the intervention were significantly more likely to pass the summative exam than students receiving lecture only (P=0.0003. Discussion: Simulation plus lecture increases short-term understanding as tested by a written exam. A longitudinal study is needed to assess the effect of a brief simulation intervention on long-term retention of clinical concepts in a basic science curriculum.

  1. Improving credibility and transparency of conservation impact evaluations through the partial identification approach.

    Science.gov (United States)

    McConnachie, Matthew M; Romero, Claudia; Ferraro, Paul J; van Wilgen, Brian W

    2016-04-01

    The fundamental challenge of evaluating the impact of conservation interventions is that researchers must estimate the difference between the outcome after an intervention occurred and what the outcome would have been without it (counterfactual). Because the counterfactual is unobservable, researchers must make an untestable assumption that some units (e.g., organisms or sites) that were not exposed to the intervention can be used as a surrogate for the counterfactual (control). The conventional approach is to make a point estimate (i.e., single number along with a confidence interval) of impact, using, for example, regression. Point estimates provide powerful conclusions, but in nonexperimental contexts they depend on strong assumptions about the counterfactual that often lack transparency and credibility. An alternative approach, called partial identification (PI), is to first estimate what the counterfactual bounds would be if the weakest possible assumptions were made. Then, one narrows the bounds by using stronger but credible assumptions based on an understanding of why units were selected for the intervention and how they might respond to it. We applied this approach and compared it with conventional approaches by estimating the impact of a conservation program that removed invasive trees in part of the Cape Floristic Region. Even when we used our largest PI impact estimate, the program's control costs were 1.4 times higher than previously estimated. PI holds promise for applications in conservation science because it encourages researchers to better understand and account for treatment selection biases; can offer insights into the plausibility of conventional point-estimate approaches; could reduce the problem of advocacy in science; might be easier for stakeholders to agree on a bounded estimate than a point estimate where impacts are contentious; and requires only basic arithmetic skills. © 2015 Society for Conservation Biology.

  2. Unconditionally Secure and Universally Composable Commitments from Physical Assumptions

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Scafuro, Alessandra

    2013-01-01

    We present a constant-round unconditional black-box compiler that transforms any ideal (i.e., statistically-hiding and statistically-binding) straight-line extractable commitment scheme, into an extractable and equivocal commitment scheme, therefore yielding to UC-security [9]. We exemplify the u...... of unconditional UC-security with (malicious) PUFs and stateless tokens, our compiler can be instantiated with any ideal straight-line extractable commitment scheme, thus allowing the use of various setup assumptions which may better fit the application or the technology available....

  3. Washington International Renewable Energy Conference (WIREC) 2008 Pledges. Methodology and Assumptions Summary

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, Bill [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bilello, Daniel E. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cowlin, Shannon C. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wise, Alison [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2008-08-01

    This report describes the methodology and assumptions used by NREL in quantifying the potential CO2 reductions resulting from more than 140 governments, international organizations, and private-sector representatives pledging to advance the uptake of renewable energy.

  4. The basic approach to age-structured population dynamics models, methods and numerics

    CERN Document Server

    Iannelli, Mimmo

    2017-01-01

    This book provides an introduction to age-structured population modeling which emphasises the connection between mathematical theory and underlying biological assumptions. Through the rigorous development of the linear theory and the nonlinear theory alongside numerics, the authors explore classical equations that describe the dynamics of certain ecological systems. Modeling aspects are discussed to show how relevant problems in the fields of demography, ecology, and epidemiology can be formulated and treated within the theory. In particular, the book presents extensions of age-structured modelling to the spread of diseases and epidemics while also addressing the issue of regularity of solutions, the asymptotic behaviour of solutions, and numerical approximation. With sections on transmission models, non-autonomous models and global dynamics, this book fills a gap in the literature on theoretical population dynamics. The Basic Approach to Age-Structured Population Dynamics will appeal to graduate students an...

  5. Evaluation of the Functional Pre-Basic-Training English-as-a-Second- Language Course

    Science.gov (United States)

    1985-02-01

    that reported in TRADOC data for BSEP literacy students. TRADOC data, presented in Table 7-6, indicate that only 47.8% of BSEP literacy students...apropiada. (Si esta aprendiendo suficiente informacion acerca de Basic Training, siga con la pregunta 17.) (marque solo una respuesta) ~ demasiadas lecciones

  6. Halo-independent direct detection analyses without mass assumptions

    International Nuclear Information System (INIS)

    Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m χ −σ n plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v min −g-tilde plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v min to nuclear recoil momentum (p R ), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call h-til-tilde(p R ). The entire family of conventional halo-independent g-tilde(v min ) plots for all DM masses are directly found from the single h-tilde(p R ) plot through a simple rescaling of axes. By considering results in h-tilde(p R ) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple g-tilde(v min ) plots for different DM masses. We conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity

  7. Basic infrastructure for a nuclear power project

    International Nuclear Information System (INIS)

    2006-06-01

    There are several stages in the process of introducing nuclear power in a country. These include development of nuclear policies and regulations, feasibility studies, public consultations, technology evaluation, requests for proposals and evaluations, contracts and financing, supply, construction, commissioning, operation and finally decommissioning. This publication addresses the 'basic' infrastructure needs, which are adequate until the issue of the construction license. It is obvious that a fully developed nuclear infrastructure will be required for the further implementation stages of a nuclear power reactor. The officials and experts in each country will undertake the transition from a basic infrastructure to a fully developed infrastructure that covers the stages of construction, commissioning, operation and decommissioning. The publication is directed to provide guidance for assessing the basic infrastructure necessary for: - A host country to consider when engaging in the implementation of nuclear power, and - A supplier country to consider when assessing whether the recipient country is in an acceptable condition to begin the implementation of a nuclear power project. The target users are decision makers, advisers and senior managers in the governmental organizations, utilities, industrial organizations and regulatory bodies in the countries adopting nuclear power programmes or exporting supplies for these programmes. The governmental organizations that may find this publication useful include: Ministries of Economy, Energy, Foreign Affairs, Finance, Mining, Internal Affairs, Academic Institutions, Nuclear Energy Agencies and Environmental Agencies. This publication was produced within the IAEA programme directed to increase the capability of Member States to plan and implement nuclear power programmes and to establish and enhance national nuclear infrastructure. This publication should be used in conjunction with the IAEA Safety Standards Series and other

  8. Marking and Moderation in the UK: False Assumptions and Wasted Resources

    Science.gov (United States)

    Bloxham, Sue

    2009-01-01

    This article challenges a number of assumptions underlying marking of student work in British universities. It argues that, in developing rigorous moderation procedures, we have created a huge burden for markers which adds little to accuracy and reliability but creates additional work for staff, constrains assessment choices and slows down…

  9. Assumptions of Corporate Social Responsibility as Competitiveness Factor

    Directory of Open Access Journals (Sweden)

    Zaneta Simanaviciene

    2017-09-01

    Full Text Available The purpose of this study was to examine the assumptions of corporate social responsibility (CSR as competitiveness factor in economic downturn. Findings indicate that factors affecting the quality of the micro-economic business environment, i.e., the sophistication of enterprise’s strategy and management processes, the quality of the human capital resources, the increase of product / service demand, the development of related and supporting sectors and the efficiency of natural resources, and competitive capacities of enterprise impact competitiveness at a micro-level. The outcomes suggest that the implementation of CSR elements, i.e., economic, environmental and social responsibilities, gives good opportunities to increase business competitiveness.

  10. Basic characterization of normal multifocal electroretinogram

    International Nuclear Information System (INIS)

    Fernandez Cherkasova, Lilia; Rojas Rondon, Irene; Castro Perez, Pedro Daniel; Lopez Felipe, Daniel; Santiesteban Freixas, Rosaralis; Mendoza Santiesteban, Carlos E

    2008-01-01

    A scientific literature review was made on the novel multifocal electroretinogram technique, the involved cell mechanisms and some of the factors modifying its results together with the form of presentation. The basic characteristics of this electrophysiological record obtained from several regions of the retina of normal subjects is important in order to create at a small scale a comparative database to evaluate pathological eye tracing. All this will greatly help in early less invasive electrodiagnosis of localized retinal lesions. (Author)

  11. Do ambient urban odors evoke basic emotions?

    Directory of Open Access Journals (Sweden)

    Sandra Theresia Weber-Glass

    2014-04-01

    Full Text Available Fragrances, such as plant odors, have been shown to evoke autonomic response patterns associated with Ekman’s (Ekman et al., 1983 basic emotions happiness, surprise, anger, fear, sadness and disgust. Inducing positive emotions by odors in highly frequented public spaces could serve to improve the quality of life in urban environments. Thus, the present study evaluated the potency of ambient odors connoted with an urban environment to evoke basic emotions on an autonomic and cognitive response level. Synthetic mixtures representing the odors of disinfectant, candles / bees wax, summer air, burnt smell, vomit and musty smell as well as odorless water as a control were presented five times in random order to 30 healthy, non-smoking human subjects with intact sense of smell. Skin temperature, skin conductance, breathing rate, forearm muscle activity, blink rate and heart rate were recorded simultaneously. Subjects rated the odors in terms of pleasantness, intensity and familiarity and gave verbal labels to each odor as well as cognitive associations with the basic emotions. The results showed that the amplitude of the skin conductance response varied as a function of odor presentation. Burnt smell and vomit elicited significantly higher electrodermal responses than summer air. Also, a negative correlation was revealed between the amplitude of the skin conductance response and hedonic odor valence indicating that the magnitude of the electrodermal response increased with odor unpleasantness. The analysis of the cognitive associations between odors and basic emotions showed that candles / bees wax and summer air were specifically associated with happiness whereas burnt smell and vomit were uniquely associated with disgust. Our findings suggest that city odors may evoke specific cognitive associations of basic emotions and that autonomic activity elicited by such odors is related to odor hedonics.

  12. Who needs the assumption of opportunistic behavior? Transaction cost economics does not!

    DEFF Research Database (Denmark)

    Koch, Carsten Allan

    2000-01-01

    The assumption of opportunistic behavior, familiar from transaction cost economics, has been and remains highly controversial. But opportunistic behavior, albeit undoubtedly an extremely important form of motivation, is not a necessary condition for the contractual problems studied by transaction...

  13. Basic Cake Decorating Workbook.

    Science.gov (United States)

    Bogdany, Mel

    Included in this student workbook for basic cake decorating are the following: (1) Drawings of steps in a basic way to ice a layer cake, how to make a paper cone, various sizes of flower nails, various sizes and types of tin pastry tubes, and special rose tubes; (2) recipes for basic decorating icings (buttercream, rose paste, and royal icing);…

  14. From basic needs to basic rights.

    Science.gov (United States)

    Facio, A

    1995-06-01

    After arriving at an understanding that basic rights refer to all human needs, it is clear that a recognition of the basic needs of female humans must precede the realization of their rights. The old Women in Development (WID) framework only understood women's needs from an androcentric perspective which was limited to practical interests. Instead, women's primary need is to be free from their subordination to men. Such an understanding places all of women's immediate needs in a new light. A human rights approach to development would see women not as beneficiaries but as people entitled to enjoy the benefits of development. Discussion of what equality before the law should mean to women began at the Third World Conference on Women in Nairobi where the issue of violence against women was first linked to development. While debate continues about the distinction between civil and political rights and economic, social, and cultural rights, the realities of women's lives do not permit such a distinction. The concept of the universality of human rights did not become codified until the UN proclaimed the Universal Declaration of Human Rights in 1948. The declaration has been criticized by feminists because the view of human rights it embodies has been too strongly influenced by a liberal Western philosophy which stresses individual rights and because it is ambiguous on the distinction between human rights and the rights of a citizen. The protection of rights afforded by the Declaration, however, should not be viewed as a final achievement but as an ongoing struggle. International conferences have led to an analysis of the human-rights approach to sustainable development which concludes that women continue to face the routine denial of their rights. Each human right must be redefined from the perspective of women's needs, which must also be redefined. Women must forego challenging the concept of the universality of human rights in order to overcome the argument of cultural

  15. Education: The Basics. The Basics

    Science.gov (United States)

    Wood, Kay

    2011-01-01

    Everyone knows that education is important, we are confronted daily by discussion of it in the media and by politicians, but how much do we really know about education? "Education: The Basics" is a lively and engaging introduction to education as an academic subject, taking into account both theory and practice. Covering the schooling system, the…

  16. Body Basics Library

    Science.gov (United States)

    ... Body Basics articles explain just how each body system, part, and process works. Use this medical library to find out about basic human anatomy, how ... Teeth Skin, Hair, and Nails Spleen and Lymphatic System ... Visit the Nemours Web site. Note: All information on TeensHealth® is for ...

  17. Tale of Two Courthouses: A Critique of the Underlying Assumptions in Chronic Disease Self-Management for Aboriginal People

    Directory of Open Access Journals (Sweden)

    Isabelle Ellis

    2009-12-01

    Full Text Available This article reviews the assumptions that underpin thecommonly implemented Chronic Disease Self-Managementmodels. Namely that there are a clear set of instructions forpatients to comply with, that all health care providers agreewith; and that the health care provider and the patient agreewith the chronic disease self-management plan that wasdeveloped as part of a consultation. These assumptions areevaluated for their validity in the remote health care context,particularly for Aboriginal people. These assumptions havebeen found to lack validity in this context, therefore analternative model to enhance chronic disease care isproposed.

  18. Coping with poverty in international assistance policy: an evaluation of spatially integrated investment strategies. [World Bank, USAID, and UN

    Energy Technology Data Exchange (ETDEWEB)

    Rondinelli, D A [Syracuse Univ., NY; Ruddle, K

    1978-04-01

    International assistance agencies have turned increasingly to integrated rural development policies in an attempt to ameliorate the inequitable distribution of economic growth plaguing Third World nations since World War II. This paper reviews the functionally and spatially integrated investment strategies of the World Bank, US Agency for International Development, and the United Nations, outlines their objectives, perceptions of the problem, basic assumptions and programs, and evaluates them in terms of potential difficulties for implementation. Those factors crucial to making integrated development policies operational--knowledge of human ecosystems in rural areas, analytical ability, operational procedures, arrangements for local participation, subsistence systems indicators and administrative capacity of local and national governments--are discussed and assessed.

  19. A biomechanical testing system to determine micromotion between hip implant and femur accounting for deformation of the hip implant: Assessment of the influence of rigid body assumptions on micromotions measurements.

    Science.gov (United States)

    Leuridan, Steven; Goossens, Quentin; Roosen, Jorg; Pastrav, Leonard; Denis, Kathleen; Mulier, Michiel; Desmet, Wim; Vander Sloten, Jos

    2017-02-01

    Accurate pre-clinical evaluation of the initial stability of new cementless hip stems using in vitro micromotion measurements is an important step in the design process to assess the new stem's potential. Several measuring systems, linear variable displacement transducer-based and other, require assuming bone or implant to be rigid to obtain micromotion values or to calculate derived quantities such as relative implant tilting. An alternative linear variable displacement transducer-based measuring system not requiring a rigid body assumption was developed in this study. The system combined advantages of local unidirectional and frame-and-bracket micromotion measuring concepts. The influence and possible errors that would be made by adopting a rigid body assumption were quantified. Furthermore, as the system allowed emulating local unidirectional and frame-and-bracket systems, the influence of adopting rigid body assumptions were also analyzed for both concepts. Synthetic and embalmed bone models were tested in combination with primary and revision implants. Single-legged stance phase loading was applied to the implant - bone constructs. Adopting a rigid body assumption resulted in an overestimation of mediolateral micromotion of up to 49.7μm at more distal measuring locations. Maximal average relative rotational motion was overestimated by 0.12° around the anteroposterior axis. Frontal and sagittal tilting calculations based on a unidirectional measuring concept underestimated the true tilting by an order of magnitude. Non-rigid behavior is a factor that should not be dismissed in micromotion stability evaluations of primary and revision femoral implants. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Examination of the Film "My Father and My Son" According to the Basic Concepts of Multigenerational Family Therapy

    Science.gov (United States)

    Acar, Tulin; Voltan-Acar, Nilufer

    2013-01-01

    The aim of this study was to evaluate the basic concepts of multigenerational Family Therapy and to evaluate the scenes of the film ''My Father and My Son'' according to these concepts. For these purposes firstly basic concepts of Multigenerational Family Therapy such as differentiation of self, triangles/triangulation, nuclear family emotional…

  1. Effects of Game Design Patterns on Basic Life Support Training Content

    Science.gov (United States)

    Kelle, Sebastian; Klemke, Roland; Specht, Marcus

    2013-01-01

    Based on a previous analysis of game design patterns and related effects in an educational scenario, the following paper presents an experimental study. In the study a course for Basic Life Support training has been evaluated and two game design patterns have been applied to the course. The hypotheses evaluated in this paper relate to game design…

  2. Large-scale analyses of synonymous substitution rates can be sensitive to assumptions about the process of mutation.

    Science.gov (United States)

    Aris-Brosou, Stéphane; Bielawski, Joseph P

    2006-08-15

    A popular approach to examine the roles of mutation and selection in the evolution of genomes has been to consider the relationship between codon bias and synonymous rates of molecular evolution. A significant relationship between these two quantities is taken to indicate the action of weak selection on substitutions among synonymous codons. The neutral theory predicts that the rate of evolution is inversely related to the level of functional constraint. Therefore, selection against the use of non-preferred codons among those coding for the same amino acid should result in lower rates of synonymous substitution as compared with sites not subject to such selection pressures. However, reliably measuring the extent of such a relationship is problematic, as estimates of synonymous rates are sensitive to our assumptions about the process of molecular evolution. Previous studies showed the importance of accounting for unequal codon frequencies, in particular when synonymous codon usage is highly biased. Yet, unequal codon frequencies can be modeled in different ways, making different assumptions about the mutation process. Here we conduct a simulation study to evaluate two different ways of modeling uneven codon frequencies and show that both model parameterizations can have a dramatic impact on rate estimates and affect biological conclusions about genome evolution. We reanalyze three large data sets to demonstrate the relevance of our results to empirical data analysis.

  3. Stable isotopes and elasmobranchs: tissue types, methods, applications and assumptions.

    Science.gov (United States)

    Hussey, N E; MacNeil, M A; Olin, J A; McMeans, B C; Kinney, M J; Chapman, D D; Fisk, A T

    2012-04-01

    Stable-isotope analysis (SIA) can act as a powerful ecological tracer with which to examine diet, trophic position and movement, as well as more complex questions pertaining to community dynamics and feeding strategies or behaviour among aquatic organisms. With major advances in the understanding of the methodological approaches and assumptions of SIA through dedicated experimental work in the broader literature coupled with the inherent difficulty of studying typically large, highly mobile marine predators, SIA is increasingly being used to investigate the ecology of elasmobranchs (sharks, skates and rays). Here, the current state of SIA in elasmobranchs is reviewed, focusing on available tissues for analysis, methodological issues relating to the effects of lipid extraction and urea, the experimental dynamics of isotopic incorporation, diet-tissue discrimination factors, estimating trophic position, diet and mixing models and individual specialization and niche-width analyses. These areas are discussed in terms of assumptions made when applying SIA to the study of elasmobranch ecology and the requirement that investigators standardize analytical approaches. Recommendations are made for future SIA experimental work that would improve understanding of stable-isotope dynamics and advance their application in the study of sharks, skates and rays. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  4. Basic electronics

    CERN Document Server

    Holbrook, Harold D

    1971-01-01

    Basic Electronics is an elementary text designed for basic instruction in electricity and electronics. It gives emphasis on electronic emission and the vacuum tube and shows transistor circuits in parallel with electron tube circuits. This book also demonstrates how the transistor merely replaces the tube, with proper change of circuit constants as required. Many problems are presented at the end of each chapter. This book is comprised of 17 chapters and opens with an overview of electron theory, followed by a discussion on resistance, inductance, and capacitance, along with their effects on t

  5. SOME CONCEPTIONS AND MISCONCEPTIONS ON REALITY AND ASSUMPTIONS IN FINANCIAL ACCOUNTING

    OpenAIRE

    Stanley C. W. Salvary

    2005-01-01

    This paper addresses two problematic issues arising from the importation of terms into financial accounting: (1) the nature of economic reality; and (2) the role of assumptions. These two issues have stirred a lot of controversy relating to financial accounting measurements and affect attestation reports. This paper attempts to provide conceptual clarity on these two issues.

  6. Evaluation of socio-economic effects of R and D results at Japan Atomic Energy Research Institute. 2. Socio-economic evaluation of the basic research at JAERI

    International Nuclear Information System (INIS)

    2003-11-01

    The Japan Atomic Energy Research Institute (JAERI), as a core organization devoted to comprehensive nuclear energy research, has steadily promoted various types of research and development (R and D) studies since its establishment in June 1956. Research activities are aimed at performing (1) R and D for nuclear energy, (2) the utilization and application of radiation-based technologies, and (3) the establishment of basic and fundamental research in the nuclear field. Last year, the socio-economic effects on items (1) and (2) were qualitatively and quantitatively evaluated. The quantitative evaluation of item (3) from the viewpoint of a socio-economic effect, however, calls for a different concept and methodology than previously used cost-benefit approach. Achievements obtained from the activities conducted over the last 10 years implied that socio-economics in basic research funded by the public could contribute to the (1) increase in useful intellectual stocks, (2) upbringing of highly skilled college graduates, (3) construction of new scientific facilities and creation of methodologies, (4) stimulation and promotion of social interrelations by networking, (5) increase of one's ability to solve scientific problems, and (6) establishment of venture companies. In this study, we focused on item (4) for the analysis because it assumed that the external economic effect has a link with the socio-economic effects accompanying the networking formation. For the criteria of socio-economic effects we assume that the external effect becomes significant in proportion to the width of networking and/or the magnitude of cooperation measured by numbers of co-writing studies between JAERI and the research bodies, namely private and governmental sectors and universities. Taking these criteria into consideration, the subsequent four items are prepared for quantitative study. They are (1) to clarify the basic research fields where JAERI has been established a significant effort to

  7. What is the proper evaluation method: Some basic considerations

    International Nuclear Information System (INIS)

    Leeb, Helmut; Schnabel, Georg; Srdinko, Thomas

    2014-01-01

    Recent developments and applications demand for an extension of the energy range and the inclusion of reliable uncertainty information in nuclear data libraries. Due to the scarcity of neutron-induced reaction data beyond 20 MeV the extension of the energy range up to at least 150 MeV is not trivial because the corresponding nuclear data evaluations depend heavily on nuclear models and proper evaluation methods are still under discussion. Restricting to evaluation techniques based on Bayesian statistics the influence of the a priori knowledge on the final result of the evaluation is considered. The study clearly indicates the need to account properly for the deficiencies of the nuclear model. Concerning the covariance matrices it is argued that they depend not only on the model, but also on the method of generation and an additional consent is required for the comparison of different evaluations of the same data sets. (authors)

  8. Questioning the "big assumptions". Part I: addressing personal contradictions that impede professional development.

    Science.gov (United States)

    Bowe, Constance M; Lahey, Lisa; Armstrong, Elizabeth; Kegan, Robert

    2003-08-01

    The ultimate success of recent medical curriculum reforms is, in large part, dependent upon the faculty's ability to adopt and sustain new attitudes and behaviors. However, like many New Year's resolutions, sincere intent to change may be short lived and followed by a discouraging return to old behaviors. Failure to sustain the initial resolve to change can be misinterpreted as a lack of commitment to one's original goals and eventually lead to greater effort expended in rationalizing the status quo rather than changing it. The present article outlines how a transformative process that has proven to be effective in managing personal change, Questioning the Big Assumptions, was successfully used in an international faculty development program for medical educators to enhance individual personal satisfaction and professional effectiveness. This process systematically encouraged participants to explore and proactively address currently operative mechanisms that could stall their attempts to change at the professional level. The applications of the Big Assumptions process in faculty development helped individuals to recognize and subsequently utilize unchallenged and deep rooted personal beliefs to overcome unconscious resistance to change. This approach systematically led participants away from circular griping about what was not right in their current situation to identifying the actions that they needed to take to realize their individual goals. By thoughtful testing of personal Big Assumptions, participants designed behavioral changes that could be broadly supported and, most importantly, sustained.

  9. Basic physical phenomena, neutron production and scaling of the dense plasma focus

    International Nuclear Information System (INIS)

    Kaeppeler, H.J.

    This paper presents an attempt at establishing a model theory for the dense plasma focus in order to present a consistent interpretation of the basic physical phenomena leading to neutron production from both acceleration and thermal processes. To achieve this, the temporal history of the focus is divided into the compression of the plasma sheath, a qiescent and very dense phase with ensuing expansion, and an instable phase where the focus plasma is disrupted by instabilities. Finally, the decay of density, velocity and thermal fields is considered. Under the assumption that Io 2 /sigmaoRo 2 = const and to/Tc = const, scaling laws for plasma focus devices are derived. It is shown that while generally the neutron yield scales with the fourth power of maximum current, neutron production from thermal processes becomes increasingly important for large devices, while in the small devices neutron production from acceleration processes is by far predominant. (orig.) [de

  10. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  11. The value of basic research insights into atrial fibrillation mechanisms as a guide to therapeutic innovation: a critical analysis.

    Science.gov (United States)

    Heijman, Jordi; Algalarrondo, Vincent; Voigt, Niels; Melka, Jonathan; Wehrens, Xander H T; Dobrev, Dobromir; Nattel, Stanley

    2016-04-01

    Atrial fibrillation (AF) is an extremely common clinical problem associated with increased morbidity and mortality. Current antiarrhythmic options include pharmacological, ablation, and surgical therapies, and have significantly improved clinical outcomes. However, their efficacy remains suboptimal, and their use is limited by a variety of potentially serious adverse effects. There is a clear need for improved therapeutic options. Several decades of research have substantially expanded our understanding of the basic mechanisms of AF. Ectopic firing and re-entrant activity have been identified as the predominant mechanisms for arrhythmia initiation and maintenance. However, it has become clear that the clinical factors predisposing to AF and the cellular and molecular mechanisms involved are extremely complex. Moreover, all AF-promoting and maintaining mechanisms are dynamically regulated and subject to remodelling caused by both AF and cardiovascular disease. Accordingly, the initial presentation and clinical progression of AF patients are enormously heterogeneous. An understanding of arrhythmia mechanisms is widely assumed to be the basis of therapeutic innovation, but while this assumption seems self-evident, we are not aware of any papers that have critically examined the practical contributions of basic research into AF mechanisms to arrhythmia management. Here, we review recent insights into the basic mechanisms of AF, critically analyse the role of basic research insights in the development of presently used anti-AF therapeutic options and assess the potential value of contemporary experimental discoveries for future therapeutic innovation. Finally, we highlight some of the important challenges to the translation of basic science findings to clinical application. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  12. Climate Change: Implications for the Assumptions, Goals and Methods of Urban Environmental Planning

    Directory of Open Access Journals (Sweden)

    Kristina Hill

    2016-12-01

    Full Text Available As a result of increasing awareness of the implications of global climate change, shifts are becoming necessary and apparent in the assumptions, concepts, goals and methods of urban environmental planning. This review will present the argument that these changes represent a genuine paradigm shift in urban environmental planning. Reflection and action to develop this paradigm shift is critical now and in the next decades, because environmental planning for cities will only become more urgent as we enter a new climate period. The concepts, methods and assumptions that urban environmental planners have relied on in previous decades to protect people, ecosystems and physical structures are inadequate if they do not explicitly account for a rapidly changing regional climate context, specifically from a hydrological and ecological perspective. The over-arching concept of spatial suitability that guided planning in most of the 20th century has already given way to concepts that address sustainability, recognizing the importance of temporality. Quite rapidly, the concept of sustainability has been replaced in many planning contexts by the priority of establishing resilience in the face of extreme disturbance events. Now even this concept of resilience is being incorporated into a novel concept of urban planning as a process of adaptation to permanent, incremental environmental changes. This adaptation concept recognizes the necessity for continued resilience to extreme events, while acknowledging that permanent changes are also occurring as a result of trends that have a clear direction over time, such as rising sea levels. Similarly, the methods of urban environmental planning have relied on statistical data about hydrological and ecological systems that will not adequately describe these systems under a new climate regime. These methods are beginning to be replaced by methods that make use of early warning systems for regime shifts, and process

  13. Human Praxis: A New Basic Assumption for Art Educators of the Future.

    Science.gov (United States)

    Hodder, Geoffrey S.

    1980-01-01

    After analyzing Vincent Lanier's five characteristic roles of art education, the article briefly explains the pedagogy of Paulo Freire, based on human praxis, and applies it to the existing "oppresive" art education system. The article reduces Lanier's roles to resemble a single Freirean model. (SB)

  14. Scenario Analysis In The Calculation Of Investment Efficiency–The Problem Of Formulating Assumptions

    Directory of Open Access Journals (Sweden)

    Dittmann Iwona

    2015-09-01

    Full Text Available This article concerns the problem of formulating assumptions in scenario analysis for investments which consist of the renting out of an apartment. The article attempts to indicate the foundations for the formulation of assumptions on the basis of observed retrospective regularities. It includes theoretical considerations regarding scenario design, as well as the results of studies on the formulation, in the past, of quantities which determined or were likely to bring closer estimate the value of the individual explanatory variables for a chosen measure of investment profitability (MIRRFCFE. The dynamics of and correlation between the variables were studied. The research was based on quarterly data from local residential real estate markets in Poland (in the six largest cities in the years 2006 – 2014, as well as on data from the financial market.

  15. Common-Sense Chemistry: The Use of Assumptions and Heuristics in Problem Solving

    Science.gov (United States)

    Maeyer, Jenine Rachel

    2013-01-01

    Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build…

  16. Basic and applied aspects of female reproduction in farmed ostriches

    NARCIS (Netherlands)

    Bronneberg, R.G.G.

    2008-01-01

    This thesis investigated basic and applied aspects of female reproduction in farmed ostriches throughout the 48h egg laying cycle, during the egg production season, and, the non-breeding season. The main objectives were: (1) to evaluate the use of transcutaneous ultrasound scanning to visualize

  17. Economic Analysis of Cyber Security

    Science.gov (United States)

    2006-07-01

    calculate the estimates are proprietary; therefore, no one outside the company can evaluate the assumptions and methodologies. Furthermore, Mi2g has a...and Miller proposed using a more mathematical approach. The theories they proposed were based on the assumption that markets operate efficiently, so...Other researchers (Ross, 1978; Ryan, 1982) modified this basic idea and created the capital asset pricing model ( CAPM ), in which investments are made

  18. IRT models with relaxed assumptions in eRm: A manual-like instruction

    Directory of Open Access Journals (Sweden)

    REINHOLD HATZINGER

    2009-03-01

    Full Text Available Linear logistic models with relaxed assumptions (LLRA as introduced by Fischer (1974 are a flexible tool for the measurement of change for dichotomous or polytomous responses. As opposed to the Rasch model, assumptions on dimensionality of items, their mutual dependencies and the distribution of the latent trait in the population of subjects are relaxed. Conditional maximum likelihood estimation allows for inference about treatment, covariate or trend effect parameters without taking the subjects' latent trait values into account. In this paper we will show how LLRAs based on the LLTM, LRSM and LPCM can be used to answer various questions about the measurement of change and how they can be fitted in R using the eRm package. A number of small didactic examples is provided that can easily be used as templates for real data sets. All datafiles used in this paper are available from http://eRm.R-Forge.R-project.org/

  19. Wartime Paris, cirrhosis mortality, and the ceteris paribus assumption.

    Science.gov (United States)

    Fillmore, Kaye Middleton; Roizen, Ron; Farrell, Michael; Kerr, William; Lemmens, Paul

    2002-07-01

    This article critiques the ceteris paribus assumption, which tacitly sustains the epidemiologic literature's inference that the sharp decline in cirrhosis mortality observed in Paris during the Second World War derived from a sharp constriction in wine consumption. Paris's wartime circumstances deviate substantially from the "all else being equal" assumption, and at least three other hypotheses for the cirrhosis decline may be contemplated. Historical and statistical review. Wartime Paris underwent tumultuous changes. Wine consumption did decline, but there were, as well, a myriad of other changes in diet and life experience, many involving new or heightened hardships, nutritional, experiential, institutional, health and mortality risks. Three competing hypotheses are presented: (1) A fraction of the candidates for cirrhosis mortality may have fallen to more sudden forms of death; (2) alcoholics, heavy drinkers and Paris's clochard subpopulation may have been differentially likely to become removed from the city's wartime population, whether by self-initiated departure, arrest and deportation, or death from other causes, even murder; and (3) there was mismeasurement in the cirrhosis mortality decline. The alcohol-cirrhosis connection provided the template for the alcohol research effort (now more than 20 years old) aimed at re-establishing scientific recognition of alcohol's direct alcohol-problems-generating associations and causal responsibilities. In a time given to reports of weaker associations of the alcohol-cirrhosis connection, the place and importance of the Paris curve in the wider literature, as regards that connection, remains. For this reason, the Paris findings should be subjected to as much research scrutiny as they undoubtedly deserve.

  20. Research and development of the industrial basic technologies of the next generation, 'composite materials (fine ceramics)'. Evaluation of the first phase research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'fine ceramics'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1984-03-30

    The results of the first phase research and development project for developing fine ceramics as the basic technologies of the next generation are evaluated. The R and D themes are selected to develop fine ceramics of high strength, corrosion resistance, precision and wear resistance, noting their excellent characteristics. Development of the basic techniques for these materials is of high significance, and highly rated. The efforts in the first-phase R and D project are aimed at development of silicon nitride and silicon carbide for synthesis of the stock materials; explosive forming/treating the stock powders; forming, sintering and processing/joining; evaluation of the characteristics; non-destructive testing methods; designs; and evaluation of the parts, among others, as the elementary techniques for production, evaluation and application of the fine ceramic materials. The technical targets of improving functions have been achieved, or bright prospects have been obtained therefor in development of the techniques for synthesis of the stock materials, forming/sintering and processing/joining. The silica reduction for stock synthesis, basic techniques for molding/sintering, and rheological considerations for the molding/sintering techniques represent the techniques of the next generation, because they break through the limitations of the conventional techniques. (NEDO)

  1. [Kraepelin's basic nosologic postulates. An attempt at a critical evaluation of the later works of Kraepelin].

    Science.gov (United States)

    Hoff, P

    1988-01-01

    This study discusses three important papers by Emil Kraepelin, published between 1918 and 1920. Kraepelin supports--in accordance with his teacher Wilhelm Wundt--the view of psychophysical parallelism as a basic principle of dealing with the questions of mental illness. Kraepelin is often called a nosologist; but one must not forget that Kraepelins nosology was not a static one, nor did he vote in favor of any kind of dogmatism. Only when Kraepelin's basic positions are reflected in a differentiated way, his enormous influence on very different parts of psychiatry as science can be understood.

  2. THE METHOD OF MULTIPLE SPATIAL PLANNING BASIC MAP

    OpenAIRE

    Zhang, C.; Fang, C.

    2018-01-01

    The “Provincial Space Plan Pilot Program” issued in December 2016 pointed out that the existing space management and control information management platforms of various departments were integrated, and a spatial planning information management platform was established to integrate basic data, target indicators, space coordinates, and technical specifications. The planning and preparation will provide supportive decision support, digital monitoring and evaluation of the implementation of the p...

  3. Graduate Education in Psychology: A Comment on Rogers' Passionate Statement

    Science.gov (United States)

    Brown, Robert C., Jr.; Tedeschi, James T.

    1972-01-01

    Authors' hope that this critical evaluation can place Carl Rogers' assumptions into perspective; they propose a compromise program meant to satisfy the basic aims of a humanistic psychology program. For Rogers' rejoinder see AA 512 869. (MB)

  4. Basic radiation oncology

    International Nuclear Information System (INIS)

    Beyzadeoglu, M. M.; Ebruli, C.

    2008-01-01

    Basic Radiation Oncology is an all-in-one book. It is an up-to-date bedside oriented book integrating the radiation physics, radiobiology and clinical radiation oncology. It includes the essentials of all aspects of radiation oncology with more than 300 practical illustrations, black and white and color figures. The layout and presentation is very practical and enriched with many pearl boxes. Key studies particularly randomized ones are also included at the end of each clinical chapter. Basic knowledge of all high-tech radiation teletherapy units such as tomotherapy, cyberknife, and proton therapy are also given. The first 2 sections review concepts that are crucial in radiation physics and radiobiology. The remaining 11 chapters describe treatment regimens for main cancer sites and tumor types. Basic Radiation Oncology will greatly help meeting the needs for a practical and bedside oriented oncology book for residents, fellows, and clinicians of Radiation, Medical and Surgical Oncology as well as medical students, physicians and medical physicists interested in Clinical Oncology. English Edition of the book Temel Radyasyon Onkolojisi is being published by Springer Heidelberg this year with updated 2009 AJCC Staging as Basic Radiation Oncology

  5. The evaluation of first aid and basic life support training for the first year university students.

    Science.gov (United States)

    Altintaş, Kerim Hakan; Aslan, Dilek; Yildiz, Ali Naci; Subaşi, Nüket; Elçin, Melih; Odabaşi, Orhan; Bilir, Nazmi; Sayek, Iskender

    2005-02-01

    In Turkey, the first aiders are few in quantity and yet they are required in many settings, such as earthquakes. It was thought that training first year university students in first aid and basic life support (FA-BLS) techniques would serve to increase the number of first aiders. It was also thought that another problem, the lack of first aid trainers, might be addressed by training medical students to perform this function. A project aimed at training first year university students in FA-BLS was conducted at Hacettepe University. In the first phase, medical student first aid trainers (MeSFAT) were trained in FA-BLS training techniques by academic trainers and in the second phase, first year university students were trained in FA-BLS techniques by these peer trainers under the academic trainers' supervision. The purpose of this study was to assess the participants' evaluation of this project and to propose a new program to increase the number of first aiders in the country. In total, 31 medical students were certified as MeSFATs and 12 of these trained 40 first year university students in FA-BLS. Various questionnaires were applied to the participants to determine their evaluation of the training program. Most of the participants and the authors considered the program to be successful and effective. This method may be used to increase the number of first aid trainers and first aiders in the community.

  6. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  7. What Were We Thinking? Five Erroneous Assumptions That Have Fueled Specialized Interventions for Adolescents Who Have Sexually Offended

    Science.gov (United States)

    Worling, James R.

    2013-01-01

    Since the early 1980s, five assumptions have influenced the assessment, treatment, and community supervision of adolescents who have offended sexually. In particular, interventions with this population have been informed by the assumptions that these youth are (i) deviant, (ii) delinquent, (iii) disordered, (iv) deficit-ridden, and (v) deceitful.…

  8. Development and Validation of a Clarinet Performance Adjudication Scale

    Science.gov (United States)

    Abeles, Harold F.

    1973-01-01

    A basic assumption of this study is that there are generally agreed upon performance standards as evidenced by the use of adjudicators for evaluations at contests and festivals. An evaluation instrument was developed to enable raters to measure effectively those aspects of performance that have common standards of proficiency. (Author/RK)

  9. Relationship between self-reported body awareness and physiotherapists' evaluation of Basic Body Awareness Therapy in refugees with PTSD

    DEFF Research Database (Denmark)

    Jensen, Jonna Anne

    Background: The number of refugees who are traumatized and diagnosed with post-traumatic stress disorder (PTSD) is increasing in Denmark and Europe. In Denmark, Basic Body Awareness Therapy (B-BAT) is used by physiotherapists in the rehabilitation of traumatized refugees as a body oriented...... intervention. A recent pilot study found that B-BAT decreased somatic and mental symptoms of PTSD in a group of refugees with this diagnosis (Stade 2015). Further, Bergström et al. (2014) found that patients with chronic pain and low body awareness had no significant changes in body awareness after treatment...... with BBAT, whereas the group with moderate/high body awareness had a significant change one year after treatment. However, whether there exists a relationship between self-reported body awareness and physiotherapists' evaluation of the applicability of BBAT on PTSD symptoms is not known. Purpose: This study...

  10. The effects of behavioral and structural assumptions in artificial stock market

    Science.gov (United States)

    Liu, Xinghua; Gregor, Shirley; Yang, Jianmei

    2008-04-01

    Recent literature has developed the conjecture that important statistical features of stock price series, such as the fat tails phenomenon, may depend mainly on the market microstructure. This conjecture motivated us to investigate the roles of both the market microstructure and agent behavior with respect to high-frequency returns and daily returns. We developed two simple models to investigate this issue. The first one is a stochastic model with a clearing house microstructure and a population of zero-intelligence agents. The second one has more behavioral assumptions based on Minority Game and also has a clearing house microstructure. With the first model we found that a characteristic of the clearing house microstructure, namely the clearing frequency, can explain fat tail, excess volatility and autocorrelation phenomena of high-frequency returns. However, this feature does not cause the same phenomena in daily returns. So the Stylized Facts of daily returns depend mainly on the agents’ behavior. With the second model we investigated the effects of behavioral assumptions on daily returns. Our study implicates that the aspects which are responsible for generating the stylized facts of high-frequency returns and daily returns are different.

  11. Energetics of basic karate kata.

    Science.gov (United States)

    Bussweiler, Jens; Hartmann, Ulrich

    2012-12-01

    Knowledge about energy requirements during exercises seems necessary to develop training concepts in combat sport Karate. It is a commonly held view that the anaerobic lactic energy metabolism plays a key role, but this assumption could not be confirmed so far. The metabolic cost and fractional energy supply of basic Karate Kata (Heian Nidan, Shotokan style) with duration of about 30 s were analyzed. Six male Karateka [mean ± SD (age 29 ± 8 years; height 177 ± 5 cm, body mass 75 ± 9 kg)] with different training experience (advanced athletes, experts, elite athletes) were examined while performing one time and two time continuously the sport-specific movements. During Kata performance oxygen uptake was measured with a portable spirometric device, blood lactate concentrations were examined before and after testing and fractional energy supply was calculated. The results have shown that on average 52 % of the energy supply for one Heian Nidan came from anaerobic alactic metabolism, 25 % from anaerobic lactic and 23 % from aerobic metabolism. For two sequentially executed Heian Nidan and thus nearly doubling the duration, the calculated percentages were 33, 25 and 42 %. Total energy demand for one Kata and two Kata was approximately 61 and 99 kJ, respectively. Despite measured blood lactate concentrations up to 8.1 mmol l(-1), which might suggest a dominance of lactic energy supply, a lactic fraction of only 17-31 % during these relatively short and intense sequences could be found. A heavy use of lactic energy metabolism had to be rejected.

  12. Production process stability - core assumption of INDUSTRY 4.0 concept

    Science.gov (United States)

    Chromjakova, F.; Bobak, R.; Hrusecka, D.

    2017-06-01

    Today’s industrial enterprises are confronted by implementation of INDUSTRY 4.0 concept with basic problem - stabilised manufacturing and supporting processes. Through this phenomenon of stabilisation, they will achieve positive digital management of both processes and continuously throughput. There is required structural stability of horizontal (business) and vertical (digitized) manufacturing processes, supported through digitalised technologies of INDUSTRY 4.0 concept. Results presented in this paper based on the research results and survey realised in more industrial companies. Following will described basic model for structural process stabilisation in manufacturing environment.

  13. Basic Approaches of Complex Interaction DrumTerrain for Vibratory Compaction

    Directory of Open Access Journals (Sweden)

    Gigel Florin Capatana

    2013-09-01

    Full Text Available In this paper the author tries to use a new method to evaluate and analyze the interaction between roller and terrain. The analysis is rheological approached, with a predominantly dynamic behaviour, so as to reveal the compatibility of the working body performances with the characteristics of the terrain. The basic idea shows that it must be assured the energy transfer maximization in the interaction between the two components of the system. The model must have permanent and continuous adjustments of the material characteristics so it can be evaluated the technological capability. The fulfilling of these objectives will be provided by using a complex model with both distributed and concentrated elements which can have rheology of elastic, dissipative and plastic types. The first conclusions of the presented study goes to the idea that the harmonization of the basic parameters of the model with the experimental values can lead to structural and functional optimizations of the entire technological system.

  14. Basic science right, not basic science lite: medical education at a crossroad.

    Science.gov (United States)

    Fincher, Ruth-Marie E; Wallach, Paul M; Richardson, W Scott

    2009-11-01

    This perspective is a counterpoint to Dr. Brass' article, Basic biomedical sciences and the future of medical education: implications for internal medicine. The authors review development of the US medical education system as an introduction to a discussion of Dr. Brass' perspectives. The authors agree that sound scientific foundations and skill in critical thinking are important and that effective educational strategies to improve foundational science education should be implemented. Unfortunately, many students do not perceive the relevance of basic science education to clinical practice.The authors cite areas of disagreement. They believe it is unlikely that the importance of basic sciences will be diminished by contemporary directions in medical education and planned modifications of USMLE. Graduates' diminished interest in internal medicine is unlikely from changes in basic science education.Thoughtful changes in education provide the opportunity to improve understanding of fundamental sciences, the process of scientific inquiry, and translation of that knowledge to clinical practice.

  15. Questioning the "big assumptions". Part II: recognizing organizational contradictions that impede institutional change.

    Science.gov (United States)

    Bowe, Constance M; Lahey, Lisa; Kegan, Robert; Armstrong, Elizabeth

    2003-08-01

    Well-designed medical curriculum reforms can fall short of their primary objectives during implementation when unanticipated or unaddressed organizational resistance surfaces. This typically occurs if the agents for change ignore faculty concerns during the planning stage or when the provision of essential institutional safeguards to support new behaviors are neglected. Disappointing outcomes in curriculum reforms then result in the perpetuation of or reversion to the status quo despite the loftiest of goals. Institutional resistance to change, much like that observed during personal development, does not necessarily indicate a communal lack of commitment to the organization's newly stated goals. It may reflect the existence of competing organizational objectives that must be addressed before substantive advances in a new direction can be accomplished. The authors describe how the Big Assumptions process (see previous article) was adapted and applied at the institutional level during a school of medicine's curriculum reform. Reform leaders encouraged faculty participants to articulate their reservations about considered changes to provided insights into the organization's competing commitments. The line of discussion provided an opportunity for faculty to appreciate the gridlock that existed until appropriate test of the school's long held Big Assumptions could be conducted. The Big Assumptions process proved useful in moving faculty groups to recognize and questions the validity of unchallenged institutional beliefs that were likely to undermine efforts toward change. The process also allowed the organization to put essential institutional safeguards in place that ultimately insured that substantive reforms could be sustained.

  16. Bayou Corne sinkhole : control measurements of State Highway 70 in Assumption Parish, Louisiana.

    Science.gov (United States)

    2014-01-01

    This project measured and assessed the surface stability of the portion of LA Highway 70 that is : potentially vulnerable to the Assumption Parish sinkhole. Using Global Positioning Systems (GPS) : enhanced by a real-time network (RTN) of continuousl...

  17. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    NARCIS (Netherlands)

    Ernst, Anja F.; Albers, Casper J.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated

  18. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    Science.gov (United States)

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  19. Reflections on assumption of energetic politics. Viewpoint of a sceptial observer

    International Nuclear Information System (INIS)

    Taczanowski, S.; Pohorecki, W.

    2000-01-01

    The Polish assumptions of energetic politics up to 2020 have been critically assessed. Energy sources availability as well as predicted fuel prices have been discussed for interesting period. Fossil fuels and uranium have been taken into account. On the presented basis it has been concluded that rejection the nuclear option in Poland for energetics development plans up to 2020 seems to be a serious mistake

  20. Transportation radiological risk assessment for the programmatic environmental impact statement: An overview of methodologies, assumptions, and input parameters

    International Nuclear Information System (INIS)

    Monette, F.; Biwer, B.; LePoire, D.; Chen, S.Y.

    1994-01-01

    The U.S. Department of Energy is considering a broad range of alternatives for the future configuration of radioactive waste management at its network of facilities. Because the transportation of radioactive waste is an integral component of the management alternatives being considered, the estimated human health risks associated with both routine and accident transportation conditions must be assessed to allow a complete appraisal of the alternatives. This paper provides an overview of the technical approach being used to assess the radiological risks from the transportation of radioactive wastes. The approach presented employs the RADTRAN 4 computer code to estimate the collective population risk during routine and accident transportation conditions. Supplemental analyses are conducted using the RISKIND computer code to address areas of specific concern to individuals or population subgroups. RISKIND is used for estimating routine doses to maximally exposed individuals and for assessing the consequences of the most severe credible transportation accidents. The transportation risk assessment is designed to ensure -- through uniform and judicious selection of models, data, and assumptions -- that relative comparisons of risk among the various alternatives are meaningful. This is accomplished by uniformly applying common input parameters and assumptions to each waste type for all alternatives. The approach presented can be applied to all radioactive waste types and provides a consistent and comprehensive evaluation of transportation-related risk

  1. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    Science.gov (United States)

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions

  2. Does Fostering Reasoning Strategies for Relatively Difficult Basic Combinations Promote Transfer by K-3 Students?

    Science.gov (United States)

    Baroody, Arthur J.; Purpura, David J.; Eiland, Michael D.; Reid, Erin E.; Paliwal, Veena

    2016-01-01

    How best to promote fluency with basic sums and differences is still not entirely clear. Some advocate a direct approach--using drill to foster memorization of basic facts by rote. Others recommend an indirect approach that first involves learning reasoning strategies. The purpose of the present study was to evaluate the efficacy of 2…

  3. Visual Basic 2012 programmer's reference

    CERN Document Server

    Stephens, Rod

    2012-01-01

    The comprehensive guide to Visual Basic 2012 Microsoft Visual Basic (VB) is the most popular programming language in the world, with millions of lines of code used in businesses and applications of all types and sizes. In this edition of the bestselling Wrox guide, Visual Basic expert Rod Stephens offers novice and experienced developers a comprehensive tutorial and reference to Visual Basic 2012. This latest edition introduces major changes to the Visual Studio development platform, including support for developing mobile applications that can take advantage of the Windows 8 operating system

  4. Quantum electronics basic theory

    CERN Document Server

    Fain, V M; Sanders, J H

    1969-01-01

    Quantum Electronics, Volume 1: Basic Theory is a condensed and generalized description of the many research and rapid progress done on the subject. It is translated from the Russian language. The volume describes the basic theory of quantum electronics, and shows how the concepts and equations followed in quantum electronics arise from the basic principles of theoretical physics. The book then briefly discusses the interaction of an electromagnetic field with matter. The text also covers the quantum theory of relaxation process when a quantum system approaches an equilibrium state, and explai

  5. Economic evaluation of reprocessing

    International Nuclear Information System (INIS)

    1979-02-01

    This paper presents a progress report of work undertaken relevant to the economic evaluation of reprocessing. It sets out the assumptions to be made for the preparation of the economic ''phase diagram'' - a plot of fast reactor premium against uranium (U 3 O 8 ) price. The paper discusses the assumptions to be made in respect of present worth methodology, LWR fuel logistics, U 3 O 8 price, enrichment tails, plutonium values, fast reactor premium and proposes a set of reference costs to be used for the preparation of the phase diagram

  6. Basic evaluation on nuclear characteristics of BWR high burnup MOX fuel and core

    International Nuclear Information System (INIS)

    Nagano, M.; Sakurai, S.; Yamaguchi, H.

    1997-01-01

    MOX fuel will be used in existing commercial BWR cores as a part of reload fuels with equivalent operability, safety and economy to UO 2 fuel in Japan. The design concept should be compatible with UO 2 fuel design. High burnup UO 2 fuels are being developed and commercialized step by step. The MOX fuel planned to be introduced in around year 2000 will use the same hardware as UO 2 8 x 8 array fuel developed for a second step of UO 2 high burnup fuel. The target discharge exposure of this MOX fuel is about 33 GWd/t. And the loading fraction of MOX fuel is approximately one-third in an equilibrium core. On the other hand, it becomes necessary to minimize a number of MOX fuels and plants utilizing MOX fuel, mainly due to the fuel economy, handling cost and inspection cost in site. For the above reasons, it needed to developed a high burnup MOX fuel containing much Pu and a core with a large amount of MOX fuels. The purpose of this study is to evaluate basic nuclear fuel and core characteristics of BWR high burnup MOX fuel with batch average exposure of about 39.5 GWd/t using 9 x 9 array fuel. The loading fraction of MOX fuel in the core is within a range of about 50% to 100%. Also the influence of Pu isotopic composition fluctuations and Pu-241 decay upon nuclear characteristics are studied. (author). 3 refs, 5 figs, 3 tabs

  7. Development and Validation of a Project Package for Junior Secondary School Basic Science

    Science.gov (United States)

    Udofia, Nsikak-Abasi

    2014-01-01

    This was a Research and Developmental study designed to develop and validate projects for Junior Secondary School Basic Science instruction and evaluation. The projects were developed using the project blueprint and sent for validation by experts in science education and measurement and evaluation; using a project validation scale. They were to…

  8. Information center as a link between basic and applied research

    International Nuclear Information System (INIS)

    Pearlstein, S.

    1976-01-01

    The National Neutron Cross Section Center (NNCSC) concerns itself with neutron physics information of a basic and applied nature. Computerized files of bibliography to the neutron physics literature, and of experimental and evaluated neutron data are maintained. The NNCSC coordinates a national effort, the Cross Section Evaluation Working Group (CSEWG) with participants from government, private, and academic institutions, to establish a computerized reference data base Evaluated Nuclear Data File (ENDF/B) for national programs. The ENDF/B is useful to basic research because it contains recommended values based on the best available measurements and is often used as reference data for normalization and analysis of experiments. For applied use the reference data are extended through nuclear model calculations or nuclear systematics to include all data of interest with standardized processing codes facilitating the use of ENDF/B in certain types of computations. Initially the main application of ENDF/B was power reactor and shield design and only neutron data were evaluated but due to the fact that for many applications both neutron and nonneutron data are required, ENDF/B has been extended in scope to include radioactive decay data and radiation spectra for the burnup and after decay heat of fission products and photon interaction data for gamma ray transport calculations. Cooperation with other centers takes place both nationally and internationally

  9. NONLINEAR MODELS FOR DESCRIPTION OF CACAO FRUIT GROWTH WITH ASSUMPTION VIOLATIONS

    Directory of Open Access Journals (Sweden)

    JOEL AUGUSTO MUNIZ

    2017-01-01

    Full Text Available Cacao (Theobroma cacao L. is an important fruit in the Brazilian economy, which is mainly cultivated in the southern State of Bahia. The optimal stage for harvesting is a major factor for fruit quality and the knowledge on its growth curves can help, especially in identifying the ideal maturation stage for harvesting. Nonlinear regression models have been widely used for description of growth curves. However, several studies in this subject do not consider the residual analysis, the existence of a possible dependence between longitudinal observations, or the sample variance heterogeneity, compromising the modeling quality. The objective of this work was to compare the fit of nonlinear regression models, considering residual analysis and assumption violations, in the description of the cacao (clone Sial-105 fruit growth. The data evaluated were extracted from Brito and Silva (1983, who conducted the experiment in the Cacao Research Center, Ilheus, State of Bahia. The variables fruit length, diameter and volume as a function of fruit age were studied. The use of weighting and incorporation of residual dependencies was efficient, since the modeling became more consistent, improving the model fit. Considering the first-order autoregressive structure, when needed, leads to significant reduction in the residual standard deviation, making the estimates more reliable. The Logistic model was the most efficient for the description of the cacao fruit growth.

  10. Formalizing Evaluation in Music Information Retrieval

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2013-01-01

    We develop a formalism to disambiguate the evaluation of music information retrieval systems. We define a ``system,'' what it means to ``analyze'' one, and make clear the aims, parts, design, execution, interpretation, and assumptions of its ``evaluation.'' We apply this formalism to discuss...

  11. Agenda dissonance: immigrant Hispanic women's and providers' assumptions and expectations for menopause healthcare.

    Science.gov (United States)

    Esposito, Noreen

    2005-02-01

    This focus group study examined immigrant Hispanic women's and providers' assumptions about and expectations of healthcare encounters in the context of menopause. Four groups of immigrant women from Central America and one group of healthcare providers were interviewed in Spanish and English, respectively. The women wanted provider-initiated, individualized anticipatory guidance about menopause, acknowledgement of their symptoms, and mainstream medical treatment for disruptive symptoms. Providers believed that menopause was an unimportant health issue for immigrant women and was overshadowed by concerns about high-risk medical problems, such as diabetes, heart disease and HIV prevention. The women expected a healthcare encounter to be patient centered, social, and complete in itself. Providers expected an encounter to be businesslike and one part of multiple visit care. Language and lack of time were barriers cited by all. Dissonance between patient-provider assumptions and expectations around issues of healthcare leads to missed opportunities for care.

  12. Lack of transparency on environmental risks of genetically modified micro-organisms in industrial biotechnology

    NARCIS (Netherlands)

    Tamis, W.L.M.; Dommelen, van A.; Snoo, de G.R.

    2009-01-01

    For the sustainable development of technological innovations the involvement of non-specialist stakeholders is crucial, which requires transparency of the knowledge base of the risks and benefits concerned. This paper evaluates the basic assumptions of the Organisation for Economic Cooperation and

  13. A holistic approach for perfusion assessment in septic shock: Basic foundations and clinical applications

    NARCIS (Netherlands)

    Hernández Poblete, G.W.

    2013-01-01

    A fundamental challenge in septic shock resuscitation is to evaluate tissue perfusion. In this thesis, we review the basic foundations for the development of a comprehensive and holistic model for perfusion assessment in septic shock, and outline its application to evaluate the impact of

  14. Basic visual observation skills training course: Appendix A. Final report

    International Nuclear Information System (INIS)

    Toquam, J.L.; Morris, F.A.; Griggs, J.R.

    1995-06-01

    The purpose of the basic visual observation skills course is to help safeguards inspectors evaluate and improve their skills in making observations during inspections and in evaluating and interpreting this information. The first 12 hours of the course provide training in five skill areas: perception and recognition; attention to detail; memory; mental imaging, mapping, and modeling skills; and judgment and decision making. Following this training is an integrating exercise involving a simulated safeguards inspection. This report contains the course manual and materials

  15. Basic visual observation skills training course: Appendix A. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Toquam, J.L.; Morris, F.A.; Griggs, J.R.

    1995-06-01

    The purpose of the basic visual observation skills course is to help safeguards inspectors evaluate and improve their skills in making observations during inspections and in evaluating and interpreting this information. The first 12 hours of the course provide training in five skill areas: perception and recognition; attention to detail; memory; mental imaging, mapping, and modeling skills; and judgment and decision making. Following this training is an integrating exercise involving a simulated safeguards inspection. This report contains the course manual and materials.

  16. The Concept of Basic Income: Global Experience and Implementation Possibilities in Lithuania

    Directory of Open Access Journals (Sweden)

    Algimantas Laurinavičius

    2016-06-01

    Full Text Available The article gives an overview of universal basic income as one of the instruments of the asset-based policy, analyses its theoretical concept and practical examples. Latest trends in Europe, especially in Finland and Switzerland, are overviewed and possibilities to implement such an instrument in Lithuania are evaluated. Research methods of scientific literature analysis, comparative and logical analysis of statistical data, data grouping and presentation were used. Article finds out that the concept of basic income is being implemented on a small scale in the US state of Alaska and in a small autonomous territory of China – Macao. Finland and Switzerland are determined to fully implement the concept of basic income by providing monthly benefits to all their citizens. Although Lithuania is categorized as a country with high income inequality and high level of poverty risk, currently it is not possible to implement the concept of basic income in Lithuania: the state social insurance fund budget would not be able to fund sufficient benefits, and the benefits that could be provided by the budget would not comply with the objectives of the concept of basic income.

  17. Temporal Distinctiveness in Task Switching: Assessing the Mixture-Distribution Assumption

    Directory of Open Access Journals (Sweden)

    James A Grange

    2016-02-01

    Full Text Available In task switching, increasing the response--cue interval has been shown to reduce the switch cost. This has been attributed to a time-based decay process influencing the activation of memory representations of tasks (task-sets. Recently, an alternative account based on interference rather than decay has been successfully applied to this data (Horoufchin et al., 2011. In this account, variation of the RCI is thought to influence the temporal distinctiveness (TD of episodic traces in memory, thus affecting their retrieval probability. This can affect performance as retrieval probability influences response time: If retrieval succeeds, responding is fast due to positive priming; if retrieval fails, responding is slow, due to having to perform the task via a slow algorithmic process. This account---and a recent formal model (Grange & Cross, 2015---makes the strong prediction that all RTs are a mixture of one of two processes: a fast process when retrieval succeeds, and a slow process when retrieval fails. The present paper assesses the evidence for this mixture-distribution assumption in TD data. In a first section, statistical evidence for mixture-distributions is found using the fixed-point property test. In a second section, a mathematical process model with mixture-distributions at its core is fitted to the response time distribution data. Both approaches provide good evidence in support of the mixture-distribution assumption, and thus support temporal distinctiveness accounts of the data.

  18. Evaluation of Investment Risks in CBA with Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Jana Korytárová

    2015-01-01

    Full Text Available Investment decisions are at the core of any development strategy. Economic growth and welfare depend on productive capital, infrastructure, human capital, knowledge, total factor productivity and the quality of institutions. Decision-making process on the selection of suitable projects in the public sector is in some aspects more difficult than in the private sector. Evaluating projects on the basis of their financial profitability, where the basic parameter is the value of the potential profit, can be misleading in these cases. One of the basic objectives of the allocation of public resources is respecting of the 3E principle (Economy, Effectiveness, Efficiency in their whole life cycle. The life cycle of the investment projects consists of four main phases. The first pre-investment phase is very important for decision-making process whether to accept or reject a public project for its realization. A well-designed feasibility study as well as cost-benefit analysis (CBA in this phase are important assumptions for future success of the project. A future financial and economical CF which represent the fundamental basis for calculation of economic effectiveness indicators are formed and modelled in these documents. This paper deals with the possibility to calculate the financial and economic efficiency of the public investment projects more accurately by simulation methods used.

  19. Current Status and Issues in Basic Pharmaceutical Education.

    Science.gov (United States)

    Yasuhara, Tomohisa

    2017-01-01

    Basic research in pharmaceutical sciences has a long and successful history. Researchers in this field have long given prime importance to the knowledge they have gained through their pharmaceutical education. The transition of pharmacy education to a 6-year course term has not only extended its duration but also placed more emphasis on practical clinical education. The School Education Act (in article 87, second paragraph) determines that "the term of the course, whose main purpose is to cultivate practical ability in clinical pharmacy, shall be six years" (excerpt). The 6-year pharmacy education is an exception to the general 4-year university term determined by the School Education Act. Therefore, the purpose of the 6-year course in pharmacy is clearly proscribed. This is true of the basic course in pharmaceutical education as well; hence, the basic course must be oriented toward developing "practical ability in clinical" education, too. The 6-year pharmacy course, starting from practice (Do), has evolved with the development of a syllabus that includes a model core curriculum (Plan). Furthermore, improvement in the course can be seen by the promoted development of faculty (Act). Now, evidence-based education research will be introduced (Check). This is how the Plan-Do-Check-Act cycle in pharmaceutical education is expected to work. Currently, pedagogy research in pharmacy education has just begun, so it is difficult to evaluate at this time whether basic pharmaceutical education does in fact contribute to enhancing the "practical clinical ability" component of pharmaceutical education.

  20. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    Science.gov (United States)

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  1. Teaching, the Legal Education and Carl Rogers Assumptions: A Case Study in a Private University

    Directory of Open Access Journals (Sweden)

    Leonardo José Peixoto Leal

    2015-12-01

    of examination lawyers and tenders, existing today a new vision called "legal education crisis" in Brazil. According to Carl Rogers (1972, the main role of the teacher is not only to teach but to help the student to learn. This idea has been legitimized internationally since the publication of the UNESCO Report (Delors, 1998, when it pointed out that "learning to know" constitutes one of the pillars of contemporary education. Rogers (1972, in the 1960s,  drew up a list of 10 assumptions implicit deeply rooted among teachers, paradigms that should be addressed by teachers The methodology used was literature and documents with a qualitative approach in the case like an argument from Case Study, considering the Master in Law and the experiences of the Monitoring and Group Study Program. It concludes that the critical evaluation is important in the formation of the legal profession, because the legal education needs to renew itself, from a teaching practice centered learning.

  2. Anti-Atheist Bias in the United States: Testing Two Critical Assumptions

    OpenAIRE

    Swan, Lawton K; Heesacker, Martin

    2012-01-01

    Decades of opinion polling and empirical investigations have clearly demonstrated a pervasive anti-atheist prejudice in the United States. However, much of this scholarship relies on two critical and largely unaddressed assumptions: (a) that when people report negative attitudes toward atheists, they do so because they are reacting specifically to their lack of belief in God; and (b) that survey questions asking about attitudes toward atheists as a group yield reliable information about biase...

  3. Comprehensive basic mathematics

    CERN Document Server

    Veena, GR

    2005-01-01

    Salient Features As per II PUC Basic Mathematics syllabus of Karnataka. Provides an introduction to various basic mathematical techniques and the situations where these could be usefully employed. The language is simple and the material is self-explanatory with a large number of illustrations. Assists the reader in gaining proficiency to solve diverse variety of problems. A special capsule containing a gist and list of formulae titled ''REMEMBER! Additional chapterwise arranged question bank and 3 model papers in a separate section---''EXAMINATION CORNER''.

  4. Wind Energy Basics | NREL

    Science.gov (United States)

    Wind Energy Basics Wind Energy Basics We have been harnessing the wind's energy for hundreds of grinding grain. Today, the windmill's modern equivalent-a wind turbine can use the wind's energy to most energy. At 100 feet (30 meters) or more aboveground, they can take advantage of the faster and

  5. Biomass Energy Basics | NREL

    Science.gov (United States)

    Biomass Energy Basics Biomass Energy Basics We have used biomass energy, or "bioenergy" keep warm. Wood is still the largest biomass energy resource today, but other sources of biomass can landfills (which are methane, the main component in natural gas) can be used as a biomass energy source. A

  6. Proposed optical test of Bell's inequalities not resting upon the fair sampling assumption

    International Nuclear Information System (INIS)

    Santos, Emilio

    2004-01-01

    Arguments are given against the fair sampling assumption, used to claim an empirical disproof of local realism. New tests are proposed, able to discriminate between quantum mechanics and a restricted, but appealing, family of local hidden-variables models. Such tests require detectors with efficiencies just above 20%

  7. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    Science.gov (United States)

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  8. Benchmarking biological nutrient removal in wastewater treatment plants: influence of mathematical model assumptions

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Gernaey, Krist V.; Jeppsson, Ulf

    2012-01-01

    This paper examines the effect of different model assumptions when describing biological nutrient removal (BNR) by the activated sludge models (ASM) 1, 2d & 3. The performance of a nitrogen removal (WWTP1) and a combined nitrogen and phosphorus removal (WWTP2) benchmark wastewater treatment plant...

  9. 76 FR 81966 - Agency Information Collection Activities; Proposed Collection; Comments Requested; Assumption of...

    Science.gov (United States)

    2011-12-29

    ... Indian country is subject to State criminal jurisdiction under Public Law 280 (18 U.S.C. 1162(a)) to... Collection; Comments Requested; Assumption of Concurrent Federal Criminal Jurisdiction in Certain Areas of Indian Country ACTION: 60-Day notice of information collection under review. The Department of Justice...

  10. Field Evaluation of the System Identification Approach for Tension Estimation of External Tendons

    Directory of Open Access Journals (Sweden)

    Myung-Hyun Noh

    2015-01-01

    Full Text Available Various types of external tendons are considered to verify the applicability of tension estimation method based on the finite element model with system identification technique. The proposed method is applied to estimate the tension of benchmark numerical example, model structure, and field structure. The numerical and experimental results show that the existing methods such as taut string theory and linear regression method show large error in the estimated tension when the condition of external tendon is different with the basic assumption used during the derivation of relationship between tension and natural frequency. However, the proposed method gives reasonable results for all of the considered external tendons in this study. Furthermore, the proposed method can evaluate the accuracy of estimated tension indirectly by comparing the measured and calculated natural frequencies. Therefore, the proposed method can be effectively used for field application of various types of external tendons.

  11. CHILDREN'S EDUCATION IN THE REGULAR NATIONAL BASIS: ASSUMPTIONS AND INTERFACES WITH PHYSICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    André da Silva Mello

    2016-09-01

    Full Text Available This paper aims at discussing the Children's Education organization within the Regular Curricular National Basis (BNCC, focusing on the permanencies and advances taking in relation to the precedent documents, and analyzing the presence of Physical Education in Children's Education from the assumptions that guide the Base, in interface with researches about pedagogical experiences with this field of knowledge. To do so, it carries out a documental-bibliographic analysis, using as sources the BNCC, the National Curricular Referential for Children's Education, the National Curricular Guidelines for Children's Education and academic-scientific productions belonging to the Physical Education area that approach Children's Education. In the analysis process, the work establishes categories which allow the interlocution among different sources used in this study. Data analyzed offers indications that the assumption present in the BNCC dialogue, not explicitly, with the movements of the curricular component and with the Physical Education academic-scientific production regarding Children's Education.

  12. Basic Finance

    Science.gov (United States)

    Vittek, J. F.

    1972-01-01

    A discussion of the basic measures of corporate financial strength, and the sources of the information is reported. Considered are: balance sheet, income statement, funds and cash flow, and financial ratios.

  13. Solar Energy Basics | NREL

    Science.gov (United States)

    Solar Energy Basics Solar Energy Basics Solar is the Latin word for sun-a powerful source of energy that can be used to heat, cool, and light our homes and businesses. That's because more energy from the technologies convert sunlight to usable energy for buildings. The most commonly used solar technologies for

  14. Learning Visual Basic NET

    CERN Document Server

    Liberty, Jesse

    2009-01-01

    Learning Visual Basic .NET is a complete introduction to VB.NET and object-oriented programming. By using hundreds of examples, this book demonstrates how to develop various kinds of applications--including those that work with databases--and web services. Learning Visual Basic .NET will help you build a solid foundation in .NET.

  15. Optimization of horizontal microcode within and beyond basic blocks: an application of processor scheduling with resources

    Energy Technology Data Exchange (ETDEWEB)

    Fisher, J.A.

    1979-10-01

    Microprogram optimization is the rearrangement of microcode written vertically, with one operation issued per step, into legal horizontal microinstructions, in which several operations are issued each instruction cycle. The rearrangement is done in a way that approximately minimizes the running time of the code. This problem is identified with the problem of processor scheduling with resource constraints. Thus, the problem of optimizing basic blocks of microcode can be seen to be np-complete; however, approximate methods for basic blocks which have good records in other, similar scheduling environments can be used. In priority list scheduling the tasks are ordered according to some evaluation function, and then schedules are found by repeated scans of the list. Several evaluation functions are shown to perform very well on large samples of various classes of random data-precedence graphs with characteristics similar to those derived from microprograms. A method of spotting resource bottlenecks in the derived data-precedence graph enables one to obtain a resource-considerate evaluation function, in which tasks which contribute directly to or precede bottlenecks have their priorities raised. The complexity of the calculations necessary to compute the lower bound was greatly reduced. A method is suggested for optimizing beyond basic blocks. Groups of basic blocks are treated as if they were one block; the information necessary to control the motion of tasks between blocks is encoded as data-precedence constraints on the conditional tasks. Long paths of code can thus be optimized, with no back branches, by the same methods used for basic blocks. 9 figures, 6 tables.

  16. Basic approach to evaluate methane partial oxidation catalysts

    CSIR Research Space (South Africa)

    Parmaliana, A

    1993-09-01

    Full Text Available -phase reaction does not affect the catalytic pathways. Reasons for controversial results reported previously are discussed. They lie in the lack of an adequate experimental approach and in the generally adopted rule to evaluate the catalytic activity...

  17. Evaluation of innovative stationary phase ligand chemistries and analytical conditions for the analysis of basic drugs by supercritical fluid chromatography.

    Science.gov (United States)

    Desfontaine, Vincent; Veuthey, Jean-Luc; Guillarme, Davy

    2016-03-18

    Similar to reversed phase liquid chromatography, basic compounds can be highly challenging to analyze by supercritical fluid chromatography (SFC), as they tend to exhibit poor peak shape, especially those with high pKa values. In this study, three new stationary phase ligand chemistries available in sub -2 μm particle sizes, namely 2-picolylamine (2-PIC), 1-aminoanthracene (1-AA) and diethylamine (DEA), were tested in SFC conditions for the analysis of basic drugs. Due to the basic properties of these ligands, it is expected that the repulsive forces may improve peak shape of basic substances, similarly to the widely used 2-ethypyridine (2-EP) phase. However, among the 38 tested basic drugs, less of 10% displayed Gaussian peaks (asymmetry between 0.8 and 1.4) using pure CO2/methanol on these phases. The addition of 10mM ammonium formate as mobile phase additive, drastically improved peak shapes and increased this proportion to 67% on 2-PIC. Introducing the additive in the injection solvent rather than in the organic modifier, gave acceptable results for 2-PIC only, with 31% of Gaussian peaks with an average asymmetry of 1.89 for the 38 selected basic drugs. These columns were also compared to hybrid silica (BEH), DIOL and 2-EP stationary phases, commonly employed in SFC. These phases commonly exhibit alternative retention and selectivity. In the end, the two most interesting ligands used as complementary columns were 2-PIC and BEH, as they provided suitable peak shapes for the basic drugs and almost orthogonal selectivities. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Ancestral assumptions and the clinical uncertainty of evolutionary medicine.

    Science.gov (United States)

    Cournoyea, Michael

    2013-01-01

    Evolutionary medicine is an emerging field of medical studies that uses evolutionary theory to explain the ultimate causes of health and disease. Educational tools, online courses, and medical school modules are being developed to help clinicians and students reconceptualize health and illness in light of our evolutionary past. Yet clinical guidelines based on our ancient life histories are epistemically weak, relying on the controversial assumptions of adaptationism and advocating a strictly biophysical account of health. To fulfill the interventionist goals of clinical practice, it seems that proximate explanations are all we need to develop successful diagnostic and therapeutic guidelines. Considering these epistemic concerns, this article argues that the clinical relevance of evolutionary medicine remains uncertain at best.

  19. Polarized BRDF for coatings based on three-component assumption

    Science.gov (United States)

    Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong

    2017-02-01

    A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.

  20. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    International Nuclear Information System (INIS)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-01-01

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model

  1. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus.

    Science.gov (United States)

    Taliotis, Constantinos; Taibi, Emanuele; Howells, Mark; Rogner, Holger; Bazilian, Morgan; Welsch, Manuel

    2017-10-01

    The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  2. Technoeconomic assumptions adopted for the development of a long-term electricity supply model for Cyprus

    Directory of Open Access Journals (Sweden)

    Constantinos Taliotis

    2017-10-01

    Full Text Available The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.

  3. Investigating Teachers’ and Students’ Beliefs and Assumptions about CALL Programme at Caledonian College of Engineering

    Directory of Open Access Journals (Sweden)

    Holi Ibrahim Holi Ali

    2012-01-01

    Full Text Available This study is set to investigate students’ and teachers’ perceptions and assumptions about newly implemented CALL Programme at the School of Foundation Studies, Caledonian College of Engineering, Oman. Two versions of questionnaire were administered to 24 teachers and 90 students to collect their beliefs and assumption about CALL programame. The results shows that the great majority of the students report that CALL is very interesting, motivating and useful to them and they learn a lot form it. However, the number of CALL hours should be increased, lab should be equipped and arranged in user friendly way, assessment should be integrated into CALL, and smart boards, black boards should be incorporated into the programme.

  4. On new evolution in development of basic technology of atomic energy

    International Nuclear Information System (INIS)

    1993-01-01

    In 1988, the expert committee on the promotion of basic technology organized in the Atomic Energy Commission presented the report and showed concretely the subjects of research and development to be promoted in four fields of material technology, artificial intelligence technology, laser technology and the technology for evaluating and reducing radiation risks for atomic energy, and the measures of efficiently promoting the technical development. The research and development achieved the steady results following this report. The creation of radiation resistant materials, the development of knowledge base system and robot technology, the development of the laser technology required for atomic energy, and the technology for evaluating and reducing radiation risks and so on have been carried out. As the measures for efficiently promoting the technical development, the promotion of the active interchange of researches, the intentional rearing of creative men, the positive development of international interchange, the introduction of the new evaluation of research and the promotion of spread of the results of research have been carried out. The state of execution and the new development measures of the development of the basic technology are reported. (K.I.)

  5. Assumptions for the Annual Energy Outlook 1993

    International Nuclear Information System (INIS)

    1993-01-01

    This report is an auxiliary document to the Annual Energy Outlook 1993 (AEO) (DOE/EIA-0383(93)). It presents a detailed discussion of the assumptions underlying the forecasts in the AEO. The energy modeling system is an economic equilibrium system, with component demand modules representing end-use energy consumption by major end-use sector. Another set of modules represents petroleum, natural gas, coal, and electricity supply patterns and pricing. A separate module generates annual forecasts of important macroeconomic and industrial output variables. Interactions among these components of energy markets generate projections of prices and quantities for which energy supply equals energy demand. This equilibrium modeling system is referred to as the Intermediate Future Forecasting System (IFFS). The supply models in IFFS for oil, coal, natural gas, and electricity determine supply and price for each fuel depending upon consumption levels, while the demand models determine consumption depending upon end-use price. IFFS solves for market equilibrium for each fuel by balancing supply and demand to produce an energy balance in each forecast year

  6. 26 CFR 1.752-6 - Partnership assumption of partner's section 358(h)(3) liability after October 18, 1999, and...

    Science.gov (United States)

    2010-04-01

    ... general. If, in a transaction described in section 721(a), a partnership assumes a liability (defined in...) does not apply to an assumption of a liability (defined in section 358(h)(3)) by a partnership as part... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Partnership assumption of partner's section 358...

  7. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  8. Finding Basic Writing's Place.

    Science.gov (United States)

    Sheridan-Rabideau, Mary P.; Brossell, Gordon

    1995-01-01

    Posits that basic writing serves a vital function by providing writing support for at-risk students and serves the needs of a growing student population that universities accept yet feel needs additional writing instruction. Concludes that the basic writing classroom is the most effective educational support for at-risk students and their writing.…

  9. An Examination of Income Effect on Consumers' Ethical Evaluation of Counterfeit Drugs Buying Behaviour: A Cross-Sectional Study in Qatar and Sudan.

    Science.gov (United States)

    Alfadl, Abubakr Abdelraouf; Ibrahim, Mohamed Izham Mohamed; Maraghi, Fatima Abdulla; Mohammad, Khadijah Shhab

    2016-09-01

    There are limited studies on consumer behaviour toward counterfeit products and the determining factors that motivate willingness to purchase counterfeit items. This study aimed to fill this literature gap through studying differences in individual ethical evaluations of counterfeit drug purchase and whether that ethical evaluation affected by difference in income. It is hypothesized that individuals with lower/higher income make a more/less permissive evaluation of ethical responsibility regarding counterfeit drug purchase. To empirically test the research assumption, a comparison was made between people who live in the low-income country Sudan and people who live in the high-income country Qatar. The study employed a face-to-face structured interview survey methodology to collect data from 1,170 subjects and the Sudanese and Qatari samples were compared using independent t-test at alpha level of 0.05 employing SPSS version 22.0. Sudanese and Qatari individuals were significantly different on all items. Sudanese individuals scored below 3 for all Awareness of Societal Consequences (ASC) items indicating that they make more permissive evaluation of ethical responsibility regarding counterfeit drug purchase. Both groups shared a basic positive moral agreement regarding subjective norm indicating that influence of income is not evident. Findings indicate that low-income individuals make more permissive evaluation of ethical responsibility regarding counterfeit drugs purchase when highlighting awareness of societal consequences used as a deterrent tool, while both low and high-income individuals share a basic positive moral agreement when subjective norm dimension is exploited to discourage unethical buying behaviour.

  10. Basic visual observation skills training course: Appendix B. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Toquam, J.L.; Morris, F.A.; Griggs, J.R.

    1995-06-01

    The purpose of the basic visual observation skills course is to help safeguards inspectors evaluate and improve their skills in making observations during inspections and in evaluating and interpreting this information. The first 12 hours of the course provide training in five skill areas: perception and recognition; attention to detail; memory; mental imaging, mapping, and modeling skills; and judgment and decision making. Following this training is an integrating exercise involving a simulated safeguards inspection. This report contains the in-class exercises in the five skill areas; pre- and post-course exercises in closure, hidden figures, map memory, and mental rotations; the final examination; a training evaluation form; and the integrating exercise.

  11. Basic visual observation skills training course: Appendix B. Final report

    International Nuclear Information System (INIS)

    Toquam, J.L.; Morris, F.A.; Griggs, J.R.

    1995-06-01

    The purpose of the basic visual observation skills course is to help safeguards inspectors evaluate and improve their skills in making observations during inspections and in evaluating and interpreting this information. The first 12 hours of the course provide training in five skill areas: perception and recognition; attention to detail; memory; mental imaging, mapping, and modeling skills; and judgment and decision making. Following this training is an integrating exercise involving a simulated safeguards inspection. This report contains the in-class exercises in the five skill areas; pre- and post-course exercises in closure, hidden figures, map memory, and mental rotations; the final examination; a training evaluation form; and the integrating exercise

  12. Measuring student teachers' basic psychological needs

    NARCIS (Netherlands)

    dr Bob Koster; Dr. Jos Castelijns; Dr. Marjan Vermeulen; dr.ir. Quinta Kools

    2012-01-01

    In the Self-Determination Theory (SDT) basic psychological needs for relatedness, autonomy and competence are distinguished. Basic psychological need fulfilment is considered to be critical for human development and intrinsic motivation. In the Netherlands, the concept of basic psychological need

  13. Measuring student teachers’ basic psychological needs

    NARCIS (Netherlands)

    Vermeulen, Marjan; Castelijns, Jos; Koster, Bob; Kools, Quinta

    2018-01-01

    In the Self–Determination Theory (SDT) basic psychological needs for relatedness, autonomy and competence are distinguished. Basic psychological need fulfilment is considered to be critical for human development and intrinsic motivation. In the Netherlands, the concept of basic psychological need

  14. Assessing the skill of hydrology models at simulating the water cycle in the HJ Andrews LTER: Assumptions, strengths and weaknesses

    Science.gov (United States)

    Simulated impacts of climate on hydrology can vary greatly as a function of the scale of the input data, model assumptions, and model structure. Four models are commonly used to simulate streamflow in model assumptions, and model structure. Four models are commonly used to simu...

  15. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  16. Holistic Approach as Viewed by the Basic School Teachers in Latvia

    Science.gov (United States)

    Badjanova, Jelena; Iliško, Dzintra

    2015-01-01

    The article points to new competencies required from basic school teachers, reinforced by the reform processes in the educational system in Latvia, the quality assurance of educational process, and modernisation and critical re-evaluation of educational materials and standards. The authors view sustainability as an integral part of reform…

  17. A Novel Clinical-Simulated Suture Education for Basic Surgical Skill: Suture on the Biological Tissue Fixed on Standardized Patient Evaluated with Objective Structured Assessment of Technical Skill (OSATS) Tools.

    Science.gov (United States)

    Shen, Zhanlong; Yang, Fan; Gao, Pengji; Zeng, Li; Jiang, Guanchao; Wang, Shan; Ye, Yingjiang; Zhu, Fengxue

    2017-06-21

    Clinical-simulated training has shown benefit in the education of medical students. However, the role of clinical simulation for surgical basic skill training such as suturing techniques remains unclear. Forty-two medical students were asked to perform specific suturing tasks at three stations with the different settings within four minutes (Station 1: Synthetic suture pad fixed on the bench, Station 2: Synthetic suture pad fixed on the standardized patient, Station 3: Pig skin fixed on the standardized patient); the OSATS (Objective Structured Assessment of Technical Skill) tool was used to evaluate the performance of students. A questionnaire was distributed to the students following the examination. Mean performance score of Station 3 was significant lower than that of Station 1 and 2 in the general performance including tissue handling, time, and motion. The suturing techniques of students at Station 2 and 3 were not as accurate as that at Station 1. Inappropriate tension was applied to the knot at Station 2 compared with Station 1 and 3. On the questionnaire, 93% of students considered clinical-simulated training of basic surgical skills was necessary and may increase their confidence in future clinical work as surgeons; 98% of students thought the assessment was more objective when OSATS tool was used for evaluation. Clinical simulation examination assessed with OSATS might throw a novel light on the education of basic surgical skills and may be worthy of wider adoption in the surgical education of medical students.

  18. Revisiting the Operating Room Basics

    Directory of Open Access Journals (Sweden)

    Tushar Chakravorty

    2015-12-01

    Full Text Available Young doctors walking into the operating room are eager to develop their skills to become efficient and knowledgeable professionals in future. But precious little is done to actively develop the basic practical skills of the budding doctors. They remain unaware about the layout of the operating room, the OR etiquette and often do not have sound scientific understanding and importance of meticulous execution of the basic operating room protocols. This article stresses the need to develop the basics of OR protocol and to improve the confidence of the young doctor by strengthening his foundation by showing him that attention to the basics of medical care and empathy for the patient can really make a difference to the outcome of a treatment.

  19. MAPPING A BASIC HEALTH UNIT: AN EXPERIENCE REPORT

    Directory of Open Access Journals (Sweden)

    Bárbara Carvalho Malheiros

    2015-01-01

    Full Text Available Backgound and Objectives: This study is an experience report on the construction of a map of a Basic Health Unit (BHU. The objective was to understand the relevance and/or importance of mapping a BHU and acquire more knowledge on the health-disease status of the registered population and identify the importance of cartography as a working tool. Case description: After reading some texts, evaluating information systems and on-site visits, it was possible to identify the health status of the population of the neighborhoods. The proposed objectives were considered to be achieved, considering the mapping of the assessed population’s health-disease situation with a closer-to-reality viewpoint, identifying the number of individuals, the diseases, living situation and health care. Conclusion: The mapping approach is a powerful working tool for allowing the planning of strategic interventions that enables the development of assistance activities, aiming to promote health and disease prevention. KEYWORDS: Mapping; Basic Health Unit; Health Planning.

  20. A method of statistical analysis in the field of sports science when assumptions of parametric tests are not violated

    Directory of Open Access Journals (Sweden)

    Elżbieta Sandurska

    2016-12-01

    Full Text Available Introduction: Application of statistical software typically does not require extensive statistical knowledge, allowing to easily perform even complex analyses. Consequently, test selection criteria and important assumptions may be easily overlooked or given insufficient consideration. In such cases, the results may likely lead to wrong conclusions. Aim: To discuss issues related to assumption violations in the case of Student's t-test and one-way ANOVA, two parametric tests frequently used in the field of sports science, and to recommend solutions. Description of the state of knowledge: Student's t-test and ANOVA are parametric tests, and therefore some of the assumptions that need to be satisfied include normal distribution of the data and homogeneity of variances in groups. If the assumptions are violated, the original design of the test is impaired, and the test may then be compromised giving spurious results. A simple method to normalize the data and to stabilize the variance is to use transformations. If such approach fails, a good alternative to consider is a nonparametric test, such as Mann-Whitney, the Kruskal-Wallis or Wilcoxon signed-rank tests. Summary: Thorough verification of the parametric tests assumptions allows for correct selection of statistical tools, which is the basis of well-grounded statistical analysis. With a few simple rules, testing patterns in the data characteristic for the study of sports science comes down to a straightforward procedure.