WorldWideScience

Sample records for evaluating basic assumptions

  1. Helping Students to Recognize and Evaluate an Assumption in Quantitative Reasoning: A Basic Critical-Thinking Activity with Marbles and Electronic Balance

    Science.gov (United States)

    Slisko, Josip; Cruz, Adrian Corona

    2013-01-01

    There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…

  2. Basic concepts and assumptions behind the new ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1979-01-01

    A review is given of some of the basic concepts and assumptions behind the current recommendations by the International Commission on Radiological Protection in ICRP Publications 26 and 28, which form the basis for the revision of the Basic Safety Standards jointly undertaken by IAEA, ILO, NEA and WHO. Special attention is given to the assumption of a linear, non-threshold dose-response relationship for stochastic radiation effects such as cancer and hereditary harm. The three basic principles of protection are discussed: justification of practice, optimization of protection and individual risk limitation. In the new ICRP recommendations particular emphasis is given to the principle of keeping all radiation doses as low as is reasonably achievable. A consequence of this is that the ICRP dose limits are now given as boundary conditions for the justification and optimization procedures rather than as values that should be used for purposes of planning and design. The fractional increase in total risk at various ages after continuous exposure near the dose limits is given as an illustration. The need for taking other sources, present and future, into account when applying the dose limits leads to the use of the commitment concept. This is briefly discussed as well as the new quantity, the effective dose equivalent, introduced by ICRP. (author)

  3. Primary prevention in public health: an analysis of basic assumptions.

    Science.gov (United States)

    Ratcliffe, J; Wallack, L

    1985-01-01

    The common definition of primary prevention is straightforward; but how it is transformed into a framework to guide action is based on personal and societal feelings and beliefs about the basis for social organization. This article focuses on the two contending primary prevention strategies of health promotion and health protection. The contention between the two strategies stems from a basic disagreement about disease causality in modern society. Health promotion is based on the "lifestyle" theory of disease causality, which sees individual health status linked ultimately to personal decisions about diet, stress, and drug habits. Primary prevention, from this perspective, entails persuading individuals to forgo their risk-taking, self-destructive behavior. Health protection, on the other hand, is based on the "social-structural" theory of disease causality. This theory sees the health status of populations linked ultimately to the unequal distribution of social resources, industrial pollution, occupational stress, and "anti-health promotion" marketing practices. Primary prevention, from this perspective, requires changing existing social and, particularly, economic policies and structures. In order to provide a basis for choosing between these contending strategies, the demonstrated (i.e., past) impact of each strategy on the health of the public is examined. Two conclusions are drawn. First, the health promotion strategy shows little potential for improving the public health, because it systematically ignores the risk-imposing, other-destructive behavior of influential actors (policy-makers and institutions) in society. And second, effective primary prevention efforts entail an "upstream" approach that results in far-reaching sociopolitical and economic change.

  4. Basic Assumptions of the New Price System and Supplements to the Tariff System for Electricity Sale

    International Nuclear Information System (INIS)

    Klepo, M.

    1995-01-01

    The article outlines some basic assumptions of the new price system and major elements of the latest proposition for the changes and supplements to the Tariff system for Electricity Sale in the Republic of Croatia, including the analysis of those elements which brought about the present unfavourable and non-productive relations within the electric power system. The paper proposes measures and actions which should by means of a price system and tariff policy improve the present unfavourable relations and their consequences and achieve a desirable consumption structure and characteristics, resulting in rational management and effective power supply-economy relationships within the electric power system as a subsystem of the power supply sector. (author). 2 refs., 3 figs., 4 tabs

  5.  Basic assumptions and definitions in the analysis of financial leverage

    Directory of Open Access Journals (Sweden)

    Tomasz Berent

    2015-12-01

    Full Text Available The financial leverage literature has been in a state of terminological chaos for decades as evidenced, for example, by the Nobel Prize Lecture mistake on the one hand, and the global financial crisis on the other. A meaningful analysis of the leverage phenomenon calls for the formulation of a coherent set of assumptions and basic definitions. The objective of the paper is to answer this call. The paper defines leverage as a value neutral concept useful in explaining the magnification effect exerted by financial activity upon the whole spectrum of financial results. By adopting constructivism as a methodological approach, we are able to introduce various types of leverage such as capital and income, base and non-base, accounting and market value, for levels and for distances (absolute and relative, costs and simple etc. The new definitions formulated here are subsequently adopted in the analysis of the content of leverage statements used by the leading finance textbook.

  6. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    Science.gov (United States)

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a

  7. Uranium: a basic evaluation

    International Nuclear Information System (INIS)

    Crull, A.W.

    1978-01-01

    All energy sources and technologies, including uranium and the nuclear industry, are needed to provide power. Public misunderstanding of the nature of uranium and how it works as a fuel may jeopardize nuclear energy as a major option. Basic chemical facts about uranium ore and uranium fuel technology are presented. Some of the major policy decisions that must be made include the enrichment, stockpiling, and pricing of uranium. Investigations and lawsuits pertaining to uranium markets are reviewed, and the point is made that oil companies will probably have to divest their non-oil energy activities. Recommendations for nuclear policies that have been made by the General Accounting Office are discussed briefly

  8. Natural and laboratory OSL growth curve–Verification of the basic assumption of luminescence dating

    International Nuclear Information System (INIS)

    Kijek, N.; Chruścińska, A.

    2016-01-01

    The basic assumption of luminescence dating is the equality between the growth curve of OSL generated by the natural radiation and the OSL growth curve reconstructed in laboratory conditions. The dose rates that generate the OSL in nature and in laboratory experiments differ by about ten orders of magnitude. Recently some discrepancies between the natural and laboratory growth curves have been observed. It is important to establish their reasons in order to introduce appropriate correction into the OSL dating protocol or to find a test that allows to eliminate the samples which should not be used for dating. For this purpose, both growth curves, natural and laboratory, were reconstructed by means of computer simulations of the processes occurring in the sample during its deposition time in environment as well as those which occur in a laboratory during dating procedure. The simulations were carried out for three models including one shallow trap, two OSL traps, one disconnected deep and one luminescence center. The OSL model for quartz can be more complex than the one used in the presented simulations, but in spite of that the results show effects of growth curves discrepancies similar to those observed in experiments. It is clear that the consistency of growth curves is not a general feature of the OSL processes, but rather a result of an advantageous configuration of trap parameters. The deep disconnected traps play the key role and their complete filling before the zeroing of OSL signal is a necessary condition of the growth curves' consistency. - Highlights: • Process of OSL growth curve generation in nature and in laboratory was simulated. • Discrepancies between the natural and the laboratory growth curves are observed. • Deep disconnected traps play the key role in growth curve inequality. • Empty deep traps before zeroing of OSL cause the inequality of growth curves.

  9. Evaluating The Markov Assumption For Web Usage Mining

    DEFF Research Database (Denmark)

    Jespersen, S.; Pedersen, Torben Bach; Thorhauge, J.

    2003-01-01

    ) model~\\cite{borges99data}. These techniques typically rely on the \\textit{Markov assumption with history depth} $n$, i.e., it is assumed that the next requested page is only dependent on the last $n$ pages visited. This is not always valid, i.e. false browsing patterns may be discovered. However, to our...

  10. CRITIQUES TOWARDS COSO’S ENTERPRISE RISK MANAGEMENT (ERM) FRAMEWORK IN ITS BASIC ASSUMPTIONS

    OpenAIRE

    Kurniawanti, Ika Atma

    2010-01-01

    Most professionals in internal control, risk management and other similar bailiwickshave agreed that Enterprise Risk Management discourses would’ve invariablyreferred to what the COSO had produced recently: the framework underlying ERM.But this paper takes a bit different stance that views several problematic issuesstem from unclear conceptions of either the basic premise underlying ERM or thenature of some ERM’s components outlined by COSO. This paper notes that, atleast, there are three poi...

  11. E-Basics: Online Basic Training in Program Evaluation

    Science.gov (United States)

    Silliman, Ben

    2016-01-01

    E-Basics is an online training in program evaluation concepts and skills designed for youth development professionals, especially those working in nonformal science education. Ten hours of online training in seven modules is designed to prepare participants for mentoring and applied practice, mastery, and/or team leadership in program evaluation.…

  12. Evaluating Basic Technology Instruction in Nigerian Secondary ...

    African Journals Online (AJOL)

    It is an important technique which when appropriately adopted results into effective teaching and learning of practical subjects. This study focused on identification of evaluating techniques aimed at improving the teaching of Basic technology in Edo State. The area of study comprises of the eighteen Local Government Areas ...

  13. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  14. Do unreal assumptions pervert behaviour?

    DEFF Research Database (Denmark)

    Petersen, Verner C.

    of the basic assumptions underlying the theories found in economics. Assumptions relating to the primacy of self-interest, to resourceful, evaluative, maximising models of man, to incentive systems and to agency theory. The major part of the paper then discusses how these assumptions and theories may pervert......-interested way nothing will. The purpose of this paper is to take a critical look at some of the assumptions and theories found in economics and discuss their implications for the models and the practices found in the management of business. The expectation is that the unrealistic assumptions of economics have...... become taken for granted and tacitly included into theories and models of management. Guiding business and manage¬ment to behave in a fashion that apparently makes these assumptions become "true". Thus in fact making theories and models become self-fulfilling prophecies. The paper elucidates some...

  15. Evaluating methodological assumptions of a catch-curve survival estimation of unmarked precocial shorebird chickes

    Science.gov (United States)

    McGowan, Conor P.; Gardner, Beth

    2013-01-01

    Estimating productivity for precocial species can be difficult because young birds leave their nest within hours or days of hatching and detectability thereafter can be very low. Recently, a method for using a modified catch-curve to estimate precocial chick daily survival for age based count data was presented using Piping Plover (Charadrius melodus) data from the Missouri River. However, many of the assumptions of the catch-curve approach were not fully evaluated for precocial chicks. We developed a simulation model to mimic Piping Plovers, a fairly representative shorebird, and age-based count-data collection. Using the simulated data, we calculated daily survival estimates and compared them with the known daily survival rates from the simulation model. We conducted these comparisons under different sampling scenarios where the ecological and statistical assumptions had been violated. Overall, the daily survival estimates calculated from the simulated data corresponded well with true survival rates of the simulation. Violating the accurate aging and the independence assumptions did not result in biased daily survival estimates, whereas unequal detection for younger or older birds and violating the birth death equilibrium did result in estimator bias. Assuring that all ages are equally detectable and timing data collection to approximately meet the birth death equilibrium are key to the successful use of this method for precocial shorebirds.

  16. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure.

    Science.gov (United States)

    Weir, Scott M; Suski, Jamie G; Salice, Christopher J

    2010-12-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. A Memory-Based Model of Posttraumatic Stress Disorder: Evaluating Basic Assumptions Underlying the PTSD Diagnosis

    Science.gov (United States)

    Rubin, David C.; Berntsen, Dorthe; Bohni, Malene Klindt

    2008-01-01

    In the mnemonic model of posttraumatic stress disorder (PTSD), the current memory of a negative event, not the event itself, determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; American Psychiatric Association,…

  18. The assumption of linearity in soil and plant concentration ratios: an experimental evaluation

    International Nuclear Information System (INIS)

    Sheppard, S.C.; Evenden, W.G.

    1988-01-01

    We have evaluated one of the main assumptions in the use of concentration ratios to describe the transfer of elements in the environment. The ratios examined in detail were the 'concentration ratio' (CR) of leaf to soil and the 'partition coefficient' (Ksub(d)) of solid- to liquid-phase concentrations in soil. Use of these ratios implies a linear relationship between the concentrations. Soil was experimentally contaminated to evaluate this linearity over more than a 1000-fold range in concentration. A secondary objective was to determine CR and Ksub(d) values in a long-term (2 y) outdoor study using a peat soil and blueberries. The elements I, Se, Cs, Pb and U were chosen as environmentally important elements. The results indicated that relationships of leaf and leachate concentrations were not consistently linearly related to the total soil concentrations for each of the elements. The modelling difficulties implied by these concentration dependencies can be partially offset by including the strong negative correlation between CR and Ksub(d). The error introduced by using a mean value of the ratios for Se or U resulted in up to a ten-fold increase in variability for CR and a three-fold increase for Ksub(d). (author)

  19. Evaluating growth assumptions using diameter or radial increments in natural even-aged longleaf pine

    Science.gov (United States)

    John C. Gilbert; Ralph S. Meldahl; Jyoti N. Rayamajhi; John S. Kush

    2010-01-01

    When using increment cores to predict future growth, one often assumes future growth is identical to past growth for individual trees. Once this assumption is accepted, a decision has to be made between which growth estimate should be used, constant diameter growth or constant basal area growth. Often, the assumption of constant diameter growth is used due to the ease...

  20. Economic assumptions for evaluating reactor-related options for managing plutonium

    International Nuclear Information System (INIS)

    Rothwell, G.

    1996-01-01

    This paper discusses the economic assumptions in the U.S. National Academy of Sciences' report, Management and Disposition of Excess Weapons Plutonium: Reactor-Related Options (1995). It reviews the Net Present Value approach for discounting and comparing the costs and benefits of reactor-related options. It argues that because risks associated with the returns to plutonium management are unlikely to be constant over time, it is preferable to use a real risk-free rate to discount cash flows and explicitly describe the probability distributions for costs and benefits, allowing decision makers to determine the risk premium of each option. As a baseline for comparison, it assumes that one economic benefit of changing the current plutonium management system is a reduction in on-going Surveillance and Maintenance (S and M) costs. This reduction in the present value of S and M costs can be compared with the discounted costs of each option. These costs include direct construction costs, indirect costs, operating costs minus revenues, and decontamination and decommissioning expenses. The paper also discusses how to conduct an uncertainty analysis. It finishes by summarizing conclusions and recommendations and discusses how these recommendations might apply to the evaluation of Russian plutonium management options. (author)

  1. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    International Nuclear Information System (INIS)

    Weir, Scott M.; Suski, Jamie G.; Salice, Christopher J.

    2010-01-01

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  2. Ecological risk of anthropogenic pollutants to reptiles: Evaluating assumptions of sensitivity and exposure

    Energy Technology Data Exchange (ETDEWEB)

    Weir, Scott M., E-mail: scott.weir@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States); Suski, Jamie G., E-mail: jamie.suski@ttu.ed [Texas Tech University, Department of Biological Sciences, Box 43131, Lubbock, TX (United States); Salice, Christopher J., E-mail: chris.salice@ttu.ed [Texas Tech University, Institute of Environmental and Human Health, Department of Environmental Toxicology, Box 41163, Lubbock, TX (United States)

    2010-12-15

    A large data gap for reptile ecotoxicology still persists; therefore, ecological risk assessments of reptiles usually incorporate the use of surrogate species. This necessitates that (1) the surrogate is at least as sensitive as the target taxon and/or (2) exposures to the surrogate are greater than that of the target taxon. We evaluated these assumptions for the use of birds as surrogates for reptiles. Based on a survey of the literature, birds were more sensitive than reptiles in less than 1/4 of the chemicals investigated. Dietary and dermal exposure modeling indicated that exposure to reptiles was relatively high, particularly when the dermal route was considered. We conclude that caution is warranted in the use of avian receptors as surrogates for reptiles in ecological risk assessment and emphasize the need to better understand the magnitude and mechanism of contaminant exposure in reptiles to improve exposure and risk estimation. - Avian receptors are not universally appropriate surrogates for reptiles in ecological risk assessment.

  3. A critical evaluation of the local-equilibrium assumption in modeling NAPL-pool dissolution

    Science.gov (United States)

    Seagren, Eric A.; Rittmann, Bruce E.; Valocchi, Albert J.

    1999-07-01

    An analytical modeling analysis was used to assess when local equilibrium (LE) and nonequilibrium (NE) modeling approaches may be appropriate for describing nonaqueous-phase liquid (NAPL) pool dissolution. NE mass-transfer between NAPL pools and groundwater is expected to affect the dissolution flux under conditions corresponding to values of Sh'St (the modified Sherwood number ( Lxkl/ Dz) multiplied by the Stanton number ( kl/ vx))≈400, the NE and LE solutions converge, and the LE assumption is appropriate. Based on typical groundwater conditions, many cases of interest are expected to fall in this range. The parameter with the greatest impact on Sh'St is kl. The NAPL pool mass-transfer coefficient correlation of Pfannkuch [Pfannkuch, H.-O., 1984. Determination of the contaminant source strength from mass exchange processes at the petroleum-ground-water interface in shallow aquifer systems. In: Proceedings of the NWWA/API Conference on Petroleum Hydrocarbons and Organic Chemicals in Ground Water—Prevention, Detection, and Restoration, Houston, TX. Natl. Water Well Assoc., Worthington, OH, Nov. 1984, pp. 111-129.] was evaluated using the toluene pool data from Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.]. Dissolution flux predictions made with kl calculated using the Pfannkuch correlation were similar to the LE model predictions, and deviated systematically from predictions made using the average overall kl=4.76 m/day estimated by Seagren et al. [Seagren, E.A., Rittmann, B.E., Valocchi, A.J., 1998. An experimental investigation of NAPL-pool dissolution enhancement by flushing. J. Contam. Hydrol., accepted.] and from the experimental data for vx>18 m/day. The Pfannkuch correlation kl was too large for vx>≈10 m/day, possibly because of the relatively low Peclet number data used by Pfannkuch [Pfannkuch, H.-O., 1984. Determination

  4. Experimental evaluation of the pure configurational stress assumption in the flow dynamics of entangled polymer melts

    DEFF Research Database (Denmark)

    Rasmussen, Henrik K.; Bejenariu, Anca Gabriela; Hassager, Ole

    2010-01-01

    to the flow in the non-linear flow regime. This has allowed highly elastic measurements within the limit of pure orientational stress, as the time of the flow was considerably smaller than the Rouse time. A Doi-Edwards [J. Chem. Soc., Faraday Trans. 2 74, 1818-1832 (1978)] type of constitutive model...... with the assumption of pure configurational stress was accurately able to predict the startup as well as the reversed flow behavior. This confirms that this commonly used theoretical picture for the flow of polymeric liquids is a correct physical principle to apply. c 2010 The Society of Rheology. [DOI: 10.1122/1.3496378]...

  5. Assumptions used for evaluating the potential radiological consequences of a less of coolant accident for pressurized water reactors - June 1974

    International Nuclear Information System (INIS)

    Anon.

    1974-01-01

    Section 50.34 of 10 CFR Part 50 requires that each applicant for a construction permit or operating license provide an analysis and evaluation of the design and performance of structures, systems, and components of the facility with the objective of assessing the risk to public health and safety resulting from operation of the facility. The design basis loss of coolant accident is one of the postulated accidents used to evaluate the adequacy of these structures, systems, and components with respect to the public health and safety. This guide gives acceptable assumptions that may be used in evaluating the radiological consequences of this accident for a pressurized water reactor. In some cases, unusual site characteristics, plant design features, or other factors may require different assumptions which will be considered on an individual case basis. The Advisory Committee on Reactor Safeguards has been consulted concerning this guide and has concurred in the regulatory position

  6. Assumptions used for evaluating the potential radiological consequences of a loss of coolant accident for boiling water reactors - June 1974

    International Nuclear Information System (INIS)

    Anon.

    1974-01-01

    Section 50.34 of 10 CFR Part 50 requires that each applicant for a construction permit or operating license provide an analysis and evaluation of the design and performance of structures, systems, and components of the facility with the objective of assessing the risk to public health and safety resulting from operation of the facility. The design basis loss of coolant accident is one of the postulated accidents used to evaluate the adequacy of these structures, systems, and components with respect to the public health and safety. This guide gives acceptable assumptions that may be used in evaluating the radiological consequences of this accident for a pressurized water reactor. In some cases, unusual site characteristics, plant design features, or other factors may require different assumptions which will be considered on an individual case basis. The Advisory Committee on Reactor Safeguards has been consulted concerning this guide and has concurred in the regulatory position

  7. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    Science.gov (United States)

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  8. Basic conceptions for reactor pressure vessel manipulators and their evaluation

    International Nuclear Information System (INIS)

    Popp, P.

    1987-01-01

    The study deals with application fields and basic design conceptions of manipulators in reactor pressure vessels as well as their evaluation. It is shown that manipulators supported at the reactor flange have essential advantages

  9. A basic evaluated neutronic data file for elemental scandium

    International Nuclear Information System (INIS)

    Smith, A.B.; Meadows, J.W.; Howerton, R.J.

    1992-01-01

    This report documents an evaluated neutronic data file for elemental scandium, presented in the ENDF/B-VI format. This file should provide basic nuclear data essential for neutronic calculations involving elemental scandium. No equivalent file was previously available

  10. False assumptions.

    Science.gov (United States)

    Swaminathan, M

    1997-01-01

    Indian women do not have to be told the benefits of breast feeding or "rescued from the clutches of wicked multinational companies" by international agencies. There is no proof that breast feeding has declined in India; in fact, a 1987 survey revealed that 98% of Indian women breast feed. Efforts to promote breast feeding among the middle classes rely on such initiatives as the "baby friendly" hospital where breast feeding is promoted immediately after birth. This ignores the 76% of Indian women who give birth at home. Blaming this unproved decline in breast feeding on multinational companies distracts attention from more far-reaching and intractable effects of social change. While the Infant Milk Substitutes Act is helpful, it also deflects attention from more pressing issues. Another false assumption is that Indian women are abandoning breast feeding to comply with the demands of employment, but research indicates that most women give up employment for breast feeding, despite the economic cost to their families. Women also seek work in the informal sector to secure the flexibility to meet their child care responsibilities. Instead of being concerned about "teaching" women what they already know about the benefits of breast feeding, efforts should be made to remove the constraints women face as a result of their multiple roles and to empower them with the support of families, governmental policies and legislation, employers, health professionals, and the media.

  11. Logic Assumptions and Risks Framework Applied to Defence Campaign Planning and Evaluation

    Science.gov (United States)

    2013-05-01

    based on prescriptive targets of reduction in particular crime statistics in a certain timeframe. Similarly, if overall desired effects are not well...the Evaluation Journal of Australasia, Australasian Evaluation Society.  UNCLASSIFIED 21 UNCLASSIFIED DSTO-TR-2840 These six campaign functions...Callahan’s article in “Anecdotally” Newsletter January 2013, Anecdote Pty Ltd., a commercial consultancy specialising in narrative technique for business

  12. Evaluation of the assumption of continuity: outline of a new tool

    OpenAIRE

    Inácio, Helena Coelho; Serrano Moracho, Francisco

    2010-01-01

    The evaluation of going-concern is one of the most visible elements of the auditor’s reports. Usually the auditor is criticized about is incapacity of identified the going-concern red flags. The auditor report doesn’t have always the effects that we expect, but there is evidence of some effects and it is an additional element to be considered in the moment of a decision. For these reasons, some statically models have been developed to help auditors in the evaluation of going-conce...

  13. Spiral CT in kidney: assumption of renal function by objective evaluation of renal cortical enhancement

    International Nuclear Information System (INIS)

    Choi, Bo Yoon; Lee, Jong Seok; Lee, Joon Woo; Myung, Jae Sung; Sim, Jung Suk; Seong, Chang Kyu; Kim, Seung Hyup; Choi, Guk Myeong; Chi, Seong Whi

    2000-01-01

    To correlate the degree of renal cortical enhancement, objectively evaluated by means of spiral CT with the serum level of creatinine, and to determine the extent to which this degree of enhancement may be used to detect renal parenchymal disease. Eighty patients (M:F = 50:30; age + 25-19, (mean 53) years) with available serum level of creatinine who underwent spiral CT between September and October 1999 were included in this study. In fifty patients the findings suggested hepatic or biliary diseases such as hepatoma, biliary cancer, or stone, while in thirty, renal diseases such as cyst, hematoma, or stone appeared to be present. Spiral CT imaging of the cortical phase was obtained at 30-40 seconds after the injection of 120 ml of non-ionic media at a rate of 3 ml/sec. The degree of renal cortical enhancement was calculated by dividing the CT attenuation number of renal cortex at the level of the renal hilum by the CT attenuation number of aorta at the same level. The degree of renal cortical enhancement was compared with the serum level of creatinine, and the degree of renal cortical enhancement in renal parenchymal disease with that of the normal group. Among eighty patients there were five with renal parenchymal disease and 75 with normal renal function. The ratio of the CT attenuation number of renal cortex to that of aorta at the level of the renal hilum ranged between 0.49 and 0.99 (mean, 0.79; standard deviation, 0.15). while the serum level of creatinine ranged between 0.6 and 3.2 mg/dl. There was significant correlation (coefficient of -0.346) and a statistically significant probability of 0.002 between the ratio of the CT attenuation numbers and the serum level of creatinine. There was a significant difference (statistically significant probability of less than 0.01) between those with renal parenchymal disease and the normal group. The use of spiral CT to measure the degree of renal cortical enhancement provides not only an effective index for

  14. Using Instrument Simulators and a Satellite Database to Evaluate Microphysical Assumptions in High-Resolution Simulations of Hurricane Rita

    Science.gov (United States)

    Hristova-Veleva, S. M.; Chao, Y.; Chau, A. H.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Martin, J. M.; Poulsen, W. L.; Rodriguez, E.; Stiles, B. W.; Turk, J.; Vu, Q.

    2009-12-01

    Improving forecasting of hurricane intensity remains a significant challenge for the research and operational communities. Many factors determine a tropical cyclone’s intensity. Ultimately, though, intensity is dependent on the magnitude and distribution of the latent heating that accompanies the hydrometeor production during the convective process. Hence, the microphysical processes and their representation in hurricane models are of crucial importance for accurately simulating hurricane intensity and evolution. The accurate modeling of the microphysical processes becomes increasingly important when running high-resolution models that should properly reflect the convective processes in the hurricane eyewall. There are many microphysical parameterizations available today. However, evaluating their performance and selecting the most representative ones remains a challenge. Several field campaigns were focused on collecting in situ microphysical observations to help distinguish between different modeling approaches and improve on the most promising ones. However, these point measurements cannot adequately reflect the space and time correlations characteristic of the convective processes. An alternative approach to evaluating microphysical assumptions is to use multi-parameter remote sensing observations of the 3D storm structure and evolution. In doing so, we could compare modeled to retrieved geophysical parameters. The satellite retrievals, however, carry their own uncertainty. To increase the fidelity of the microphysical evaluation results, we can use instrument simulators to produce satellite observables from the model fields and compare to the observed. This presentation will illustrate how instrument simulators can be used to discriminate between different microphysical assumptions. We will compare and contrast the members of high-resolution ensemble WRF model simulations of Hurricane Rita (2005), each member reflecting different microphysical assumptions

  15. BASIC

    DEFF Research Database (Denmark)

    Hansen, Pelle Guldborg; Schmidt, Karsten

    2017-01-01

    De sidste 10 år har vi været vidner til opkomsten af et nyt evidensbaseret policy paradigme, Behavioural Public Policy (BPP), der søger at integrere teoretiske og metodiske indsigter fra adfærdsvidenskaberne i offentlig politikudvikling. Arbejdet med BPP har dog båret præg af, at være usystematisk...... BPP. Tilgangen består dels af den overordnede proces-model BASIC og dels af et iboende framework, ABCD, der er en model for systematisk adfærdsanalyse, udvikling, test og implementering af adfærdsrettede løsningskoncepter. Den samlede model gør det muligt for forskere såvel som offentligt ansatte...

  16. EVALUATION OF BASIC COURSE WORKSHOP CONDUCTED IN A MEDICAL COLLEGE

    OpenAIRE

    Manasee Panda; Krishna Kar; Kaushik Mishra

    2017-01-01

    BACKGROUND Faculty development is perhaps one of the foremost issues among the factors influencing the quality of medical education. It was planned to evaluate Basic course workshop (BCW) on Medical education Technologies (MET) conducted in the institution with following objectives 1. To assess the effectiveness of the B CW in MET conducted in the Medical College. 2. To study the changes in teaching practices and assessment methods of faculties after the workshop. MATERIALS ...

  17. [Basic principles and methodological considerations of health economic evaluations].

    Science.gov (United States)

    Loza, Cesar; Castillo-Portilla, Manuel; Rojas, José Luis; Huayanay, Leandro

    2011-01-01

    Health Economics is an essential instrument for health management, and economic evaluations can be considered as tools assisting the decision-making process for the allocation of resources in health. Currently, economic evaluations are increasingly being used worldwide, thus encouraging evidence-based decision-making and seeking efficient and rational alternatives within the framework of health services activities. In this review, we present an overview and define the basic types of economic evaluations, with emphasis on complete Economic Evaluations (EE). In addition, we review key concepts regarding the perspectives from which EE can be conducted, the types of costs that can be considered, the time horizon, discounting, assessment of uncertainty and decision rules. Finally, we describe concepts about the extrapolation and spread of economic evaluations in health.

  18. Multiverse Assumptions and Philosophy

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2018-02-01

    Full Text Available Multiverses are predictions based on theories. Focusing on each theory’s assumptions is key to evaluating a proposed multiverse. Although accepted theories of particle physics and cosmology contain non-intuitive features, multiverse theories entertain a host of “strange” assumptions classified as metaphysical (outside objective experience, concerned with fundamental nature of reality, ideas that cannot be proven right or wrong topics such as: infinity, duplicate yous, hypothetical fields, more than three space dimensions, Hilbert space, advanced civilizations, and reality established by mathematical relationships. It is easy to confuse multiverse proposals because many divergent models exist. This overview defines the characteristics of eleven popular multiverse proposals. The characteristics compared are: initial conditions, values of constants, laws of nature, number of space dimensions, number of universes, and fine tuning explanations. Future scientific experiments may validate selected assumptions; but until they do, proposals by philosophers may be as valid as theoretical scientific theories.

  19. Sampling Assumptions in Inductive Generalization

    Science.gov (United States)

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  20. Basic tests on integrity evaluation for natural hexafluoride transporting container

    International Nuclear Information System (INIS)

    Gomi, Yoshio; Yamakawa, Hidetsugu; Kato, Osamu; Kobayashi, Seiichi

    1990-01-01

    In this study, the affected factors that needed to integrity evaluation for UF 6 transporting 48Y cylinder, were confirmed by basic tests and preliminary analysis. The factors were the sealing parts and external surface emissivity that ruled both the behavior under fire accident condition and the fire resistance capability of the cylinder, and the external pressure resistance capability at the sunk accident. The results obtained as follows. (1) Confirming tests for fire resistance of cylinder valve and plug, seat leakage of the valve caused at 150 degrees C. by unequal thermal expansion between the valve body and the stem. The tin-lead solder coating the tapered thread of valve and plug, melted at 200 degrees C., then the sealing boundary broke. (2) An external emissivity influence to radiation heat transfer measured with test pieces heated by electric oven. The covered paints of the specimen burned and separated, the emissivity changed 0.4 to 0.6, dependent on the surrounding temperature. Type 48Y cylinder filled with 12.5 tons of UF 6 and the measured emissivity was used the computer code analysis. The hydraulic breaking did not happen under the fire accident condition at 800 degrees C., for 30 minutes. (3) The external pressure test of the valve endured the hydrostatic pressure at 3000 meters, which corresponded to about five times the cylinder body buckling strength. (author)

  1. EVALUATION OF BASIC COURSE WORKSHOP CONDUCTED IN A MEDICAL COLLEGE

    Directory of Open Access Journals (Sweden)

    Manasee Panda

    2017-08-01

    Full Text Available BACKGROUND Faculty development is perhaps one of the foremost issues among the factors influencing the quality of medical education. It was planned to evaluate Basic course workshop (BCW on Medical education Technologies (MET conducted in the institution with following objectives 1. To assess the effectiveness of the B CW in MET conducted in the Medical College. 2. To study the changes in teaching practices and assessment methods of faculties after the workshop. MATERIALS AND METHODS Present Evaluation study was conducted at the RTC (SCB Medical College, Odisha of MCI in MET from February 2012 to December 2012. Kirkpatrick’s model with four levels of program outcomes (reaction, learning, behaviour, and result was used to evaluate the effectiveness of workshop. Convenient sampling method was used. All the faculties in the first 4 batches of the workshop were the study participants. Data was collected from the record of the RTC from the filled in Feedback form, PrePst test forms, filled semi structured questionnaire from the participants, in-depth interview of facilitators and focus group discussion of students. Descriptive statistics like percentage, Proportions and Chi-square test used. RESULTS A total of 67 faculties responded to the questionnaire. There was gain in knowledge for majority of faculties in different teaching learning process and assessment methods due to the workshop. More than 90% of faculties had the attitude to practice interactive teaching, PBL and preparing MCQs and structured oral questions. Self-reported change in teaching behavior and assessment method was reported by more than 80% of the faculties. Reasons for non- implementation were given as the lack of support from the institution (64%, from other faculties (34%,lack of self-motivation(13%.Facilitators were satisfied with the quality of training. But FGD conducted for the students revealed that they failed to recognize noticeable change in the teaching and

  2. Evaluating the assessment of essay type questions in the basic ...

    African Journals Online (AJOL)

    Methodology: We examined the merits and demerits of the closed and open systems of assessment of essay type questions and viva voce in professional exams in the Basic Medical Sciences together with the challenges of present day Medical Education. Result: The result showed that the closed system of marking in its ...

  3. Adult Learning Assumptions

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to examine Knowles' theory of andragogy and his six assumptions of how adults learn while providing evidence to support two of his assumptions based on the theory of andragogy. As no single theory explains how adults learn, it can best be assumed that adults learn through the accumulation of formal and informal…

  4. What is the proper evaluation method: Some basic considerations

    International Nuclear Information System (INIS)

    Leeb, Helmut; Schnabel, Georg; Srdinko, Thomas

    2014-01-01

    Recent developments and applications demand for an extension of the energy range and the inclusion of reliable uncertainty information in nuclear data libraries. Due to the scarcity of neutron-induced reaction data beyond 20 MeV the extension of the energy range up to at least 150 MeV is not trivial because the corresponding nuclear data evaluations depend heavily on nuclear models and proper evaluation methods are still under discussion. Restricting to evaluation techniques based on Bayesian statistics the influence of the a priori knowledge on the final result of the evaluation is considered. The study clearly indicates the need to account properly for the deficiencies of the nuclear model. Concerning the covariance matrices it is argued that they depend not only on the model, but also on the method of generation and an additional consent is required for the comparison of different evaluations of the same data sets. (authors)

  5. Basic approach to evaluate methane partial oxidation catalysts

    CSIR Research Space (South Africa)

    Parmaliana, A

    1993-09-01

    Full Text Available -phase reaction does not affect the catalytic pathways. Reasons for controversial results reported previously are discussed. They lie in the lack of an adequate experimental approach and in the generally adopted rule to evaluate the catalytic activity...

  6. Brand Evaluation - A Basic Feature in Modern Brand Management

    Directory of Open Access Journals (Sweden)

    Cosmin IRIMIEŞ

    2012-10-01

    Full Text Available Defined as the sum of features that make a subject unique, the brand has turned into one of the most important characteristics of the way products, services and institutions conduct their public relations or are presented to the contemporary consumer. Taking into consideration that branding is an extremely flexible process and can be applied to a very wide range of subjects, the brand management has become one of the most important instruments of modern marketing and is used in every selling/buying transaction. The purpose of this article is to make a comprehensive analysis of the evaluation methods of brands, to present the situations that usually need a brand evaluation as well as to see whether Romania has made any progress from this point of view.

  7. Evaluating the Sensitivity of the Mass-Based Particle Removal Calculations for HVAC Filters in ISO 16890 to Assumptions for Aerosol Distributions

    Directory of Open Access Journals (Sweden)

    Brent Stephens

    2018-02-01

    Full Text Available High efficiency particle air filters are increasingly being recommended for use in heating, ventilating, and air-conditioning (HVAC systems to improve indoor air quality (IAQ. ISO Standard 16890-2016 provides a methodology for approximating mass-based particle removal efficiencies for PM1, PM2.5, and PM10 using size-resolved removal efficiency measurements for 0.3 µm to 10 µm particles. Two historical volume distribution functions for ambient aerosol distributions are assumed to represent ambient air in urban and rural areas globally. The goals of this work are to: (i review the ambient aerosol distributions used in ISO 16890, (ii evaluate the sensitivity of the mass-based removal efficiency calculation procedures described in ISO 16890 to various assumptions that are related to indoor and outdoor aerosol distributions, and (iii recommend several modifications to the standard that can yield more realistic estimates of mass-based removal efficiencies for HVAC filters, and thus provide a more realistic representation of a greater number of building scenarios. The results demonstrate that knowing the PM mass removal efficiency estimated using ISO 16890 is not sufficient to predict the PM mass removal efficiency in all of the environments in which the filter might be used. The main reason for this insufficiency is that the assumptions for aerosol number and volume distributions can substantially impact the results, albeit with some exceptions.

  8. Sensitivity Analysis Without Assumptions.

    Science.gov (United States)

    Ding, Peng; VanderWeele, Tyler J

    2016-05-01

    Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.

  9. Technical note: Evaluation of the simultaneous measurements of mesospheric OH, HO2, and O3 under a photochemical equilibrium assumption - a statistical approach

    Science.gov (United States)

    Kulikov, Mikhail Y.; Nechaev, Anton A.; Belikovich, Mikhail V.; Ermakova, Tatiana S.; Feigin, Alexander M.

    2018-05-01

    This Technical Note presents a statistical approach to evaluating simultaneous measurements of several atmospheric components under the assumption of photochemical equilibrium. We consider simultaneous measurements of OH, HO2, and O3 at the altitudes of the mesosphere as a specific example and their daytime photochemical equilibrium as an evaluating relationship. A simplified algebraic equation relating local concentrations of these components in the 50-100 km altitude range has been derived. The parameters of the equation are temperature, neutral density, local zenith angle, and the rates of eight reactions. We have performed a one-year simulation of the mesosphere and lower thermosphere using a 3-D chemical-transport model. The simulation shows that the discrepancy between the calculated evolution of the components and the equilibrium value given by the equation does not exceed 3-4 % in the full range of altitudes independent of season or latitude. We have developed a statistical Bayesian evaluation technique for simultaneous measurements of OH, HO2, and O3 based on the equilibrium equation taking into account the measurement error. The first results of the application of the technique to MLS/Aura data (Microwave Limb Sounder) are presented in this Technical Note. It has been found that the satellite data of the HO2 distribution regularly demonstrate lower altitudes of this component's mesospheric maximum. This has also been confirmed by model HO2 distributions and comparison with offline retrieval of HO2 from the daily zonal means MLS radiance.

  10. Contextuality under weak assumptions

    International Nuclear Information System (INIS)

    Simmons, Andrew W; Rudolph, Terry; Wallman, Joel J; Pashayan, Hakop; Bartlett, Stephen D

    2017-01-01

    The presence of contextuality in quantum theory was first highlighted by Bell, Kochen and Specker, who discovered that for quantum systems of three or more dimensions, measurements could not be viewed as deterministically revealing pre-existing properties of the system. More precisely, no model can assign deterministic outcomes to the projectors of a quantum measurement in a way that depends only on the projector and not the context (the full set of projectors) in which it appeared, despite the fact that the Born rule probabilities associated with projectors are independent of the context. A more general, operational definition of contextuality introduced by Spekkens, which we will term ‘probabilistic contextuality’, drops the assumption of determinism and allows for operations other than measurements to be considered contextual. Even two-dimensional quantum mechanics can be shown to be contextual under this generalised notion. Probabilistic noncontextuality represents the postulate that elements of an operational theory that cannot be distinguished from each other based on the statistics of arbitrarily many repeated experiments (they give rise to the same operational probabilities) are ontologically identical. In this paper, we introduce a framework that enables us to distinguish between different noncontextuality assumptions in terms of the relationships between the ontological representations of objects in the theory given a certain relation between their operational representations. This framework can be used to motivate and define a ‘possibilistic’ analogue, encapsulating the idea that elements of an operational theory that cannot be unambiguously distinguished operationally can also not be unambiguously distinguished ontologically. We then prove that possibilistic noncontextuality is equivalent to an alternative notion of noncontextuality proposed by Hardy. Finally, we demonstrate that these weaker noncontextuality assumptions are sufficient to prove

  11. The use of the SF-36 questionnaire in adult survivors of childhood cancer: evaluation of data quality, score reliability, and scaling assumptions

    Directory of Open Access Journals (Sweden)

    Winter David L

    2006-10-01

    Full Text Available Abstract Background The SF-36 has been used in a number of previous studies that have investigated the health status of childhood cancer survivors, but it never has been evaluated regarding data quality, scaling assumptions, and reliability in this population. As health status among childhood cancer survivors is being increasingly investigated, it is important that the measurement instruments are reliable, validated and appropriate for use in this population. The aim of this paper was to determine whether the SF-36 questionnaire is a valid and reliable instrument in assessing self-perceived health status of adult survivors of childhood cancer. Methods We examined the SF-36 to see how it performed with respect to (1 data completeness, (2 distribution of the scale scores, (3 item-internal consistency, (4 item-discriminant validity, (5 internal consistency, and (6 scaling assumptions. For this investigation we used SF-36 data from a population-based study of 10,189 adult survivors of childhood cancer. Results Overall, missing values ranged per item from 0.5 to 2.9 percent. Ceiling effects were found to be highest in the role limitation-physical (76.7% and role limitation-emotional (76.5% scales. All correlations between items and their hypothesised scales exceeded the suggested standard of 0.40 for satisfactory item-consistency. Across all scales, the Cronbach's alpha coefficient of reliability was found to be higher than the suggested value of 0.70. Consistent across all cancer groups, the physical health related scale scores correlated strongly with the Physical Component Summary (PCS scale scores and weakly with the Mental Component Summary (MCS scale scores. Also, the mental health and role limitation-emotional scales correlated strongly with the MCS scale score and weakly with the PCS scale score. Moderate to strong correlations with both summary scores were found for the general health perception, energy/vitality, and social functioning

  12. An evaluation of the 18- and 12-month basic postgraduate training programmes in Denmark

    DEFF Research Database (Denmark)

    Kjaer, Niels Kristian; Qvesel, Dorte; Kodal, Troels

    2010-01-01

    equipped and less ready for continued specialisation than doctors of the 18-month programme and they requested a downward adjustment of the learning objectives associated with the educational positions which follow their basic training. Physicians do not expect the increased focus on learning...... and new programmes evaluate their training, and it explores their attitudes towards the new postgraduate training programme. MATERIAL AND METHODS: We developed a questionnaire by which quantitative and qualitative data were collected. The questionnaire was sent to all physicians following basic...... and supervision to compensate for the six-month reduction of the training period. Internal medicine should be included in the basic postgraduate training of all physicians. Training in secondary as well as primary health care was requested. CONCLUSION: The young physicians were reluctant towards the new basic...

  13. Designing an evaluation framework for WFME basic standards for medical education.

    Science.gov (United States)

    Tackett, Sean; Grant, Janet; Mmari, Kristin

    2016-01-01

    To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.

  14. Evaluating the Effects of Basic Skills Mathematics Placement on Academic Outcomes of Community College Students

    Science.gov (United States)

    Melguizo, Tatiana; Bo, Hans; Prather, George; Kim, Bo

    2011-01-01

    The main objective of the authors' proposed study is to evaluate the effectiveness of math placement policies for entering community college students on these students' academic success in math, and their transfer and graduation rates. The main research question that guides the proposed study is: What are the effects of various basic skills…

  15. Teaching Basic Cooking Skills: Evaluation of the North Carolina Extension "Cook Smart, Eat Smart" Program

    Science.gov (United States)

    Dunn, Carolyn; Jayaratne, K. S. U.; Baughman, Kristen; Levine, Katrina

    2014-01-01

    Cook Smart, Eat Smart (CSES) is a 12-hour cooking school that teaches participants to prepare nutritious, delicious food using simple, healthy preparation techniques, basic ingredients, and minimal equipment. The purpose of this evaluation was to examine the impact of CSES on food preparation and meal consumption behavior. Program outcomes include…

  16. Evaluation of Achievement of Universal Basic Education (UBE) in Delta State

    Science.gov (United States)

    Osadebe, P. U.

    2014-01-01

    The study evaluated the objectives of the Universal Basic Education (UBE) programme in Delta State. It considered the extent to which each objective was achieved. A research question on the extent to which the UBE objectives were achieved guided the study. Two hypotheses were tested. A sample of 300 students was randomly drawn through the use of…

  17. Testing Our Fundamental Assumptions

    Science.gov (United States)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  18. A simulation study to compare three self-controlled case series approaches: correction for violation of assumption and evaluation of bias.

    Science.gov (United States)

    Hua, Wei; Sun, Guoying; Dodd, Caitlin N; Romio, Silvana A; Whitaker, Heather J; Izurieta, Hector S; Black, Steven; Sturkenboom, Miriam C J M; Davis, Robert L; Deceuninck, Genevieve; Andrews, N J

    2013-08-01

    The assumption that the occurrence of outcome event must not alter subsequent exposure probability is critical for preserving the validity of the self-controlled case series (SCCS) method. This assumption is violated in scenarios in which the event constitutes a contraindication for exposure. In this simulation study, we compared the performance of the standard SCCS approach and two alternative approaches when the event-independent exposure assumption was violated. Using the 2009 H1N1 and seasonal influenza vaccines and Guillain-Barré syndrome as a model, we simulated a scenario in which an individual may encounter multiple unordered exposures and each exposure may be contraindicated by the occurrence of outcome event. The degree of contraindication was varied at 0%, 50%, and 100%. The first alternative approach used only cases occurring after exposure with follow-up time starting from exposure. The second used a pseudo-likelihood method. When the event-independent exposure assumption was satisfied, the standard SCCS approach produced nearly unbiased relative incidence estimates. When this assumption was partially or completely violated, two alternative SCCS approaches could be used. While the post-exposure cases only approach could handle only one exposure, the pseudo-likelihood approach was able to correct bias for both exposures. Violation of the event-independent exposure assumption leads to an overestimation of relative incidence which could be corrected by alternative SCCS approaches. In multiple exposure situations, the pseudo-likelihood approach is optimal; the post-exposure cases only approach is limited in handling a second exposure and may introduce additional bias, thus should be used with caution. Copyright © 2013 John Wiley & Sons, Ltd.

  19. IAEA consultants' meeting on selection of basic evaluations for the FENDL-2 library. Summary report

    International Nuclear Information System (INIS)

    Pashchenko, A.B.

    1996-09-01

    FENDL-1 is the international reference nuclear data library for fusion design applications, available from the IAEA Nuclear Data Section. FENDL/E is the sublibrary for evaluated neutron reaction data. An updated version, FENDL-2, is being developed. The present report contains the Summary of the IAEA Consultants' Meeting on ''Selection of Basic Evaluations for the FENDL-2 Library'', held at Karlsruhe, Germany, from 24 to 28 June 1996. This meeting was organized by the IAEA Nuclear Data Section (NDS) with the co-operation and assistance of local organizers of the Forschungszentrum Karlsruhe, Germany. Summarized are the conclusions and recommendations for the selection of basic evaluations from candidates submitted by five national projects (JENDL-FF, BROND, EFF, ENDF/B-VI and CENDL) for FENDL/E-2.0 international reference data library. (author). 1 tab

  20. Basic And Alternative Rules In Evaluation Of Tangible And Intangible Assets

    OpenAIRE

    Luminiţa Rus

    2010-01-01

    The purpose of this report is to bring to the forefront the basic and alternative national rules in evaluation of tangible and intangible assets approved by the Order of the Ministry of Public Finance no. 3055/2009, compared with the International Standards of Accounting matters and positioning of this accounting treatment in the context of the International Regulations. It also is reviewing fiscal influence of these valuation rules.

  1. BASIC AND ALTERNATIVE RULES IN EVALUATION OF TANGIBLE AND INTANGIBLE ASSETS

    Directory of Open Access Journals (Sweden)

    LUMINIŢA RUS

    2010-01-01

    Full Text Available The purpose of this report is to bring to the forefront the basic and alternative national rules in evaluation of tangible and intangible assets approved by the Order of the Ministry of Public Finance no. 3055/2009, compared with the International Standards of Accounting matters and positioning of this accounting treatment in the context of the International Regulations. It also is reviewing fiscal influence of these valuation rules.

  2. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Basic Test Framework for the Evaluation of Text Line Segmentation and Text Parameter Extraction

    Directory of Open Access Journals (Sweden)

    Darko Brodić

    2010-05-01

    Full Text Available Text line segmentation is an essential stage in off-line optical character recognition (OCR systems. It is a key because inaccurately segmented text lines will lead to OCR failure. Text line segmentation of handwritten documents is a complex and diverse problem, complicated by the nature of handwriting. Hence, text line segmentation is a leading challenge in handwritten document image processing. Due to inconsistencies in measurement and evaluation of text segmentation algorithm quality, some basic set of measurement methods is required. Currently, there is no commonly accepted one and all algorithm evaluation is custom oriented. In this paper, a basic test framework for the evaluation of text feature extraction algorithms is proposed. This test framework consists of a few experiments primarily linked to text line segmentation, skew rate and reference text line evaluation. Although they are mutually independent, the results obtained are strongly cross linked. In the end, its suitability for different types of letters and languages as well as its adaptability are its main advantages. Thus, the paper presents an efficient evaluation method for text analysis algorithms.

  4. Basic evaluation of typical nanoporous silica nanoparticles in being drug carrier: Structure, wettability and hemolysis.

    Science.gov (United States)

    Li, Jing; Guo, Yingyu

    2017-04-01

    Herein, the present work devoted to study the basic capacity of nanoporous silica nanoparticles in being drug carrier that covered structure, wettability and hemolysis so as to provide crucial evaluation. Typical nanoporous silica nanoparticles that consist of nanoporous silica nanoparticles (NSN), amino modified nanoporous silica nanoparticles (amino-NSN), carboxyl modified nanoporous silica nanoparticles (carboxyl-NSN) and hierachical nanoporous silica nanoparticles (hierachical-NSN) were studied. The results showed that their wettability and hemolysis were closely related to structure and surface modification. Basically, wettability became stronger as the amount of OH on the surface of NSN was higher. Both large nanopores and surface modification can reduce the wettability of NSN. Furthermore, NSN series were safe to be used when they circulated into the blood in low concentration, while if high concentration can not be avoided during administration, high porosity or amino modification of NSN were safer to be considered. It is believed that the basic evaluation of NSN can make contribution in providing scientific instruction for designing drug loaded NSN systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Basic principles on the safety evaluation of the HTGR hydrogen production system

    International Nuclear Information System (INIS)

    Ohashi, Kazutaka; Nishihara, Tetsuo; Tazawa, Yujiro; Tachibana, Yukio; Kunitomi, Kazuhiko

    2009-03-01

    As HTGR hydrogen production systems, such as HTTR-IS system or GTHTR300C currently being developed by Japan Atomic Energy Agency, consists of nuclear reactor and chemical plant, which are without a precedent in the world, safety design philosophy and regulatory framework should be newly developed. In this report, phenomena to be considered and events to be postulated in the safety evaluation of the HTGR hydrogen production systems were investigated and basic principles to establish acceptance criteria for the explosion and toxic gas release accidents were provided. Especially for the explosion accident, quantitative criteria to the reactor building are proposed with relating sample calculation results. It is necessary to treat abnormal events occurred in the hydrogen production system as an 'external events to the nuclear plant' in order to classify the hydrogen production system as no-nuclear facility' and basic policy to meet such requirement was also provided. (author)

  6. Basic life support: evaluation of learning using simulation and immediate feedback devices1.

    Science.gov (United States)

    Tobase, Lucia; Peres, Heloisa Helena Ciqueto; Tomazini, Edenir Aparecida Sartorelli; Teodoro, Simone Valentim; Ramos, Meire Bruna; Polastri, Thatiane Facholi

    2017-10-30

    to evaluate students' learning in an online course on basic life support with immediate feedback devices, during a simulation of care during cardiorespiratory arrest. a quasi-experimental study, using a before-and-after design. An online course on basic life support was developed and administered to participants, as an educational intervention. Theoretical learning was evaluated by means of a pre- and post-test and, to verify the practice, simulation with immediate feedback devices was used. there were 62 participants, 87% female, 90% in the first and second year of college, with a mean age of 21.47 (standard deviation 2.39). With a 95% confidence level, the mean scores in the pre-test were 6.4 (standard deviation 1.61), and 9.3 in the post-test (standard deviation 0.82, p basic cardiopulmonary resuscitation, according to the feedback device; 43.7 (standard deviation 26.86) mean duration of the compression cycle by second of 20.5 (standard deviation 9.47); number of compressions 167.2 (standard deviation 57.06); depth of compressions of 48.1 millimeter (standard deviation 10.49); volume of ventilation 742.7 (standard deviation 301.12); flow fraction percentage of 40.3 (standard deviation 10.03). the online course contributed to learning of basic life support. In view of the need for technological innovations in teaching and systematization of cardiopulmonary resuscitation, simulation and feedback devices are resources that favor learning and performance awareness in performing the maneuvers.

  7. Basic life support: evaluation of learning using simulation and immediate feedback devices

    Directory of Open Access Journals (Sweden)

    Lucia Tobase

    2017-10-01

    Full Text Available ABSTRACT Objective: to evaluate students’ learning in an online course on basic life support with immediate feedback devices, during a simulation of care during cardiorespiratory arrest. Method: a quasi-experimental study, using a before-and-after design. An online course on basic life support was developed and administered to participants, as an educational intervention. Theoretical learning was evaluated by means of a pre- and post-test and, to verify the practice, simulation with immediate feedback devices was used. Results: there were 62 participants, 87% female, 90% in the first and second year of college, with a mean age of 21.47 (standard deviation 2.39. With a 95% confidence level, the mean scores in the pre-test were 6.4 (standard deviation 1.61, and 9.3 in the post-test (standard deviation 0.82, p <0.001; in practice, 9.1 (standard deviation 0.95 with performance equivalent to basic cardiopulmonary resuscitation, according to the feedback device; 43.7 (standard deviation 26.86 mean duration of the compression cycle by second of 20.5 (standard deviation 9.47; number of compressions 167.2 (standard deviation 57.06; depth of compressions of 48.1 millimeter (standard deviation 10.49; volume of ventilation 742.7 (standard deviation 301.12; flow fraction percentage of 40.3 (standard deviation 10.03. Conclusion: the online course contributed to learning of basic life support. In view of the need for technological innovations in teaching and systematization of cardiopulmonary resuscitation, simulation and feedback devices are resources that favor learning and performance awareness in performing the maneuvers.

  8. Basic data generation and pressure loss coefficient evaluation for HANARO core thermal-hydraulic analyses

    International Nuclear Information System (INIS)

    Chae, Hee Taek; Lee, Kye Hong

    1999-06-01

    MATRA-h, a HANARO subchannel analysis computer code, is used to evaluate thermal margin of the HANARO fuel. It's capability includes the assessments of CHF, ONB margin, and fuel temperature. In this report, basic input data and core design parameters required to perform the subchannel analysis with MATRA-h code are collected. These data include the subchannel geometric data, thermal-hydraulic correlations, empirical constants and material properties. The friction and form loss coefficients of the fuel assemblies were determined based on the results of the pressure drop test. At the same time, different form loss coefficients at the end plates and spacers are evaluated for various subchannels. The adequate correlations are applied to the evaluation of the form loss coefficients for various subchannels, which are corrected by measured values in order to have a same pressure drop at each flow channel. These basic input data and design parameters described in this report will be applied usefully to evaluate the thermal margin of the HANARO fuel. (author). 11 refs., 13 tabs., 11 figs

  9. GPRA (Government Performance and Results Act) and research evaluation for basic science

    International Nuclear Information System (INIS)

    Takahashi, Shoji

    2002-08-01

    The purpose of the Government Performance and Results Act of 1993 (GPRA) is to ask federal agencies for evaluating their program performance especially from cost-efficiency aspect and to report to Congress. GPRA is to hold agencies accountable for their programs by requiring that they think strategically (in most cases every 5 years) and set, measure and report goals annually. The agencies which have responsibilities for enhancing basic science like Department of Energy (DOE) and National Science Fund (NSF) are not excluded by reasons of the difficulties of economic evaluations. In Japan, based on 'the Rationalization program for the public corporations' of 2001, the research developing type corporations should make a cost-performance evaluation in addition to the conventional ones. They have same theme as US agencies struggles. The purpose of this report is to get some hints for this theme by surveying GPRA reports of DOE and NSF and analyzing related information. At present, I have to conclude although everybody accepts the necessities of socio-economic evaluations and investment criteria for basic research, studies and discussions about ways and means are still continuing even in the US. (author)

  10. Evaluation of the implementation of a quality system in a basic research laboratory: viability and impacts.

    Science.gov (United States)

    Fraga, Hilda Carolina de Jesus Rios; Fukutani, Kiyoshi Ferreira; Celes, Fabiana Santana; Barral, Aldina Maria Prado; Oliveira, Camila Indiani de

    2012-01-01

    To evaluate the process of implementing a quality management system in a basic research laboratory of a public institution, particularly considering the feasibility and impacts of this improvement. This was a prospective and qualitative study. We employed the norm "NIT DICLA 035--Princípios das Boas Práticas de Laboratório (BPL)" and auxiliary documents of Organisation for Economic Co-operation and Development to complement the planning and implementation of a Quality System, in a basic research laboratory. In parallel, we used the PDCA tool to define the goals of each phase of the implementation process. This study enabled the laboratory to comply with the NIT DICLA 035 norm and to implement this norm during execution of a research study. Accordingly, documents were prepared and routines were established such as the registration of non-conformities, traceability of research data and equipment calibration. The implementation of a quality system, the setting of a laboratory focused on basic research is feasible once certain structural changes are made. Importantly, impacts were noticed during the process, which could be related to several improvements in the laboratory routine.

  11. Basic concepts and assumptions behind the ICRP recommendations

    International Nuclear Information System (INIS)

    Lindell, B.

    1981-03-01

    The paper gives a review of the current radiation protection recommendations by the International Commission on Radiological Protection (ICRP). It discusses concepts like stochastic effects, radiation detriments, collective dose, dose equivalent and dose limits. (G.B.)

  12. Evaluating a hybrid web-based basic genetics course for health professionals.

    Science.gov (United States)

    Wallen, Gwenyth R; Cusack, Georgie; Parada, Suzan; Miller-Davis, Claiborne; Cartledge, Tannia; Yates, Jan

    2011-08-01

    Health professionals, particularly nurses, continue to struggle with the expanding role of genetics information in the care of their patients. This paper describes an evaluation study of the effectiveness of a hybrid basic genetics course for healthcare professionals combining web-based learning with traditional face-to-face instructional techniques. A multidisciplinary group from the National Institutes of Health (NIH) created "Basic Genetics Education for Healthcare Providers" (BGEHCP). This program combined 7 web-based self-education modules with monthly traditional face-to-face lectures by genetics experts. The course was pilot tested by 186 healthcare providers from various disciplines with 69% (n=129) of the class registrants enrolling in a pre-post evaluation trial. Outcome measures included critical thinking knowledge items and a Web-based Learning Environment Inventory (WEBLEI). Results indicated a significant (peffectiveness particularly in the area of convenience, access and the course structure and design. Although significant increases in overall knowledge scores were achieved, scores in content areas surrounding genetic risk identification and ethical issues regarding genetic testing reflected continued gaps in knowledge. Web-based genetics education may help overcome genetics knowledge deficits by providing access for health professionals with diverse schedules in a variety of national and international settings. Published by Elsevier Ltd.

  13. Evaluating subway drivers’ exposure to whole body vibration based on Basic and VDV methods (with ISO 2631-1 standard

    Directory of Open Access Journals (Sweden)

    A. Khavanin

    2014-07-01

    Conclusion: Investigation of the result obtained from Basic method and VDV method manifested different amounts of vibration exposure in a way that VDV predicts higher level of risk, compared to basic method. The results shows that some presented indicators can not presented the safe zone in human vibration evaluations.

  14. Evaluation of Some Approved Basic Science and Technology Textbooks in Use in Junior Secondary Schools in Nigeria

    Science.gov (United States)

    Nwafor, C. E.; Umoke, C. C.

    2016-01-01

    This study was designed to evaluate the content adequacy and readability of approved basic science and technology textbooks in use in junior secondary schools in Nigeria. Eight research questions guided the study. The sample of the study consisted of six (6) approved basic science and technology textbooks, 30 Junior Secondary Schools randomly…

  15. An Evaluation of the Employee Training and Development Process for Nicolet Area Technical College's Basic Education Program.

    Science.gov (United States)

    Karl, Luis C.

    The adult basic education (ABE) program at Nicolet Area Technical College (NATC) evaluated its training and development (T&D) process for new basic education instructors. The study gathered monitoring and screening criteria that addressed valuable components for use in an instrument for validating effectiveness of the ABE program (T&D)…

  16. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  17. Pre-training evaluation and feedback improved skills retention of basic life support in medical students.

    Science.gov (United States)

    Li, Qi; Zhou, Rong-hua; Liu, Jin; Lin, Jing; Ma, Er-Li; Liang, Peng; Shi, Ting-wei; Fang, Li-qun; Xiao, Hong

    2013-09-01

    Pre-training evaluation and feedback have been shown to improve medical students' skills acquisition of basic life support (BLS) immediately following training. The impact of such training on BLS skills retention is unknown. This study was conducted to investigate effects of pre-training evaluation and feedback on BLS skills retention in medical students. Three hundred and thirty 3rd year medical students were randomized to two groups, the control group (C group) and pre-training evaluation and feedback group (EF group). Each group was subdivided into four subgroups according to the time of retention-test (at 1-, 3-, 6-, 12-month following the initial training). After a 45-min BLS lecture, BLS skills were assessed (pre-training evaluation) in both groups before training. Following this, the C group received 45 min training. 15 min of group feedback corresponding to students' performance in pre-training evaluation was given only in the EF group that was followed by 30 min of BLS training. BLS skills were assessed immediately after training (post-test) and at follow up (retention-test). No skills difference was observed between the two groups in pre-training evaluation. Better skills acquisition was observed in the EF group (85.3 ± 7.3 vs. 68.1 ± 12.2 in C group) at post-test (p<0.001). In all retention-test, better skills retention was observed in each EF subgroup, compared with its paired C subgroup. Pre-training evaluation and feedback improved skills retention in the EF group for 12 months after the initial training, compared with the control group. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. An improved method for basic hydrolysis of isoflavone malonylglucosides and quality evaluation of Chinese soy materials.

    Science.gov (United States)

    Yuan, Dan; Pan, Yingni; Chen, Yan; Uno, Toshio; Zhang, Shaohui; Kano, Yoshihiro

    2008-01-01

    Basic hydrolysis procedure is often included in the sample preparation in order to quantify malonylglucosides or acetylglucosides of soy materials. However, it is preferable not to use NaOH as a hydrolytic reagent considering the effect of its alkalinity on the successive injection to HPLC and low acidity of soy isoflavones. This paper presents an improved method for basic hydrolysis using ammonia as a hydrolytic reagent without the additional neutralization step. Moreover, by means of HPLC and LC-MS methods, a systematic quality evaluation of natural soy materials from Chinese markets were established and discussed, inclusive of soybeans, black soybeans, defatted soy flours, as well as the distribution of isoflavones in the seed coat, hypocotyl and cotyledon. The results indicate that HPLC profiling patterns of originating various isoflavone constituents of Chinese soybeans was similar to those of Japanese ones, and those of Chinese black soybeans was similar to those of American ones. The average content level of total soy isoflavones of Chinese soybeans and black soybeans were a little lower than that of American and Japanese ones. Additionally, the thorough analysis for Semen Sojae Praeparatum, a Chinese herbal medicine made from fermented black soybeans or soybeans was done for the first time and the characteristic of its HPLC profiling patterns shows the higher content of isoflavone glucosides and aglycones than those of natural soy materials.

  19. Evaluation of the basic mechanical and thermal properties of deep crystalline rocks

    International Nuclear Information System (INIS)

    Park, Byoung Yoon; Bae, Dae Seok; Kim, Chun Soo; Kim, Kyung Su; Koh, Young Kwon; Jeon, Seok Won

    2001-04-01

    This report provides the mechanical and thermal properties of granitic intact rocks obtained from Deep Core Drilling Program which is carried out as part of the assessment of deep geological environmental condition. These data are the basic material properties of the core samples from the boreholes drilled up to 500 m depth at the Yusung and Kosung sites. These sites were selected based on the result of preliminary site evaluation study. In this study, the mechanical properties include density, porosity, P-wave velocity, S-wave velocity, uniaxial compressive strength, Young's modulus, Poisson's ratio, tensile strength, and shear strength of fractures, and the thermal properties are heat conductivity, thermal expansion coefficient, specific heat and so on. Those properties were measured through laboratory tests and these data are compared with the existing test results of several domestic rocks

  20. Dosimetric quantities and basic data for the evaluation of generalised derived limits

    International Nuclear Information System (INIS)

    Harrison, N.T.; Simmonds, J.R.

    1980-12-01

    The procedures, dosimetric quantities and basic data to be used for the evaluation of Generalised Derived Limits (GDLs) in environmental materials and of Generalised Derived Limits for discharges to atmosphere are described. The dosimetric considerations and the appropriate intake rates for both children and adults are discussed. In most situations in the nuclear industry and in those institutions, hospitals and laboratories which use relatively small quantities of radioactive material, the Generalised Derived Limits provide convenient reference levels against which the results of environmental monitoring can be compared, and atmospheric discharges can be assessed. They are intended for application when the environmental contamination or discharge to atmosphere is less than about 5% of the Generalised Derived Limit; above this level, it will usually be necessary to undertake a more detailed site-specific assessment. (author)

  1. Evaluation of the basic mechanical and thermal properties of deep crystalline rocks

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byoung Yoon; Bae, Dae Seok; Kim, Chun Soo; Kim, Kyung Su; Koh, Young Kwon; Jeon, Seok Won

    2001-04-01

    This report provides the mechanical and thermal properties of granitic intact rocks obtained from Deep Core Drilling Program which is carried out as part of the assessment of deep geological environmental condition. These data are the basic material properties of the core samples from the boreholes drilled up to 500 m depth at the Yusung and Kosung sites. These sites were selected based on the result of preliminary site evaluation study. In this study, the mechanical properties include density, porosity, P-wave velocity, S-wave velocity, uniaxial compressive strength, Young's modulus, Poisson's ratio, tensile strength, and shear strength of fractures, and the thermal properties are heat conductivity, thermal expansion coefficient, specific heat and so on. Those properties were measured through laboratory tests and these data are compared with the existing test results of several domestic rocks.

  2. Investigation and basic evaluation for ultra-high burnup fuel cladding material

    International Nuclear Information System (INIS)

    Ioka, Ikuo; Nagase, Fumihisa; Futakawa, Masatoshi; Kiuchi, Kiyoshi

    2001-03-01

    In ultra-high burnup of the power reactor, it is an essential problem to develop the cladding with excellent durability. First, development history and approach of the safety assessment of Zircaloy for the high burnup fuel were summarized in the report. Second, the basic evaluation and investigation were carried out on the material with high practicability in order to select the candidate materials for the ultra-high burnup fuel. In addition, the basic research on modification technology of the cladding surface was carried out from the viewpoint of the addition of safety margin as a cladding. From the development history of the zirconium alloy including the Zircaloy, it is hard to estimate the results of in-pile test from those of the conventional corrosion test (out-pile test). Therefore, the development of the new testing technology that can simulate the actual environment and the elucidation of the corrosion-controlling factor of the cladding are desired. In cases of RIA (Reactivity Initiated Accident) and LOCA (Loss of Coolant Accident), it seems that the loss of ductility in zirconium alloys under heavy irradiation and boiling of high temperature water restricts the extension of fuel burnup. From preliminary evaluation on the high corrosion-resistance materials (austenitic stainless steel, iron or nickel base superalloys, titanium alloy, niobium alloy, vanadium alloy and ferritic stainless steel), stabilized austenitic stainless steels with a capability of future improvement and high-purity niobium alloys with a expectation of the good corrosion resistance were selected as candidate materials of ultra-high burnup cladding. (author)

  3. Basic study of water-cement ratio evaluation for fresh mortar using an ultrasonic measurement technique

    International Nuclear Information System (INIS)

    Hamza Haffies Ismail; Murata, Yorinobu

    2009-01-01

    The objective of this research is for the basic study of ultrasonic evaluation method for the determination of the water-cement-ratio (W/C) in fresh concrete at the early age of hardening. Water-cement ratio is a important parameter to evaluate the strength of concrete for concrete construction. Using an ultrasonic pulse measurement technique, wave velocity and frequency variations depend on the age of concrete during hardening process could be evaluated. As a sample test, fresh mortar of water-cement ratio of 40 %, 50% and 60 % was poured into cylindrical plastic mould form (φ100 mm x 50 mm). For an ultrasonic pulse wave transmission technique, two wide band ultrasonic transducers were set on the top and bottom surface of mortar, and start measuring from 10 minutes after pouring water until 60 minutes of 5 minutes of intervals. As a result, it was confirmed that wave velocity and center frequency were changed with the age of mortar depends on the water-cement ratio. (author)

  4. Thermal lattice benchmarks for testing basic evaluated data files, developed with MCNP4B

    International Nuclear Information System (INIS)

    Maucec, M.; Glumac, B.

    1996-01-01

    The development of unit cell and full reactor core models of DIMPLE S01A and TRX-1 and TRX-2 benchmark experiments, using Monte Carlo computer code MCNP4B is presented. Nuclear data from ENDF/B-V and VI version of cross-section library were used in the calculations. In addition, a comparison to results obtained with the similar models and cross-section data from the EJ2-MCNPlib library (which is based upon the JEF-2.2 evaluation) developed in IRC Petten, Netherlands is presented. The results of the criticality calculation with ENDF/B-VI data library, and a comparison to results obtained using JEF-2.2 evaluation, confirm the MCNP4B full core model of a DIMPLE reactor as a good benchmark for testing basic evaluated data files. On the other hand, the criticality calculations results obtained using the TRX full core models show less agreement with experiment. It is obvious that without additional data about the TRX geometry, our TRX models are not suitable as Monte Carlo benchmarks. (author)

  5. Medical students can learn the basic application, analytic, evaluative, and psychomotor skills of critical care medicine.

    Science.gov (United States)

    Rogers, P L; Jacob, H; Thomas, E A; Harwell, M; Willenkin, R L; Pinsky, M R

    2000-02-01

    To determine whether fourth-year medical students can learn the basic analytic, evaluative, and psychomotor skills needed to initially manage a critically ill patient. Student learning was evaluated using a performance examination, the objective structured clinical examination (OSCE). Students were randomly assigned to one of two clinical scenarios before the elective. After the elective, students completed the other scenario, using a crossover design. Five surgical intensive care units in a tertiary care university teaching hospital. Forty fourth-year medical students enrolled in the critical care medicine (CCM) elective. All students evaluated a live "simulated critically ill" patient, requested physiologic data from a nurse, ordered laboratory tests, received data in real time, and intervened as they deemed appropriate. Student performance of specific behavioral objectives was evaluated at five stations. They were expected to a) assess airway, breathing, and circulation in appropriate sequence; b) prepare a manikin for intubation, obtain an acceptable airway on the manikin, demonstrate bag-mouth ventilation, and perform acceptable laryngoscopy and intubation; c) provide appropriate mechanical ventilator settings; d) manage hypotension; and e) request and interpret pulmonary artery data and initiate appropriate therapy. OSCEs were videotaped and reviewed by two faculty members masked to time of examination. A checklist of key behaviors was used to evaluate performance. The primary outcome measure was the difference in examination score before and after the rotation. Secondary outcomes included the difference in scores at each rotation. The mean preelective score was 57.0%+/-8.3% compared with 85.9%+/-7.4% (ppsychomotor skills necessary to initially manage critically ill patients. After an appropriate 1-month CCM elective, students' thinking and application skills required to initially manage critically ill patients improved markedly, as demonstrated by an OSCE

  6. Major Assumptions of Mastery Learning.

    Science.gov (United States)

    Anderson, Lorin W.

    Mastery learning can be described as a set of group-based, individualized, teaching and learning strategies based on the premise that virtually all students can and will, in time, learn what the school has to teach. Inherent in this description are assumptions concerning the nature of schools, classroom instruction, and learners. According to the…

  7. The evaluation of first aid and basic life support training for the first year university students.

    Science.gov (United States)

    Altintaş, Kerim Hakan; Aslan, Dilek; Yildiz, Ali Naci; Subaşi, Nüket; Elçin, Melih; Odabaşi, Orhan; Bilir, Nazmi; Sayek, Iskender

    2005-02-01

    In Turkey, the first aiders are few in quantity and yet they are required in many settings, such as earthquakes. It was thought that training first year university students in first aid and basic life support (FA-BLS) techniques would serve to increase the number of first aiders. It was also thought that another problem, the lack of first aid trainers, might be addressed by training medical students to perform this function. A project aimed at training first year university students in FA-BLS was conducted at Hacettepe University. In the first phase, medical student first aid trainers (MeSFAT) were trained in FA-BLS training techniques by academic trainers and in the second phase, first year university students were trained in FA-BLS techniques by these peer trainers under the academic trainers' supervision. The purpose of this study was to assess the participants' evaluation of this project and to propose a new program to increase the number of first aiders in the country. In total, 31 medical students were certified as MeSFATs and 12 of these trained 40 first year university students in FA-BLS. Various questionnaires were applied to the participants to determine their evaluation of the training program. Most of the participants and the authors considered the program to be successful and effective. This method may be used to increase the number of first aid trainers and first aiders in the community.

  8. Basic evaluation on nuclear characteristics of BWR high burnup MOX fuel and core

    International Nuclear Information System (INIS)

    Nagano, M.; Sakurai, S.; Yamaguchi, H.

    1997-01-01

    MOX fuel will be used in existing commercial BWR cores as a part of reload fuels with equivalent operability, safety and economy to UO 2 fuel in Japan. The design concept should be compatible with UO 2 fuel design. High burnup UO 2 fuels are being developed and commercialized step by step. The MOX fuel planned to be introduced in around year 2000 will use the same hardware as UO 2 8 x 8 array fuel developed for a second step of UO 2 high burnup fuel. The target discharge exposure of this MOX fuel is about 33 GWd/t. And the loading fraction of MOX fuel is approximately one-third in an equilibrium core. On the other hand, it becomes necessary to minimize a number of MOX fuels and plants utilizing MOX fuel, mainly due to the fuel economy, handling cost and inspection cost in site. For the above reasons, it needed to developed a high burnup MOX fuel containing much Pu and a core with a large amount of MOX fuels. The purpose of this study is to evaluate basic nuclear fuel and core characteristics of BWR high burnup MOX fuel with batch average exposure of about 39.5 GWd/t using 9 x 9 array fuel. The loading fraction of MOX fuel in the core is within a range of about 50% to 100%. Also the influence of Pu isotopic composition fluctuations and Pu-241 decay upon nuclear characteristics are studied. (author). 3 refs, 5 figs, 3 tabs

  9. Evaluation of a School Building in Turkey According to the Basic Sustainable Design Criteria

    Science.gov (United States)

    Arslan, H. D.

    2017-08-01

    In Turkey, as well as many other developing countries, the significance of sustainable education buildings has only recently become recognized and the issue of sustainability issue has not been sufficiently involved in laws and regulations. In this study, first of all architectural sustainability with basic design criteria has been explained. After that selected type primary school project in Turkey has been evaluated according to the sustainable design criteria. Type project of school buildings significantly limits the sustainability performance expected from buildings. It is clear that type projects shorten the planning time as they include a designing process that is independent of settlement and they are repeated in various places with different characteristics, indeed. On the other hand; abundance of disadvantages such as the overlook of the natural physical and structural properties of the location mostly restricts the sustainable design of the building. For sustainable buildings, several factors such as the environment, land, climate, insolation, direction etc. shall be taken into consideration at the beginning stage. Therefore; implementation of type projects can be deemed to be inappropriate for sustainability.

  10. [Evaluative study of nursing consultation in the basic networks of Curitiba, Brazil].

    Science.gov (United States)

    da Silva, Sandra Honorato; Cubas, Marcia Regina; Fedalto, Maira Aparecida; da Silva, Sandra Regina; Limas, Thaís Cristina da Costa

    2010-03-01

    The implementation of the electronic health record in the basic networks of Curitiba enabled an advance in the implementation of the nursing consultation and the ICNPCH, whose modeling uses the ICNP axes structure and the ICNPCH list of action. The objective of this study was to evaluate the nursing consultation from the productivity and assistance coverage perspective. The studied population was obtained from a secondary database of nursing consultations from April to June of 2005. The analysis was performed using the Datawarehouse and OLAP tool. The productivity per professional was found to be 2.5 consultations per day. Professionals use 16% of their daily work time with this activity and up to 27% of their potential per month. The ICNPCH was used in 21% of the consultations. There is a 0.08 consultation coverage per inhabitant for 6% of the population. The nursing consultation makes it possible to characterize the nurses' role in health care and a new professional position capable of affecting the construction of public politics.

  11. Ex-post evaluation. Research independency of the basic science study of JAERI

    International Nuclear Information System (INIS)

    Yanagisawa, Kazuaki; Takahashi, Shoji

    2010-06-01

    A research independency was defined here as the continuity and the development of a corresponding research field with an evolution of history. The authors took three fields as research parameters for the ex-post evaluation. They were all belonged to the basic science field studied in the Japan Atomic Energy Research Institute (JAERI). The first parameter was actinides, which was situated in the center of research networking from the viewpoint of socio-economy. The second parameter was positron, which was situated in the periphery of research networking and the third one was neutron, which had competition with other research organizations in Japan. The three were supported and promoted financially by the JAERI. The target year was covered from 1978 to 2002, a 25-years. INIS (International Nuclear Information Systems) operated by the International Atomic Energy Agency (IAEA) was used as the tool for the present bibliometric study. It was revealed that important factors that led the sustainable success of the research independency were the constant efforts to accomplish their mission, the education of their successors to instructing the explicit and tacit research findings and the construction of intellectual networking with learned circles and industries, those were in good collaboration with JAERI. These were quantitatively clarified. Conversely, main factors that impeded the development of the research independency were discontinuance of research caused by a retirement, a change of post or that of occupation, and an unexpected accident (death) of the core researchers. Among three parameters, the authors confirmed that there occurred the time-dependent stage of germination, development and declination of the research independency attributing to the interaction between the succession factors and impeded factors. For this kind of ex-post evaluation, the support of field research laboratory was inevitable. (author)

  12. A dedicated breast-PET/CT scanner: Evaluation of basic performance characteristics.

    Science.gov (United States)

    Raylman, Raymond R; Van Kampen, Will; Stolin, Alexander V; Gong, Wenbo; Jaliparthi, Gangadhar; Martone, Peter F; Smith, Mark F; Sarment, David; Clinthorne, Neal H; Perna, Mark

    2018-04-01

    Application of advanced imaging techniques, such as PET and x ray CT, can potentially improve detection of breast cancer. Unfortunately, both modalities have challenges in the detection of some lesions. The combination of the two techniques, however, could potentially lead to an overall improvement in diagnostic breast imaging. The purpose of this investigation is to test the basic performance of a new dedicated breast-PET/CT. The PET component consists of a rotating pair of detectors. Its performance was evaluated using the NEMA NU4-2008 protocols. The CT component utilizes a pulsed x ray source and flat panel detector mounted on the same gantry as the PET scanner. Its performance was assessed using specialized phantoms. The radiation dose to a breast during CT imaging was explored by the measurement of free-in-air kerma and air kerma measured at the center of a 16 cm-diameter PMMA cylinder. Finally, the combined capabilities of the system were demonstrated by imaging of a micro-hot-rod phantom. Overall, performance of the PET component is comparable to many pre-clinical and other dedicated breast-PET scanners. Its spatial resolution is 2.2 mm, 5 mm from the center of the scanner using images created with the single-sliced-filtered-backprojection algorithm. Peak NECR is 24.6 kcps; peak sensitivity is 1.36%; the scatter fraction is 27%. Spatial resolution of the CT scanner is 1.1 lp/mm at 10% MTF. The free-in-air kerma is 2.33 mGy, while the PMMA-air kerma is 1.24 mGy. Finally, combined imaging of a micro-hot-rod phantom illustrated the potential utility of the dual-modality images produced by the system. The basic performance characteristics of a new dedicated breast-PET/CT scanner are good, demonstrating that its performance is similar to current dedicated PET and CT scanners. The potential value of this system is the capability to produce combined duality-modality images that could improve detection of breast disease. The next stage in development of this system

  13. Evaluation of a newly developed media-supported 4-step approach for basic life support training

    Directory of Open Access Journals (Sweden)

    Sopka Saša

    2012-07-01

    Full Text Available Abstract Objective The quality of external chest compressions (ECC is of primary importance within basic life support (BLS. Recent guidelines delineate the so-called 4“-step approach” for teaching practical skills within resuscitation training guided by a certified instructor. The objective of this study was to evaluate whether a “media-supported 4-step approach” for BLS training leads to equal practical performance compared to the standard 4-step approach. Materials and methods After baseline testing, 220 laypersons were either trained using the widely accepted method for resuscitation training (4-step approach or using a newly created “media-supported 4-step approach”, both of equal duration. In this approach, steps 1 and 2 were ensured via a standardised self-produced podcast, which included all of the information regarding the BLS algorithm and resuscitation skills. Participants were tested on manikins in the same mock cardiac arrest single-rescuer scenario prior to intervention, after one week and after six months with respect to ECC-performance, and participants were surveyed about the approach. Results Participants (age 23 ± 11, 69% female reached comparable practical ECC performances in both groups, with no statistical difference. Even after six months, there was no difference detected in the quality of the initial assessment algorithm or delay concerning initiation of CPR. Overall, at least 99% of the intervention group (n = 99; mean 1.5 ± 0.8; 6-point Likert scale: 1 = completely agree, 6 = completely disagree agreed that the video provided an adequate introduction to BLS skills. Conclusions The “media-supported 4-step approach” leads to comparable practical ECC-performance compared to standard teaching, even with respect to retention of skills. Therefore, this approach could be useful in special educational settings where, for example, instructors’ resources are sparse or large-group sessions

  14. Effect of basic fog of medical x-ray films on image quality and patient dose-method of evaluation and steps to control

    International Nuclear Information System (INIS)

    Bohra, Reena; Nair, C.P.R.; Jayalakshmi, V.; Govindarajan, K.N.; Bhatt, B.C.

    2003-01-01

    Unacceptable basic fog of medical x-ray films has been reported recently from many hospitals. The paper presents the effect of basic fog on radiographic quality of films like sensitivity (speed), contrast and maximum density (DMax). Several batches of general- purpose medical x-ray films from five different manufacturers were studied to evaluate batch-to-batch variation in basic fog. Increase in basic fog with aging of films was also evaluated. Reasons for increased basic fog observed in the film processing facilities of a few hospitals were analysed. Factors responsible for increase in basic fog and the steps to control it have been discussed

  15. Triatominae Biochemistry Goes to School: Evaluation of a Novel Tool for Teaching Basic Biochemical Concepts of Chagas Disease Vectors

    Science.gov (United States)

    Cunha, Leonardo Rodrigues; de Oliveria Cudischevitch, Cecília; Carneiro, Alan Brito; Macedo, Gustavo Bartholomeu; Lannes, Denise; da Silva-Neto, Mário Alberto Cardoso

    2014-01-01

    We evaluate a new approach to teaching the basic biochemistry mechanisms that regulate the biology of Triatominae, major vectors of "Trypanosoma cruzi," the causative agent of Chagas disease. We have designed and used a comic book, "Carlos Chagas: 100 years after a hero's discovery" containing scientific information obtained by…

  16. [Implementation of the International Health Regulations in Cuba: evaluation of basic capacities of the health sector in selected provinces].

    Science.gov (United States)

    Gala, Ángela; Toledo, María Eugenia; Arias, Yanisnubia; Díaz González, Manuel; Alvarez Valdez, Angel Manuel; Estévez, Gonzalo; Abreu, Rolando Miyar; Flores, Gustavo Kourí

    2012-09-01

    Obtain baseline information on the status of the basic capacities of the health sector at the local, municipal, and provincial levels in order to facilitate identification of priorities and guide public policies that aim to comply with the requirements and capacities established in Annex 1A of the International Health Regulations 2005 (IHR-2005). A descriptive cross-sectional study was conducted by application of an instrument of evaluation of basic capacities referring to legal and institutional autonomy, the surveillance and research process, and the response to health emergencies in 36 entities involved in international sanitary control at the local, municipal, and provincial levels in the provinces of Havana, Cienfuegos, and Santiago de Cuba. The polyclinics and provincial centers of health and epidemiology in the three provinces had more than 75% of the basic capacities required. Twelve out of 36 units had implemented 50% of the legal and institutional framework. There was variable availability of routine surveillance and research, whereas the entities in Havana had more than 40% of the basic capacities in the area of events response. The provinces evaluated have integrated the basic capacities that will allow implementation of IHR-2005 within the period established by the World Health Organization. It is necessary to develop and establish effective action plans to consolidate surveillance as an essential activity of national and international security in terms of public health.

  17. Measuring Productivity Change without Neoclassical Assumptions: A Conceptual Analysis

    NARCIS (Netherlands)

    B.M. Balk (Bert)

    2008-01-01

    textabstractThe measurement of productivity change (or difference) is usually based on models that make use of strong assumptions such as competitive behaviour and constant returns to scale. This survey discusses the basics of productivity measurement and shows that one can dispense with most if not

  18. On testing the missing at random assumption

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    Most approaches to learning from incomplete data are based on the assumption that unobserved values are missing at random (mar). While the mar assumption, as such, is not testable, it can become testable in the context of other distributional assumptions, e.g. the naive Bayes assumption...

  19. Occupancy estimation and the closure assumption

    Science.gov (United States)

    Rota, Christopher T.; Fletcher, Robert J.; Dorazio, Robert M.; Betts, Matthew G.

    2009-01-01

    1. Recent advances in occupancy estimation that adjust for imperfect detection have provided substantial improvements over traditional approaches and are receiving considerable use in applied ecology. To estimate and adjust for detectability, occupancy modelling requires multiple surveys at a site and requires the assumption of 'closure' between surveys, i.e. no changes in occupancy between surveys. Violations of this assumption could bias parameter estimates; however, little work has assessed model sensitivity to violations of this assumption or how commonly such violations occur in nature. 2. We apply a modelling procedure that can test for closure to two avian point-count data sets in Montana and New Hampshire, USA, that exemplify time-scales at which closure is often assumed. These data sets illustrate different sampling designs that allow testing for closure but are currently rarely employed in field investigations. Using a simulation study, we then evaluate the sensitivity of parameter estimates to changes in site occupancy and evaluate a power analysis developed for sampling designs that is aimed at limiting the likelihood of closure. 3. Application of our approach to point-count data indicates that habitats may frequently be open to changes in site occupancy at time-scales typical of many occupancy investigations, with 71% and 100% of species investigated in Montana and New Hampshire respectively, showing violation of closure across time periods of 3 weeks and 8 days respectively. 4. Simulations suggest that models assuming closure are sensitive to changes in occupancy. Power analyses further suggest that the modelling procedure we apply can effectively test for closure. 5. Synthesis and applications. Our demonstration that sites may be open to changes in site occupancy over time-scales typical of many occupancy investigations, combined with the sensitivity of models to violations of the closure assumption, highlights the importance of properly addressing

  20. [Kraepelin's basic nosologic postulates. An attempt at a critical evaluation of the later works of Kraepelin].

    Science.gov (United States)

    Hoff, P

    1988-01-01

    This study discusses three important papers by Emil Kraepelin, published between 1918 and 1920. Kraepelin supports--in accordance with his teacher Wilhelm Wundt--the view of psychophysical parallelism as a basic principle of dealing with the questions of mental illness. Kraepelin is often called a nosologist; but one must not forget that Kraepelins nosology was not a static one, nor did he vote in favor of any kind of dogmatism. Only when Kraepelin's basic positions are reflected in a differentiated way, his enormous influence on very different parts of psychiatry as science can be understood.

  1. Evaluation of achiral templates with fluxional Brønsted basic substituents in enantioselective conjugate additions.

    Science.gov (United States)

    Adachi, Shinya; Takeda, Norihiko; Sibi, Mukund P

    2014-12-19

    Enantioselective conjugate addition of malononitrile to pyrazolidinone-derived enoates proceeds in excellent yields and high enantioselectivities. A comparison of fluxional substituents with and without a Brønsted basic site and their impact on selectivity is detailed. Molecular sieves as an additive were found to be essential to achieve high enantioselectivity.

  2. Curriculum Reform and School Performance: An Evaluation of the "New Basics."

    Science.gov (United States)

    Alexander, Karl L.; Pallas, Aaron M.

    This report examines whether a high school curriculum organized around the five "new basics" suggested by the National Commission on Excellence in Education is likely to enhance student achievement. Data from the ETS Growth Study reveals that completion of the core curriculum has sizable effects on senior-year test performance, even when…

  3. Evaluation of the Functional Pre-Basic-Training English-as-a-Second- Language Course

    Science.gov (United States)

    1985-02-01

    that reported in TRADOC data for BSEP literacy students. TRADOC data, presented in Table 7-6, indicate that only 47.8% of BSEP literacy students...apropiada. (Si esta aprendiendo suficiente informacion acerca de Basic Training, siga con la pregunta 17.) (marque solo una respuesta) ~ demasiadas lecciones

  4. Basic principles of test-negative design in evaluating influenza vaccine effectiveness.

    Science.gov (United States)

    Fukushima, Wakaba; Hirota, Yoshio

    2017-08-24

    Based on the unique characteristics of influenza, the concept of "monitoring" influenza vaccine effectiveness (VE) across the seasons using the same observational study design has been developed. In recent years, there has been a growing number of influenza VE reports using the test-negative design, which can minimize both misclassification of diseases and confounding by health care-seeking behavior. Although the test-negative designs offer considerable advantages, there are some concerns that widespread use of the test-negative design without knowledge of the basic principles of epidemiology could produce invalid findings. In this article, we briefly review the basic concepts of the test-negative design with respect to classic study design such as cohort studies or case-control studies. We also mention selection bias, which may be of concern in some countries where rapid diagnostic testing is frequently used in routine clinical practices, as in Japan. Copyright © 2017. Published by Elsevier Ltd.

  5. Evaluation of mid-to-long term basic research for environmental restoration

    International Nuclear Information System (INIS)

    1989-09-01

    This document describes a long-term basic research program for the US Department of Energy (DOE) that complements departmental initiatives in waste management and site cleanup. The most important problems faced by DOE are environmental restoration of waste sites and cleanup of inactive facilities. Environmental restoration is defined in this report as characterization, assessment, remediation, and post-closure verification within the waste/environmental system at DOE sites. Remediation of inactive, contaminated waste-disposal sites is the largest and most expensive task facing DOE. Immobilization, isolation, separation, and destruction of waste, either aboveground or in situ, are difficult and costly tasks. Technologies for these tasks are primitive or do not exist. Departmental problems in the long term are being analyzed scientifically and research needs are being identified. When completed, the Office of Energy Research's (OER's) basis research plan will describe potential scientific research needs for universities, national laboratories, and others as a basis for research proposals to DOE. Extensive interaction with the scientific community is planned to further refine and prioritize research needs. Basic research within DOE is directed toward fundamental knowledge leading to the discovery of new scientific or engineering concepts and principles that may or may not have immediate specific technological applications. However, because DOE is a mission-oriented agency, basic research in DOE is strongly influenced by national energy and environmental policy and may be multidisciplinary in nature. Basic research will provide innovative concepts and the fundamental knowledge base that facilitates the development and application of new and emerging technologies. 41 refs., 5 figs., 9 tabs

  6. Programa saúde da família no brasil: um enfoque sobre seus pressupostos básicos, operacionalização e vantagens Family health program in brazil: a focus on its basic assumptions, performance and advantages

    Directory of Open Access Journals (Sweden)

    Milena Lopes Santana

    2001-07-01

    Full Text Available De sua concepção até o momento atual, são muitas as análises a respeito do Programa Saúde da Família (PSF no Brasil. Embora ainda em número reduzido, integrantes das unidades de saúde da família, secretários municipais de saúde, prefeitos, elementos do Ministério da Saúde, bem como docentes de universidades e pesquisadores renomados da saúde pública e outras áreas afins, têm se disposto a discutir e a refletir sobre tal estratégia. Dessa forma, tornou-se pertinente fazer uma revisão da literatura sobre o PSF, a qual foi abordada em temas: retrospectiva histórica do período que antecedeu o PSF; seus pressupostos básicos; estratégias de operacionalização: a família como foco de assistência, o princípio da vigilância à saúde, a atuação da equipe multidisciplinar; os diferentes modelos de implantação no Brasil; aspectos facilitadores ou não dessa implantação, bem como as vantagens e desvantagens do PSF no sistema de saúde brasileiro.Since its conception up to the moment, many have been the analysis concerning the Family Health Program in Brazil (FHP. Although still in a small number, members of the Family Health Units, Health Municipal Secretaries, Mayors, members of health Ministry, as well as Universities teaching staff and renowned researchers of public health and other similar branches, they have disposed themselves towards discussing and considering such strategy. Thus, it became appropriate to carry out a review on the literature about The FHP, which was approached in topics: historic retrospective of the period that preceded The FHP; its basic assumptions; performance strategies; the family as the center of assistance, the principle of health vigilance, the performance of the multidisciplinarian staff, the different patterns of implantation in Brazil, the facilitating aspects or not of this launching in Brazil, as well as the advantages and disadvantages of The FHP in Brazilian Health System.

  7. Evaluation of the basic concepts of approaches for the coexistence of nuclear energy and people/local community

    International Nuclear Information System (INIS)

    Kondo, Shunsuke; Kuroki, Shinichi; Nakagiri, Yuko

    2007-01-01

    In November 2007, the Policy Evaluation Committee compiled the report, which evaluated the basic concepts of approaches to the coexistence of nuclear energy and people/local community, specified in the Framework for Nuclear Energy Policy. The report states that the 'concerned administrative bodies are carrying out measures related to the coexistence of nuclear energy and people/local communities in line with these basic concept' and summarizes fifteen proposals conductive to the betterment and improvement of these measures, which were classified as 1) secure transparency and promotion of mutual understanding with the public, 2) development and enrichment of learning opportunities and public participation, 3) relationship between the government and local governments and 4) coexistence with local residents. The Japan Atomic Energy Commission (JAEC) considers this report to be reasonable. This article presented an overview of this activity. (T. Tanaka)

  8. National Uranium Resource Evaluation Program. Hydrogeochemical and Stream Sediment Reconnaissance Basic Data Reports Computer Program Requests Manual

    International Nuclear Information System (INIS)

    1980-01-01

    This manual is intended to aid those who are unfamiliar with ordering computer output for verification and preparation of Uranium Resource Evaluation (URE) Project reconnaissance basic data reports. The manual is also intended to help standardize the procedures for preparing the reports. Each section describes a program or group of related programs. The sections are divided into three parts: Purpose, Request Forms, and Requested Information

  9. Basic Principles and Utilization Possibilities’ of Ultrasonic Phased Array in Material Nondestructive Evaluation

    Directory of Open Access Journals (Sweden)

    Dagmar Faktorova

    2004-01-01

    Full Text Available The paper deals with the basic principles of operation and with the utilization possibilities of phased array (PA in materials nondestructive testing (NDT. The first part deals with description of PA arrangement modes, which enable to generate, focus and steer the ultrasonic beem. The second part deals with the description of electromagnetic acoustic transducer PA operation. The last part deals with the description of the utilization of PA in nondestructive testing of conductive materials and the advantages of PA utilization in inhomogeneous materials NDT.

  10. Towards New Probabilistic Assumptions in Business Intelligence

    Directory of Open Access Journals (Sweden)

    Schumann Andrew

    2015-01-01

    Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.

  11. An Agent-Based Approach for Evaluating Basic Design Options of Management Accounting Systems

    Directory of Open Access Journals (Sweden)

    Friederike Wall

    2013-12-01

    Full Text Available This paper investigates the effectiveness of reducing errors in management accounting systems with respect to organizational performance. In particular, different basic design options of management accounting systems of how to improve the information base by measurements of actual values are analyzed in different organizational contexts. The paper applies an agent-based simulation based on the idea of NK fitness landscapes. The results provide broad, but no universal support for conventional wisdom that lower inaccuracies of accounting information lead to more effective adaptation processes. Furthermore, results indicate that the effectiveness of improving the management accounting system subtly interferes with the complexity of the interactions within the organization and the coordination mode applied

  12. Basic and advanced echocardiographic evaluation of myocardial dysfunction in sepsis and septic shock.

    Science.gov (United States)

    Vallabhajosyula, S; Pruthi, S; Shah, S; Wiley, B M; Mankad, S V; Jentzer, J C

    2018-01-01

    Sepsis continues to be a leading cause of mortality and morbidity in the intensive care unit. Cardiovascular dysfunction in sepsis is associated with worse short- and long-term outcomes. Sepsis-related myocardial dysfunction is noted in 20%-65% of these patients and manifests as isolated or combined left or right ventricular systolic or diastolic dysfunction. Echocardiography is the most commonly used modality for the diagnosis of sepsis-related myocardial dysfunction. With the increasing use of ultrasonography in the intensive care unit, there is a renewed interest in sepsis-related myocardial dysfunction. This review summarises the current scope of literature focused on sepsis-related myocardial dysfunction and highlights the use of basic and advanced echocardiographic techniques for the diagnosis of sepsis-related myocardial dysfunction and the management of sepsis and septic shock.

  13. The Immoral Assumption Effect: Moralization Drives Negative Trait Attributions.

    Science.gov (United States)

    Meindl, Peter; Johnson, Kate M; Graham, Jesse

    2016-04-01

    Jumping to negative conclusions about other people's traits is judged as morally bad by many people. Despite this, across six experiments (total N = 2,151), we find that multiple types of moral evaluations--even evaluations related to open-mindedness, tolerance, and compassion--play a causal role in these potentially pernicious trait assumptions. Our results also indicate that moralization affects negative-but not positive-trait assumptions, and that the effect of morality on negative assumptions cannot be explained merely by people's general (nonmoral) preferences or other factors that distinguish moral and nonmoral traits, such as controllability or desirability. Together, these results suggest that one of the more destructive human tendencies--making negative assumptions about others--can be caused by the better angels of our nature. © 2016 by the Society for Personality and Social Psychology, Inc.

  14. Measurement and Basic Physics Committee of the U.S. Cross-Section Evaluation Working Group annual report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L. [ed.] [comp.] [Argonne National Lab., IL (United States); McLane, V. [ed.] [comp.] [Brookhaven National Lab., Upton, NY (United States)

    1997-10-01

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with responsibility for organizing and overseeing the US cross-section evaluation effort. It`s main product is the official US evaluated nuclear data file, ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the US were declining at an alarming rate and needed considerable encouragement to avoid the loss of this resource. The mission of the Committee is to maintain contact with experimentalists in the Us and to encourage them to contribute to the national nuclear data effort. Improved communication and the facilitation of collaborative activities are among the tools employed in achieving this objective. In 1994 the Committee was given an additional mission, namely, to serve as an interface between the applied interests represented in CSEWG and the basic nuclear science community. Accordingly, its name was changed to the Measurement and Basic Physics Committee. The present annual report is the third such document issued by the Committee. It contains voluntary contributions from several laboratories in the US. Their contributions were submitted to the Chairman for compilation and editing.

  15. Measurement and basic physics committee of the U.S. cross-section evaluation working group, annual report 1997

    International Nuclear Information System (INIS)

    Smith, D.L.; McLane, V.

    1998-01-01

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with responsibility for organizing and overseeing the US cross-section evaluation effort. Its main product is the official US evaluated nuclear data file, ENDF. The current version of this file is Version VI. All evaluations included in ENDF, as well as periodic modifications and updates to the file, are reviewed and approved by CSEWG and issued by the US Nuclear Data Center, Brookhaven National Laboratory. CSEWG is comprised of volunteers from the US nuclear data community who possess expertise in evaluation methodologies and who collectively have been responsible for producing most of the evaluations included in ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the US were declining at an alarming rate and needed considerable encouragement to avoid the loss of this resource. The mission of the Committee is to maintain contact with experimentalists in the US and to encourage them to contribute to the national nuclear data effort. Improved communication and the facilitation of collaborative activities are among the tools employed in achieving this objective. In 1994 the Committee was given an additional mission, namely, to serve as an interface between the applied interests represented in CSEWG and the basic nuclear science community. Accordingly, its name was changed to the Measurement and Basic Physics Committee. The present annual report is the third such document issued by the Committee. It contains voluntary contributions from several laboratories in the US. Their contributions were submitted to the Chairman for compilation and editing

  16. MEASUREMENT AND BASIC PHYSICS COMMITTEE OF THE U.S. CROSS-SECTION EVALUATION WORKING GROUP, ANNUAL REPORT 1997

    Energy Technology Data Exchange (ETDEWEB)

    SMITH,D.L.; MCLANE,V.

    1998-10-20

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with responsibility for organizing and overseeing the US cross-section evaluation effort. Its main product is the official US evaluated nuclear data file, ENDF. The current version of this file is Version VI. All evaluations included in ENDF, as well as periodic modifications and updates to the file, are reviewed and approved by CSEWG and issued by the US Nuclear Data Center, Brookhaven National Laboratory. CSEWG is comprised of volunteers from the US nuclear data community who possess expertise in evaluation methodologies and who collectively have been responsible for producing most of the evaluations included in ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the US were declining at an alarming rate and needed considerable encouragement to avoid the loss of this resource. The mission of the Committee is to maintain contact with experimentalists in the US and to encourage them to contribute to the national nuclear data effort. Improved communication and the facilitation of collaborative activities are among the tools employed in achieving this objective. In 1994 the Committee was given an additional mission, namely, to serve as an interface between the applied interests represented in CSEWG and the basic nuclear science community. Accordingly, its name was changed to the Measurement and Basic Physics Committee. The present annual report is the third such document issued by the Committee. It contains voluntary contributions from several laboratories in the US. Their contributions were submitted to the Chairman for compilation and editing.

  17. Abreu System - a dosimetric system to evaluate basic functioning parameters of roentgenography equipment

    International Nuclear Information System (INIS)

    Feital, J.C.

    1988-01-01

    This work shows a system to evaluate the half-value thickness of X-ray bundle. This system consists in a card with an aluminium filter, thermoluminescent dosemeter of lithium fluoride and radiographics films. )C.G.C.) [pt

  18. Basic and clinical evaluation of our newly developed radiographic orthopantomography in studying the TMJ

    Energy Technology Data Exchange (ETDEWEB)

    Takagi, Sumio

    1987-03-01

    Temporomandibular arthrosis has been reported in many fields: oral surgery, dental prosthetics, otolaryngology, radiology, psychosomatic medicine etc... Roentgenographic examination plays an important role in the diagnosis of temporomandibular arthrosis. There have been studies on various types of roentgenography of the temporomandibular joint, but because its anatomical morphology is so complicated that it is difficult to obtain an adequate roentgenograph. Thus, no definite method of roentgenography has been clearly established. We developed a new method of roentgenography that takes advantage of the characteristics of orthopantomography, and analyzed our results statistically. The following results were obtained: 1) As conditions for roentgenography, the optimal tube voltage was 60 -- 75 KVp, but this varied according to sex and age; the optimal tube current was 15 mA. 2) The optimal position of the head in the anteroposterior position was 10 mm in front of the standard point, and that in the vertical position was the central part of the film. 3) The optimal position of the head was reached when the OM line was horizontal. 4) On the basis of the data obtained from these basic experiments, images from standard roentgenography were compared statistically with those from the modified method of Schuller in patients with temporomandibular arthrosis. There was a significant difference between the images. These results demonstrated that this method of roentgenography may be useful in standardizing X-ray procedures, provides images of temporomandibular arthrosis of high reproductive quality, suggesting that this method may be adequate for more definitive diagnoses.

  19. AN EVALUATION OF THE BASIC CHARACTERISTICS OF A PLASTIC SCINTILLATING FIBRE DETECTOR IN CT RADIATION FIELDS.

    Science.gov (United States)

    Terasaki, Kento; Fujibuchi, Toshioh; Toyoda, Takatoshi; Yoshida, Yutaka; Akasaka, Tsutomu; Nohtomi, Akihiro; Morishita, Junji

    2016-12-01

    The ionisation chamber for computed tomography (CT) is an instrument that is most commonly used to measure the computed tomography dose index. However, it has been reported that the 10 cm effective detection length of the ionisation chamber is insufficient due to the extent of the dose distribution outside the chamber. The purpose of this study was to estimate the basic characteristics of a plastic scintillating fibre (PSF) detector with a long detection length of 50 cm in CT radiation fields. The authors investigated position dependence using diagnostic X-ray equipment and dependencies for energy, dose rate and slice thickness using an X-ray CT system. The PSF detector outputs piled up at a count rate of 10 000 counts ms -1 in dose rate dependence study. With calibration, this detector may be useful as a CT dosemeter with a long detection length except for the measurement at high dose rate. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. A basic and clinical evaluation of our newly developed radiographic orthopantomography in studying the TMJ

    International Nuclear Information System (INIS)

    Takagi, Sumio

    1987-01-01

    Temporomandibular arthrosis has been reported in many fields: oral surgery, dental prosthetics, otolaryngology, radiology, psychosomatic medicine etc... Roentgenographic examination plays an important role in the diagnosis of temporomandibular arthrosis. There have been studies on various types of roentgenography of the temporomandibular joint, but because its anatomical morphology is so complicated that it is difficult to obtain an adequate roentgenograph. Thus, no definite method of roentgenography has been clearly established. We developed a new method of roentgenography that takes advantage of the characteristics of orthopantomography, and analyzed our results statistically. The following results were obtained: 1) As conditions for roentgenography, the optimal tube voltage was 60 ∼ 75 KVp, but this varied according to sex and age; the optimal tube current was 15 mA. 2) The optimal position of the head in the anteroposterior position was 10 mm in front of the standard point, and that in the vertical position was the central part of the film. 3) The optimal position of the head was reached when the OM line was horizontal. 4) On the basis of the data obtained from these basic experiments, images from standard roentgenography were compared statistically with those from the modified method of Schuller in patients with temporomandibular arthrosis. There was a significant difference between the images (P < 0.01). These results demonstrated that this method of roentgenography may be useful in standardizing X-ray procedures, provides images of temporomandibular arthrosis of high reproductive quality, suggesting that this method may be adequate for more definitive diagnoses. (author)

  1. Basic design and economical evaluation of Gas Turbine High Temperature Reactor 300 (GTHTR300)

    International Nuclear Information System (INIS)

    Kazuhiko, Kunitomi; Shusaku, Shiozawa; Xing, Yan

    2007-01-01

    High Temperature Gas-cooled Reactor (HTGR) combined with a direct cycle gas turbine offers one of the most promising nuclear electricity generation options after 2010. Japan Atomic Energy Agency has been engaging in the basic design and development of Gas Turbine High Temperature Reactor 300 (GTHTR300) since 2003. Costs of capital, fuel, and operation and maintenance have been estimated. The capital cost of the GTHTR300 is lower than that of the existing light water reactor (LWR) because the generation efficiency is considerably higher whereas the construction cost is lower owing to the design simplicity of the gas turbine power conversion unit and the reactor safety system. The fuel cost is shown to equal that of LWR. The operation and maintenance cost has a slight advantage due to the use of chemically inert helium coolant. In sum, the cost of electricity for the GTHTR300 is estimated to be below US 3.3 cents/kWh (4 yen/kWh), which is about two-third of that of current LWRs in Japan. The results confirm that the net power generation cost of the GTHTR300 is much lower than that of the LWR, indicating that the GTHTR300 plant consisting of small-scale reactor units can be economically competitive to the latest large-scale LWR. (authors)

  2. [Preparation and evaluation of stationary phase of high performance liquid chromatography for the separation of basic solutes].

    Science.gov (United States)

    Wang, P; Wang, J; Cong, R; Dong, B

    1997-05-01

    A bonded phase for high performance liquid chromatography (HPLC) has been prepared by the new reaction between silica and silicon ether. The ether was synthesized from alkylchlorosilane and pentane-2,4-dione in the presence of imidazole under inert conditions by using anhydrous tetrahydrofuran as solvent. The bonded phase thus obtained was characterized by elemental analysis, diffuse reflectance infrared Fourier transform (DRIFT) spectroscopy and HPLC evaluation. The carbon content was 9.4% and the surface coverage almost attained 3.0micromol/m2 without end-capping. The silanol absorption peaks of the product cannot be observed from the DRIFT spectrum, which revealed that the silanization reaction proceeded thoroughly. The basic solutes, such as aniline, o-toluidine, p-toluidine, N,N-dimethylaniline and pyridine were used as the probe solutes to examine their interaction with the residual silanols on the surface of the products. No buffer or salt was used in the mobile phase for these experiments. In comparison with an acidic solute, such as, phenol, basic aniline eluted in front of phenol, and the ratio of asymmetry of aniline peak to that of the phenol peak was 1.1. Furthermore the relative k' value of p-toluidine to that of o-toluidine was also 1.1. All the results showed that the stationary phase has better quality and reproducibility and can be used for the separation of basic solutes efficiently.

  3. Dominant region: a basic feature for group motion analysis and its application to teamwork evaluation in soccer games

    Science.gov (United States)

    Taki, Tsuyoshi; Hasegawa, Jun-ichi

    1998-12-01

    This paper proposes a basic feature for quantitative measurement and evaluation of group behavior of persons. This feature called 'dominant region' is a kind of sphere of influence for each person in the group. The dominant region is defined as a region in where the person can arrive earlier than any other persons and can be formulated as Voronoi region modified by replacing the distance function with a time function. This time function is calculated based on a computational model of moving ability of the person. As an application of the dominant region, we present a motion analysis system of soccer games. The purpose of this system is to evaluate the teamwork quantitatively based on movement of all the players in the game. From experiments using motion pictures of actual games, it is suggested that the proposed feature is useful for measurement and evaluation of group behavior in team sports. This basic feature may be applied to other team ball games, such as American football, basketball, handball and water polo.

  4. Institutional Evaluation in Basic Education Schools: a participation-oriented approach

    Directory of Open Access Journals (Sweden)

    Adilson Dalben

    2010-07-01

    Full Text Available The present work is a synthesis of my Master’s Degree dissertation in which I tried toidentify the factors that have influenced the implementation of Participatory InstitutionalEvaluation in a public primary school of the periphery of Campinas, a Brazilianmunicipality in the state of Sao Paulo. Based on the concept of negotiated quality, theenactment of the institutional evaluation model proposed required the constitution ofan Evaluation Commission by representatives of diverse actors of the school community.The research consisted of a qualitative case study, using data collected from October2005 to December 2006, when I entered the school environment in order to support theschool to develop its evaluation process. Four categories of analysis were constructedto reflect on the school political pedagogical project, the educational culture of theschool principal, the nuances of participation and the potentialities of participativeinstitutional evaluation. The results acknowledge the potential of participative institutionalevaluation as a means for democratic management and for technical and politicalcapacity building at the school level aimed at overcoming problems faced by theschool.

  5. Design and evaluation of basic standard encryption algorithm modules using nanosized complementary metal oxide semiconductor molecular circuits

    Science.gov (United States)

    Masoumi, Massoud; Raissi, Farshid; Ahmadian, Mahmoud; Keshavarzi, Parviz

    2006-01-01

    We are proposing that the recently proposed semiconductor-nanowire-molecular architecture (CMOL) is an optimum platform to realize encryption algorithms. The basic modules for the advanced encryption standard algorithm (Rijndael) have been designed using CMOL architecture. The performance of this design has been evaluated with respect to chip area and speed. It is observed that CMOL provides considerable improvement over implementation with regular CMOS architecture even with a 20% defect rate. Pseudo-optimum gate placement and routing are provided for Rijndael building blocks and the possibility of designing high speed, attack tolerant and long key encryptions are discussed.

  6. Development of Basic Key Technologies for Gen IV SFR Safety Evaluation

    International Nuclear Information System (INIS)

    Jeong, Hae Yong; Kwon, Young Min; Kim, Tae Woon; Park, Soo Yong; Suk, Soo Dong; Lee, Kwi Lim; Lee, Yong Bum; Chang, Won Pyo; Ha, Kwi Seok; Hahn, Sang Hoon

    2010-07-01

    Safety issues and design requirements on control rod worth were identified through the evaluation of safety design characteristics and the preliminary safety evaluation. This results will be taken into account for the conceptual design studies of the demonstration reactor in the next stage. The Level-1 Pasa has been performed and a quantitative Cdf value was produced for the selected design from the several candidates. The inherent safety characteristics of the selected design were evaluated through the DBE and ATWS analyses. A surrogate material for Tru has been selected which is applicable to the study of liquidus/solidus temperature test for the metallic fuel containing Tru. A methodology for the regression analysis with surrogate material has been developed and valuable data on metal fuel liquidus/solidus temperature have been measured. A simple mechanistic model describing a bending of subassemblies has been formulated based on the foreign test data and existing models. Its applicability has been evaluated for the Phenix design. New criteria of the core damage for the SFR PSA were identified. The list of initiating events, system response event tree, and core response event tree, which constitute a PSA methodology for an SFR, have been introduced. By developing the SFR PIRT, phenomenological model features, which have to be satisfied in a safety code, were defined and the PIRT results were applied to the design of the PDRC test facility. Bases for a safety evaluation methodology for the SFR DBEs have been also prepared. A draft version of the topical report on the code for local fault analysis has been completed. Since 2007, the MARS-LMR code has been developed and assessments for model validation with the test data from EBR-II and Phenix reactor have been continued. The code has been applied to the evaluation of passive safety of a conceptual design of Gen IV SFR

  7. Development of composite calcium hydroxide sorbent in mechanical operations and evaluation of its basic sorption properties

    Directory of Open Access Journals (Sweden)

    Gara Paweł

    2017-01-01

    Full Text Available This article presents the results of research carried out on the possibility of obtaining composite calcium hydroxide sorbent in the process of two-step granulation, containing additional compounds of Al, Mg and Fe, and their textural and sorption studies. For this purpose, attempts were undertaken to compact commercial calcium hydroxide powder with six additives in the laboratory roll press. The resulting compacts were crushed and sieved in order to achieve the assumed sieve fraction. Based on the obtained results, basic parameters of the process of formation of composite sorbent have been determined. Both, the selected composite sorbents fractions and additives were subsequently subjected to textural studies (determination of the specific surface area and porosity and sorption capacity performance. In addition, for the better interpretation of the results, thermogravimetric studies were carried out both for the additives and composite sorbents, as well as the grain size distribution of the additives. The results of the physicochemical tests of the obtained composite sorbents were compared with analogic results from the study on fine-grained hydroxide sorbent without additives and carbonate sorbent. The presented results showed that in a two-step granulation process it is possible to obtain the granular Ca(OH2 sorbent, as well as composite sorbents possessing better SO2 sorption capacity in comparison to the powder Ca(OH2 and/or to the calcium carbonate sorbent. This can be attributed to the combination of capability of the sorbent to appropriate thermal decomposition and the formation of a group of pores in the range of 0.07-0.3 microns.

  8. Emergency department evaluation of ischemic stroke and TIA: the BASIC Project.

    Science.gov (United States)

    Brown, D L; Lisabeth, L D; Garcia, N M; Smith, M A; Morgenstern, L B

    2004-12-28

    To identify demographic and clinical variables of emergency department (ED) practices in a community-based acute stroke study. By both active and passive surveillance, the authors identified cerebrovascular disease cases in Nueces County, TX, as part of the Brain Attack Surveillance in Corpus Christi (BASIC) Project, a population-based stroke surveillance study, between January 1, 2000, and December 31, 2002. With use of multivariable logistic regression, variables independently associated with three separate outcomes were sought: hospital admission, brain imaging in the ED, and neurologist consultation in the ED. Prespecified variables included age, sex, ethnicity, insurance status, NIH Stroke Scale score, type of stroke (ischemic stroke or TIA), vascular risk factors, and symptom presentation variables. Percentage use of recombinant tissue plasminogen activator (rt-PA) was calculated. A total of 941 Mexican Americans (MAs) and 855 non-Hispanic whites (NHWs) were seen for ischemic stroke (66%) or TIA (34%). Only 8% of patients received an in-person neurology consultation in the ED, and 12% did not receive any head imaging. TIA was negatively associated with neurology consultations compared with completed stroke (odds ratio [OR] 0.35 [95% CI 0.21 to 0.57]). TIA (OR 0.14 [0.10 to 0.19]) and sensory symptoms (OR 0.59 [0.44 to 0.81]) were also negatively associated with hospital admission. MAs (OR 0.58 [0.35 to 0.98]) were less likely to have neurology consultations in the ED than NHWs. Only 1.7% of patients were treated with rt-PA. Neurologists are seldom involved with acute cerebrovascular care in the emergency department (ED), especially in patients with TIA. Greater neurologist involvement may improve acute stroke diagnosis and treatment efforts in the ED.

  9. A methodology which facilitated the evaluation of learning in a mass university course for basic calculus

    Directory of Open Access Journals (Sweden)

    Patricia Villalonga de García

    2005-03-01

    Full Text Available The aim of the present work is to introduce the methodology used to carry out a diagnostic of the system of evaluation in learning for Mathematics I (subject of the first year in the Facultad de Bioquímica, Química y Farmacia of the Universidad Nacional of Tucumán in Argentina This diagnostic was based on a model of alternative evaluation of learning, designed on the basis of criteria resulting from constructivist pedagogical currents and on the basis of methodological principles for the qualitative and quantitative paradigms in socioeducational research. The criteria stated in this model led to the formulation of the hypothesis: “the evaluation of learning in the subject is enhanced with a reductionistic and disintegrated conception of the processes of teaching and learning”. In order to contrast it, surveys were designed which were applied to students in years 2001 and 2003 and to teachers in year 2001, and a study was carried out on the items of summative evaluation of the subject based on the principles of evaluation standards of the National Council of Teachers of Mathematics. The sources of information were chosen by attending to the characteristics of the context of work and the limitations which conditioned the investigation. A technique was designed to analyze the open and close questions of the surveys and to study the items of the exams. This facilitated the elaboration of a system of categories with which the diagnostic was implemented. The methodological design adopted and the sources used were adequate to reach the aims proposed in the study. Besides, they provided the means to find solid argumentations to contrast the hypotheses of work.

  10. Technological assumptions for biogas purification.

    Science.gov (United States)

    Makareviciene, Violeta; Sendzikiene, Egle

    2015-01-01

    Biogas can be used in the engines of transport vehicles and blended into natural gas networks, but it also requires the removal of carbon dioxide, hydrogen sulphide, and moisture. Biogas purification process flow diagrams have been developed for a process enabling the use of a dolomite suspension, as well as for solutions obtained by the filtration of the suspension, to obtain biogas free of hydrogen sulphide and with a carbon dioxide content that does not exceed 2%. The cost of biogas purification was evaluated on the basis of data on biogas production capacity and biogas production cost obtained from local water treatment facilities. It has been found that, with the use of dolomite suspension, the cost of biogas purification is approximately six times lower than that in the case of using a chemical sorbent such as monoethanolamine. The results showed travelling costs using biogas purified by dolomite suspension are nearly 1.5 time lower than travelling costs using gasoline and slightly lower than travelling costs using mineral diesel fuel.

  11. How Symmetrical Assumptions Advance Strategic Management Research

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Hallberg, Hallberg

    2014-01-01

    We develop the case for symmetrical assumptions in strategic management theory. Assumptional symmetry obtains when assumptions made about certain actors and their interactions in one of the application domains of a theory are also made about this set of actors and their interactions in other...... application domains of the theory. We argue that assumptional symmetry leads to theoretical advancement by promoting the development of theory with greater falsifiability and stronger ontological grounding. Thus, strategic management theory may be advanced by systematically searching for asymmetrical...

  12. Basic Evaluation and the Virtuous Realisation of Values: The Integrative Model of Aristotle

    Directory of Open Access Journals (Sweden)

    Markus Riedenauer

    2016-12-01

    Full Text Available Human affectivity is a research topic situated at the intersection of psychology, philosophical anthropology, theory of action and ethics. This article reconstructs the Aristotelian theory of emotions in the context of his theory of aspiration (o/recij and in terms of their function as primary evaluators of situations, which forms the basis for virtue ethics. The Aristotelian model integrates desire, motivation and morality for a rational being in community. Affects (pa/Jh reveal the profile of relevance of the world to a person as an indispensable basis for the work of practical reason. They are analysed in the dimensions of their cognitive core, their social, bodily, and motivational aspects. Affectivity constitutes a primary evaluative response to situations and thereby disposes human beings to realise their call to morally good, virtuous and fulfilling action.

  13. Abreu system - A dosimetric system to evaluate basic parameters of photofluorographic X-ray machine

    International Nuclear Information System (INIS)

    Feital, J.C.

    1987-01-01

    In Brazil, photofluorographic X-ray machines are used for cuberculosis mass screening throughout the country. The exact number of these X-ray equipment is unknown but it is estimated to be around 1000 operating units. Twelve million miniature chest radiographs are taken per year. In order to make local inspections speedier and also aiming at its postal use, a system has been developed wich evaluates the entrace exposure of the patient, the X-ray beam half-value layer ( leading to the evaluation of the tube's total filtration ) and the beam's field size. It consists of a piece of cardboard where filters, TLDs and X-ray films are inserted. So far the system has been tested in 53 X-ray machines in Rio de Janeiro. The results show that it can be used in a national survey program. (Author) [pt

  14. Intertemporal evaluation criteria for climate change policy: the basic ethical issues

    OpenAIRE

    Buchholz, Wolfgang; Schymura, Michael

    2011-01-01

    The evaluation of long-term effects of climate change in cost-benefit analysis has a long tradition in environmental economics. Since the publication of the Stern Review in 2006 the debate about the "appropriate" discounting of future welfare and utility levels was revived and the most renowned scholars of the profession participated in this debate. But it seems that some contributions dealing with the Stern Review and the Review itself mixed up normative and positive issues to defend the own...

  15. Evaluation of Basic Skills Improvement for Laparoscopy by Training with a Video Game

    Directory of Open Access Journals (Sweden)

    María Fernanda Gómez-Ramírez

    2014-06-01

    Full Text Available Introduction: Due to the growing economical and ethical limitations in surgeons training for minimally invasive surgery (mis, e.g. laparoscopy, this study aims at evaluating the effect of a continuous practice of a particular videogame on the development of the fundamental and specific skills needed to perform this type of procedure successfully. Materials and methods: To evaluate the effectiveness of video game practicing, three essential and common activities were chosen (cutting, suturing, and eye-hand coordination to be performed in laparoscopic simulators. Eight different indexes or variables of performance were measured in the three activities. Fourteen voluntaries without previous experience in surgery were divided in two groups (intervention and control and their performance was evaluated before and after a one-month standardized training program with the video game Marble Mania®. Results: A general improvement of all the performance variables was observed after one month training in the intervention group. This improvement was significant with respect to the control group in three of the eight variables: suturing errors (p = 0.003, and the execution and number of errors in the eye-hand coordination (p = 0.025 and 0.001, respectively.

  16. Abreu system - dosimetric system to evaluate the performance of the basic parameters of photofluorographic equipments

    International Nuclear Information System (INIS)

    Silva Feita, J.C. da.

    1986-01-01

    In Brazil, photofluorographic X-ray machines are used for tuberculosis mass screening throughout the country. The exact number of these X-ray equipment is unknown, but it is estimated to be around 1000 operating units. Twelve million miniature chest radiographs are taken per year. In order to make local inspections speedier and also aiming at its postal use, a system has been developed which evaluates the entrance exposure of the patients, the X-ray beam half-value layer (leading to the evaluation of the tube's total filtration) and the beam's field size. It consists of a piece of cardboard where filters, TLDs and X-ray films are inserted. So far the system has been tested in 53 X-ray machines. The results show that it can be used in a national survey program. The data collected were used for the calculation of doses and this showed the influence of field size and tube voltage on the dose to the thyroid, uterus, ovaries, bone marrow and lungs. Furthermore, the results can be used to estimate population doses and risks factors due to photofluorographic examinations. (author)

  17. Basic considerations for the preparation of performance testing materials as related to performance evaluation acceptance criteria

    International Nuclear Information System (INIS)

    McCurdy, D.E.; Morton, J.S.

    2001-01-01

    The preparation of performance testing (PT) materials for environmental and radiobioassay applications involves the use of natural matrix materials containing the analyte of interest, the addition (spiking) of the analyte to a desired matrix (followed by blending for certain matrices) or a combination of the two. The distribution of the sample analyte concentration in a batch of PT samples will reflect the degree of heterogeneity of the analyte in the PT material and/or the reproducibility of the sample preparation process. Commercial and government implemented radioanalytical performance evaluation programs have a variety of acceptable performance criteria. The performance criteria should take into consideration many parameters related to the preparation of the PT materials including the within and between sample analyte heterogeneity, the accuracy of the quantification of an analyte in the PT material and to what 'known' value will a laboratory's result be compared. How sample preparation parameters affect the successful participation in performance evaluation (PE) programs having an acceptance criteria established as a percent difference from a 'known' value or in PE programs using other acceptance criteria, such as the guidance provided in ANSI N42.22 and N13.30 is discussed. (author)

  18. Autogenous shrinkage in high-performance cement paste: An evaluation of basic mechanisms

    DEFF Research Database (Denmark)

    Lura, Pietro; Jensen, Ole Mejlhede; van Breugel, Klaas

    2003-01-01

    In this paper, various mechanisms Suggested to cause autogenous shrinkage are presented. The mechanisms are evaluated from the point of view of their soundness and applicability to quantitative modeling of autogenous shrinkage. The capillary tension approach is advantageous, because it has a sound...... mechanical and thermodynamical basis. Furthermore, this mechanism is easily applicable in a numerical model when dealing with a continuously changing microstructure. In order to test the numerical model, autogenous deformation and internal relative humidity (RH) of a Portland cement paste were measured...... on the capillary tension approach. Because a part of the RH drop in the cement paste is due to dissolved salts in the pore solution, a method is suggested to separate this effect from self-desiccation and to calculate the actual stress in the pore fluid associated with menisci formation....

  19. Wrong assumptions in the financial crisis

    NARCIS (Netherlands)

    Aalbers, M.B.

    2009-01-01

    Purpose - The purpose of this paper is to show how some of the assumptions about the current financial crisis are wrong because they misunderstand what takes place in the mortgage market. Design/methodology/approach - The paper discusses four wrong assumptions: one related to regulation, one to

  20. The relevance of ''theory rich'' bridge assumptions

    NARCIS (Netherlands)

    Lindenberg, S

    1996-01-01

    Actor models are increasingly being used as a form of theory building in sociology because they can better represent the caul mechanisms that connect macro variables. However, actor models need additional assumptions, especially so-called bridge assumptions, for filling in the relatively empty

  1. Basic Evaluation of Analytical Performance and Clinical Utility of Immunoradiometric TSH Assay

    International Nuclear Information System (INIS)

    Suhy, Il Kyo; Cho, Bo Youn; Lee, Hong Kyu; Koh, Chang Soon; Min, Hun Ki; Lee, Mun Ho

    1987-01-01

    To assess the analytic performance of immunoradiometric TSH assay (IRMA TSH), assay precision determined by intra and interassay variance, assay accuracy determined by dilution and recovery study, were evaluated by using two commercial kit (Abott and Daichi). Normal range of basal serum TSH and TRH stimulated TSH increment were also determined in 234 healthy subjects (male 110, female 124; age 20-70) and 30 volunteers (male 10, female 20; age 21-26). In addition, basal TSH levels of 70 patients with untreated hyperthyroidism, 50 untreated hypothyroidism, and 60 euthyroidism were measured to assess the clinical utility of IRMA TSH. The detection limit of IRMA TSH was 0.04 mU/l and 0.08 mU/l by Abott Kit and Daichi kit respectively. Using Abott kit, intraassay variance were 2.0, 3.1 and 1.4% in mean TSH concentration 2.4, 31.6 and 98.2 mU/l repectively and interassay variance were 2.0 and 3.2% in mean TSH concentration 2.3 and 31.3 mU/l. Mean recovery rate was 92.5% and dilution study showed nearly straight line. When Daichi kit was used, intrasssay variance were 5.6, 5.2 and 6.2% in mean TSH concentration of 2.4, 31.6 and 98.2 mU/1 respectively and interassay variance were 7.1 and 7.4% in mean TSH of 2.3 and 31.3 mU,/l. Mean recovery rate was 89.9%. Normal range of basal TSH and TRH stimulated peak TSH were 0.38-4.02 mU/1 and 2.85-30.8 mU/1 repectively (95% confidence interval, Abott kit used). Sensitivity and specificity of basal TSH levels for diagnosing hypothyroidism as well as specificity for diagnosing hyperthyroidism were 100% by using both kit. Sensitivity of basal TSH level for diagnosing hyperthyroidism was 100% when TSH levels were measured by Abott kit while that was 80.9% when measured by Daichi kit. These results suggest that IRMA TSH was very precise and accurate method and might be used as a first line test in the evaluation of thyroid function

  2. An Evaluation of Organizational and Experience Factors Affecting the Perceived Transfer of U.S. Air Force Basic Combat Skills Training

    National Research Council Canada - National Science Library

    Crow, Shirley D

    2007-01-01

    .... In this study, basic combat skills training was evaluated using a number of training factors that potentially affect trainees' perception of training transfer, or their ability to apply the skills...

  3. Autogenous shrinkage in high-performance cement paste: An evaluation of basic mechanisms

    International Nuclear Information System (INIS)

    Lura, Pietro; Jensen, Ole Mejlhede; Breugel, Klaas van

    2003-01-01

    In this paper, various mechanisms suggested to cause autogenous shrinkage are presented. The mechanisms are evaluated from the point of view of their soundness and applicability to quantitative modeling of autogenous shrinkage. The capillary tension approach is advantageous, because it has a sound mechanical and thermodynamical basis. Furthermore, this mechanism is easily applicable in a numerical model when dealing with a continuously changing microstructure. In order to test the numerical model, autogenous deformation and internal relative humidity (RH) of a Portland cement paste were measured during the first week of hardening. The isothermal heat evolution was also recorded to monitor the progress of hydration and the elastic modulus in compression was measured. RH change, degree of hydration and elastic modulus were used as input data for the calculation of autogenous deformation based on the capillary tension approach. Because a part of the RH drop in the cement paste is due to dissolved salts in the pore solution, a method is suggested to separate this effect from self-desiccation and to calculate the actual stress in the pore fluid associated with menisci formation

  4. [Evaluation of an education intervention for childhood obesity prevention in basic schools in Chile].

    Science.gov (United States)

    Lobos Fernández, Luz Lorena; Leyton Dinamarca, Bárbara; Kain Bercovich, Juliana; Vio del Río, Fernando

    2013-01-01

    The aim of this study was to evaluate a comprehensive intervention in nutrition education and physical activity to prevent childhood obesity in primary school children of low socioeconomic status in Macul county in Chile, with a two year follow-up (2008 and 2009) of the children. The intervention consisted in teacher nutrition training in healthy eating and the implementation of educational material based on Chilean dietary guidelines. In addition, there was an increase in physical education classes to 3-4 hours per week and physical education teachers were recruited for that purpose. Weight, height and six minutes walk test (6MWT) were measured and body mass index (BMI), BMI Z score, prevalence of normal, overweight and obese children were calculated with WHO 2007reference. Changes between baseline and BMI Z in each period and 6MWT/height, and changes in nutrition knowledge through questionnaires were measured. There was no significant difference in BMI Z score between the initial and final periods and in the evolution of the nutritional status of children. Nutrition knowledge improved significantly between the two measurements. There was a significant increase in 6MWT/height (10 meters between baseline and follow-up, p educational interventions are required according to the reality of each community to obtain a positive impact to prevent childhood obesity in primary schools. Copyright © AULA MEDICA EDICIONES 2013. Published by AULA MEDICA. All rights reserved.

  5. Determination of optimal angiographic viewing angles: Basic principles and evaluation study

    International Nuclear Information System (INIS)

    Dumay, A.C.M.; Reiber, J.H.C.; Gerbrands, J.J.

    1994-01-01

    Foreshortening of vessel segments in angiographic (biplane) projection images may cause misinterpretation of the extent and degree of coronary artery disease. The views in which the object of interest are visualized with minimum foreshortening are called optimal views. In this paper the authors present a complete approach to obtain such views with computer-assisted techniques. The object of interest is first visualized in two arbitrary views. Two landmarks of the object are manually defined in the two projection images. With complete information of the projection geometry, the vector representation of the object in the three-dimensional space is computed. This vector is perpendicular to a plane in which the views are called optimal. The user has one degree of freedom to define a set of optimal biplane views. The angle between the central beams of the imaging systems can be chosen freely. The computation of the orientation of the object and of corresponding optimal biplane views have been evaluated with a simple hardware phantom. The mean and the standard deviation of the overall errors in the calculation of the optimal angulation angles were 1.8 degree and 1.3 degree, respectively, when the user defined a rotation angle

  6. Measurement and Basic Physics Committee of the US cross-section evaluation working group. Annual report 1996

    International Nuclear Information System (INIS)

    Smith, D.L.; McLane, V.

    1996-11-01

    The Cross-Section Evaluation Working Group (CSEWG) is a long-standing committee charged with the responsibility for organizing and overseeing the U.S. cross-section evaluation effort. It's main product is the official U.S. evaluated nuclear data file, ENDF. The current version of this file is Version VI. All evaluations included in ENDF are reviewed and approved by CSEWG and issued by the U.S. Nuclear Data Center, Brookhaven National Laboratory. CSEWG is comprised of volunteers from the U.S. nuclear data community who possess expertise in evaluation methodologies and who collectively have been responsible for producing most of the evaluations included in ENDF. In 1992 CSEWG added the Measurements Committee to its list of standing committees and subcommittees. This action was based on a recognition of the importance of experimental data in the evaluation process as well as the realization that measurement activities in the U.S. were declining at an alarming rate and needed all possible encouragement to avoid the loss of this resource. The mission of the Committee is to maintain a network of experimentalists in the U.S. that would provide needed encouragement to the national nuclear data measurement effort through improved communication and facilitation of collaborative activities. In 1994, an additional charge was added to the responsibilities of this Committee, namely, to serve as an interface between the more applied interests represented in CSEWG and the basic nuclear science community. This annual report is the second such document issued by the Committee. It contains voluntary contributions from eleven laboratories in the U.S. which have been prepared by members of the Committee and submitted to the Chairman for compilation and editing. It is hoped that the information provided here on the work that is going on at the reporting laboratories will prove interesting and stimulating to the readers

  7. Relationship between self-reported body awareness and physiotherapists' evaluation of Basic Body Awareness Therapy in refugees with PTSD

    DEFF Research Database (Denmark)

    Jensen, Jonna Anne

    Background: The number of refugees who are traumatized and diagnosed with post-traumatic stress disorder (PTSD) is increasing in Denmark and Europe. In Denmark, Basic Body Awareness Therapy (B-BAT) is used by physiotherapists in the rehabilitation of traumatized refugees as a body oriented...... intervention. A recent pilot study found that B-BAT decreased somatic and mental symptoms of PTSD in a group of refugees with this diagnosis (Stade 2015). Further, Bergström et al. (2014) found that patients with chronic pain and low body awareness had no significant changes in body awareness after treatment...... with BBAT, whereas the group with moderate/high body awareness had a significant change one year after treatment. However, whether there exists a relationship between self-reported body awareness and physiotherapists' evaluation of the applicability of BBAT on PTSD symptoms is not known. Purpose: This study...

  8. Determining Bounds on Assumption Errors in Operational Analysis

    Directory of Open Access Journals (Sweden)

    Neal M. Bengtson

    2014-01-01

    Full Text Available The technique of operational analysis (OA is used in the study of systems performance, mainly for estimating mean values of various measures of interest, such as, number of jobs at a device and response times. The basic principles of operational analysis allow errors in assumptions to be quantified over a time period. The assumptions which are used to derive the operational analysis relationships are studied. Using Karush-Kuhn-Tucker (KKT conditions bounds on error measures of these OA relationships are found. Examples of these bounds are used for representative performance measures to show limits on the difference between true performance values and those estimated by operational analysis relationships. A technique for finding tolerance limits on the bounds is demonstrated with a simulation example.

  9. Hygiene Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Hygiene Basics KidsHealth / For Teens / Hygiene Basics What's in this article? Oily Hair Sweat ... smell, anyway? Read below for information on some hygiene basics — and learn how to deal with greasy ...

  10. Formalization and Analysis of Reasoning by Assumption

    OpenAIRE

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been speci...

  11. Monitoring Assumptions in Assume-Guarantee Contracts

    Directory of Open Access Journals (Sweden)

    Oleg Sokolsky

    2016-05-01

    Full Text Available Pre-deployment verification of software components with respect to behavioral specifications in the assume-guarantee form does not, in general, guarantee absence of errors at run time. This is because assumptions about the environment cannot be discharged until the environment is fixed. An intuitive approach is to complement pre-deployment verification of guarantees, up to the assumptions, with post-deployment monitoring of environment behavior to check that the assumptions are satisfied at run time. Such a monitor is typically implemented by instrumenting the application code of the component. An additional challenge for the monitoring step is that environment behaviors are typically obtained through an I/O library, which may alter the component's view of the input format. This transformation requires us to introduce a second pre-deployment verification step to ensure that alarms raised by the monitor would indeed correspond to violations of the environment assumptions. In this paper, we describe an approach for constructing monitors and verifying them against the component assumption. We also discuss limitations of instrumentation-based monitoring and potential ways to overcome it.

  12. Basic electrotechnology

    CERN Document Server

    Ashen, R A

    2013-01-01

    BASIC Electrotechnology discusses the applications of Beginner's All-purpose Symbolic Instruction Code (BASIC) in engineering, particularly in solving electrotechnology-related problems. The book is comprised of six chapters that cover several topics relevant to BASIC and electrotechnology. Chapter 1 provides an introduction to BASIC, and Chapter 2 talks about the use of complex numbers in a.c. circuit analysis. Chapter 3 covers linear circuit analysis with d.c. and sinusoidal a.c. supplies. The book also discusses the elementary magnetic circuit theory. The theory and performance of two windi

  13. Formalization and analysis of reasoning by assumption.

    Science.gov (United States)

    Bosse, Tibor; Jonker, Catholijn M; Treur, Jan

    2006-01-02

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.

  14. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  15. Evaluation of an Educational Model Based on the Development of Sustainable Competencies in Basic Teacher Training in Spain

    Directory of Open Access Journals (Sweden)

    Pedro Vega-Marcote

    2015-03-01

    Full Text Available The environmental deterioration of the planet, caused by unsustainable development and an unfair model, requires global change on a political, social and environmental level. To boost this change, it is necessary to redirect education and, specifically, environmental education, to educate citizens so that they are capable of making responsible decisions and behaving sustainably. The purpose of this study is to evaluate an educational teacher training model based on the development of sustainable competencies and on the solving of environmental problems. Its final aim is to search for a model that enables students to participate, individually and collectively, in the solution of socio-environmental problems in their surroundings, but without losing the global perspective, and that fosters sustainable life styles. To do so, a quasi-experimental quantitative research was performed with two pretest-posttest phases to compare the results of an active and participative methodology with another more expository one. The results show significant differences in the knowledge, attitudes and intention of the behavior of the aspiring teachers. Thus, this first analysis shows that the experiential educational model promotes and favors sustainable actions in higher education (the faculty of educational science, responsible for basic teacher training more efficiently and could be the basis for future proposals in this field.

  16. Triatominae biochemistry goes to school: evaluation of a novel tool for teaching basic biochemical concepts of Chagas disease vectors.

    Science.gov (United States)

    Cunha, Leonardo Rodrigues; Cudischevitch, Cecília de Oliveira; Carneiro, Alan Brito; Macedo, Gustavo Bartholomeu; Lannes, Denise; Silva-Neto, Mário Alberto Cardoso da

    2014-01-01

    We evaluate a new approach to teaching the basic biochemistry mechanisms that regulate the biology of Triatominae, major vectors of Trypanosoma cruzi, the causative agent of Chagas disease. We have designed and used a comic book, "Carlos Chagas: 100 years after a hero's discovery" containing scientific information obtained by seven distinguished contemporary Brazilian researchers working with Triatominaes. Students (22) in the seventh grade of a public elementary school received the comic book. The study was then followed up by the use of Concept Maps elaborated by the students. Six Concept Maps elaborated by the students before the introduction of the comic book received an average score of 7. Scores rose to an average of 45 after the introduction of the comic book. This result suggests that a more attractive content can greatly improve the knowledge and conceptual understanding among students not previously exposed to insect biochemistry. In conclusion, this study illustrates an alternative to current strategies of teaching about the transmission of neglected diseases. It also promotes the diffusion of the scientific knowledge produced by Brazilian researchers that may stimulate students to choose a scientific career. © 2014 The International Union of Biochemistry and Molecular Biology.

  17. Health professionals in the process of vaccination against hepatitis B in two basic units of Belo Horizonte: a qualitative evaluation.

    Science.gov (United States)

    Lages, Annelisa Santos; França, Elisabeth Barboza; Freitas, Maria Imaculada de Fátima

    2013-06-01

    According to the Vaccine Coverage Survey, performed in 2007, the immunization coverage against hepatitis B in Belo Horizonte, for infants under one year old, was below the level proposed by the Brazilian National Program of Immunization. This vaccine was used as basis for evaluating the involvement of health professionals in the process of vaccination in two Basic Health Units (UBS, acronym in Portuguese) in the city. This study is qualitative and uses the notions of Social Representations Theory and the method of Structural Analysis of Narrative to carry out the interviews and data analysis. The results show flaws related to controlling and use of the mirror card and the parent orientation, and also the monitoring of vaccination coverage (VC) and use of VC data as input for planning health actions. It was observed that the working process in the UBS is focused on routine tasks, with low creativity of the professionals, which includes representations that maintain strong tendency to value activities focused on the health of individuals to the detriment of public health actions. In conclusion, the vaccination process fault can be overcome with a greater appreciation of everyday actions and with a much better use of local information about vaccination, and some necessary adjustments within the UBS to improve public health actions.

  18. Validity of the mockwitness paradigm: testing the assumptions.

    Science.gov (United States)

    McQuiston, Dawn E; Malpass, Roy S

    2002-08-01

    Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.

  19. In vitro evaluation of a basic fibroblast growth factor-containing hydrogel toward vocal fold lamina propria scar treatment.

    Science.gov (United States)

    Erndt-Marino, Josh D; Jimenez-Vergara, Andrea C; Diaz-Rodriguez, Patricia; Kulwatno, Jonathan; Diaz-Quiroz, Juan Felipe; Thibeault, Susan; Hahn, Mariah S

    2018-04-01

    Scarring of the vocal fold lamina propria can lead to debilitating voice disorders that can significantly impair quality of life. The reduced pliability of the scar tissue-which diminishes proper vocal fold vibratory efficiency-results in part from abnormal extracellular matrix (ECM) deposition by vocal fold fibroblasts (VFF) that have taken on a fibrotic phenotype. To address this issue, bioactive materials containing cytokines and/or growth factors may provide a platform to transition fibrotic VFF within the scarred tissue toward an anti-fibrotic phenotype, thereby improving the quality of ECM within the scar tissue. However, for such an approach to be most effective, the acute host response resulting from biomaterial insertion/injection likely also needs to be considered. The goal of the present work was to evaluate the anti-fibrotic and anti-inflammatory capacity of an injectable hydrogel containing tethered basic fibroblast growth factor (bFGF) in the dual context of scar and biomaterial-induced acute inflammation. An in vitro co-culture system was utilized containing both activated, fibrotic VFF and activated, pro-inflammatory macrophages (MΦ) within a 3D poly(ethylene glycol) diacrylate (PEGDA) hydrogel containing tethered bFGF. Following 72 h of culture, alterations in VFF and macrophage phenotype were evaluated relative to mono-culture and co-culture controls. In our co-culture system, bFGF reduced the production of fibrotic markers collagen type I, α smooth muscle actin, and biglycan by activated VFF and promoted wound-healing/anti-inflammatory marker expression in activated MΦ. Cumulatively, these data indicate that bFGF-containing hydrogels warrant further investigation for the treatment of vocal fold lamina propria scar. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 1258-1267, 2018. © 2017 Wiley Periodicals, Inc.

  20. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  1. Anesthesia Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Anesthesia Basics KidsHealth / For Teens / Anesthesia Basics What's in ... español Conceptos básicos sobre la anestesia What Is Anesthesia? No doubt about it, getting an operation can ...

  2. BASIC Programming.

    Science.gov (United States)

    Jennings, Carol Ann

    Designed for use by both secondary- and postsecondary-level business teachers, this curriculum guide consists of 10 units of instructional materials dealing with Beginners All-Purpose Symbol Instruction Code (BASIC) programing. Topics of the individual lessons are numbering BASIC programs and using the PRINT, END, and REM statements; system…

  3. The homogeneous marginal utility of income assumption

    NARCIS (Netherlands)

    Demuynck, T.

    2015-01-01

    We develop a test to verify if every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and

  4. Critically Challenging Some Assumptions in HRD

    Science.gov (United States)

    O'Donnell, David; McGuire, David; Cross, Christine

    2006-01-01

    This paper sets out to critically challenge five interrelated assumptions prominent in the (human resource development) HRD literature. These relate to: the exploitation of labour in enhancing shareholder value; the view that employees are co-contributors to and co-recipients of HRD benefits; the distinction between HRD and human resource…

  5. Formalization and Analysis of Reasoning by Assumption

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning

  6. Extracurricular Business Planning Competitions: Challenging the Assumptions

    Science.gov (United States)

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  7. Shattering world assumptions: A prospective view of the impact of adverse events on world assumptions.

    Science.gov (United States)

    Schuler, Eric R; Boals, Adriel

    2016-05-01

    Shattered Assumptions theory (Janoff-Bulman, 1992) posits that experiencing a traumatic event has the potential to diminish the degree of optimism in the assumptions of the world (assumptive world), which could lead to the development of posttraumatic stress disorder. Prior research assessed the assumptive world with a measure that was recently reported to have poor psychometric properties (Kaler et al., 2008). The current study had 3 aims: (a) to assess the psychometric properties of a recently developed measure of the assumptive world, (b) to retrospectively examine how prior adverse events affected the optimism of the assumptive world, and (c) to measure the impact of an intervening adverse event. An 8-week prospective design with a college sample (N = 882 at Time 1 and N = 511 at Time 2) was used to assess the study objectives. We split adverse events into those that were objectively or subjectively traumatic in nature. The new measure exhibited adequate psychometric properties. The report of a prior objective or subjective trauma at Time 1 was related to a less optimistic assumptive world. Furthermore, participants who experienced an intervening objectively traumatic event evidenced a decrease in optimistic views of the world compared with those who did not experience an intervening adverse event. We found support for Shattered Assumptions theory retrospectively and prospectively using a reliable measure of the assumptive world. We discuss future assessments of the measure of the assumptive world and clinical implications to help rebuild the assumptive world with current therapies. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Basic hydraulics

    CERN Document Server

    Smith, P D

    1982-01-01

    BASIC Hydraulics aims to help students both to become proficient in the BASIC programming language by actually using the language in an important field of engineering and to use computing as a means of mastering the subject of hydraulics. The book begins with a summary of the technique of computing in BASIC together with comments and listing of the main commands and statements. Subsequent chapters introduce the fundamental concepts and appropriate governing equations. Topics covered include principles of fluid mechanics; flow in pipes, pipe networks and open channels; hydraulic machinery;

  9. An Evaluation of the Quality of the Desinfection Process in Inanimated Surfaces of Basic Health Units by Biomarkers Research

    Directory of Open Access Journals (Sweden)

    Ana Paula Bandeira Fucci

    2013-06-01

    Full Text Available Infection Related Health Care – IRHC may occur by exogenous transmission through the contamination of contaminated surfaces. This study aimed at verifying the quality of the process of disinfecting inanimate surfaces of Basic Health Units – BHU in a northeastern city in São Paulo state, through the presence of biomarkers, Staphylococcus aureus and Escherichia coli. We evaluated 7 UBS in random times and days, covering the following areas: dressing-room doorknob, drinking fountains and faucets, office desk, reception counter. Sterile swabs were rubbed on a 20 cm2 surface and transported to the laboratory in Stuart medium to the Clinical Analyses Didactic Laboratory of UNIFEV. The samples were cultured on Blood agar and MacConkey agar at 35 ± 1oC for 24 hours in aerobic and microaerophilic jar, respectively. Staphylococcus aureus was identified by the production of hemolysin, catalase and coagulase. Escherichia coli was identified using the biochemical tests: TSI, citrate, urease, indole, lysine, ornithine and arginine. Of the 105 samples analyzed, 6.66% of the samples were positive for Staphylococcus aureus and Escherichia coli to 2.85%. The Areas which showed the presence of biomarkers were: the reception booth, booth pharmacy, handles of the dressing room, dressing room faucet and drinking fountain. These results corroborate other studies that show that inanimate surfaces are important sources of contamination in the healthcare environment, contributing to crosscontamination and, consequently, to the increase of infection to the patient who is subjected to procedures in this environment. Within this context, government, by means of public health policies, is responsible for the training of health professionals, contributing to the promotion and prevention of public health

  10. The Analysis of Basic Public Service Supply Regional Equalization in China’s Provinces——Based on the Theil Index Evaluation

    Science.gov (United States)

    Liao, Zangyi

    2017-12-01

    Accomplishing the regional equalization of basic public service supply among the provinces in China is an important objective that can promote the people’s livelihood construction. In order to measure the problem which is about the non-equalization of basic public service supply, this paper takes these aspects as the first index, such as the infrastructure construction, basic education services, public employment services, public health service and social security service. At the same time, it cooperates with 16 index as the second index to construct the performance evaluation systems, and then use the Theil index to evaluate the performance in provinces that using the panel data from the year 2000 to 2012.

  11. Basic Finance

    Science.gov (United States)

    Vittek, J. F.

    1972-01-01

    A discussion of the basic measures of corporate financial strength, and the sources of the information is reported. Considered are: balance sheet, income statement, funds and cash flow, and financial ratios.

  12. Sensitivity of probabilistic MCO water content estimates to key assumptions

    International Nuclear Information System (INIS)

    DUNCAN, D.R.

    1999-01-01

    Sensitivity of probabilistic multi-canister overpack (MCO) water content estimates to key assumptions is evaluated with emphasis on the largest non-cladding film-contributors, water borne by particulates adhering to damage sites, and water borne by canister particulate. Calculations considered different choices of damage state degree of independence, different choices of percentile for reference high inputs, three types of input probability density function (pdfs): triangular, log-normal, and Weibull, and the number of scrap baskets in an MCO

  13. Assumptions behind size-based ecosystem models are realistic

    DEFF Research Database (Denmark)

    Andersen, Ken Haste; Blanchard, Julia L.; Fulton, Elizabeth A.

    2016-01-01

    A recent publication about balanced harvesting (Froese et al., ICES Journal of Marine Science; doi:10.1093/icesjms/fsv122) contains several erroneous statements about size-spectrum models. We refute the statements by showing that the assumptions pertaining to size-spectrum models discussed by Fro...... that there is indeed a constructive role for a wide suite of ecosystem models to evaluate fishing strategies in an ecosystem context...

  14. The Perspectives of Students and Teachers in the English Department in the College of Basic Education on the Student Evaluation of Teachers

    Science.gov (United States)

    Taqi, Hanan A.; Al-Nouh, Nowreyah A.; Dashti, Abdulmuhsin A.; Shuqair, Khaled M.

    2014-01-01

    In the context of students' evaluation of teachers in higher education, this paper examines the perspectives of students and faculty members in the English Department in the college of Basic education (CBE) in the State of Kuwait. The study is based on a survey that covered 320 students and 19 members of staff in the English department. The study…

  15. The predictive value of demonstrable stress incontinence during basic office evaluation and urodynamics in women without symptomatic urinary incontinence undergoing vaginal prolapse surgery

    NARCIS (Netherlands)

    van der Ploeg, J. Marinus; Zwolsman, Sandra E.; Posthuma, Selina; Wiarda, Hylco S.; van der Vaart, C. Huub; Roovers, Jan-Paul W. R.

    2017-01-01

    Women with pelvic organ prolapse without symptoms of urinary incontinence (UI) might demonstrate stress urinary incontinence (SUI) with or without prolapse reduction. We aimed to determine the value of demonstrable SUI during basic office evaluation or urodynamics in predicting SUI after vaginal

  16. Evaluation of a Workplace Basic Skills Program: An Impact Study of AVC Edmonton's 1990 Job Effectiveness Training Program at Stelco Steel. Report Summary.

    Science.gov (United States)

    Barker, Kathryn Chang

    The pilot Job Effectiveness Training (JET) workplace basic skills program, developed by Canada's Alberta Vocational College (AVC), Edmonton, for Stelco Steel during 1989-90, was evaluated in terms of impacts or changes from the perspective of the four major stakeholder groups: the students (12 Stelco employees); the employers (Stelco management);…

  17. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    Energy Technology Data Exchange (ETDEWEB)

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  18. Towards New Probabilistic Assumptions in Business Intelligence

    OpenAIRE

    Schumann Andrew; Szelc Andrzej

    2015-01-01

    One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot ...

  19. The 'revealed preferences' theory: Assumptions and conjectures

    International Nuclear Information System (INIS)

    Green, C.H.

    1983-01-01

    Being kind of intuitive psychology the 'Revealed-Preferences'- theory based approaches towards determining the acceptable risks are a useful method for the generation of hypotheses. In view of the fact that reliability engineering develops faster than methods for the determination of reliability aims the Revealed-Preferences approach is a necessary preliminary help. Some of the assumptions on which the 'Revealed-Preferences' theory is based will be identified and analysed and afterwards compared with experimentally obtained results. (orig./DG) [de

  20. How to Handle Assumptions in Synthesis

    Directory of Open Access Journals (Sweden)

    Roderick Bloem

    2014-07-01

    Full Text Available The increased interest in reactive synthesis over the last decade has led to many improved solutions but also to many new questions. In this paper, we discuss the question of how to deal with assumptions on environment behavior. We present four goals that we think should be met and review several different possibilities that have been proposed. We argue that each of them falls short in at least one aspect.

  1. Managerial and Organizational Assumptions in the CMM's

    DEFF Research Database (Denmark)

    Rose, Jeremy; Aaen, Ivan; Nielsen, Peter Axel

    2008-01-01

    Thinking about improving the management of software development in software firms is dominated by one approach: the capability maturity model devised and administered at the Software Engineering Institute at Carnegie Mellon University. Though CMM, and its replacement CMMI are widely known and used...... thinking about large production and manufacturing organisations (particularly in America) in the late industrial age. Many of the difficulties reported with CMMI can be attributed basing practice on these assumptions in organisations which have different cultures and management traditions, perhaps...

  2. Evaluation of innovative stationary phase ligand chemistries and analytical conditions for the analysis of basic drugs by supercritical fluid chromatography.

    Science.gov (United States)

    Desfontaine, Vincent; Veuthey, Jean-Luc; Guillarme, Davy

    2016-03-18

    Similar to reversed phase liquid chromatography, basic compounds can be highly challenging to analyze by supercritical fluid chromatography (SFC), as they tend to exhibit poor peak shape, especially those with high pKa values. In this study, three new stationary phase ligand chemistries available in sub -2 μm particle sizes, namely 2-picolylamine (2-PIC), 1-aminoanthracene (1-AA) and diethylamine (DEA), were tested in SFC conditions for the analysis of basic drugs. Due to the basic properties of these ligands, it is expected that the repulsive forces may improve peak shape of basic substances, similarly to the widely used 2-ethypyridine (2-EP) phase. However, among the 38 tested basic drugs, less of 10% displayed Gaussian peaks (asymmetry between 0.8 and 1.4) using pure CO2/methanol on these phases. The addition of 10mM ammonium formate as mobile phase additive, drastically improved peak shapes and increased this proportion to 67% on 2-PIC. Introducing the additive in the injection solvent rather than in the organic modifier, gave acceptable results for 2-PIC only, with 31% of Gaussian peaks with an average asymmetry of 1.89 for the 38 selected basic drugs. These columns were also compared to hybrid silica (BEH), DIOL and 2-EP stationary phases, commonly employed in SFC. These phases commonly exhibit alternative retention and selectivity. In the end, the two most interesting ligands used as complementary columns were 2-PIC and BEH, as they provided suitable peak shapes for the basic drugs and almost orthogonal selectivities. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Evaluation of socio-economic effects of R and D results at Japan Atomic Energy Research Institute. 2. Socio-economic evaluation of the basic research at JAERI

    International Nuclear Information System (INIS)

    2003-11-01

    The Japan Atomic Energy Research Institute (JAERI), as a core organization devoted to comprehensive nuclear energy research, has steadily promoted various types of research and development (R and D) studies since its establishment in June 1956. Research activities are aimed at performing (1) R and D for nuclear energy, (2) the utilization and application of radiation-based technologies, and (3) the establishment of basic and fundamental research in the nuclear field. Last year, the socio-economic effects on items (1) and (2) were qualitatively and quantitatively evaluated. The quantitative evaluation of item (3) from the viewpoint of a socio-economic effect, however, calls for a different concept and methodology than previously used cost-benefit approach. Achievements obtained from the activities conducted over the last 10 years implied that socio-economics in basic research funded by the public could contribute to the (1) increase in useful intellectual stocks, (2) upbringing of highly skilled college graduates, (3) construction of new scientific facilities and creation of methodologies, (4) stimulation and promotion of social interrelations by networking, (5) increase of one's ability to solve scientific problems, and (6) establishment of venture companies. In this study, we focused on item (4) for the analysis because it assumed that the external economic effect has a link with the socio-economic effects accompanying the networking formation. For the criteria of socio-economic effects we assume that the external effect becomes significant in proportion to the width of networking and/or the magnitude of cooperation measured by numbers of co-writing studies between JAERI and the research bodies, namely private and governmental sectors and universities. Taking these criteria into consideration, the subsequent four items are prepared for quantitative study. They are (1) to clarify the basic research fields where JAERI has been established a significant effort to

  4. Comparative Interpretation of Classical and Keynesian Fiscal Policies (Assumptions, Principles and Primary Opinions

    Directory of Open Access Journals (Sweden)

    Engin Oner

    2015-06-01

    Full Text Available Adam Smith being its founder, in the Classical School, which gives prominence to supply and adopts an approach of unbiased finance, the economy is always in a state of full employment equilibrium. In this system of thought, the main philosophy of which is budget balance, that asserts that there is flexibility between prices and wages and regards public debt as an extraordinary instrument, the interference of the state with the economic and social life is frowned upon. In line with the views of the classical thought, the classical fiscal policy is based on three basic assumptions. These are the "Consumer State Assumption", the assumption accepting that "Public Expenditures are Always Ineffectual" and the assumption concerning the "Impartiality of the Taxes and Expenditure Policies Implemented by the State". On the other hand, the Keynesian School founded by John Maynard Keynes, gives prominence to demand, adopts the approach of functional finance, and asserts that cases of underemployment equilibrium and over-employment equilibrium exist in the economy as well as the full employment equilibrium, that problems cannot be solved through the invisible hand, that prices and wages are strict, the interference of the state is essential and at this point fiscal policies have to be utilized effectively.Keynesian fiscal policy depends on three primary assumptions. These are the assumption of "Filter State", the assumption that "public expenditures are sometimes effective and sometimes ineffective or neutral" and the assumption that "the tax, debt and expenditure policies of the state can never be impartial". 

  5. Basic electronics

    CERN Document Server

    Holbrook, Harold D

    1971-01-01

    Basic Electronics is an elementary text designed for basic instruction in electricity and electronics. It gives emphasis on electronic emission and the vacuum tube and shows transistor circuits in parallel with electron tube circuits. This book also demonstrates how the transistor merely replaces the tube, with proper change of circuit constants as required. Many problems are presented at the end of each chapter. This book is comprised of 17 chapters and opens with an overview of electron theory, followed by a discussion on resistance, inductance, and capacitance, along with their effects on t

  6. Evaluation of a Numeracy Intervention Program Focusing on Basic Numerical Knowledge and Conceptual Knowledge: A Pilot Study.

    Science.gov (United States)

    Kaufmann, Liane; Handl, Pia; Thony, Brigitte

    2003-01-01

    In this study, six elementary grade children with developmental dyscalculia were trained individually and in small group settings with a one-semester program stressing basic numerical knowledge and conceptual knowledge. All the children showed considerable and partly significant performance increases on all calculation components. Results suggest…

  7. Evaluation of Retention of Knowledge and Skills Imparted to First-Year Medical Students through Basic Life Support Training

    Science.gov (United States)

    Pande, Sushma; Pande, Santosh; Parate, Vrushali; Pande, Sanket; Sukhsohale, Neelam

    2014-01-01

    Poor awareness among medical graduates about basic life support (BLS) is a matter of great concern. The presence of a trained rescuer is the key determinant of ultimate survival from life-threatening emergencies. To achieve this goal, early exposure to such life-saving skills is the right decision to foster these skills for medical students, which…

  8. New Assumptions to Guide SETI Research

    Science.gov (United States)

    Colombano, S. P.

    2018-01-01

    The recent Kepler discoveries of Earth-like planets offer the opportunity to focus our attention on detecting signs of life and technology in specific planetary systems, but I feel we need to become more flexible in our assumptions. The reason is that, while it is still reasonable and conservative to assume that life is most likely to have originated in conditions similar to ours, the vast time differences in potential evolutions render the likelihood of "matching" technologies very slim. In light of these challenges I propose a more "aggressive"� approach to future SETI exploration in directions that until now have received little consideration.

  9. Assumptions for the Annual Energy Outlook 1992

    International Nuclear Information System (INIS)

    1992-01-01

    This report serves a auxiliary document to the Energy Information Administration (EIA) publication Annual Energy Outlook 1992 (AEO) (DOE/EIA-0383(92)), released in January 1992. The AEO forecasts were developed for five alternative cases and consist of energy supply, consumption, and price projections by major fuel and end-use sector, which are published at a national level of aggregation. The purpose of this report is to present important quantitative assumptions, including world oil prices and macroeconomic growth, underlying the AEO forecasts. The report has been prepared in response to external requests, as well as analyst requirements for background information on the AEO and studies based on the AEO forecasts

  10. Limiting assumptions in molecular modeling: electrostatics.

    Science.gov (United States)

    Marshall, Garland R

    2013-02-01

    Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.

  11. Basic concepts

    International Nuclear Information System (INIS)

    Dorner, B.

    1999-01-01

    The basic concepts of neutron scattering as a tool for studying the structure and the dynamics of condensed matter. Theoretical aspects are outlined, the two different cases of coherent and incoherent scattering are presented. The issue of resolution, coherence volume and the role of monochromators are also discussed. (K.A.)

  12. Body Basics

    Science.gov (United States)

    ... learn more about how the body works, what basic human anatomy is, and what happens when parts of ... consult your doctor. © 1995- The Nemours Foundation. All rights reserved. Images provided by The Nemours Foundation, iStock, Getty Images, Veer, Shutterstock, and Clipart.com.

  13. Basic Thermodynamics

    International Nuclear Information System (INIS)

    Duthil, P

    2014-01-01

    The goal of this paper is to present a general thermodynamic basis that is useable in the context of superconductivity and particle accelerators. The first part recalls the purpose of thermodynamics and summarizes its important concepts. Some applications, from cryogenics to magnetic systems, are covered. In the context of basic thermodynamics, only thermodynamic equilibrium is considered

  14. Basic Thermodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Duthil, P [Orsay, IPN (France)

    2014-07-01

    The goal of this paper is to present a general thermodynamic basis that is useable in the context of superconductivity and particle accelerators. The first part recalls the purpose of thermodynamics and summarizes its important concepts. Some applications, from cryogenics to magnetic systems, are covered. In the context of basic thermodynamics, only thermodynamic equilibrium is considered.

  15. Ethanol Basics

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-01-30

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  16. Consenting to Heteronormativity: Assumptions in Biomedical Research

    NARCIS (Netherlands)

    Cottingham, M.D.; Fisher, J.A.

    2015-01-01

    The process of informed consent is fundamental to basic scientific research with human subjects. As one aspect of the scientific enterprise, clinical drug trials rely on informed consent documents to safeguard the ethical treatment of trial participants. This paper explores the role of

  17. Teaching and Learning Science in the 21st Century: Challenging Critical Assumptions in Post-Secondary Science

    Directory of Open Access Journals (Sweden)

    Amanda L. Glaze

    2018-01-01

    Full Text Available It is widely agreed upon that the goal of science education is building a scientifically literate society. Although there are a range of definitions for science literacy, most involve an ability to problem solve, make evidence-based decisions, and evaluate information in a manner that is logical. Unfortunately, science literacy appears to be an area where we struggle across levels of study, including with students who are majoring in the sciences in university settings. One reason for this problem is that we have opted to continue to approach teaching science in a way that fails to consider the critical assumptions that faculties in the sciences bring into the classroom. These assumptions include expectations of what students should know before entering given courses, whose responsibility it is to ensure that students entering courses understand basic scientific concepts, the roles of researchers and teachers, and approaches to teaching at the university level. Acknowledging these assumptions and the potential for action to shift our teaching and thinking about post-secondary education represents a transformative area in science literacy and preparation for the future of science as a field.

  18. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    Science.gov (United States)

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  19. Wavelet basics

    CERN Document Server

    Chan, Y T

    1995-01-01

    Since the study of wavelets is a relatively new area, much of the research coming from mathematicians, most of the literature uses terminology, concepts and proofs that may, at times, be difficult and intimidating for the engineer. Wavelet Basics has therefore been written as an introductory book for scientists and engineers. The mathematical presentation has been kept simple, the concepts being presented in elaborate detail in a terminology that engineers will find familiar. Difficult ideas are illustrated with examples which will also aid in the development of an intuitive insight. Chapter 1 reviews the basics of signal transformation and discusses the concepts of duals and frames. Chapter 2 introduces the wavelet transform, contrasts it with the short-time Fourier transform and clarifies the names of the different types of wavelet transforms. Chapter 3 links multiresolution analysis, orthonormal wavelets and the design of digital filters. Chapter 4 gives a tour d'horizon of topics of current interest: wave...

  20. Education: The Basics. The Basics

    Science.gov (United States)

    Wood, Kay

    2011-01-01

    Everyone knows that education is important, we are confronted daily by discussion of it in the media and by politicians, but how much do we really know about education? "Education: The Basics" is a lively and engaging introduction to education as an academic subject, taking into account both theory and practice. Covering the schooling system, the…

  1. An introduction to economic analysis in medicine - the basics of methology and chosen trems. Examples of results of evaluation in nuclear medicine

    International Nuclear Information System (INIS)

    Brockhuis, B.M.; Lass, P.

    2002-01-01

    This article overviews the basics terms and methodology of economic analysis in health care. The most important forms of economic analysis: cost-effectiveness, cost-utility and cost-minimisation analysis and aims of their application are presented. Particular emphasis is put on economic evaluation in nuclear medicine, e.g. FDG-PET v. thoracotomy in lung cancer diagnosis, radioiodine therapy v. antithyroid drugs in hyperthyroidism and technetium-99m-MIBI breast imaging v. biopsy in nonpalpable breast abnormalities. (author)

  2. Leakage-Resilient Circuits without Computational Assumptions

    DEFF Research Database (Denmark)

    Dziembowski, Stefan; Faust, Sebastian

    2012-01-01

    Physical cryptographic devices inadvertently leak information through numerous side-channels. Such leakage is exploited by so-called side-channel attacks, which often allow for a complete security breache. A recent trend in cryptography is to propose formal models to incorporate leakage...... on computational assumptions, our results are purely information-theoretic. In particular, we do not make use of public key encryption, which was required in all previous works...... into the model and to construct schemes that are provably secure within them. We design a general compiler that transforms any cryptographic scheme, e.g., a block-cipher, into a functionally equivalent scheme which is resilient to any continual leakage provided that the following three requirements are satisfied...

  3. Assumptions and Challenges of Open Scholarship

    Directory of Open Access Journals (Sweden)

    George Veletsianos

    2012-10-01

    Full Text Available Researchers, educators, policymakers, and other education stakeholders hope and anticipate that openness and open scholarship will generate positive outcomes for education and scholarship. Given the emerging nature of open practices, educators and scholars are finding themselves in a position in which they can shape and/or be shaped by openness. The intention of this paper is (a to identify the assumptions of the open scholarship movement and (b to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. Through a critique of technology use in education, an understanding of educational technology narratives and their unfulfilled potential, and an appreciation of the negotiated implementation of technology use, we hope that this paper helps spark a conversation for a more critical, equitable, and effective future for education and open scholarship.

  4. Challenging the assumptions for thermal sensation scales

    DEFF Research Database (Denmark)

    Schweiker, Marcel; Fuchs, Xaver; Becker, Susanne

    2016-01-01

    Scales are widely used to assess the personal experience of thermal conditions in built environments. Most commonly, thermal sensation is assessed, mainly to determine whether a particular thermal condition is comfortable for individuals. A seven-point thermal sensation scale has been used...... extensively, which is suitable for describing a one-dimensional relationship between physical parameters of indoor environments and subjective thermal sensation. However, human thermal comfort is not merely a physiological but also a psychological phenomenon. Thus, it should be investigated how scales for its...... assessment could benefit from a multidimensional conceptualization. The common assumptions related to the usage of thermal sensation scales are challenged, empirically supported by two analyses. These analyses show that the relationship between temperature and subjective thermal sensation is non...

  5. LEVEL OF KNOWLEDGE OF THE BASIC CONCEPTS OF PHYSICAL EVALUATION FOR THE PROFESSIONALS IN THE ACADEMICS OF THE CITY OF JOÃO PESSOA - PB

    Directory of Open Access Journals (Sweden)

    Rodrigo Benevides Ceriani

    2005-10-01

    Full Text Available The objective of this study is to verify the level of knowledge of the basic concepts of physical evaluation for the responsible professionals for this practice in the academies. He/she/you elapses of a traverse study, of field, with professionals that act in the area of Physical Evaluation, registered by CREF 10 - PB/RN. questionnaire of open and closed questions was Applied in 39 individuals. The statistics was applied of percentile of frequency through spreadsheet Excel. The results found that: 61,54% collect for the physical activity, being in 41,66% of the cases, 15 real; 69,23% don't include in the registration; 84,61% have knowledge of the one that is test; 61,54% of the one that it is to measure and 53,45% of the one that it is to evaluate. Three people were found without graduation in physical education, or in another course of superior level, acting in the area Conclusions: They still act inside of the academies, directly with the physical evaluation, professionals not graduated in physical education or in another course of superior level. Many appraisers don't possess the basic theoretical knowledge regarding the concepts that involve to test, to measure and to evaluate. In general it is collected by the physical evaluation, being most included in the customer's registration

  6. High intertester reliability of the cumulated ambulation score for the evaluation of basic mobility in patients with hip fracture

    DEFF Research Database (Denmark)

    Kristensen, Morten Tange; Andersen, Lene; Bech-Jensen, Rie

    2009-01-01

    OBJECTIVE: To examine the intertester reliability of the three activities of the Cumulated Ambulation Score (CAS) and the total CAS, and to define limits for the smallest change in basic mobility that indicates a real change in patients with hip fracture. DESIGN: An intertester reliability study....... SETTING: An acute 20-bed orthopaedic hip fracture unit. SUBJECTS: Fifty consecutive patients with a median age of 83 (25-75% quartile, 68-86) years. INTERVENTIONS: The CAS, which describes the patient's independency in three activities - (1) getting in and out of bed, (2) sit to stand from a chair, and (3...

  7. Methodology and assumptions for evaluating heating and cooling energy requirements in new single-family residential buildings: Technical support document for the PEAR (Program for Energy Analysis of Residences) microcomputer program

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Y.J.; Ritschard, R.; Bull, J.; Byrne, S.; Turiel, I.; Wilson, D.; Hsui, C.; Foley, D.

    1987-01-01

    This report provides technical documentation for a software package called PEAR (Program for Energy Analysis of Residences) developed by LBL. PEAR offers an easy-to-use and accurate method of estimating the energy savings associated with various energy conservation measures used in site-built, single-family homes. This program was designed for use by non-technical groups such as home builders, home buyers or others in the buildings industry, and developed as an integral part of a set of voluntary guidelines entitled Affordable Housing Through Energy Conservation: A Guide to Designing and Constructing Energy Efficient Homes. These guidelines provide a method for selecting and evaluating cost-effective energy conservation measures based on the energy savings estimated by PEAR. This work is part of a Department of Energy program aimed at conducting research that will improve the energy efficiency of the nation's stock of conventionally-built and manufactured homes, and presenting the results to the public in a simplified format.

  8. Human Praxis: A New Basic Assumption for Art Educators of the Future.

    Science.gov (United States)

    Hodder, Geoffrey S.

    1980-01-01

    After analyzing Vincent Lanier's five characteristic roles of art education, the article briefly explains the pedagogy of Paulo Freire, based on human praxis, and applies it to the existing "oppresive" art education system. The article reduces Lanier's roles to resemble a single Freirean model. (SB)

  9. Basic principles

    International Nuclear Information System (INIS)

    Wilson, P.D.

    1996-01-01

    Some basic explanations are given of the principles underlying the nuclear fuel cycle, starting with the physics of atomic and nuclear structure and continuing with nuclear energy and reactors, fuel and waste management and finally a discussion of economics and the future. An important aspect of the fuel cycle concerns the possibility of ''closing the back end'' i.e. reprocessing the waste or unused fuel in order to re-use it in reactors of various kinds. The alternative, the ''oncethrough'' cycle, discards the discharged fuel completely. An interim measure involves the prolonged storage of highly radioactive waste fuel. (UK)

  10. Basic electronics

    CERN Document Server

    Tayal, DC

    2010-01-01

    The second edition of this book incorporates the comments and suggestions of my friends and students who have critically studied the first edition. In this edition the changes and additions have been made and subject matter has been rearranged at some places. The purpose of this text is to provide a comprehensive and up-to-date study of the principles of operation of solid state devices, their basic circuits and application of these circuits to various electronic systems, so that it can serve as a standard text not only for universities and colleges but also for technical institutes. This book

  11. Assumptions for the Annual Energy Outlook 1993

    International Nuclear Information System (INIS)

    1993-01-01

    This report is an auxiliary document to the Annual Energy Outlook 1993 (AEO) (DOE/EIA-0383(93)). It presents a detailed discussion of the assumptions underlying the forecasts in the AEO. The energy modeling system is an economic equilibrium system, with component demand modules representing end-use energy consumption by major end-use sector. Another set of modules represents petroleum, natural gas, coal, and electricity supply patterns and pricing. A separate module generates annual forecasts of important macroeconomic and industrial output variables. Interactions among these components of energy markets generate projections of prices and quantities for which energy supply equals energy demand. This equilibrium modeling system is referred to as the Intermediate Future Forecasting System (IFFS). The supply models in IFFS for oil, coal, natural gas, and electricity determine supply and price for each fuel depending upon consumption levels, while the demand models determine consumption depending upon end-use price. IFFS solves for market equilibrium for each fuel by balancing supply and demand to produce an energy balance in each forecast year

  12. Evaluative Conditioning 2.0: Direct versus Associative Transfer of Affect to Brands

    NARCIS (Netherlands)

    S.T.L.R. Sweldens (Steven)

    2009-01-01

    textabstractA basic assumption in advertising is that brands become more well-liked after they were presented in positive contexts. This assumption is warranted because studies on ‘evaluative conditioning’ have demonstrated that when a brand is repeatedly presented together with positive affective

  13. Underlying assumptions and core beliefs in anorexia nervosa and dieting.

    Science.gov (United States)

    Cooper, M; Turner, H

    2000-06-01

    To investigate assumptions and beliefs in anorexia nervosa and dieting. The Eating Disorder Belief Questionnaire (EDBQ), was administered to patients with anorexia nervosa, dieters and female controls. The patients scored more highly than the other two groups on assumptions about weight and shape, assumptions about eating and negative self-beliefs. The dieters scored more highly than the female controls on assumptions about weight and shape. The cognitive content of anorexia nervosa (both assumptions and negative self-beliefs) differs from that found in dieting. Assumptions about weight and shape may also distinguish dieters from female controls.

  14. Evaluation of Pharmacokinetic Assumptions Using a 443 Chemical Library (IVIVE)

    Science.gov (United States)

    With the increasing availability of high-throughput and in vitro data for untested chemicals, there is a need for pharmacokinetic (PK) models for in vitro to in vivo extrapolation (IVIVE). Though some PBPK models have been created for individual compounds us...

  15. Evaluation of Pharmacokinetic Assumptions Using a 443 Chemical Library (SOT)

    Science.gov (United States)

    With the increasing availability of high-throughput and in vitro data for untested chemicals, there is a need for pharmacokinetic (PK) models for in vitro to in vivo extrapolation (IVIVE). Though some PBPK models have been created for individual compounds using in vivo data, we ...

  16. An evaluation of a community-based basic parenting programme: a two-year follow-up.

    Science.gov (United States)

    Roberts, Deborah

    2012-02-01

    Behavioural difficulties in the early years and through primary school age present a challenge to community practitioners; and the long-term costs to society of untreated conduct disorder place a huge financial strain on services, as well as leading to a poor prognosis for the children affected. The aim of this study was to establish the long-term effects for participants attending a 12-week Basic Incredible Years Programme, two years post-completion. Fifty-seven participants were interviewed, representing 63% of the original sample, who parented children aged 1-12 years. Pre- and post-intervention and follow-up measures were the General Health Questionnaire (30) and Eyberg Child Behaviour Checklist. The most common theme reported was that it had helped to change their child's behaviour, and this was demonstrated quantitatively with mean average scores for the Eyberg Child Behaviour Inventory reducing to below clinical cut-off post-group and at two years. This same pattern was seen for participants' mental health, with improvements largely maintained at two years. Of the one-third of the children whose behaviour deteriorated two years after the course, two-thirds of these children had experienced adverse life events or had a secondary diagnosis.

  17. [Empowerment in prevention and health promotion--a critical conceptual evaluation of basic understanding, dimensions and assessment problems].

    Science.gov (United States)

    Kliche, T; Kröger, G

    2008-12-01

    Empowerment is an important concept in health care, but despite its prevalence it seems to be more of a buzz word. Thus, a conceptual review on empowerment in prevention and health promotion was carried out. 62 German and international theoretical contributions, reviews and studies were incorporated, covering the fields of prevention, care and therapy, rehabilitation, health-care research, nursing and work-related stress. The analysis revealed eight main dimensions of empowerment: (1) shared decision-making, (2) self-efficacy, (3) social support and social capital, (4) skills and competences, (5) health care utilisation, (6) goal setting and attainment, (7) reflexive thought and (8) innovation. Their empirical assessment can be carried out on a micro-, meso-, or macro-level. Three distinct basic conceptual notions emerged from the analysis, each applying its own specific research questions and measurement instruments: clinical, organizational-professional and political understanding of "empowerment". Therefore, these three specific conceptual notions should each be developed and tested separately, in particular in reviews, and empirical studies should embrace all eight subdimensions.

  18. Inflation Basics

    Energy Technology Data Exchange (ETDEWEB)

    Green, Dan [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2014-03-01

    inflation since metrical fluctuations, both scalar and tensor, are also produced in inflationary models. Thus, the time appears to be appropriate for a very basic and simple exposition of the inflationary model written from a particle physics perspective. Only the simplest scalar model will be explored because it is easy to understand and contains all the basic elements of the inflationary model.

  19. Inflation Basics

    International Nuclear Information System (INIS)

    Green, Dan

    2014-01-01

    waves imprinted on the CMB. These would be a ''smoking gun'' for inflation since metrical fluctuations, both scalar and tensor, are also produced in inflationary models. Thus, the time appears to be appropriate for a very basic and simple exposition of the inflationary model written from a particle physics perspective. Only the simplest scalar model will be explored because it is easy to understand and contains all the basic elements of the inflationary model.

  20. Development of design window evaluation and display system. 2. Confirmations of the basic performance of genetic algorithm

    International Nuclear Information System (INIS)

    Murakami, Satoshi; Muramatsu, Toshiharu

    2003-05-01

    A large- scale sodium-cooled fast breeder reactor in feasibility studies on commercialized fast reactors which has a tendency of consideration of through simplified and compacted system is being investigated, however special attention should be paid to thermohydraulic designs for a gas entrainment behavior from free surfaces, a flow-induced vibration of in-vessel components, a thermal shock for various structures due to high-speed coolant flows, nonsymmetrical coolant flows, etc. in the reactor vessel. As thus a lot of thermal-hydraulic issues relate to each other complicatedly on the reactor designs, multiple-criteria decision-making on the understanding of relationship among thermal-hydraulic issues is indispensable to design the reactor efficiently. Genetic Algorithm (GA), which is one of the methods for multiple-criteria decision-making, was applied to the typical simple objective optimization problems and then was confirmed its basic performance. From the analyses, the following results have been obtained. (1) In the unimodal optimization problem, it was confirmed that GA is capable of sufficient searching ability. (2) It was confirmed that GA can be also applied to the discrete optimization problems. (3) In the case of applying GA to the combinational optimization problem, the searching efficiency is improved better by increasing the number of experiment times than the maximum of generation. (4) In the case of applying Ga to the multimodal optimization problem, the searching ability is improved by using the two genetic operators (I.e., the mutation, and the elite strategy) at the same time. (author)

  1. The zero-sum assumption in neutral biodiversity theory

    NARCIS (Netherlands)

    Etienne, R.S.; Alonso, D.; McKane, A.J.

    2007-01-01

    The neutral theory of biodiversity as put forward by Hubbell in his 2001 monograph has received much criticism for its unrealistic simplifying assumptions. These are the assumptions of functional equivalence among different species (neutrality), the assumption of point mutation speciation, and the

  2. Philosophy of Technology Assumptions in Educational Technology Leadership

    Science.gov (United States)

    Webster, Mark David

    2017-01-01

    A qualitative study using grounded theory methods was conducted to (a) examine what philosophy of technology assumptions are present in the thinking of K-12 technology leaders, (b) investigate how the assumptions may influence technology decision making, and (c) explore whether technological determinist assumptions are present. Subjects involved…

  3. Designing and Evaluating a Professional Development Programme for Basic Technology Integration in English as a Foreign Language (EFL) Classrooms

    Science.gov (United States)

    Ansyari, Muhammad Fauzan

    2015-01-01

    This study aims to develop and evaluate a professional development programme for technology integration in an Indonesian university's English language teaching setting. The study explored the characteristics of this programme to English lecturers' technological pedagogical content knowledge (TPCK) development. This design-based research employed…

  4. Visual Product Evaluation: Using the Semantic Differential to Investigate the Influence of Basic Vase Geometry on Users’ Perception

    DEFF Research Database (Denmark)

    Achiche, Sofiane; Maier, Anja; Milanova, Krasimira

    2014-01-01

    Products evoke emotions in people. Emotions can influence purchase decisions and product evaluations. It is widely acknowledged that better product performance and higher user satisfaction can be reached through aesthetic design. However, when designing a new product, most of the attention...

  5. Iowa CASAS Pilot Project Reports: An Initial Evaluation of CASAS Effectiveness in Iowa's Adult Basic Education Programs.

    Science.gov (United States)

    Strom, Mary L.

    In fall 1992, the Iowa Department of Education began pilot tests of the Comprehensive Adult Student Assessment System (CASAS), an assessment system evaluating reading, math, and problem solving in a life skills context for adult remedial programs. This document provides reports from the nine community colleges that served as test sites, describing…

  6. Establishing the minimal number of virtual reality simulator training sessions necessary to develop basic laparoscopic skills competence: evaluation of the learning curve

    Directory of Open Access Journals (Sweden)

    Ricardo Jordao Duarte

    2013-09-01

    Full Text Available Introduction Medical literature is scarce on information to define a basic skills training program for laparoscopic surgery (peg and transferring, cutting, clipping. The aim of this study was to determine the minimal number of simulator sessions of basic laparoscopic tasks necessary to elaborate an optimal virtual reality training curriculum. Materials and Methods Eleven medical students with no previous laparoscopic experience were spontaneously enrolled. They were submitted to simulator training sessions starting at level 1 (Immersion Lap VR, San Jose, CA, including sequentially camera handling, peg and transfer, clipping and cutting. Each student trained twice a week until 10 sessions were completed. The score indexes were registered and analyzed. The total of errors of the evaluation sequences (camera, peg and transfer, clipping and cutting were computed and thereafter, they were correlated to the total of items evaluated in each step, resulting in a success percent ratio for each student for each set of each completed session. Thereafter, we computed the cumulative success rate in 10 sessions, obtaining an analysis of the learning process. By non-linear regression the learning curve was analyzed. Results By the non-linear regression method the learning curve was analyzed and a r2 = 0.73 (p < 0.001 was obtained, being necessary 4.26 (∼five sessions to reach the plateau of 80% of the estimated acquired knowledge, being that 100% of the students have reached this level of skills. From the fifth session till the 10th, the gain of knowledge was not significant, although some students reached 96% of the expected improvement. Conclusions This study revealed that after five simulator training sequential sessions the students' learning curve reaches a plateau. The forward sessions in the same difficult level do not promote any improvement in laparoscopic basic surgical skills, and the students should be introduced to a more difficult training

  7. Establishing the minimal number of virtual reality simulator training sessions necessary to develop basic laparoscopic skills competence: evaluation of the learning curve.

    Science.gov (United States)

    Duarte, Ricardo Jordão; Cury, José; Oliveira, Luis Carlos Neves; Srougi, Miguel

    2013-01-01

    Medical literature is scarce on information to define a basic skills training program for laparoscopic surgery (peg and transferring, cutting, clipping). The aim of this study was to determine the minimal number of simulator sessions of basic laparoscopic tasks necessary to elaborate an optimal virtual reality training curriculum. Eleven medical students with no previous laparoscopic experience were spontaneously enrolled. They were submitted to simulator training sessions starting at level 1 (Immersion Lap VR, San Jose, CA), including sequentially camera handling, peg and transfer, clipping and cutting. Each student trained twice a week until 10 sessions were completed. The score indexes were registered and analyzed. The total of errors of the evaluation sequences (camera, peg and transfer, clipping and cutting) were computed and thereafter, they were correlated to the total of items evaluated in each step, resulting in a success percent ratio for each student for each set of each completed session. Thereafter, we computed the cumulative success rate in 10 sessions, obtaining an analysis of the learning process. By non-linear regression the learning curve was analyzed. By the non-linear regression method the learning curve was analyzed and a r2 = 0.73 (p sessions) to reach the plateau of 80% of the estimated acquired knowledge, being that 100% of the students have reached this level of skills. From the fifth session till the 10th, the gain of knowledge was not significant, although some students reached 96% of the expected improvement. This study revealed that after five simulator training sequential sessions the students' learning curve reaches a plateau. The forward sessions in the same difficult level do not promote any improvement in laparoscopic basic surgical skills, and the students should be introduced to a more difficult training tasks level.

  8. The basic approaches to evaluation of effects of the long-therm radiation exposure in a range of 'low' doses

    International Nuclear Information System (INIS)

    Takhauov, R.M.; Karpov, A.B.; Litvyakov, N.V.

    2010-01-01

    The previously performed research allowed forming the main concepts about deterministic effects of radiation impact and defining the main postulates of radiation medicine with respect to average and high levels of exposure. At the same time the research performed to evaluate stochastic effects caused by 'low' doses of ionizing radiation influence failed to find a single-valued answer relating both to dose impact inducing the effect development and to the spectrum of recorded pathologic processes or diseases. In this connection on our opinion the most prospective decision of present problem are studies in follow directions: 1. The evaluation of radiation productions personal and residents of nearby territories mortality and morbidity structure and dynamics. 2. The estimation of risk main diseases development with long-term radiation exposure and analysis of radiation factor role in its pathogenesis. 3. The study of genetic disturbances of persons and their descendants exposed long-term ionizing radiation in 'low' doses. 4. The study of individual radio sensitivity genetic markers. 5. The evaluation of structural and functional homeostasis disturbances inducing the radiation exposure in 'low' doses. The object of study is the Close administrative territorial formation Seversk population and mainly Siberian Group Chemical Enterprises (SGCE) personal - the largest complex productions of atomic industry in the world. The Regional Medico-Dosimetric Register (RMDR), creating since 2001 is the basis for epidemiological studies. The register database contain the information concerning to about 66 000 SGCE workers (it is a whole number of workers in all years of SGCE existence from 1950 to now day). The 35 000 from 66 000 workers are the workers of so-called main productions. About 96% workers exposed external radiation has cumulative dose less 500 mSv. On the base of laboratory of genomic medicine it was created and constantly enriched the bank DNA and biological material

  9. Teachers in Action Research: Assumptions and Potentials

    Science.gov (United States)

    Li, Yuen-Ling

    2008-01-01

    Research literature has long indicated that action research may stimulate practitioners themselves to actively evaluate the quality of their practice. This study is designed to report the use of action research for the development of early years professional practice by analyzing the pre-project and the post-project video-filmed teaching events.…

  10. HYPROLOG: A New Logic Programming Language with Assumptions and Abduction

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2005-01-01

    We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars. The lan......We present HYPROLOG, a novel integration of Prolog with assumptions and abduction which is implemented in and partly borrows syntax from Constraint Handling Rules (CHR) for integrity constraints. Assumptions are a mechanism inspired by linear logic and taken over from Assumption Grammars....... The language shows a novel flexibility in the interaction between the different paradigms, including all additional built-in predicates and constraints solvers that may be available. Assumptions and abduction are especially useful for language processing, and we can show how HYPROLOG works seamlessly together...

  11. Spatial modelling of assumption of tourism development with geographic IT using

    Directory of Open Access Journals (Sweden)

    Jitka Machalová

    2010-01-01

    Full Text Available The aim of this article is to show the possibilities of spatial modelling and analysing of assumptions of tourism development in the Czech Republic with the objective to make decision-making processes in tourism easier and more efficient (for companies, clients as well as destination managements. The development and placement of tourism depend on the factors (conditions that influence its application in specific areas. These factors are usually divided into three groups: selective, localization and realization. Tourism is inseparably connected with space – countryside. The countryside can be modelled and consecutively analysed by the means of geographical information technologies. With the help of spatial modelling and following analyses the localization and realization conditions in the regions of the Czech Republic have been evaluated. The best localization conditions have been found in the Liberecký region. The capital city of Prague has negligible natural conditions; however, those social ones are on a high level. Next, the spatial analyses have shown that the best realization conditions are provided by the capital city of Prague. Then the Central-Bohemian, South-Moravian, Moravian-Silesian and Karlovarský regions follow. The development of tourism destination is depended not only on the localization and realization factors but it is basically affected by the level of local destination management. Spatial modelling can help destination managers in decision-making processes in order to optimal use of destination potential and efficient targeting their marketing activities.

  12. Behavioural assumptions in labour economics: Analysing social security reforms and labour market transitions

    OpenAIRE

    van Huizen, T.M.

    2012-01-01

    The aim of this dissertation is to test behavioural assumptions in labour economics models and thereby improve our understanding of labour market behaviour. The assumptions under scrutiny in this study are derived from an analysis of recent influential policy proposals: the introduction of savings schemes in the system of social security. A central question is how this reform will affect labour market incentives and behaviour. Part I (Chapter 2 and 3) evaluates savings schemes. Chapter 2 exam...

  13. Assumptions of Customer Knowledge Enablement in the Open Innovation Process

    Directory of Open Access Journals (Sweden)

    Jokubauskienė Raminta

    2017-08-01

    Full Text Available In the scientific literature, open innovation is one of the most effective means to innovate and gain a competitive advantage. In practice, there is a variety of open innovation activities, but, nevertheless, customers stand as the cornerstone in this area, since the customers’ knowledge is one of the most important sources of new knowledge and ideas. Evaluating the context where are the interactions of open innovation and customer knowledge enablement, it is necessary to take into account the importance of customer knowledge management. Increasingly it is highlighted that customers’ knowledge management facilitates the creation of innovations. However, it should be an examination of other factors that influence the open innovation, and, at the same time, customers’ knowledge management. This article presents a theoretical model, which reveals the assumptions of open innovation process and the impact on the firm’s performance.

  14. Evaluation of doctors' performance as facilitators in basic medical science lecture classes in a new Malaysian medical school

    Directory of Open Access Journals (Sweden)

    Ismail S

    2015-03-01

    Full Text Available Salwani Ismail,1 Abdus Salam,2 Ahmed G Alattraqchi,1 Lakshmi Annamalai,1 Annamalai Chockalingam,1 Wan Putri Elena,3 Nor Iza A Rahman,1 Abdullahi Rabiu Abubakar,1 Mainul Haque1 1Faculty of Medicine, Universiti Sultan Zainal Abidin, Kuala Terengganu, Terengganu, Malaysia; 2Department of Medical Education, Universiti Kebangsaan Malaysia Medical Centre, Kuala Lumpur, Malaysia; 3School of Health Sciences, Health Campus, Universiti Sains Malaysia, Kubang Kerian, Kelantan, Malaysia Background: Didactic lecture is the oldest and most commonly used method of teaching. In addition, it is considered one of the most efficient ways to disseminate theories, ideas, and facts. Many critics feel that lectures are an obsolete method to use when students need to perform hands-on activities, which is an everyday need in the study of medicine. This study evaluates students' perceptions regarding lecture quality in a new medical school. Methods: This was a cross-sectional study conducted of the medical students of Universiti Sultan Zainal Abidin. The study population was 468 preclinical medical students from years 1 and 2 of academic year 2012–2013. Data were collected using a validated instrument. There were six different sections of questions using a 5-point Likert scale. The data were then compiled and analyzed, using SPSS version 20. Results: The response rate was 73%. Among 341 respondents, 30% were male and 70% were female. Eighty-five percent of respondents agree or strongly agree that the lectures had met the criteria with regard to organization of lecture materials. Similarly, 97% of students agree or strongly agree that lecturers maintained adequate voices and gestures. Conclusion: Medical students are quite satisfied with the lecture classes and the lectures. However, further research is required to identify student-centered teaching and learning methods to promote active learning. Keywords: lecture, effectiveness, evaluation, undergraduate medical

  15. A PSA study for the SMART basic design

    International Nuclear Information System (INIS)

    Han, Sang Hoon; Kim, H. C.; Yang, S. H.; Lee, D. J.

    2002-03-01

    SMART (System-Integrated Modular Advanced Reactor) is under development that is an advanced integral type small and medium category nuclear power reactor with the rated thermal power of 330 MW. A Probabilistic Safety Analysis (PSA) for the SMART basic design has been performed to evaluate the safety and optimize the design. Currently, the basic design is done and the detailed design is not available for the SMART, we made several assumptions about the system design before performing the PSA. The scope of the PSA was limited to the Level-1 internal full power PSA. The level-2 and 3 PSA, the external PSA, and the low power/shutdown PSA will be performed in the final design stage

  16. Development of an educational simulator system, ECCSIM-Lite, for the acquisition of basic perfusion techniques and evaluation.

    Science.gov (United States)

    Ninomiya, Shinji; Tokumine, Asako; Yasuda, Toru; Tomizawa, Yasuko

    2007-01-01

    A training system with quantitative evaluation of performance for training perfusionists is valuable for preparation for rare but critical situations. A simulator system, ECCSIM-Lite, for extracorporeal circulation (ECC) training of perfusionists was developed. This system consists of a computer system containing a simulation program of the hemodynamic conditions and the training scenario with instructions, a flow sensor unit, a reservoir with a built-in water level sensor, and an ECC circuit with a soft bag representing the human body. This system is relatively simple, easy to handle, compact, and reasonably inexpensive. Quantitative information is recorded, including the changes in arterial flow by the manipulation of a knob, the changes in venous drainage by handling a clamp, and the change in reservoir level; the time courses of the above parameters are presented graphically. To increase the realism of the training, a numerical-hydraulic circulatory model was applied. Following the instruction and explanation of the scenario in the form of audio and video captions, it is possible for a trainee to undertake self-study without an instructor or a computer operator. To validate the system, a training session was given to three beginners using a simple training scenario; it was possible to record the performance of the perfusion sessions quantitatively. In conclusion, the ECCSIM-Lite system is expected to be useful for perfusion training, since quantitative information about the trainee's performance is recorded and it is possible to use the data for assessment and comparison.

  17. Evaluation of doctors' performance as facilitators in basic medical science lecture classes in a new Malaysian medical school.

    Science.gov (United States)

    Ismail, Salwani; Salam, Abdus; Alattraqchi, Ahmed G; Annamalai, Lakshmi; Chockalingam, Annamalai; Elena, Wan Putri; Rahman, Nor Iza A; Abubakar, Abdullahi Rabiu; Haque, Mainul

    2015-01-01

    Didactic lecture is the oldest and most commonly used method of teaching. In addition, it is considered one of the most efficient ways to disseminate theories, ideas, and facts. Many critics feel that lectures are an obsolete method to use when students need to perform hands-on activities, which is an everyday need in the study of medicine. This study evaluates students' perceptions regarding lecture quality in a new medical school. This was a cross-sectional study conducted of the medical students of Universiti Sultan Zainal Abidin. The study population was 468 preclinical medical students from years 1 and 2 of academic year 2012-2013. Data were collected using a validated instrument. There were six different sections of questions using a 5-point Likert scale. The data were then compiled and analyzed, using SPSS version 20. The response rate was 73%. Among 341 respondents, 30% were male and 70% were female. Eighty-five percent of respondents agree or strongly agree that the lectures had met the criteria with regard to organization of lecture materials. Similarly, 97% of students agree or strongly agree that lecturers maintained adequate voices and gestures. Medical students are quite satisfied with the lecture classes and the lectures. However, further research is required to identify student-centered teaching and learning methods to promote active learning.

  18. Hydrogeochemical and stream sediment reconnaissance basic data for Emory Peak NTMS Quadrangle, Texas. Uranium Resource Evaluation Project

    International Nuclear Information System (INIS)

    1978-01-01

    Results of a reconnaissance geochemical survey of the Emory Peak Quadrangle, Texas, are reported. Field and laboratory data are presented for 193 groundwater samples and 491 stream sediment samples. Statistical and areal distributions of uranium and other possible uranium-related variables are displayed. A generalized geologic map of the survey area is provided, and the pertinent geologic factors which may be of significance in evaluating the potential for uranium mineralization are briefly discussed. In groundwater, uranium concentrations above the 85th percentile outline an area in the northwest portion of the quadrangle which is dominated by tertiary tuffaceous ash beds which disconformably overlie cretaceous units. The relationship between uranium and related variables indicates this area appears to have the best potential for uranium mineralization within the quadrangle. Stream sediment data indicate four areas that appear to be favorable for potential uranium mineralization: the Upper Green Valley-Paradise Valley region, the Terlingua Creek-Solitario region, an area in the vicinity of Big Bend National Park, and an area east of long. 102 0 15' W. In the first three of the preceding areas, soluble uranium is associated with tertiary igneous rocks. In the fourth area, soluble uranium is present in carbonate-dominant cretaceous strata

  19. TIES for Dummies 3rd Edition (Technology Identification, Evaluation, and Selection) Basic how to's to implement the TIES method

    Science.gov (United States)

    Kirby, Michelle R.

    2002-01-01

    The TIES method is a forecasting environment whereby the decision-maker has the ability to easily assess and trade-off the impact of various technologies without sophisticated and time-consuming mathematical formulations. TIES provides a methodical approach where technically feasible alternatives can be identified with accuracy and speed to reduce design cycle time, and subsequently, life cycle costs, and was achieved through the use of various probabilistic methods, such as Response Surface Methodology and Monte Carlo Simulations. Furthermore, structured and systematic techniques are utilized from other fields to identify possible concepts and evaluation criteria by which comparisons can be made. This objective is achieved by employing the use of Morphological Matrices and Multi-Attribute Decision Making techniques. Through the execution of each step, a family of design alternatives for a given set of customer requirements can be identified and assessed subjectively or objectively. This methodology allows for more information (knowledge) to be brought into the earlier phases of the design process and will have direct implications on the affordability of the system. The increased knowledge allows for optimum allocation of company resources and quantitative justification for program decisions. Finally, the TIES method provided novel results and quantitative justification to facilitate decision making in the early stages of design so as to produce affordable and quality products.

  20. Development and evaluation of a hypermedia system that integrates basic concepts of mechanics, biomechanics and human anatomy

    Directory of Open Access Journals (Sweden)

    Flavia Rezende

    2006-08-01

    Full Text Available This work describes the modeling of a hypermedia learning system (called “Biomec” that integrates physical, biomechanical and anatomical concepts involved in the human motion and a study carried out with undergraduate students who interacted with the system. The instructional design of the “Biomec” hypermedia system was developed on the basis of a theoretical framework which articulates the Cognitive Flexibility Theory and the interdisciplinary approach to knowledge. The system was evaluated based on its use by students of Biomechanics I and Kinesiology in a Pre Service Teachers Training Course of Physical Education aiming to discuss the following questions: (i what is its impact on the students’ attitude related to Physics? (ii in what extent does the hypertextual approach to the content favor the interdisciplinary conception of human motion? (iii in what extent do the students’ navigation profiles adapt to conceptual needs of the different disciplines of the course? The students answered instruments that assessed affective and cognitive aspects before and after the interaction with the system, and had their navigation registered and analyzed. The set of data obtained allowed to conclude that the “Biomec” system is a relevant instructional material, capable of positively influence the students’ attitude related to Physics, to favor the interdisciplinary approach of human motion and to attend the students enrolled in Biomechanics I better than the students enrolled in Kinesiology.

  1. A Last Glacial Maximum world-ocean simulation at eddy-permitting resolution - Part 1: Experimental design and basic evaluation

    Science.gov (United States)

    Ballarotta, M.; Brodeau, L.; Brandefelt, J.; Lundberg, P.; Döös, K.

    2013-01-01

    Most state-of-the-art climate models include a coarsely resolved oceanic component, which has difficulties in capturing detailed dynamics, and therefore eddy-permitting/eddy-resolving simulations have been developed to reproduce the observed World Ocean. In this study, an eddy-permitting numerical experiment is conducted to simulate the global ocean state for a period of the Last Glacial Maximum (LGM, ~ 26 500 to 19 000 yr ago) and to investigate the improvements due to taking into account these higher spatial scales. The ocean general circulation model is forced by a 49-yr sample of LGM atmospheric fields constructed from a quasi-equilibrated climate-model simulation. The initial state and the bottom boundary condition conform to the Paleoclimate Modelling Intercomparison Project (PMIP) recommendations. Before evaluating the model efficiency in representing the paleo-proxy reconstruction of the surface state, the LGM experiment is in this first part of the investigation, compared with a present-day eddy-permitting hindcast simulation as well as with the available PMIP results. It is shown that the LGM eddy-permitting simulation is consistent with the quasi-equilibrated climate-model simulation, but large discrepancies are found with the PMIP model analyses, probably due to the different equilibration states. The strongest meridional gradients of the sea-surface temperature are located near 40° N and S, this due to particularly large North-Atlantic and Southern-Ocean sea-ice covers. These also modify the locations of the convection sites (where deep-water forms) and most of the LGM Conveyor Belt circulation consequently takes place in a thinner layer than today. Despite some discrepancies with other LGM simulations, a glacial state is captured and the eddy-permitting simulation undertaken here yielded a useful set of data for comparisons with paleo-proxy reconstructions.

  2. Synthetic Vision Systems in GA Cockpit-Evaluation of Basic Maneuvers Performed by Low Time GA Pilots During Transition from VMC to IMC

    Science.gov (United States)

    Takallu, M. A.; Wong, D. T.; Uenking, M. D.

    2002-01-01

    An experimental investigation was conducted to study the effectiveness of modern flight displays in general aviation cockpits for mitigating Low Visibility Loss of Control and the Controlled Flight Into Terrain accidents. A total of 18 General Aviation (GA) pilots with private pilot, single engine land rating, with no additional instrument training beyond private pilot license requirements, were recruited to evaluate three different display concepts in a fixed-based flight simulator at the NASA Langley Research Center's General Aviation Work Station. Evaluation pilots were asked to continue flight from Visual Meteorological Conditions (VMC) into Instrument Meteorological Conditions (IMC) while performing a series of 4 basic precision maneuvers. During the experiment, relevant pilot/vehicle performance variables, pilot control inputs and physiological data were recorded. Human factors questionnaires and interviews were administered after each scenario. Qualitative and quantitative data have been analyzed and the results are presented here. Pilot performance deviations from the established target values (errors) were computed and compared with the FAA Practical Test Standards. Results of the quantitative data indicate that evaluation pilots committed substantially fewer errors when using the Synthetic Vision Systems (SVS) displays than when they were using conventional instruments. Results of the qualitative data indicate that evaluation pilots perceived themselves to have a much higher level of situation awareness while using the SVS display concept.

  3. Community Hospital of the Assumption, Thurles, Tipperary.

    LENUS (Irish Health Repository)

    Moorhead, Anne

    2011-03-31

    Abstract Background Health professionals working in primary care and public health have opportunities to address body weight status issues with their patients through face-to-face contact. The objectives of this all-Ireland project are: 1. to assess the attitudes, current practices\\/behaviours and knowledge of key health professional groups on body weight status; 2. to assess the health professional groups\\' ability to identify body weight status in both adults and children. The health professional groups are: (a) community related public health nurses; (b) school public health nurses; (c) GPs and practice nurses (primary care); and (d) occupational health nurses (workplace) from both Northern Ireland and the Republic of Ireland. Methods\\/Design This all-Ireland multi-disciplinary project follows a mixed methods approach using both quantitative and qualitative methodologies, and consists of four components: 1. Literature review - to explore the role of health professionals in managing obesity through spontaneous intervention in a variety of health promotion settings. 2. Telephone interviews and focus groups - to gain an in-depth insight into the views of health professionals in assessing body weight status. 3. Survey (primarily online but also paper-based) - to determine the attitudes, current practices\\/behaviours and knowledge of health professionals in assessing body weight status. 4. Online evaluation study - an online interactive programme will be developed to assess health professionals\\' ability to identify the body weight status of adults and children. Discussion This project will assess and report the attitudes, current practices\\/behaviours and knowledge of key health professional groups within Northern Ireland and the Republic of Ireland on body weight status, and their ability to identify body weight status in both adults and children. The results of this project will generate recommendations for clinical practice in managing obesity, which may

  4. Breakdown of Hydrostatic Assumption in Tidal Channel with Scour Holes

    Directory of Open Access Journals (Sweden)

    Chunyan Li

    2016-10-01

    Full Text Available Hydrostatic condition is a common assumption in tidal and subtidal motions in oceans and estuaries.. Theories with this assumption have been largely successful. However, there is no definite criteria separating the hydrostatic from the non-hydrostatic regimes in real applications because real problems often times have multiple scales. With increased refinement of high resolution numerical models encompassing smaller and smaller spatial scales, the need for non-hydrostatic models is increasing. To evaluate the vertical motion over bathymetric changes in tidal channels and assess the validity of the hydrostatic approximation, we conducted observations using a vessel-based acoustic Doppler current profiler (ADCP. Observations were made along a straight channel 18 times over two scour holes of 25 m deep, separated by 330 m, in and out of an otherwise flat 8 m deep tidal pass leading to the Lake Pontchartrain over a time period of 8 hours covering part of the diurnal tidal cycle. Out of the 18 passages over the scour holes, 11 of them showed strong upwelling and downwelling which resulted in the breakdown of hydrostatic condition. The maximum observed vertical velocity was ~ 0.35 m/s, a high value in a tidal channel, and the estimated vertical acceleration reached a high value of 1.76×10-2 m/s2. Analysis demonstrated that the barotropic non-hydrostatic acceleration was dominant. The cause of the non-hydrostatic flow was the that over steep slopes. This demonstrates that in such a system, the bathymetric variation can lead to the breakdown of hydrostatic conditions. Models with hydrostatic restrictions will not be able to correctly capture the dynamics in such a system with significant bathymetric variations particularly during strong tidal currents.

  5. Investigating the Assumptions of Uses and Gratifications Research

    Science.gov (United States)

    Lometti, Guy E.; And Others

    1977-01-01

    Discusses a study designed to determine empirically the gratifications sought from communication channels and to test the assumption that individuals differentiate channels based on gratifications. (MH)

  6. Legal assumptions for private company claim for additional (supplementary payment

    Directory of Open Access Journals (Sweden)

    Šogorov Stevan

    2011-01-01

    Full Text Available Subject matter of analyze in this article are legal assumptions which must be met in order to enable private company to call for additional payment. After introductory remarks discussion is focused on existence of provisions regarding additional payment in formation contract, or in shareholders meeting general resolution, as starting point for company's claim. Second assumption is concrete resolution of shareholders meeting which creates individual obligations for additional payments. Third assumption is defined as distinctness regarding sum of payment and due date. Sending of claim by relevant company body is set as fourth legal assumption for realization of company's right to claim additional payments from member of private company.

  7. “Five Minutes of Composers”: A Technique for Evaluating Productivity of Verbal Memory in the System of Basic Music Education

    Directory of Open Access Journals (Sweden)

    Sorokov D.G.

    2017-11-01

    Full Text Available The paper addresses the need for developing criteria-based diagnostic tools for quick individual and group evaluation of musical knowledge in children within the system of basic music education. The proposed technique called “Five Minutes of Composers” allows one to evaluate musical knowledge in a single child, in a whole class or in an educational organisation. The paper provides a full description of the technique and the process of its standartisation: stanines and corresponding normative values are assigned to each age group; the differential validity of the technique is statistically proven for the factors "gender", "stage of education", "age", and "total productivity". The outcomes of the conducted study show the following: the average level of productivity is significantly higher in girls; this level is significantly higher in students of 6th and 7th classes as compared to students of 3rd—5th classes; there is a direct correlation between age and productivity of recall; children with high levels of productivity outscore others in the number of recalled names of composers right from the start. The paper concludes with some remarks concerning the possibilities of using this technique for measuring the progress in children’s musical knowledge, for criteria-based comparative analysis of the quality of teaching, and for evaluating the quality of music education in single classes and educational organisations in general.

  8. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the ...

  9. 40 CFR 264.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... FACILITIES Financial Requirements § 264.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure, post-closure care, or... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  10. 40 CFR 261.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... Excluded Hazardous Secondary Materials § 261.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure or liability... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  11. 40 CFR 265.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ..., STORAGE, AND DISPOSAL FACILITIES Financial Requirements § 265.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the... 40 Protection of Environment 25 2010-07-01 2010-07-01 false State assumption of responsibility...

  12. 40 CFR 144.66 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... PROGRAMS (CONTINUED) UNDERGROUND INJECTION CONTROL PROGRAM Financial Responsibility: Class I Hazardous Waste Injection Wells § 144.66 State assumption of responsibility. (a) If a State either assumes legal... 40 Protection of Environment 22 2010-07-01 2010-07-01 false State assumption of responsibility...

  13. 40 CFR 267.150 - State assumption of responsibility.

    Science.gov (United States)

    2010-07-01

    ... STANDARDIZED PERMIT Financial Requirements § 267.150 State assumption of responsibility. (a) If a State either assumes legal responsibility for an owner's or operator's compliance with the closure care or liability... 40 Protection of Environment 26 2010-07-01 2010-07-01 false State assumption of responsibility...

  14. Capturing Assumptions while Designing a Verification Model for Embedded Systems

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    A formal proof of a system correctness typically holds under a number of assumptions. Leaving them implicit raises the chance of using the system in a context that violates some assumptions, which in return may invalidate the correctness proof. The goal of this paper is to show how combining

  15. PFP issues/assumptions development and management planning guide

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The PFP Issues/Assumptions Development and Management Planning Guide presents the strategy and process used for the identification, allocation, and maintenance of an Issues/Assumptions Management List for the Plutonium Finishing Plant (PFP) integrated project baseline. Revisions to this document will include, as attachments, the most recent version of the Issues/Assumptions Management List, both open and current issues/assumptions (Appendix A), and closed or historical issues/assumptions (Appendix B). This document is intended be a Project-owned management tool. As such, this document will periodically require revisions resulting from improvements of the information, processes, and techniques as now described. Revisions that suggest improved processes will only require PFP management approval

  16. Assumptions and Policy Decisions for Vital Area Identification Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myungsu; Bae, Yeon-Kyoung; Lee, Youngseung [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    U.S. Nuclear Regulatory Commission and IAEA guidance indicate that certain assumptions and policy questions should be addressed to a Vital Area Identification (VAI) process. Korea Hydro and Nuclear Power conducted a VAI based on current Design Basis Threat and engineering judgement to identify APR1400 vital areas. Some of the assumptions were inherited from Probabilistic Safety Assessment (PSA) as a sabotage logic model was based on PSA logic tree and equipment location data. This paper illustrates some important assumptions and policy decisions for APR1400 VAI analysis. Assumptions and policy decisions could be overlooked at the beginning stage of VAI, however they should be carefully reviewed and discussed among engineers, plant operators, and regulators. Through APR1400 VAI process, some of the policy concerns and assumptions for analysis were applied based on document research and expert panel discussions. It was also found that there are more assumptions to define for further studies for other types of nuclear power plants. One of the assumptions is mission time, which was inherited from PSA.

  17. Detecting and accounting for violations of the constancy assumption in non-inferiority clinical trials.

    Science.gov (United States)

    Koopmeiners, Joseph S; Hobbs, Brian P

    2018-05-01

    Randomized, placebo-controlled clinical trials are the gold standard for evaluating a novel therapeutic agent. In some instances, it may not be considered ethical or desirable to complete a placebo-controlled clinical trial and, instead, the placebo is replaced by an active comparator with the objective of showing either superiority or non-inferiority to the active comparator. In a non-inferiority trial, the experimental treatment is considered non-inferior if it retains a pre-specified proportion of the effect of the active comparator as represented by the non-inferiority margin. A key assumption required for valid inference in the non-inferiority setting is the constancy assumption, which requires that the effect of the active comparator in the non-inferiority trial is consistent with the effect that was observed in previous trials. It has been shown that violations of the constancy assumption can result in a dramatic increase in the rate of incorrectly concluding non-inferiority in the presence of ineffective or even harmful treatment. In this paper, we illustrate how Bayesian hierarchical modeling can be used to facilitate multi-source smoothing of the data from the current trial with the data from historical studies, enabling direct probabilistic evaluation of the constancy assumption. We then show how this result can be used to adapt the non-inferiority margin when the constancy assumption is violated and present simulation results illustrating that our method controls the type-I error rate when the constancy assumption is violated, while retaining the power of the standard approach when the constancy assumption holds. We illustrate our adaptive procedure using a non-inferiority trial of raltegravir, an antiretroviral drug for the treatment of HIV.

  18. Rapid and simple detection of foot-and-mouth disease virus: Evaluation of a cartridge-based molecular detection system for use in basic laboratories.

    Science.gov (United States)

    Goller, K V; Dill, V; Madi, M; Martin, P; Van der Stede, Y; Vandenberge, V; Haas, B; Van Borm, S; Koenen, F; Kasanga, C J; Ndusilo, N; Beer, M; Liu, L; Mioulet, V; Armson, B; King, D P; Fowler, V L

    2018-04-01

    Highly contagious transboundary animal diseases such as foot-and-mouth disease (FMD) are major threats to the productivity of farm animals. To limit the impact of outbreaks and to take efficient steps towards a timely control and eradication of the disease, rapid and reliable diagnostic systems are of utmost importance. Confirmatory diagnostic assays are typically performed by experienced operators in specialized laboratories, and access to this capability is often limited in the developing countries with the highest disease burden. Advances in molecular technologies allow implementation of modern and reliable techniques for quick and simple pathogen detection either in basic laboratories or even at the pen-side. Here, we report on a study to evaluate a fully automated cartridge-based real-time RT-PCR diagnostic system (Enigma MiniLab ® ) for the detection of FMD virus (FMDV). The modular system integrates both nucleic acid extraction and downstream real-time RT-PCR (rRT-PCR). The analytical sensitivity of this assay was determined using serially diluted culture grown FMDV, and the performance of the assay was evaluated using a selected range of FMDV positive and negative clinical samples of bovine, porcine and ovine origin. The robustness of the assay was evaluated in an international inter-laboratory proficiency test and by deployment into an African laboratory. It was demonstrated that the system is easy to use and can detect FMDV with high sensitivity and specificity, roughly on par with standard laboratory methods. This cartridge-based automated real-time RT-PCR system for the detection of FMDV represents a reliable and easy to use diagnostic tool for the early and rapid disease detection of acutely infected animals even in remote areas. This type of system could be easily deployed for routine surveillance within endemic regions such as Africa or could alternatively be used in the developed world. © 2017 The Authors. Transboundary and Emerging Diseases

  19. A randomized control trial to evaluate the importance of pre-training basic laparoscopic psychomotor skills upon the learning curve of laparoscopic intra-corporeal knot tying.

    Science.gov (United States)

    Molinas, Carlos Roger; Binda, Maria Mercedes; Sisa, Cesar Manuel; Campo, Rudi

    2017-01-01

    Training of basic laparoscopic psychomotor skills improves the acquisition of more advanced laparoscopic tasks, such as laparoscopic intra-corporeal knot tying (LICK). This randomized controlled trial was designed to evaluate whether pre-training of basic skills, as laparoscopic camera navigation (LCN), hand-eye coordination (HEC), and bimanual coordination (BMC), and the combination of the three of them, has any beneficial effect upon the learning curve of LICK. The study was carried out in a private center in Asunción, Paraguay, by 80 medical students without any experience in surgery. Four laparoscopic tasks were performed in the ENCILAP model (LCN, HEC, BMC, and LICK). Participants were allocated to 5 groups (G1-G5). The study was structured in 5 phases. In phase 1, they underwent a base-line test ( T 1 ) for all tasks (1 repetition of each task in consecutive order). In phase 2, participants underwent different training programs (30 consecutive repetitions) for basic tasks according to the group they belong to (G1: none; G2: LCN; G3: HEC; G4: BMC; and G5: LCN, HEC, and BMC). In phase 3, they were tested again ( T 2 ) in the same manner than at T 1 . In phase 4, they underwent a standardized training program for LICK (30 consecutive repetitions). In phase 5, they were tested again ( T 3 ) in the same manner than at T 1 and T 2 . At each repetition, scoring was based on the time taken for task completion system. The scores were plotted and non-linear regression models were used to fit the learning curves to one- and two-phase exponential decay models for each participant (individual curves) and for each group (group curves). The LICK group learning curves fitted better to the two-phase exponential decay model. From these curves, the starting points ( Y 0), the point after HEC training/before LICK training ( Y 1), the Plateau, and the rate constants ( K ) were calculated. All groups, except for G4, started from a similar point ( Y 0). At Y 1, G5 scored already

  20. MONITORED GEOLOGIC REPOSITORY LIFE CYCLE COST ESTIMATE ASSUMPTIONS DOCUMENT

    International Nuclear Information System (INIS)

    R.E. Sweeney

    2001-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost (LCC) estimate and schedule update incorporating information from the Viability Assessment (VA) , License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  1. Monitored Geologic Repository Life Cycle Cost Estimate Assumptions Document

    International Nuclear Information System (INIS)

    Sweeney, R.

    2000-01-01

    The purpose of this assumptions document is to provide general scope, strategy, technical basis, schedule and cost assumptions for the Monitored Geologic Repository (MGR) life cycle cost estimate and schedule update incorporating information from the Viability Assessment (VA), License Application Design Selection (LADS), 1999 Update to the Total System Life Cycle Cost (TSLCC) estimate and from other related and updated information. This document is intended to generally follow the assumptions outlined in the previous MGR cost estimates and as further prescribed by DOE guidance

  2. The stable model semantics under the any-world assumption

    OpenAIRE

    Straccia, Umberto; Loyer, Yann

    2004-01-01

    The stable model semantics has become a dominating approach to complete the knowledge provided by a logic program by means of the Closed World Assumption (CWA). The CWA asserts that any atom whose truth-value cannot be inferred from the facts and rules is supposed to be false. This assumption is orthogonal to the so-called the Open World Assumption (OWA), which asserts that every such atom's truth is supposed to be unknown. The topic of this paper is to be more fine-grained. Indeed, the objec...

  3. National uranium resource evaluation program. Hydrogeochemical and stream sediment reconnaissance basic data for Oklahoma City NTMS Quadrangle, Oklahoma. Uranium resource evaluation project

    International Nuclear Information System (INIS)

    1978-01-01

    Field and laboratory data are presented for 812 groundwater samples and 847 stream sediment samples. Statistical and areal distributions of uranium and other possibly uranium-related variables are displayed. A generalized geologic map of the survey area is provided, and pertinent geologic factors which may be of significance in evaluating the potential for uranium mineralization are briefly discussed. Based on the results from groundwater sampling, the most promising formations for potential uranium mineralization in the quadrangle are the Permian Bison, Purcell-Salt Plains-Kingman, Fairmont, Dog Creek, Chickasha, Duncan, and Cedar Hills Formations. These units are characterized by relatively high average concentrations of uranium, conductivity, arsenic, calcium, lithium, molybdenum, and sulfate. In addition, groundwaters from the Pennsylvanian Oscar Formation are characterized by values above the 85th percentile for uranium, conductivity, the uranium/sulfate ratio, arsenic, and vanadium. Results of stream sediment sampling indicate that the most promising formations for potential uranium mineralization include the same Permian Formation as indicated by groundwater sampling (Bison, Purcell-Salt Plains-Kingman, Fairmont, Dog-Creek, Chickasha, Duncan, and Cedar Hill Formations) in an area where these formations crop out north of the North Canadian River. Stream sediment samples from this area are characterized by concentrations above the 85th percentile for uranium, thorium, arsenic, lithium, manganese, and vanadium

  4. National Uranium Resource Evaluation Program. Hydrogeochemical and stream sediment reconnaissance basic data for Beeville NTMS Quadrangle, Texas. Uranium resource evaluation project

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-31

    Results of a reconnaissance geochemical survey of the Beeville Quadrangle, Texas are reported. Field and laboratory data are presented for 373 groundwater and 364 stream sediment samples. Statistical and areal distributions of uranium and possible uranium-related variables are displayed. A generalized geologic map of the survey area is provided, and pertinent geologic factors which may be of significance in evaluating the potential for uranium mineralization are briefly discussed. The groundwater data indicate that the northwestern corner of the quadrangle is the most favorable for potential uranium mineralization. Favorability is indicated by high uranium concentrations; high arsenic, molybdenum, and vanadium concentrations; and proximity and similar geologic setting to the mines of the Karnes County mining district. Other areas that appear favorable are an area in Bee and Refugio Counties and the northeastern part of the quadrangle. Both areas have water chemistry similar to the Karnes County area, but the northeastern area does not have high concentrations of pathfinder elements. The stream sediment data indicate that the northeastern corner of the quadrangle is the most favorable for potential mineralization, but agricultural practices and mineralogy of the outcropping Beaumont Formation may indicate a false anomaly. The northwestern corner of the quadrangle is considered favorable because of its proximity to the known uranium deposits, but the data do not seem to support this.

  5. National Uranium Resource Evaluation Program. Hydrogeochemical and stream sediment reconnaissance basic data for Beeville NTMS Quadrangle, Texas. Uranium resource evaluation project

    International Nuclear Information System (INIS)

    1979-01-01

    Results of a reconnaissance geochemical survey of the Beeville Quadrangle, Texas are reported. Field and laboratory data are presented for 373 groundwater and 364 stream sediment samples. Statistical and areal distributions of uranium and possible uranium-related variables are displayed. A generalized geologic map of the survey area is provided, and pertinent geologic factors which may be of significance in evaluating the potential for uranium mineralization are briefly discussed. The groundwater data indicate that the northwestern corner of the quadrangle is the most favorable for potential uranium mineralization. Favorability is indicated by high uranium concentrations; high arsenic, molybdenum, and vanadium concentrations; and proximity and similar geologic setting to the mines of the Karnes County mining district. Other areas that appear favorable are an area in Bee and Refugio Counties and the northeastern part of the quadrangle. Both areas have water chemistry similar to the Karnes County area, but the northeastern area does not have high concentrations of pathfinder elements. The stream sediment data indicate that the northeastern corner of the quadrangle is the most favorable for potential mineralization, but agricultural practices and mineralogy of the outcropping Beaumont Formation may indicate a false anomaly. The northwestern corner of the quadrangle is considered favorable because of its proximity to the known uranium deposits, but the data do not seem to support this

  6. Stem Cell Basics

    Science.gov (United States)

    ... Tips Info Center Research Topics Federal Policy Glossary Stem Cell Information General Information Clinical Trials Funding Information Current ... Basics » Stem Cell Basics I. Back to top Stem Cell Basics I. Introduction: What are stem cells, and ...

  7. Supporting calculations and assumptions for use in WESF safetyanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Hey, B.E.

    1997-03-07

    This document provides a single location for calculations and assumptions used in support of Waste Encapsulation and Storage Facility (WESF) safety analyses. It also provides the technical details and bases necessary to justify the contained results.

  8. A framework for the organizational assumptions underlying safety culture

    International Nuclear Information System (INIS)

    Packer, Charles

    2002-01-01

    The safety culture of the nuclear organization can be addressed at the three levels of culture proposed by Edgar Schein. The industry literature provides a great deal of insight at the artefact and espoused value levels, although as yet it remains somewhat disorganized. There is, however, an overall lack of understanding of the assumption level of safety culture. This paper describes a possible framework for conceptualizing the assumption level, suggesting that safety culture is grounded in unconscious beliefs about the nature of the safety problem, its solution and how to organize to achieve the solution. Using this framework, the organization can begin to uncover the assumptions at play in its normal operation, decisions and events and, if necessary, engage in a process to shift them towards assumptions more supportive of a strong safety culture. (author)

  9. Psychopatholgy, fundamental assumptions and CD-4 T lymphocyte ...

    African Journals Online (AJOL)

    In addition, we explored whether psychopathology and negative fundamental assumptions in ... Method: Self-rating questionnaires to assess depressive symptoms, ... associated with all participants scoring in the positive range of the FA scale.

  10. Idaho National Engineering Laboratory installation roadmap assumptions document

    International Nuclear Information System (INIS)

    1993-05-01

    This document is a composite of roadmap assumptions developed for the Idaho National Engineering Laboratory (INEL) by the US Department of Energy Idaho Field Office and subcontractor personnel as a key element in the implementation of the Roadmap Methodology for the INEL Site. The development and identification of these assumptions in an important factor in planning basis development and establishes the planning baseline for all subsequent roadmap analysis at the INEL

  11. [Evaluation of vaginal dysfunction in symptomatic and asymptomatic pregnant women by using the analysis of basic vaginal states (BVS) and its comparison with the conventional microbiological study].

    Science.gov (United States)

    Touzon, María S; Losada, Mirta; Eliseht, Martha Cora; Menghi, Claudia; Gatta, Claudia; Santa Cruz, Gabriela; Malamud de Ruda Vega, Hilda; Vay, Carlos; Tatti, Silvio; Famiglietti, Angela; Perazzi, Beatriz

    2014-01-01

    Infections of the lower genital tract associated to maternal and perinatal complications frequently occur during pregnancy. The aim of this study was to evaluate vaginal dysfunction through the analysis of basic vaginal states (BVS) using the methodology of balance of the vaginal content (BAVACO) and to compare it with the microbiological study of candidiasis, trichomoniasis and bacterial vaginosis (BV). Pregnant patients (1238) were examined from 2010 to 2012. In asymptomatic (A) (n: 1046) and symptomatic pregnant women (S) (n: 192) BVS I was 59.5% and 26% of the patients, respectively. BVS II was observed in 19.7% of A and in 17.2% of S. BVS III was only detected in A in 0.4%. BVS IV was observed in 14.4% of A and in 38% of S. BVS V was detected in 6% of A and in 18.8% of S. Yeasts were associated to BVS I and II in 55.5% and 23.2% of A, respectively; and in 32.4% and 31% of S, respectively. Trichomonas were associated to BVS I in 50% of A, to IV in 44.4% of S and to V in 33.3% of S. BAVACO susceptibility to detect yeasts was 80.4% and 85.5% in A and S, respectively; 40% and 75% in A and S, respectively, to detect trichomonas and 100% in A and S to detect BV. BAVACO specificity was 100% for all pathogens in A and S. The study of BVS proved useful as a guide to evaluate vaginal dysfunction, regardless of symptomatology. Therefore, this study is recommended as prenatal control. Copyright © 2014 Asociación Argentina de Microbiología. Publicado por Elsevier España. All rights reserved.

  12. Evaluating the role of acidic, basic, and polar amino acids and dipeptides on a molecular electrocatalyst for H 2 oxidation

    Energy Technology Data Exchange (ETDEWEB)

    Boralugodage, Nilusha Priyadarshani; Arachchige, Rajith Jayasingha; Dutta, Arnab; Buchko, Garry W.; Shaw, Wendy J.

    2017-01-01

    Amino acids and peptides have been shown to have a significant influence on the H2 production and oxidation reactivity of Ni(PR2NR’2)2, where PR2NR’2 = 1,5-diaza-3,7-diphosphacyclooctane, R is either phenyl (Ph) or cyclohexyl (Cy), and R’ is either an amino acid or peptide. Most recently, the Ni(PCy2Naminoacid2)2 complexes (CyAA) have shown enhanced H2 oxidation rates, water solubility, and in the case of arginine (CyArg) and phenylalanine (CyPhe), electrocatalytic reversibility. Both the backbone –COOH and side chain interactions were shown to be critical to catalytic performance. Here we further investigate the roles of the outer coordination sphere by evaluating amino acids with acidic, basic, and hydrophilic side chains, as well as dipeptides which combine multiple successful features from previous complexes. Six new complexes were prepared, three containing single amino acids: aspartic acid (CyAsp), lysine (CyLys), and serine (CySer) and three containing dipeptides: glycine-phenylalanine (Cy(GlyPhe)), phenylalanine-glycine (Cy(PheGly)), and aspartic acid-phenylananine (Cy(AspPhe)). The resulting catalytic performance demonstrates that complexes need both interactions between side chain and –COOH groups for fast, efficient catalysis. The fastest of all of the catalysts, Cy(AspPhe), had both of these features, while the other dipeptide complexes with an amide replacing the -COOH were both slower; however, the amide group was demonstrated to participate in the proton pathway when side chain interactions are present to position it. Both the hydrophilic and basic side chains, notably lacking in side chain interactions, significantly increased the overpotential, with only modest increases in TOF. Of all of the complexes, only CyAsp was reversible at room temperature, and only in water, the first of these

  13. Efeito da violação de pressuposições da metodologia de modelos mistos na avaliação genética animal Effect of assumption violations of the mixed model methodology on the genetic evaluation

    Directory of Open Access Journals (Sweden)

    R. Fonseca

    2001-02-01

    Full Text Available Estudos de simulação foram conduzidos para verificar o efeito da violação de pressuposições da metodologia de modelos mistos, variâncias genéticas conhecidas sem erro e distribuição normal dos erros aleatórios sobre os ganhos genéticos obtidos durante 10 gerações de seleção. Outros parâmetros, como valor fenotípico e acurácia, também foram avaliados. Inicialmente, foi simulado um genoma constituído de uma única característica quantitativa governada por 500 locos. O genoma foi utilizado na construção de uma população-base, na qual a característica quantitativa possuía herdabilidade inicial de 0,10. Para se obter uma estrutura de parentesco a partir das populações-base, foi gerada uma população inicial a partir da qual o processo de seleção teve início e os erros nos componentes de variâncias e as distribuições dos efeitos de ambiente foram introduzidos. Para pressuposição de que a variância genética era conhecida, utilizaram-se as intensidades de erro de 0%, -10%, -30%, -50%, 10%, 30% e 50%, enquanto que para a pressuposição de que a distribuição dos erros aleatórios era normal, utilizaram-se as distribuições normal, exponencial, poisson e uniforme. A cada geração foram selecionados 20 machos e 100 fêmeas, acasalados ao acaso, cada macho acasalado com cinco fêmeas, produzindo cinco descendentes por acasalamento. Esse processo foi repetido 30 vezes para minimização dos efeitos da flutuação gênica. Para a primeira pressuposição, não foi verificado efeito das intensidades de erro, aplicadas ao componente de variância genética aditiva sobre o ganho genético durante as 10 gerações de seleção. O mesmo resultado foi verificado para a distribuição dos erros aleatórios, ou seja, não houve influência de diferentes distribuições nos ganhos genéticos verificados.Simulation studies were conducted to evaluate the effects of two assumption violations of the methodology of mixed models

  14. Quasi-experimental study designs series-paper 7: assessing the assumptions.

    Science.gov (United States)

    Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian

    2017-09-01

    Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Oil price assumptions in macroeconomic forecasts: should we follow future market expectations?

    International Nuclear Information System (INIS)

    Coimbra, C.; Esteves, P.S.

    2004-01-01

    In macroeconomic forecasting, in spite of its important role in price and activity developments, oil prices are usually taken as an exogenous variable, for which assumptions have to be made. This paper evaluates the forecasting performance of futures market prices against the other popular technical procedure, the carry-over assumption. The results suggest that there is almost no difference between opting for futures market prices or using the carry-over assumption for short-term forecasting horizons (up to 12 months), while, for longer-term horizons, they favour the use of futures market prices. However, as futures market prices reflect market expectations for world economic activity, futures oil prices should be adjusted whenever market expectations for world economic growth are different to the values underlying the macroeconomic scenarios, in order to fully ensure the internal consistency of those scenarios. (Author)

  16. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    Science.gov (United States)

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Providing security assurance in line with national DBT assumptions

    Science.gov (United States)

    Bajramovic, Edita; Gupta, Deeksha

    2017-01-01

    As worldwide energy requirements are increasing simultaneously with climate change and energy security considerations, States are thinking about building nuclear power to fulfill their electricity requirements and decrease their dependence on carbon fuels. New nuclear power plants (NPPs) must have comprehensive cybersecurity measures integrated into their design, structure, and processes. In the absence of effective cybersecurity measures, the impact of nuclear security incidents can be severe. Some of the current nuclear facilities were not specifically designed and constructed to deal with the new threats, including targeted cyberattacks. Thus, newcomer countries must consider the Design Basis Threat (DBT) as one of the security fundamentals during design of physical and cyber protection systems of nuclear facilities. IAEA NSS 10 describes the DBT as "comprehensive description of the motivation, intentions and capabilities of potential adversaries against which protection systems are designed and evaluated". Nowadays, many threat actors, including hacktivists, insider threat, cyber criminals, state and non-state groups (terrorists) pose security risks to nuclear facilities. Threat assumptions are made on a national level. Consequently, threat assessment closely affects the design structures of nuclear facilities. Some of the recent security incidents e.g. Stuxnet worm (Advanced Persistent Threat) and theft of sensitive information in South Korea Nuclear Power Plant (Insider Threat) have shown that these attacks should be considered as the top threat to nuclear facilities. Therefore, the cybersecurity context is essential for secure and safe use of nuclear power. In addition, States should include multiple DBT scenarios in order to protect various target materials, types of facilities, and adversary objectives. Development of a comprehensive DBT is a precondition for the establishment and further improvement of domestic state nuclear-related regulations in the

  18. Evaluation of dynamics and equilibrium models for the sorption of Basic Violet 3 on activated carbon prepared from Moringa Oleifera fruit shell waste

    Directory of Open Access Journals (Sweden)

    C. Sumithra

    2014-03-01

    Full Text Available The feasibility of activated carbon prepared from Moringa oleifera fruit shell waste to remove Basic Violet 3 from aqueous solution was investigated through batch mode contact time studies. The surface chemistry of activated carbon is studied using Boehm titrations and pH of PZC measurements indicates that the surface oxygenated groups are mainly basic in nature. The surface area of the activated carbon is determined using BET method. The kinetics of Basic Violet 3 adsorption are observed to be pH dependent. The experimental data can be explained by Pseudo second order kinetic model. For, Basic Violet 3, the Langmuir model is best suited to stimulate the adsorption isotherms.

  19. Evaluation of an Australian health literacy training program for socially disadvantaged adults attending basic education classes: study protocol for a cluster randomised controlled trial.

    Science.gov (United States)

    McCaffery, Kirsten J; Morony, Suzanne; Muscat, Danielle M; Smith, Sian K; Shepherd, Heather L; Dhillon, Haryana M; Hayen, Andrew; Luxford, Karen; Meshreky, Wedyan; Comings, John; Nutbeam, Don

    2016-05-27

    People with low literacy and low health literacy have poorer health outcomes. Literacy and health literacy are distinct but overlapping constructs that impact wellbeing. Interventions that target both could improve health outcomes. This is a cluster randomised controlled trial with a qualitative component. Participants are 300 adults enrolled in basic language, literacy and numeracy programs at adult education colleges across New South Wales, Australia. Each adult education institute (regional administrative centre) contributes (at least) two classes matched for student demographics, which may be at the same or different campuses. Classes (clusters) are randomly allocated to receive either the health literacy intervention (an 18-week program with health knowledge and skills embedded in language, literacy, and numeracy training (LLN)), or the standard Language Literacy and Numeracy (LLN) program (usual LLN classes, specifically excluding health content). The primary outcome is functional health literacy skills - knowing how to use a thermometer, and read and interpret food and medicine labels. The secondary outcomes are self-reported confidence, more advanced health literacy skills; shared decision making skills, patient activation, health knowledge and self-reported health behaviour. Data is collected at baseline, and immediately and 6 months post intervention. A sample of participating teachers, students, and community health workers will be interviewed in-depth about their experiences with the program to better understand implementation issues and to strengthen the potential for scaling up the program. Outcomes will provide evidence regarding real-world implementation of a health literacy training program with health worker involvement in an Australian adult education setting. The evaluation trial will provide insight into translating and scaling up health literacy education for vulnerable populations with low literacy. Australian New Zealand Clinical Trials

  20. Research and development of the industrial basic technologies of the next generation, 'composite materials (quality evaluation techniques)'. Evaluation of the first phase research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'fukugo zairyo (hinshitsu hyoka gijutsu)'. Zenki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1985-03-30

    The results of the first phase research and development project for developing composite materials as the basic technologies of the next generation are evaluated, and the directions of the R and D evaluation for the second phase are set up. The efforts in the first-phase R and D project are aimed at development of methods for measuring carbon fiber surface structure elements; confirmation of the relationship between adhesive shear strength and active surface area; development of methods for determining fracture toughness by standard specimens; estimation of allowable void fraction for inter-layer shear strength; defect detection and quality evaluation by electromagnetic, ultrasonic, laser holography and AE methods; development of methods for detecting resin setting reactions during the molding processes; and understanding deteriorated mechanical characteristics of the resin-based composites by environmental factors, among others. The objectives of the first-phase project have been almost achieved. It is decided that the second-phase R and D project are directed to investigations on the relationship between surface properties of the fibers in the composites and fiber/matrix adhesion; researches on mechanical characteristics involved in fracture of the structural elements; evaluation of mechanical properties of the metal-based composites and investigations on detecting their defects; elucidation of the effects of environmental factors on their strength; and development of the techniques for integrating detection of molding-induced cracking and that of setting reactivity, among others. (NEDO)

  1. Changing Assumptions and Progressive Change in Theories of Strategic Organization

    DEFF Research Database (Denmark)

    Foss, Nicolai J.; Hallberg, Niklas L.

    2017-01-01

    are often decoupled from the results of empirical testing, changes in assumptions seem closely intertwined with theoretical progress. Using the case of the resource-based view, we suggest that progressive change in theories of strategic organization may come about as a result of scholarly debate and dispute......A commonly held view is that strategic organization theories progress as a result of a Popperian process of bold conjectures and systematic refutations. However, our field also witnesses vibrant debates or disputes about the specific assumptions that our theories rely on, and although these debates...... over what constitutes proper assumptions—even in the absence of corroborating or falsifying empirical evidence. We also discuss how changing assumptions may drive future progress in the resource-based view....

  2. The Emperors sham - wrong assumption that sham needling is sham.

    Science.gov (United States)

    Lundeberg, Thomas; Lund, Iréne; Näslund, Jan; Thomas, Moolamanil

    2008-12-01

    During the last five years a large number of randomised controlled clinical trials (RCTs) have been published on the efficacy of acupuncture in different conditions. In most of these studies verum is compared with sham acupuncture. In general both verum and sham have been found to be effective, and often with little reported difference in outcome. This has repeatedly led to the conclusion that acupuncture is no more effective than placebo treatment. However, this conclusion is based on the assumption that sham acupuncture is inert. Since sham acupuncture evidently is merely another form of acupuncture from the physiological perspective, the assumption that sham is sham is incorrect and conclusions based on this assumption are therefore invalid. Clinical guidelines based on such conclusions may therefore exclude suffering patients from valuable treatments.

  3. Evolution of Requirements and Assumptions for Future Exploration Missions

    Science.gov (United States)

    Anderson, Molly; Sargusingh, Miriam; Perry, Jay

    2017-01-01

    NASA programs are maturing technologies, systems, and architectures to enabling future exploration missions. To increase fidelity as technologies mature, developers must make assumptions that represent the requirements of a future program. Multiple efforts have begun to define these requirements, including team internal assumptions, planning system integration for early demonstrations, and discussions between international partners planning future collaborations. For many detailed life support system requirements, existing NASA documents set limits of acceptable values, but a future vehicle may be constrained in other ways, and select a limited range of conditions. Other requirements are effectively set by interfaces or operations, and may be different for the same technology depending on whether the hard-ware is a demonstration system on the International Space Station, or a critical component of a future vehicle. This paper highlights key assumptions representing potential life support requirements and explanations of the driving scenarios, constraints, or other issues that drive them.

  4. Respondent-Driven Sampling – Testing Assumptions: Sampling with Replacement

    Directory of Open Access Journals (Sweden)

    Barash Vladimir D.

    2016-03-01

    Full Text Available Classical Respondent-Driven Sampling (RDS estimators are based on a Markov Process model in which sampling occurs with replacement. Given that respondents generally cannot be interviewed more than once, this assumption is counterfactual. We join recent work by Gile and Handcock in exploring the implications of the sampling-with-replacement assumption for bias of RDS estimators. We differ from previous studies in examining a wider range of sampling fractions and in using not only simulations but also formal proofs. One key finding is that RDS estimates are surprisingly stable even in the presence of substantial sampling fractions. Our analyses show that the sampling-with-replacement assumption is a minor contributor to bias for sampling fractions under 40%, and bias is negligible for the 20% or smaller sampling fractions typical of field applications of RDS.

  5. DDH-Like Assumptions Based on Extension Rings

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Kiltz, Eike

    2012-01-01

    We introduce and study a new type of DDH-like assumptions based on groups of prime order q. Whereas standard DDH is based on encoding elements of $\\mathbb{F}_{q}$ “in the exponent” of elements in the group, we ask what happens if instead we put in the exponent elements of the extension ring $R_f=......-Reingold style pseudorandom functions, and auxiliary input secure encryption. This can be seen as an alternative to the known family of k-LIN assumptions....

  6. Emerging Assumptions About Organization Design, Knowledge And Action

    Directory of Open Access Journals (Sweden)

    Alan Meyer

    2013-12-01

    Full Text Available Participants in the Organizational Design Community’s 2013 Annual Conference faced the challenge of “making organization design knowledge actionable.”  This essay summarizes the opinions and insights participants shared during the conference.  I reflect on these ideas, connect them to recent scholarly thinking about organization design, and conclude that seeking to make design knowledge actionable is nudging the community away from an assumption set based upon linearity and equilibrium, and toward a new set of assumptions based on emergence, self-organization, and non-linearity.

  7. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    Science.gov (United States)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  8. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    Science.gov (United States)

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  9. Interface Input/Output Automata: Splitting Assumptions from Guarantees

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Nyman, Ulrik; Wasowski, Andrzej

    2006-01-01

    's \\IOAs [11], relying on a context dependent notion of refinement based on relativized language inclusion. There are two main contributions of the work. First, we explicitly separate assumptions from guarantees, increasing the modeling power of the specification language and demonstrating an interesting...

  10. Exploring five common assumptions on Attention Deficit Hyperactivity Disorder

    NARCIS (Netherlands)

    Batstra, Laura; Nieweg, Edo H.; Hadders-Algra, Mijna

    The number of children diagnosed with attention deficit hyperactivity disorder (ADHD) and treated with medication is steadily increasing. The aim of this paper was to critically discuss five debatable assumptions on ADHD that may explain these trends to some extent. These are that ADHD (i) causes

  11. Efficient pseudorandom generators based on the DDH assumption

    NARCIS (Netherlands)

    Rezaeian Farashahi, R.; Schoenmakers, B.; Sidorenko, A.; Okamoto, T.; Wang, X.

    2007-01-01

    A family of pseudorandom generators based on the decisional Diffie-Hellman assumption is proposed. The new construction is a modified and generalized version of the Dual Elliptic Curve generator proposed by Barker and Kelsey. Although the original Dual Elliptic Curve generator is shown to be

  12. Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.

    Science.gov (United States)

    Ahmadi Nasab Emran, Shahram

    2016-06-01

    In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.

  13. Consequences of Violated Equating Assumptions under the Equivalent Groups Design

    Science.gov (United States)

    Lyren, Per-Erik; Hambleton, Ronald K.

    2011-01-01

    The equal ability distribution assumption associated with the equivalent groups equating design was investigated in the context of a selection test for admission to higher education. The purpose was to assess the consequences for the test-takers in terms of receiving improperly high or low scores compared to their peers, and to find strong…

  14. Child Development Knowledge and Teacher Preparation: Confronting Assumptions.

    Science.gov (United States)

    Katz, Lilian G.

    This paper questions the widely held assumption that acquiring knowledge of child development is an essential part of teacher preparation and teaching competence, especially among teachers of young children. After discussing the influence of culture, parenting style, and teaching style on developmental expectations and outcomes, the paper asserts…

  15. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    Science.gov (United States)

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  16. Making Predictions about Chemical Reactivity: Assumptions and Heuristics

    Science.gov (United States)

    Maeyer, Jenine; Talanquer, Vicente

    2013-01-01

    Diverse implicit cognitive elements seem to support but also constrain reasoning in different domains. Many of these cognitive constraints can be thought of as either implicit assumptions about the nature of things or reasoning heuristics for decision-making. In this study we applied this framework to investigate college students' understanding of…

  17. Using Contemporary Art to Challenge Cultural Values, Beliefs, and Assumptions

    Science.gov (United States)

    Knight, Wanda B.

    2006-01-01

    Art educators, like many other educators born or socialized within the main-stream culture of a society, seldom have an opportunity to identify, question, and challenge their cultural values, beliefs, assumptions, and perspectives because school culture typically reinforces those they learn at home and in their communities (Bush & Simmons, 1990).…

  18. Does Artificial Neural Network Support Connectivism's Assumptions?

    Science.gov (United States)

    AlDahdouh, Alaa A.

    2017-01-01

    Connectivism was presented as a learning theory for the digital age and connectivists claim that recent developments in Artificial Intelligence (AI) and, more specifically, Artificial Neural Network (ANN) support their assumptions of knowledge connectivity. Yet, very little has been done to investigate this brave allegation. Does the advancement…

  19. Discourses and Theoretical Assumptions in IT Project Portfolio Management

    DEFF Research Database (Denmark)

    Hansen, Lars Kristian; Kræmmergaard, Pernille

    2014-01-01

    DISCOURSES AND THEORETICAL ASSUMPTIONS IN IT PROJECT PORTFOLIO MANAGEMENT: A REVIEW OF THE LITERATURE These years increasing interest is put on IT project portfolio management (IT PPM). Considering IT PPM an interdisciplinary practice, we conduct a concept-based literature review of relevant...

  20. 7 CFR 1980.476 - Transfer and assumptions.

    Science.gov (United States)

    2010-01-01

    ...-354 449-30 to recover its pro rata share of the actual loss at that time. In completing Form FmHA or... the lender on liquidations and property management. A. The State Director may approve all transfer and... Director will notify the Finance Office of all approved transfer and assumption cases on Form FmHA or its...

  1. Origins and Traditions in Comparative Education: Challenging Some Assumptions

    Science.gov (United States)

    Manzon, Maria

    2018-01-01

    This article questions some of our assumptions about the history of comparative education. It explores new scholarship on key actors and ways of knowing in the field. Building on the theory of the social constructedness of the field of comparative education, the paper elucidates how power shapes our scholarly histories and identities.

  2. Observing gravitational-wave transient GW150914 with minimal assumptions

    NARCIS (Netherlands)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Phythian-Adams, A.T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwa, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. C.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, R.D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, M.J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackburn, L.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, A.L.S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, J.G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, T.C; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brocki, P.; Brooks, A. F.; Brown, A.D.; Brown, D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderon Bustillo, J.; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Diaz, J. Casanueva; Casentini, C.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Baiardi, L. Cerboni; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chatterji, S.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, D. S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Qian; Chua, S. E.; Chung, E.S.; Ciani, G.; Clara, F.; Clark, J. A.; Clark, M.; Cleva, F.; Coccia, E.; Cohadon, P. -F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, A.C.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J. -P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, A.L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Debra, D.; Debreczeni, G.; Degallaix, J.; De laurentis, M.; Deleglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.A.; DeRosa, R. T.; Rosa, R.; DeSalvo, R.; Dhurandhar, S.; Diaz, M. C.; Di Fiore, L.; Giovanni, M.G.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H. -B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, T. M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.M.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. R.; Flaminio, R.; Fletcher, M; Fournier, J. -D.; Franco, S; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritsche, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.P.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, Idelmis G.; Castro, J. M. Gonzalez; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Lee-Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.M.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; de Haas, R.; Hacker, J. J.; Buffoni-Hall, R.; Hall, E. D.; Hammond, G.L.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, P.J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C. -J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hinder, I.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J. -M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, D.H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jimenez-Forteza, F.; Johnson, W.; Jones, I.D.; Jones, R.; Jonker, R. J. G.; Ju, L.; Haris, K.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.H.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kefelian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.E.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan., S.; Khan, Z.; Khazanov, E. A.; Kijhunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.M.; King, E. J.; King, P. J.; Kinsey, M.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krolak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Laguna, P.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, R.; Leavey, S.; Lebigot, E. O.; Lee, C.H.; Lee, K.H.; Lee, M.H.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lueck, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magana-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R.M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mende, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B.C.; Moore, J.C.; Moraru, D.; Gutierrez Moreno, M.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, S.D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P.G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Gutierrez-Neri, M.; Neunzert, A.; Newton-Howes, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J.; Oh, S. H.; Ohme, F.; Oliver, M. B.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Page, J.; Paris, H. R.; Parker, W.S; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prolchorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Puerrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosinska, D.; Rowan, S.; Ruediger, A.; Ruggi, P.; Ryan, K.A.; Sachdev, P.S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J; Schmidt, P.; Schnabel, R.B.; Schofield, R. M. S.; Schoenbeck, A.; Schreiber, K.E.C.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, M.S.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shithriar, M. S.; Shaltev, M.; Shao, Z.M.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, António Dias da; Simakov, D.; Singer, A; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, R. J. E.; Smith, N.D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stone, J.R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S. E.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepanczyk, M. J.; Tacca, M.D.; Talukder, D.; Tanner, D. B.; Tapai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, W.R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Toyra, D.; Travasso, F.; Traylor, G.; Trifiro, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlhruch, H.; Vajente, G.; Valdes, G.; Van Bakel, N.; Van Beuzekom, Martin; Van den Brand, J. F. J.; Van Den Broeck, C.F.F.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasuth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, R. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Vicere, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J. -Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, MT; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L. -W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.M.; Wessels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, D.; Williams, D.R.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J.L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrozny, A.; Zangrando, L.; Zanolin, M.; Zendri, J. -P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.

    2016-01-01

    The gravitational-wave signal GW150914 was first identified on September 14, 2015, by searches for short-duration gravitational-wave transients. These searches identify time-correlated transients in multiple detectors with minimal assumptions about the signal morphology, allowing them to be

  3. Deep Borehole Field Test Requirements and Controlled Assumptions.

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientific characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.

  4. Basic Research Firing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Basic Research Firing Facility is an indoor ballistic test facility that has recently transitioned from a customer-based facility to a dedicated basic research...

  5. Basic Cake Decorating Workbook.

    Science.gov (United States)

    Bogdany, Mel

    Included in this student workbook for basic cake decorating are the following: (1) Drawings of steps in a basic way to ice a layer cake, how to make a paper cone, various sizes of flower nails, various sizes and types of tin pastry tubes, and special rose tubes; (2) recipes for basic decorating icings (buttercream, rose paste, and royal icing);…

  6. Evaluation Planning, Evaluation Management, and Utilization of Evaluation Results within Adult Literacy Campaigns, Programs and Projects (with Implications for Adult Basic Education and Nonformal Education Programs in General). A Working Paper.

    Science.gov (United States)

    Bhola, H. S.

    Addressed to professionals involved in program evaluation, this working paper covers various aspects of evaluation planning, including the following: planning as a sociotechnical process, steps in evaluation planning, program planning and implementation versus evaluation planning and implementation, the literacy system and its subsystems, and some…

  7. Super learning to hedge against incorrect inference from arbitrary parametric assumptions in marginal structural modeling.

    Science.gov (United States)

    Neugebauer, Romain; Fireman, Bruce; Roy, Jason A; Raebel, Marsha A; Nichols, Gregory A; O'Connor, Patrick J

    2013-08-01

    Clinical trials are unlikely to ever be launched for many comparative effectiveness research (CER) questions. Inferences from hypothetical randomized trials may however be emulated with marginal structural modeling (MSM) using observational data, but success in adjusting for time-dependent confounding and selection bias typically relies on parametric modeling assumptions. If these assumptions are violated, inferences from MSM may be inaccurate. In this article, we motivate the application of a data-adaptive estimation approach called super learning (SL) to avoid reliance on arbitrary parametric assumptions in CER. Using the electronic health records data from adults with new-onset type 2 diabetes, we implemented MSM with inverse probability weighting (IPW) estimation to evaluate the effect of three oral antidiabetic therapies on the worsening of glomerular filtration rate. Inferences from IPW estimation were noticeably sensitive to the parametric assumptions about the associations between both the exposure and censoring processes and the main suspected source of confounding, that is, time-dependent measurements of hemoglobin A1c. SL was successfully implemented to harness flexible confounding and selection bias adjustment from existing machine learning algorithms. Erroneous IPW inference about clinical effectiveness because of arbitrary and incorrect modeling decisions may be avoided with SL. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Research and development of the industrial basic technologies of the next generation, 'composite materials (fine ceramics)'. Evaluation of the first phase research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'fine ceramics'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1984-03-30

    The results of the first phase research and development project for developing fine ceramics as the basic technologies of the next generation are evaluated. The R and D themes are selected to develop fine ceramics of high strength, corrosion resistance, precision and wear resistance, noting their excellent characteristics. Development of the basic techniques for these materials is of high significance, and highly rated. The efforts in the first-phase R and D project are aimed at development of silicon nitride and silicon carbide for synthesis of the stock materials; explosive forming/treating the stock powders; forming, sintering and processing/joining; evaluation of the characteristics; non-destructive testing methods; designs; and evaluation of the parts, among others, as the elementary techniques for production, evaluation and application of the fine ceramic materials. The technical targets of improving functions have been achieved, or bright prospects have been obtained therefor in development of the techniques for synthesis of the stock materials, forming/sintering and processing/joining. The silica reduction for stock synthesis, basic techniques for molding/sintering, and rheological considerations for the molding/sintering techniques represent the techniques of the next generation, because they break through the limitations of the conventional techniques. (NEDO)

  9. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  10. The sufficiency assumption of the reasoned approach to action

    Directory of Open Access Journals (Sweden)

    David Trafimow

    2015-12-01

    Full Text Available The reasoned action approach to understanding and predicting behavior includes the sufficiency assumption. Although variables not included in the theory may influence behavior, these variables work through the variables in the theory. Once the reasoned action variables are included in an analysis, the inclusion of other variables will not increase the variance accounted for in behavioral intentions or behavior. Reasoned action researchers are very concerned with testing if new variables account for variance (or how much traditional variables account for variance, to see whether they are important, in general or with respect to specific behaviors under investigation. But this approach tacitly assumes that accounting for variance is highly relevant to understanding the production of variance, which is what really is at issue. Based on the variance law, I question this assumption.

  11. Forecasting Value-at-Risk under Different Distributional Assumptions

    Directory of Open Access Journals (Sweden)

    Manuela Braione

    2016-01-01

    Full Text Available Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR. We provide a comprehensive look at the problem by considering the impact that different distributional assumptions have on the accuracy of both univariate and multivariate GARCH models in out-of-sample VaR prediction. The set of analyzed distributions comprises the normal, Student, Multivariate Exponential Power and their corresponding skewed counterparts. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures used to rank the different specifications. The results show the importance of allowing for heavy-tails and skewness in the distributional assumption with the skew-Student outperforming the others across all tests and confidence levels.

  12. Spatial Angular Compounding for Elastography without the Incompressibility Assumption

    OpenAIRE

    Rao, Min; Varghese, Tomy

    2005-01-01

    Spatial-angular compounding is a new technique that enables the reduction of noise artifacts in ultrasound elastography. Previous results using spatial angular compounding, however, were based on the use of the tissue incompressibility assumption. Compounded elastograms were obtained from a spatially-weighted average of local strain estimated from radiofrequency echo signals acquired at different insonification angles. In this paper, we present a new method for reducing the noise artifacts in...

  13. Estimators for longitudinal latent exposure models: examining measurement model assumptions.

    Science.gov (United States)

    Sánchez, Brisa N; Kim, Sehee; Sammel, Mary D

    2017-06-15

    Latent variable (LV) models are increasingly being used in environmental epidemiology as a way to summarize multiple environmental exposures and thus minimize statistical concerns that arise in multiple regression. LV models may be especially useful when multivariate exposures are collected repeatedly over time. LV models can accommodate a variety of assumptions but, at the same time, present the user with many choices for model specification particularly in the case of exposure data collected repeatedly over time. For instance, the user could assume conditional independence of observed exposure biomarkers given the latent exposure and, in the case of longitudinal latent exposure variables, time invariance of the measurement model. Choosing which assumptions to relax is not always straightforward. We were motivated by a study of prenatal lead exposure and mental development, where assumptions of the measurement model for the time-changing longitudinal exposure have appreciable impact on (maximum-likelihood) inferences about the health effects of lead exposure. Although we were not particularly interested in characterizing the change of the LV itself, imposing a longitudinal LV structure on the repeated multivariate exposure measures could result in high efficiency gains for the exposure-disease association. We examine the biases of maximum likelihood estimators when assumptions about the measurement model for the longitudinal latent exposure variable are violated. We adapt existing instrumental variable estimators to the case of longitudinal exposures and propose them as an alternative to estimate the health effects of a time-changing latent predictor. We show that instrumental variable estimators remain unbiased for a wide range of data generating models and have advantages in terms of mean squared error. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Data-driven smooth tests of the proportional hazards assumption

    Czech Academy of Sciences Publication Activity Database

    Kraus, David

    2007-01-01

    Roč. 13, č. 1 (2007), s. 1-16 ISSN 1380-7870 R&D Projects: GA AV ČR(CZ) IAA101120604; GA ČR(CZ) GD201/05/H007 Institutional research plan: CEZ:AV0Z10750506 Keywords : Cox model * Neyman's smooth test * proportional hazards assumption * Schwarz's selection rule Subject RIV: BA - General Mathematics Impact factor: 0.491, year: 2007

  15. Bank stress testing under different balance sheet assumptions

    OpenAIRE

    Busch, Ramona; Drescher, Christian; Memmel, Christoph

    2017-01-01

    Using unique supervisory survey data on the impact of a hypothetical interest rate shock on German banks, we analyse price and quantity effects on banks' net interest margin components under different balance sheet assumptions. In the first year, the cross-sectional variation of banks' simulated price effect is nearly eight times as large as the one of the simulated quantity effect. After five years, however, the importance of both effects converges. Large banks adjust their balance sheets mo...

  16. The incompressibility assumption in computational simulations of nasal airflow.

    Science.gov (United States)

    Cal, Ismael R; Cercos-Pita, Jose Luis; Duque, Daniel

    2017-06-01

    Most of the computational works on nasal airflow up to date have assumed incompressibility, given the low Mach number of these flows. However, for high temperature gradients, the incompressibility assumption could lead to a loss of accuracy, due to the temperature dependence of air density and viscosity. In this article we aim to shed some light on the influence of this assumption in a model of calm breathing in an Asian nasal cavity, by solving the fluid flow equations in compressible and incompressible formulation for different ambient air temperatures using the OpenFOAM package. At low flow rates and warm climatological conditions, similar results were obtained from both approaches, showing that density variations need not be taken into account to obtain a good prediction of all flow features, at least for usual breathing conditions. This agrees with most of the simulations previously reported, at least as far as the incompressibility assumption is concerned. However, parameters like nasal resistance and wall shear stress distribution differ for air temperatures below [Formula: see text]C approximately. Therefore, density variations should be considered for simulations at such low temperatures.

  17. Experimental data from irradiation of physical detectors disclose weaknesses in basic assumptions of the δ ray theory of track structure

    DEFF Research Database (Denmark)

    Olsen, K. J.; Hansen, Jørgen-Walther

    1985-01-01

    The applicability of track structure theory has been tested by comparing predictions based on the theory with experimental high-LET dose-response data for an amino acid alanine and a nylon based radiochromic dye film radiation detector. The linear energy transfer LET, has been varied from 28...

  18. Basic research for environmental restoration

    International Nuclear Information System (INIS)

    1990-12-01

    The Department of Energy (DOE) is in the midst of a major environmental restoration effort to reduce the health and environmental risks resulting from past waste management and disposal practices at DOE sites. This report describes research needs in environmental restoration and complements a previously published document, DOE/ER-0419, Evaluation of Mid-to-Long Term Basic Research for Environmental Restoration. Basic research needs have been grouped into five major categories patterned after those identified in DOE/ER-0419: (1) environmental transport and transformations; (2) advanced sampling, characterization, and monitoring methods; (3) new remediation technologies; (4) performance assessment; and (5) health and environmental effects. In addition to basic research, this document deals with education and training needs for environmental restoration. 2 figs., 6 tabs

  19. Basic research for environmental restoration

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The Department of Energy (DOE) is in the midst of a major environmental restoration effort to reduce the health and environmental risks resulting from past waste management and disposal practices at DOE sites. This report describes research needs in environmental restoration and complements a previously published document, DOE/ER-0419, Evaluation of Mid-to-Long Term Basic Research for Environmental Restoration. Basic research needs have been grouped into five major categories patterned after those identified in DOE/ER-0419: (1) environmental transport and transformations; (2) advanced sampling, characterization, and monitoring methods; (3) new remediation technologies; (4) performance assessment; and (5) health and environmental effects. In addition to basic research, this document deals with education and training needs for environmental restoration. 2 figs., 6 tabs.

  20. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  1. Hydromechanics - basic properties

    International Nuclear Information System (INIS)

    Lee, Sung Tak; Lee, Je Geun

    1987-03-01

    This book tells of hydromechanics, which is about basic properties of hydromechanics such as conception, definition, mass, power and weight, and perfect fluid and perfect gas, hydrostatics with summary, basic equation of hydrostatics, relative balance of hydrostatics, and kinematics of hydromechanics, description method of floating, hydromechanics about basic knowledge, equation of moment, energy equation and application of Bernoulli equation, application of momentum theory, inviscid flow and fluid measuring.

  2. Basic molecular spectroscopy

    CERN Document Server

    Gorry, PA

    1985-01-01

    BASIC Molecular Spectroscopy discusses the utilization of the Beginner's All-purpose Symbolic Instruction Code (BASIC) programming language in molecular spectroscopy. The book is comprised of five chapters that provide an introduction to molecular spectroscopy through programs written in BASIC. The coverage of the text includes rotational spectra, vibrational spectra, and Raman and electronic spectra. The book will be of great use to students who are currently taking a course in molecular spectroscopy.

  3. Anti-Atheist Bias in the United States: Testing Two Critical Assumptions

    Directory of Open Access Journals (Sweden)

    Lawton K Swan

    2012-02-01

    Full Text Available Decades of opinion polling and empirical investigations have clearly demonstrated a pervasive anti-atheist prejudice in the United States. However, much of this scholarship relies on two critical and largely unaddressed assumptions: (a that when people report negative attitudes toward atheists, they do so because they are reacting specifically to their lack of belief in God; and (b that survey questions asking about attitudes toward atheists as a group yield reliable information about biases against individual atheist targets. To test these assumptions, an online survey asked a probability-based random sample of American adults (N = 618 to evaluate a fellow research participant (“Jordan”. Jordan garnered significantly more negative evaluations when identified as an atheist than when described as religious or when religiosity was not mentioned. This effect did not differ as a function of labeling (“atheist” versus “no belief in God”, or the amount of individuating information provided about Jordan. These data suggest that both assumptions are tenable: nonbelief—rather than extraneous connotations of the word “atheist”—seems to underlie the effect, and participants exhibited a marked bias even when confronted with an otherwise attractive individual.

  4. Research and development of the industrial basic technologies of the next generation, 'composite materials (metal-based)'. Evaluation of the first phase research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'fukugo zairyo (kinzokukei)'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1983-03-31

    The results of the first phase research and development project for developing the metal-based composite materials as the basic technologies of the next generation are evaluated. For development of highly functional materials, the efforts are directed to understanding the basic characteristics regarding wettability and reactivity of PAN-based carbon fibers and PCS-based silicon carbide fibers with metals, in which aluminum-based matrices are considered; preliminary tests for various composting methods; and development of wire preforms as the intermediate materials. Bright prospects have been obtained for improving problems involved in each technique. For development of molding/processing techniques, basic researches have been conducted for hot pressing/rolling, extrusion/withdrawal, powder molding and melt molding, using various reinforcing fibers. Bright prospects have been also obtained for combinations of adequate molding processes for individual materials. For development of design techniques, the basic data regarding mechanical characteristics of FRMs, both domestic and foreign, are collected, and analyzed by the trace tests with specimens, to set up the targets for FRM function improvements, viewed from the designs. (NEDO)

  5. Investigation of assumptions underlying current safety guidelines on EM-induced nerve stimulation

    Science.gov (United States)

    Neufeld, Esra; Vogiatzis Oikonomidis, Ioannis; Iacono, Maria Ida; Angelone, Leonardo M.; Kainz, Wolfgang; Kuster, Niels

    2016-06-01

    An intricate network of a variety of nerves is embedded within the complex anatomy of the human body. Although nerves are shielded from unwanted excitation, they can still be stimulated by external electromagnetic sources that induce strongly non-uniform field distributions. Current exposure safety standards designed to limit unwanted nerve stimulation are based on a series of explicit and implicit assumptions and simplifications. This paper demonstrates the applicability of functionalized anatomical phantoms with integrated coupled electromagnetic and neuronal dynamics solvers for investigating the impact of magnetic resonance exposure on nerve excitation within the full complexity of the human anatomy. The impact of neuronal dynamics models, temperature and local hot-spots, nerve trajectory and potential smoothing, anatomical inhomogeneity, and pulse duration on nerve stimulation was evaluated. As a result, multiple assumptions underlying current safety standards are questioned. It is demonstrated that coupled EM-neuronal dynamics modeling involving realistic anatomies is valuable to establish conservative safety criteria.

  6. Pre-equilibrium assumptions and statistical model parameters effects on reaction cross-section calculations

    International Nuclear Information System (INIS)

    Avrigeanu, M.; Avrigeanu, V.

    1992-02-01

    A systematic study on effects of statistical model parameters and semi-classical pre-equilibrium emission models has been carried out for the (n,p) reactions on the 56 Fe and 60 Co target nuclei. The results obtained by using various assumptions within a given pre-equilibrium emission model differ among them more than the ones of different models used under similar conditions. The necessity of using realistic level density formulas is emphasized especially in connection with pre-equilibrium emission models (i.e. with the exciton state density expression), while a basic support could be found only by replacement of the Williams exciton state density formula with a realistic one. (author). 46 refs, 12 figs, 3 tabs

  7. An Expedient Study on Back-Propagation (BPN) Neural Networks for Modeling Automated Evaluation of the Answers and Progress of Deaf Students' That Possess Basic Knowledge of the English Language and Computer Skills

    Science.gov (United States)

    Vrettaros, John; Vouros, George; Drigas, Athanasios S.

    This article studies the expediency of using neural networks technology and the development of back-propagation networks (BPN) models for modeling automated evaluation of the answers and progress of deaf students' that possess basic knowledge of the English language and computer skills, within a virtual e-learning environment. The performance of the developed neural models is evaluated with the correlation factor between the neural networks' response values and the real value data as well as the percentage measurement of the error between the neural networks' estimate values and the real value data during its training process and afterwards with unknown data that weren't used in the training process.

  8. THE COMPLEX OF ASSUMPTION CATHEDRAL OF THE ASTRAKHAN KREMLIN

    Directory of Open Access Journals (Sweden)

    Savenkova Aleksandra Igorevna

    2016-08-01

    Full Text Available This article is devoted to an architectural and historical analysis of the constructions forming a complex of Assumption Cathedral of the Astrakhan Kremlin, which earlier hasn’t been considered as a subject of special research. Basing on the archival sources, photographic materials, publications and on-site investigations of monuments, the creation history of the complete architectural complex sustained in one style of the Muscovite baroque, unique in its composite construction, is considered. Its interpretation in the all-Russian architectural context is offered. Typological features of single constructions come to light. The typology of the Prechistinsky bell tower has an untypical architectural solution - “hexagonal structure on octagonal and quadrangular structures”. The way of connecting the building of the Cathedral and the chambers by the passage was characteristic of monastic constructions and was exclusively seldom in kremlins, farmsteads and ensembles of city cathedrals. The composite scheme of the Assumption Cathedral includes the Lobnoye Mesto (“the Place of Execution” located on an axis from the West, it is connected with the main building by a quarter-turn with landing. The only prototype of the structure is a Lobnoye Mesto on the Red Square in Moscow. In the article the version about the emergence of the Place of Execution on the basis of earlier existing construction - a tower “the Peal” which is repeatedly mentioned in written sources in connection with S. Razin’s revolt is considered. The metropolitan Sampson, trying to keep the value of the Astrakhan metropolitanate, builds the Assumption Cathedral and the Place of Execution directly appealing to a capital prototype to emphasize the continuity and close connection with Moscow.

  9. Are Prescription Opioids Driving the Opioid Crisis? Assumptions vs Facts.

    Science.gov (United States)

    Rose, Mark Edmund

    2018-04-01

    Sharp increases in opioid prescriptions, and associated increases in overdose deaths in the 2000s, evoked widespread calls to change perceptions of opioid analgesics. Medical literature discussions of opioid analgesics began emphasizing patient and public health hazards. Repetitive exposure to this information may influence physician assumptions. While highly consequential to patients with pain whose function and quality of life may benefit from opioid analgesics, current assumptions about prescription opioid analgesics, including their role in the ongoing opioid overdose epidemic, have not been scrutinized. Information was obtained by searching PubMed, governmental agency websites, and conference proceedings. Opioid analgesic prescribing and associated overdose deaths both peaked around 2011 and are in long-term decline; the sharp overdose increase recorded in 2014 was driven by illicit fentanyl and heroin. Nonmethadone prescription opioid analgesic deaths, in the absence of co-ingested benzodiazepines, alcohol, or other central nervous system/respiratory depressants, are infrequent. Within five years of initial prescription opioid misuse, 3.6% initiate heroin use. The United States consumes 80% of the world opioid supply, but opioid access is nonexistent for 80% and severely restricted for 4.1% of the global population. Many current assumptions about opioid analgesics are ill-founded. Illicit fentanyl and heroin, not opioid prescribing, now fuel the current opioid overdose epidemic. National discussion has often neglected the potentially devastating effects of uncontrolled chronic pain. Opioid analgesic prescribing and related overdoses are in decline, at great cost to patients with pain who have benefited or may benefit from, but cannot access, opioid analgesic therapy.

  10. Questioning the foundations of physics which of our fundamental assumptions are wrong?

    CERN Document Server

    Foster, Brendan; Merali, Zeeya

    2015-01-01

    The essays in this book look at way in which the fundaments of physics might need to be changed in order to make progress towards a unified theory. They are based on the prize-winning essays submitted to the FQXi essay competition “Which of Our Basic Physical Assumptions Are Wrong?”, which drew over 270 entries. As Nobel Laureate physicist Philip W. Anderson realized, the key to understanding nature’s reality is not anything “magical”, but the right attitude, “the focus on asking the right questions, the willingness to try (and to discard) unconventional answers, the sensitive ear for phoniness, self-deception, bombast, and conventional but unproven assumptions.” The authors of the eighteen prize-winning essays have, where necessary, adapted their essays for the present volume so as to (a) incorporate the community feedback generated in the online discussion of the essays, (b) add new material that has come to light since their completion and (c) to ensure accessibility to a broad audience of re...

  11. Moral dilemmas in professions of public trust and the assumptions of ethics of social consequences

    Directory of Open Access Journals (Sweden)

    Dubiel-Zielińska Paulina

    2016-06-01

    Full Text Available The aim of the article is to show the possibility of applying assumptions from ethics of social consequences when making decisions about actions, as well as in situations of moral dilemmas, by persons performing occupations of public trust on a daily basis. Reasoning in the article is analytical and synthetic. Article begins with an explanation of the basic concepts of “profession” and “the profession of public trust” and a manifestation of the difference between these terms. This is followed by a general description of professions of public trust. The area and definition of moral dilemmas is emphasized. Furthermore, representatives of professions belonging to them are listed. After a brief characterization of axiological foundations and the main assumptions of ethics of social consequences, actions according to Vasil Gluchman and Włodzimierz Galewicz are discussed and actions in line with ethics of social consequences are transferred to the practical domain. The article points out that actions in professional life are obligatory, impermissible, permissible, supererogatory and unmarked in the moral dimension. In the final part of the article an afterthought is included on how to solve moral dilemmas when in the position of a representative of the profession of public trust. The article concludes with a summary report containing the conclusions that stem from ethics of social consequences for professions of public trust, followed by short examples.

  12. Radiation hormesis and the linear-no-threshold assumption

    CERN Document Server

    Sanders, Charles L

    2009-01-01

    Current radiation protection standards are based upon the application of the linear no-threshold (LNT) assumption, which considers that even very low doses of ionizing radiation can cause cancer. The radiation hormesis hypothesis, by contrast, proposes that low-dose ionizing radiation is beneficial. In this book, the author examines all facets of radiation hormesis in detail, including the history of the concept and mechanisms, and presents comprehensive, up-to-date reviews for major cancer types. It is explained how low-dose radiation can in fact decrease all-cause and all-cancer mortality an

  13. First assumptions and overlooking competing causes of death

    DEFF Research Database (Denmark)

    Leth, Peter Mygind; Andersen, Anh Thao Nguyen

    2014-01-01

    Determining the most probable cause of death is important, and it is sometimes tempting to assume an obvious cause of death, when it readily presents itself, and stop looking for other competing causes of death. The case story presented in the article illustrates this dilemma. The first assumption...... of cause of death, which was based on results from bacteriology tests, proved to be wrong when the results from the forensic toxicology testing became available. This case also illustrates how post mortem computed tomography (PMCT) findings of radio opaque material in the stomach alerted the pathologist...

  14. Assumptions of Corporate Social Responsibility as Competitiveness Factor

    Directory of Open Access Journals (Sweden)

    Zaneta Simanaviciene

    2017-09-01

    Full Text Available The purpose of this study was to examine the assumptions of corporate social responsibility (CSR as competitiveness factor in economic downturn. Findings indicate that factors affecting the quality of the micro-economic business environment, i.e., the sophistication of enterprise’s strategy and management processes, the quality of the human capital resources, the increase of product / service demand, the development of related and supporting sectors and the efficiency of natural resources, and competitive capacities of enterprise impact competitiveness at a micro-level. The outcomes suggest that the implementation of CSR elements, i.e., economic, environmental and social responsibilities, gives good opportunities to increase business competitiveness.

  15. ψ -ontology result without the Cartesian product assumption

    Science.gov (United States)

    Myrvold, Wayne C.

    2018-05-01

    We introduce a weakening of the preparation independence postulate of Pusey et al. [Nat. Phys. 8, 475 (2012), 10.1038/nphys2309] that does not presuppose that the space of ontic states resulting from a product-state preparation can be represented by the Cartesian product of subsystem state spaces. On the basis of this weakened assumption, it is shown that, in any model that reproduces the quantum probabilities, any pair of pure quantum states |ψ >,|ϕ > with ≤1 /√{2 } must be ontologically distinct.

  16. Unconditionally Secure and Universally Composable Commitments from Physical Assumptions

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Scafuro, Alessandra

    2013-01-01

    We present a constant-round unconditional black-box compiler that transforms any ideal (i.e., statistically-hiding and statistically-binding) straight-line extractable commitment scheme, into an extractable and equivocal commitment scheme, therefore yielding to UC-security [9]. We exemplify the u...... of unconditional UC-security with (malicious) PUFs and stateless tokens, our compiler can be instantiated with any ideal straight-line extractable commitment scheme, thus allowing the use of various setup assumptions which may better fit the application or the technology available....

  17. Finding Basic Writing's Place.

    Science.gov (United States)

    Sheridan-Rabideau, Mary P.; Brossell, Gordon

    1995-01-01

    Posits that basic writing serves a vital function by providing writing support for at-risk students and serves the needs of a growing student population that universities accept yet feel needs additional writing instruction. Concludes that the basic writing classroom is the most effective educational support for at-risk students and their writing.…

  18. Biomass Energy Basics | NREL

    Science.gov (United States)

    Biomass Energy Basics Biomass Energy Basics We have used biomass energy, or "bioenergy" keep warm. Wood is still the largest biomass energy resource today, but other sources of biomass can landfills (which are methane, the main component in natural gas) can be used as a biomass energy source. A

  19. Wind Energy Basics | NREL

    Science.gov (United States)

    Wind Energy Basics Wind Energy Basics We have been harnessing the wind's energy for hundreds of grinding grain. Today, the windmill's modern equivalent-a wind turbine can use the wind's energy to most energy. At 100 feet (30 meters) or more aboveground, they can take advantage of the faster and

  20. Solar Energy Basics | NREL

    Science.gov (United States)

    Solar Energy Basics Solar Energy Basics Solar is the Latin word for sun-a powerful source of energy that can be used to heat, cool, and light our homes and businesses. That's because more energy from the technologies convert sunlight to usable energy for buildings. The most commonly used solar technologies for

  1. Learning Visual Basic NET

    CERN Document Server

    Liberty, Jesse

    2009-01-01

    Learning Visual Basic .NET is a complete introduction to VB.NET and object-oriented programming. By using hundreds of examples, this book demonstrates how to develop various kinds of applications--including those that work with databases--and web services. Learning Visual Basic .NET will help you build a solid foundation in .NET.

  2. Health Insurance Basics

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Health Insurance Basics KidsHealth / For Teens / Health Insurance Basics What's ... thought advanced calculus was confusing. What Exactly Is Health Insurance? Health insurance is a plan that people buy ...

  3. Body Basics Library

    Science.gov (United States)

    ... Body Basics articles explain just how each body system, part, and process works. Use this medical library to find out about basic human anatomy, how ... Teeth Skin, Hair, and Nails Spleen and Lymphatic System ... Visit the Nemours Web site. Note: All information on TeensHealth® is for ...

  4. A Novel Clinical-Simulated Suture Education for Basic Surgical Skill: Suture on the Biological Tissue Fixed on Standardized Patient Evaluated with Objective Structured Assessment of Technical Skill (OSATS) Tools.

    Science.gov (United States)

    Shen, Zhanlong; Yang, Fan; Gao, Pengji; Zeng, Li; Jiang, Guanchao; Wang, Shan; Ye, Yingjiang; Zhu, Fengxue

    2017-06-21

    Clinical-simulated training has shown benefit in the education of medical students. However, the role of clinical simulation for surgical basic skill training such as suturing techniques remains unclear. Forty-two medical students were asked to perform specific suturing tasks at three stations with the different settings within four minutes (Station 1: Synthetic suture pad fixed on the bench, Station 2: Synthetic suture pad fixed on the standardized patient, Station 3: Pig skin fixed on the standardized patient); the OSATS (Objective Structured Assessment of Technical Skill) tool was used to evaluate the performance of students. A questionnaire was distributed to the students following the examination. Mean performance score of Station 3 was significant lower than that of Station 1 and 2 in the general performance including tissue handling, time, and motion. The suturing techniques of students at Station 2 and 3 were not as accurate as that at Station 1. Inappropriate tension was applied to the knot at Station 2 compared with Station 1 and 3. On the questionnaire, 93% of students considered clinical-simulated training of basic surgical skills was necessary and may increase their confidence in future clinical work as surgeons; 98% of students thought the assessment was more objective when OSATS tool was used for evaluation. Clinical simulation examination assessed with OSATS might throw a novel light on the education of basic surgical skills and may be worthy of wider adoption in the surgical education of medical students.

  5. Research and development of basic technologies for the next generation industries, 'bio-elements'. Evaluation on the first term research and development; Jisedai sangyo kiban gijutsu kenkyu kaihtsu 'bio soshi'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-03-01

    Research, development and evaluation were performed in order to establish basic technologies to realize new information processing incorporating high-order functions of living organisms, and bio-elements equipped with high-order functions. In the field of developing technical means to measure directly the information processing function of living organisms, detailed analysis has become possible for the first time on spatial operation of nerve activities by using two-dimensional beam measurement operation on the nerve activities. In the field of aiming at modeling, a system to realize functions effective in terms of engineering was structured by incorporating into the model the functions specific to nervous systems of visual sense and motion. In the field of element technologies, a large progress was seen in the basic technology and evaluation technology such as design and fabrication of molecules added with different functions in the LB membrane. Membranes having simple but different functions such as light beam and scent detection were manufactured, whereas a possibility of having them manifest functions as elements was verified. Furthermore, possibilities were discovered in manufacturing functional membranes by using self-organizing capability as a result of using basic properties possessed by polymers, and in realizing functions simulating plasticity of nerves. These achievements would lead to a belief that the targets of the first term research and development have been achieved nearly completely. (NEDO)

  6. Research and development of basic technologies for the next generation industries, 'light-reactive materials'. Evaluation on the second term research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'hikari hanno zairyo'. Dainiki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-03-01

    Research and evaluation was performed with an objective of establishing the basic technology related to light-reactive materials that control the structures and status of aggregation of molecules by using actions of light, and can be used for ultra-high density recording, high resolution indication and light switches. In elucidating the basic characteristics of photochromic materials, a non-destructively readable recording system was proposed and demonstrated, highly durable and high-functional photochromic compounds were developed, and a number of material design guidelines were accumulated to realize characteristics required in light-beam recording. With regard to development of the photochromic materials, realization of photochromic thin films that can record wavelengths in multiplex manner has become more realistic. For elucidating basic characteristics of PHB materials, a method for evaluation from a number of directions including the time method for photon echo was established in addition to the conventional frequency recording characteristics. Regarding the elucidation of the PHB phenomenon, demonstration was carried out on intermediate zone structure control in diversified material systems including living organism substances, where a large number of findings were accumulated. (NEDO)

  7. Research and development of basic technologies for the next generation industries, 'bio-elements'. Evaluation on the first term research and development; Jisedai sangyo kiban gijutsu kenkyu kaihtsu 'bio soshi'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-03-01

    Research, development and evaluation were performed in order to establish basic technologies to realize new information processing incorporating high-order functions of living organisms, and bio-elements equipped with high-order functions. In the field of developing technical means to measure directly the information processing function of living organisms, detailed analysis has become possible for the first time on spatial operation of nerve activities by using two-dimensional beam measurement operation on the nerve activities. In the field of aiming at modeling, a system to realize functions effective in terms of engineering was structured by incorporating into the model the functions specific to nervous systems of visual sense and motion. In the field of element technologies, a large progress was seen in the basic technology and evaluation technology such as design and fabrication of molecules added with different functions in the LB membrane. Membranes having simple but different functions such as light beam and scent detection were manufactured, whereas a possibility of having them manifest functions as elements was verified. Furthermore, possibilities were discovered in manufacturing functional membranes by using self-organizing capability as a result of using basic properties possessed by polymers, and in realizing functions simulating plasticity of nerves. These achievements would lead to a belief that the targets of the first term research and development have been achieved nearly completely. (NEDO)

  8. Sugar Cane Genome Numbers Assumption by Ribosomal DNA FISH Techniques

    NARCIS (Netherlands)

    Thumjamras, S.; Jong, de H.; Iamtham, S.; Prammanee, S.

    2013-01-01

    Conventional cytological method is limited for polyploidy plant genome study, especially sugar cane chromosomes that show unstable numbers of each cultivar. Molecular cytogenetic as fluorescent in situ hybridization (FISH) techniques were used in this study. A basic chromosome number of sugar cane

  9. Drug policy in sport: hidden assumptions and inherent contradictions.

    Science.gov (United States)

    Smith, Aaron C T; Stewart, Bob

    2008-03-01

    This paper considers the assumptions underpinning the current drugs-in-sport policy arrangements. We examine the assumptions and contradictions inherent in the policy approach, paying particular attention to the evidence that supports different policy arrangements. We find that the current anti-doping policy of the World Anti-Doping Agency (WADA) contains inconsistencies and ambiguities. WADA's policy position is predicated upon four fundamental principles; first, the need for sport to set a good example; secondly, the necessity of ensuring a level playing field; thirdly, the responsibility to protect the health of athletes; and fourthly, the importance of preserving the integrity of sport. A review of the evidence, however, suggests that sport is a problematic institution when it comes to setting a good example for the rest of society. Neither is it clear that sport has an inherent or essential integrity that can only be sustained through regulation. Furthermore, it is doubtful that WADA's anti-doping policy is effective in maintaining a level playing field, or is the best means of protecting the health of athletes. The WADA anti-doping policy is based too heavily on principals of minimising drug use, and gives insufficient weight to the minimisation of drug-related harms. As a result drug-related harms are being poorly managed in sport. We argue that anti-doping policy in sport would benefit from placing greater emphasis on a harm minimisation model.

  10. The extended evolutionary synthesis: its structure, assumptions and predictions

    Science.gov (United States)

    Laland, Kevin N.; Uller, Tobias; Feldman, Marcus W.; Sterelny, Kim; Müller, Gerd B.; Moczek, Armin; Jablonka, Eva; Odling-Smee, John

    2015-01-01

    Scientific activities take place within the structured sets of ideas and assumptions that define a field and its practices. The conceptual framework of evolutionary biology emerged with the Modern Synthesis in the early twentieth century and has since expanded into a highly successful research program to explore the processes of diversification and adaptation. Nonetheless, the ability of that framework satisfactorily to accommodate the rapid advances in developmental biology, genomics and ecology has been questioned. We review some of these arguments, focusing on literatures (evo-devo, developmental plasticity, inclusive inheritance and niche construction) whose implications for evolution can be interpreted in two ways—one that preserves the internal structure of contemporary evolutionary theory and one that points towards an alternative conceptual framework. The latter, which we label the ‘extended evolutionary synthesis' (EES), retains the fundaments of evolutionary theory, but differs in its emphasis on the role of constructive processes in development and evolution, and reciprocal portrayals of causation. In the EES, developmental processes, operating through developmental bias, inclusive inheritance and niche construction, share responsibility for the direction and rate of evolution, the origin of character variation and organism–environment complementarity. We spell out the structure, core assumptions and novel predictions of the EES, and show how it can be deployed to stimulate and advance research in those fields that study or use evolutionary biology. PMID:26246559

  11. Stable isotopes and elasmobranchs: tissue types, methods, applications and assumptions.

    Science.gov (United States)

    Hussey, N E; MacNeil, M A; Olin, J A; McMeans, B C; Kinney, M J; Chapman, D D; Fisk, A T

    2012-04-01

    Stable-isotope analysis (SIA) can act as a powerful ecological tracer with which to examine diet, trophic position and movement, as well as more complex questions pertaining to community dynamics and feeding strategies or behaviour among aquatic organisms. With major advances in the understanding of the methodological approaches and assumptions of SIA through dedicated experimental work in the broader literature coupled with the inherent difficulty of studying typically large, highly mobile marine predators, SIA is increasingly being used to investigate the ecology of elasmobranchs (sharks, skates and rays). Here, the current state of SIA in elasmobranchs is reviewed, focusing on available tissues for analysis, methodological issues relating to the effects of lipid extraction and urea, the experimental dynamics of isotopic incorporation, diet-tissue discrimination factors, estimating trophic position, diet and mixing models and individual specialization and niche-width analyses. These areas are discussed in terms of assumptions made when applying SIA to the study of elasmobranch ecology and the requirement that investigators standardize analytical approaches. Recommendations are made for future SIA experimental work that would improve understanding of stable-isotope dynamics and advance their application in the study of sharks, skates and rays. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  12. Has the "Equal Environments" assumption been tested in twin studies?

    Science.gov (United States)

    Eaves, Lindon; Foley, Debra; Silberg, Judy

    2003-12-01

    A recurring criticism of the twin method for quantifying genetic and environmental components of human differences is the necessity of the so-called "equal environments assumption" (EEA) (i.e., that monozygotic and dizygotic twins experience equally correlated environments). It has been proposed to test the EEA by stratifying twin correlations by indices of the amount of shared environment. However, relevant environments may also be influenced by genetic differences. We present a model for the role of genetic factors in niche selection by twins that may account for variation in indices of the shared twin environment (e.g., contact between members of twin pairs). Simulations reveal that stratification of twin correlations by amount of contact can yield spurious evidence of large shared environmental effects in some strata and even give false indications of genotype x environment interaction. The stratification approach to testing the equal environments assumption may be misleading and the results of such tests may actually be consistent with a simpler theory of the role of genetic factors in niche selection.

  13. Halo-Independent Direct Detection Analyses Without Mass Assumptions

    CERN Document Server

    Anderson, Adam J.; Kahn, Yonatan; McCullough, Matthew

    2015-10-06

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the $m_\\chi-\\sigma_n$ plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the $v_{min}-\\tilde{g}$ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from $v_{min}$ to nuclear recoil momentum ($p_R$), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call $\\tilde{h}(p_R)$. The entire family of conventional halo-independent $\\tilde{g}(v_{min})$ plots for all DM masses are directly found from the single $\\tilde{h}(p_R)$ plot through a simple re...

  14. Evaluation guide for the radiological impact study of a basic nuclear installation (BNI) as a support for the authorization application of releases

    International Nuclear Information System (INIS)

    Chartier, Mr.; Despres, A.; Supervil, S.; Conte, D.; Hubert, P.; Oudiz, A.; Champion, D.

    2002-10-01

    At the time of a licence application of effluent releases and water pumping of basic nuclear facilities (BNF), the operator of the installation must in particular provide a radiological impact study of the radioactive effluent releases coming from the installation on the environment and on public health. An impact study of the radioactive releases represents technical and conditional specifications. It was for this reason that the French Safety Authority (ASN then DSIN) and the Directorate-General of Health Services (DGS) requested IRSN (then IPSN), in April 1999, to develop a guide facilitating the review of such a study, as well for the services implied in the examination of the licence applications, as for all the concerned parties in this field. The objective of the guide is to take into account the regulatory context which underlies the development of the impact studies (decree no. 95-540 of May 4, 1995, modified by the decree no. 2002-460 of April 4, 2002, and the Euratom guideline 96/29 of May 13, 1996, known as 'the basic standard guideline', accompanied by its transposition texts in French law). In this precise context, the guide proposes to assess the radiological impact study of a BNF from three different angles: - the description and the quantification of the produced effluents, by taking account of the triggering processes, of the different processing measures and of the procedures to optimise the reduction of the produced effluents; - the estimate of the dosimetric impact of the planned releases on the population, taking into account the environmental characteristics of the installation; - the definition of the conditions to monitor the releases and the environment. This guide provides a general condition logical framework adaptable to any particular situation met

  15. Protection against external impacts and missiles - Load assumption and effects on the plant design of a 1300 MW PWR-Plant

    International Nuclear Information System (INIS)

    Gremm, O.; Orth, K.H.

    1978-01-01

    The load assumptions and effects of the external impacts are given. The fundamental properties of the KWU standard design according to these impacts and the consequences for the engineering safeguards are explained. The protection against external impacts includes the protection against all external missiles. The basic measure of protection against internal missiles is the strict separation of redundancies. (author)

  16. From basic needs to basic rights.

    Science.gov (United States)

    Facio, A

    1995-06-01

    After arriving at an understanding that basic rights refer to all human needs, it is clear that a recognition of the basic needs of female humans must precede the realization of their rights. The old Women in Development (WID) framework only understood women's needs from an androcentric perspective which was limited to practical interests. Instead, women's primary need is to be free from their subordination to men. Such an understanding places all of women's immediate needs in a new light. A human rights approach to development would see women not as beneficiaries but as people entitled to enjoy the benefits of development. Discussion of what equality before the law should mean to women began at the Third World Conference on Women in Nairobi where the issue of violence against women was first linked to development. While debate continues about the distinction between civil and political rights and economic, social, and cultural rights, the realities of women's lives do not permit such a distinction. The concept of the universality of human rights did not become codified until the UN proclaimed the Universal Declaration of Human Rights in 1948. The declaration has been criticized by feminists because the view of human rights it embodies has been too strongly influenced by a liberal Western philosophy which stresses individual rights and because it is ambiguous on the distinction between human rights and the rights of a citizen. The protection of rights afforded by the Declaration, however, should not be viewed as a final achievement but as an ongoing struggle. International conferences have led to an analysis of the human-rights approach to sustainable development which concludes that women continue to face the routine denial of their rights. Each human right must be redefined from the perspective of women's needs, which must also be redefined. Women must forego challenging the concept of the universality of human rights in order to overcome the argument of cultural

  17. Basic rocks in Finland

    International Nuclear Information System (INIS)

    Piirainen, T.; Gehoer, S.; Iljina, M.; Kaerki, A.; Paakkola, J.; Vuollo, J.

    1992-10-01

    Basic igneous rocks, containing less than 52% SiO 2 , constitute an important part of the Finnish Archaean and Proterozoic crust. In the Archaean crust exist two units which contain the majority of the basic rocks. The Arcaean basic rocks are metavolcanics and situated in the Greenstone Belts of Eastern Finland. They are divided into two units. The greenstones of the lower one are tholeiites, komatiites and basaltic komatiites. The upper consists of bimodal series of volcanics and the basic rocks of which are Fe-tholeiites, basaltic komatiites and komatiites. Proterozoic basic rocks are divided into seven groups according to their ages. The Proterozoic igneous activity started by the volominous basic magmatism 2.44 Ga ago. During this stage formed the layered intrusions and related dykes in the Northern Finland. 2.2 Ga old basic rocks are situated at the margins of Karelian formations. 2.1 Ga aged Fe-tholeiitic magmatic activity is widespread in Eastern and Northern Finland. The basic rocks of 1.97 Ga age group are met within the Karelian Schist Belts as obducted ophiolite complexes but they occur also as tholeiitic diabase dykes cutting the Karelian schists and Archean basement. The intrusions and the volcanics of the 1.9 Ga old basic igneous activity are mostly encountered around the Granitoid Complex of Central Finland. Subjotnian, 1.6 Ga aged tholeiitic diabases are situated around the Rapakivi massifs of Southern Finland, and postjotnian, 1.2 Ga diabases in Western Finland where they form dykes cutting Svecofennian rocks

  18. Quantum electronics basic theory

    CERN Document Server

    Fain, V M; Sanders, J H

    1969-01-01

    Quantum Electronics, Volume 1: Basic Theory is a condensed and generalized description of the many research and rapid progress done on the subject. It is translated from the Russian language. The volume describes the basic theory of quantum electronics, and shows how the concepts and equations followed in quantum electronics arise from the basic principles of theoretical physics. The book then briefly discusses the interaction of an electromagnetic field with matter. The text also covers the quantum theory of relaxation process when a quantum system approaches an equilibrium state, and explai

  19. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  20. Estimation of the energy loss at the blades in rowing: common assumptions revisited.

    Science.gov (United States)

    Hofmijster, Mathijs; De Koning, Jos; Van Soest, A J

    2010-08-01

    In rowing, power is inevitably lost as kinetic energy is imparted to the water during push-off with the blades. Power loss is estimated from reconstructed blade kinetics and kinematics. Traditionally, it is assumed that the oar is completely rigid and that force acts strictly perpendicular to the blade. The aim of the present study was to evaluate how reconstructed blade kinematics, kinetics, and average power loss are affected by these assumptions. A calibration experiment with instrumented oars and oarlocks was performed to establish relations between measured signals and oar deformation and blade force. Next, an on-water experiment was performed with a single female world-class rower rowing at constant racing pace in an instrumented scull. Blade kinematics, kinetics, and power loss under different assumptions (rigid versus deformable oars; absence or presence of a blade force component parallel to the oar) were reconstructed. Estimated power losses at the blades are 18% higher when parallel blade force is incorporated. Incorporating oar deformation affects reconstructed blade kinematics and instantaneous power loss, but has no effect on estimation of power losses at the blades. Assumptions on oar deformation and blade force direction have implications for the reconstructed blade kinetics and kinematics. Neglecting parallel blade forces leads to a substantial underestimation of power losses at the blades.

  1. An optical flow algorithm based on gradient constancy assumption for PIV image processing

    International Nuclear Information System (INIS)

    Zhong, Qianglong; Yang, Hua; Yin, Zhouping

    2017-01-01

    Particle image velocimetry (PIV) has matured as a flow measurement technique. It enables the description of the instantaneous velocity field of the flow by analyzing the particle motion obtained from digitally recorded images. Correlation based PIV evaluation technique is widely used because of its good accuracy and robustness. Although very successful, correlation PIV technique has some weakness which can be avoided by optical flow based PIV algorithms. At present, most of the optical flow methods applied to PIV are based on brightness constancy assumption. However, some factors of flow imaging technology and the nature property of the fluids make the brightness constancy assumption less appropriate in real PIV cases. In this paper, an implementation of a 2D optical flow algorithm (GCOF) based on gradient constancy assumption is introduced. The proposed GCOF assumes the edges of the illuminated PIV particles are constant during motion. It comprises two terms: a combined local-global gradient data term and a first-order divergence and vorticity smooth term. The approach can provide accurate dense motion fields. The approach are tested on synthetic images and on two experimental flows. The comparison of GCOF with other optical flow algorithms indicates the proposed method is more accurate especially in conditions of illumination variation. The comparison of GCOF with correlation PIV technique shows that the proposed GCOF has advantages on preserving small divergence and vorticity structures of the motion field and getting less outliers. As a consequence, the GCOF acquire a more accurate and better topological description of the turbulent flow. (paper)

  2. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Directory of Open Access Journals (Sweden)

    Giordano James

    2010-01-01

    Full Text Available Abstract A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order, and what counts as abnormality (i.e.- disorder. The distinction(s between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice.

  3. On the ontological assumptions of the medical model of psychiatry: philosophical considerations and pragmatic tasks

    Science.gov (United States)

    2010-01-01

    A common theme in the contemporary medical model of psychiatry is that pathophysiological processes are centrally involved in the explanation, evaluation, and treatment of mental illnesses. Implied in this perspective is that clinical descriptors of these pathophysiological processes are sufficient to distinguish underlying etiologies. Psychiatric classification requires differentiation between what counts as normality (i.e.- order), and what counts as abnormality (i.e.- disorder). The distinction(s) between normality and pathology entail assumptions that are often deeply presupposed, manifesting themselves in statements about what mental disorders are. In this paper, we explicate that realism, naturalism, reductionism, and essentialism are core ontological assumptions of the medical model of psychiatry. We argue that while naturalism, realism, and reductionism can be reconciled with advances in contemporary neuroscience, essentialism - as defined to date - may be conceptually problematic, and we pose an eidetic construct of bio-psychosocial order and disorder based upon complex systems' dynamics. However we also caution against the overuse of any theory, and claim that practical distinctions are important to the establishment of clinical thresholds. We opine that as we move ahead toward both a new edition of the Diagnostic and Statistical Manual, and a proposed Decade of the Mind, the task at hand is to re-visit nosologic and ontologic assumptions pursuant to a re-formulation of diagnostic criteria and practice. PMID:20109176

  4. String cosmology basic ideas and general results

    CERN Document Server

    Veneziano, Gabriele

    1995-01-01

    After recalling a few basic concepts from cosmology and string theory, I will outline the main ideas/assumptions underlying (our own group's approach to) string cosmology and show how these lead to the definition of a two-parameter family of ``minimal" models. I will then briefly explain how to compute, in terms of those parameters, the spectrum of scalar, tensor and electromagnetic perturbations, and mention their most relevant physical consequences. More details on the latter part of this talk can be found in Maurizio Gasperini's contribution to these proceedings.

  5. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    International Nuclear Information System (INIS)

    Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow

    2013-01-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  6. Is the assumption of normality or log-normality for continuous response data critical for benchmark dose estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)

    2013-11-01

    Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more

  7. Basic Financial Accounting

    DEFF Research Database (Denmark)

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects...

  8. HIV Treatment: The Basics

    Science.gov (United States)

    ... AIDS Drugs Clinical Trials Apps skip to content HIV Treatment Home Understanding HIV/AIDS Fact Sheets HIV ... 4 p.m. ET) Send us an email HIV Treatment: The Basics Last Reviewed: March 22, 2018 ...

  9. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... How Peer Counseling Works Julie Gassaway, MS, RN Pediatric Injuries Pediatric Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation Sara Klaas, MSW Transitions for Children ...

  10. Powassan (POW) Virus Basics

    Science.gov (United States)

    ... Health Professionals Related Topics For International Travelers Powassan Virus Disease Basics Download this fact sheet formatted for ... Virus Disease Fact Sheet (PDF) What is Powassan virus? Powassan virus is a tickborne flavivirus that is ...

  11. Brain Basics: Understanding Sleep

    Science.gov (United States)

    ... You are here Home » Disorders » Patient & Caregiver Education Brain Basics: Understanding Sleep Anatomy of Sleep Sleep Stages ... t form or maintain the pathways in your brain that let you learn and create new memories, ...

  12. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of ... injury? What is a Spinal Cord Injury? SCI Medical Experts People Living With SCI Personal Experiences By ...

  13. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Topic Resources Peer Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of Spinal Cord Injury Rehabilitation Adult Injuries Spinal Cord Injury 101 David ...

  14. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... RN Pediatric Injuries Pediatric Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation ... Rogers, PT Recreational Therapy after Spinal Cord Injury Jennifer Piatt, PhD Kristine Cichowski, MS Read Bio Founding ...

  15. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Topic Resources Peer Counseling Blog About Media Donate Spinal Cord Injury Medical Expert Videos Topics menu Topics The Basics of Spinal Cord Injury Rehabilitation Adult Injuries Spinal Cord Injury 101 ...

  16. Basics of SCI Rehabilitation

    Medline Plus

    Full Text Available ... Spinal Cord Injury 101 Lawrence Vogel, MD The Basics of Pediatric SCI Rehabilitation Sara Klaas, MSW Transitions for Children with Spinal Cord Injury Patricia Mucia, RN Family Life After Pediatric Spinal Injury Dawn Sheaffer, MSW Rehabilitation ...

  17. Physical Activity Basics

    Science.gov (United States)

    ... Weight Breastfeeding Micronutrient Malnutrition State and Local Programs Physical Activity Basics Recommend on Facebook Tweet Share Compartir How much physical activity do you need? Regular physical activity helps improve ...

  18. Radionuclide Basics: Iodine

    Science.gov (United States)

    ... Centers Radiation Protection Contact Us Share Radionuclide Basics: Iodine Iodine (chemical symbol I) is a chemical element. ... in the environment Iodine sources Iodine and health Iodine in the Environment All 37 isotopes of iodine ...

  19. Experimental assessment of unvalidated assumptions in classical plasticity theory.

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, Rebecca Moss (University of Utah, Salt Lake City, UT); Burghardt, Jeffrey A. (University of Utah, Salt Lake City, UT); Bauer, Stephen J.; Bronowski, David R.

    2009-01-01

    This report investigates the validity of several key assumptions in classical plasticity theory regarding material response to changes in the loading direction. Three metals, two rock types, and one ceramic were subjected to non-standard loading directions, and the resulting strain response increments were displayed in Gudehus diagrams to illustrate the approximation error of classical plasticity theories. A rigorous mathematical framework for fitting classical theories to the data, thus quantifying the error, is provided. Further data analysis techniques are presented that allow testing for the effect of changes in loading direction without having to use a new sample and for inferring the yield normal and flow directions without having to measure the yield surface. Though the data are inconclusive, there is indication that classical, incrementally linear, plasticity theory may be inadequate over a certain range of loading directions. This range of loading directions also coincides with loading directions that are known to produce a physically inadmissible instability for any nonassociative plasticity model.

  20. Factor structure and concurrent validity of the world assumptions scale.

    Science.gov (United States)

    Elklit, Ask; Shevlin, Mark; Solomon, Zahava; Dekel, Rachel

    2007-06-01

    The factor structure of the World Assumptions Scale (WAS) was assessed by means of confirmatory factor analysis. The sample was comprised of 1,710 participants who had been exposed to trauma that resulted in whiplash. Four alternative models were specified and estimated using LISREL 8.72. A correlated 8-factor solution was the best explanation of the sample data. The estimates of reliability of eight subscales of the WAS ranged from .48 to .82. Scores from five subscales correlated significantly with trauma severity as measured by the Harvard Trauma Questionnaire, although the magnitude of the correlations was low to modest, ranging from .08 to -.43. It is suggested that the WAS has adequate psychometric properties for use in both clinical and research settings.

  1. Posttraumatic Growth and Shattered World Assumptions Among Ex-POWs

    DEFF Research Database (Denmark)

    Lahav, Y.; Bellin, Elisheva S.; Solomon, Z.

    2016-01-01

    Objective: The controversy regarding the nature of posttraumatic growth (PTG) includes two main competing claims: one which argues that PTG reflects authentic positive changes and the other which argues that PTG reflects illusionary defenses. The former also suggests that PTG evolves from shattered...... world assumptions (WAs) and that the co-occurrence of high PTG and negative WAs among trauma survivors reflects reconstruction of an integrative belief system. The present study aimed to test these claims by investigating, for the first time, the mediating role of dissociation in the relation between...... PTG and WAs. Method: Former prisoners of war (ex-POWs; n = 158) and comparable controls (n = 106) were assessed 38 years after the Yom Kippur War. Results: Ex-POWs endorsed more negative WAs and higher PTG and dissociation compared to controls. Ex-POWs with posttraumatic stress disorder (PTSD...

  2. Ancestral assumptions and the clinical uncertainty of evolutionary medicine.

    Science.gov (United States)

    Cournoyea, Michael

    2013-01-01

    Evolutionary medicine is an emerging field of medical studies that uses evolutionary theory to explain the ultimate causes of health and disease. Educational tools, online courses, and medical school modules are being developed to help clinicians and students reconceptualize health and illness in light of our evolutionary past. Yet clinical guidelines based on our ancient life histories are epistemically weak, relying on the controversial assumptions of adaptationism and advocating a strictly biophysical account of health. To fulfill the interventionist goals of clinical practice, it seems that proximate explanations are all we need to develop successful diagnostic and therapeutic guidelines. Considering these epistemic concerns, this article argues that the clinical relevance of evolutionary medicine remains uncertain at best.

  3. Polarized BRDF for coatings based on three-component assumption

    Science.gov (United States)

    Liu, Hong; Zhu, Jingping; Wang, Kai; Xu, Rong

    2017-02-01

    A pBRDF(polarized bidirectional reflection distribution function) model for coatings is given based on three-component reflection assumption in order to improve the polarized scattering simulation capability for space objects. In this model, the specular reflection is given based on microfacet theory, the multiple reflection and volume scattering are given separately according to experimental results. The polarization of specular reflection is considered from Fresnel's law, and both multiple reflection and volume scattering are assumed depolarized. Simulation and measurement results of two satellite coating samples SR107 and S781 are given to validate that the pBRDF modeling accuracy can be significantly improved by the three-component model given in this paper.

  4. Dynamic Group Diffie-Hellman Key Exchange under standard assumptions

    International Nuclear Information System (INIS)

    Bresson, Emmanuel; Chevassut, Olivier; Pointcheval, David

    2002-01-01

    Authenticated Diffie-Hellman key exchange allows two principals communicating over a public network, and each holding public-private keys, to agree on a shared secret value. In this paper we study the natural extension of this cryptographic problem to a group of principals. We begin from existing formal security models and refine them to incorporate major missing details (e.g., strong-corruption and concurrent sessions). Within this model we define the execution of a protocol for authenticated dynamic group Diffie-Hellman and show that it is provably secure under the decisional Diffie-Hellman assumption. Our security result holds in the standard model and thus provides better security guarantees than previously published results in the random oracle model

  5. Halo-independent direct detection analyses without mass assumptions

    International Nuclear Information System (INIS)

    Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan; McCullough, Matthew

    2015-01-01

    Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m χ −σ n plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v min −g-tilde plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v min to nuclear recoil momentum (p R ), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call h-til-tilde(p R ). The entire family of conventional halo-independent g-tilde(v min ) plots for all DM masses are directly found from the single h-tilde(p R ) plot through a simple rescaling of axes. By considering results in h-tilde(p R ) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple g-tilde(v min ) plots for different DM masses. We conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity

  6. Wartime Paris, cirrhosis mortality, and the ceteris paribus assumption.

    Science.gov (United States)

    Fillmore, Kaye Middleton; Roizen, Ron; Farrell, Michael; Kerr, William; Lemmens, Paul

    2002-07-01

    This article critiques the ceteris paribus assumption, which tacitly sustains the epidemiologic literature's inference that the sharp decline in cirrhosis mortality observed in Paris during the Second World War derived from a sharp constriction in wine consumption. Paris's wartime circumstances deviate substantially from the "all else being equal" assumption, and at least three other hypotheses for the cirrhosis decline may be contemplated. Historical and statistical review. Wartime Paris underwent tumultuous changes. Wine consumption did decline, but there were, as well, a myriad of other changes in diet and life experience, many involving new or heightened hardships, nutritional, experiential, institutional, health and mortality risks. Three competing hypotheses are presented: (1) A fraction of the candidates for cirrhosis mortality may have fallen to more sudden forms of death; (2) alcoholics, heavy drinkers and Paris's clochard subpopulation may have been differentially likely to become removed from the city's wartime population, whether by self-initiated departure, arrest and deportation, or death from other causes, even murder; and (3) there was mismeasurement in the cirrhosis mortality decline. The alcohol-cirrhosis connection provided the template for the alcohol research effort (now more than 20 years old) aimed at re-establishing scientific recognition of alcohol's direct alcohol-problems-generating associations and causal responsibilities. In a time given to reports of weaker associations of the alcohol-cirrhosis connection, the place and importance of the Paris curve in the wider literature, as regards that connection, remains. For this reason, the Paris findings should be subjected to as much research scrutiny as they undoubtedly deserve.

  7. Basic Finite Element Method

    International Nuclear Information System (INIS)

    Lee, Byeong Hae

    1992-02-01

    This book gives descriptions of basic finite element method, which includes basic finite element method and data, black box, writing of data, definition of VECTOR, definition of matrix, matrix and multiplication of matrix, addition of matrix, and unit matrix, conception of hardness matrix like spring power and displacement, governed equation of an elastic body, finite element method, Fortran method and programming such as composition of computer, order of programming and data card and Fortran card, finite element program and application of nonelastic problem.

  8. Development NGOs: Basic Facts

    OpenAIRE

    Aldashev, Gani; Navarra, Cecilia

    2017-01-01

    This paper systematizes the results of the empirical literature on development non-governmental organizations (NGOs), drawing both from quantitative and qualitative analyses, and constructs a set of basic facts about these organizations. These basic facts concern the size of the development NGO sector and its evolution, the funding of NGOs, the allocation of NGO aid and projects across beneficiary countries, the relationship of NGOs with beneficiaries, and the phenomenon of globalization of d...

  9. An Evaluation of Organizational and Experience Factors Affecting the Perceived Transfer of U.S. Air Force Basic Combat Skills Training

    National Research Council Canada - National Science Library

    Crow, Shirley D

    2007-01-01

    ... they learned in training on the job or in a hostile environment. The analysis used structural equation modeling to evaluate the paths between each of the factors and perceived training transfer...

  10. Development and Pre-Clinical Evaluation of Recombinant Human Myelin Basic Protein Nano Therapeutic Vaccine in Experimental Autoimmune Encephalomyelitis Mice Animal Model

    Science.gov (United States)

    Al-Ghobashy, Medhat A.; Elmeshad, Aliaa N.; Abdelsalam, Rania M.; Nooh, Mohammed M.; Al-Shorbagy, Muhammad; Laible, Götz

    2017-04-01

    Recombinant human myelin basic protein (rhMBP) was previously produced in the milk of transgenic cows. Differences in molecular recognition of either hMBP or rhMBP by surface-immobilized anti-hMBP antibodies were demonstrated. This indicated differences in immunological response between rhMBP and hMBP. Here, the activity of free and controlled release rhMBP poly(ɛ-caprolactone) nanoparticles (NPs), as a therapeutic vaccine against multiple sclerosis (MS) was demonstrated in experimental autoimmune encephalomyelitis (EAE) animal model. Following optimization of nanoformulation, discrete spherical, rough-surfaced rhMBP NPs with high entrapment efficiency and controlled release pattern were obtained. Results indicated that rhMBP was loaded into and electrostatically adsorbed onto the surface of NPs. Subcutaneous administration of free or rhMBP NPs before EAE-induction reduced the average behavioral score in EAE mice and showed only mild histological alterations and preservation of myelin sheath, with rhMBP NPs showing increased protection. Moreover, analysis of inflammatory cytokines (IFN-γ and IL-10) in mice brains revealed that pretreatment with free or rhMBP NPs significantly protected against induced inflammation. In conclusion: i) rhMBP ameliorated EAE symptoms in EAE animal model, ii) nanoformulation significantly enhanced efficacy of rhMBP as a therapeutic vaccine and iii) clinical investigations are required to demonstrate the activity of rhMBP NPs as a therapeutic vaccine for MS.

  11. Structure and evaluation of antibacterial and antitubercular properties of new basic and heterocyclic 3-formylrifamycin SV derivatives obtained via 'click chemistry' approach.

    Science.gov (United States)

    Pyta, Krystian; Klich, Katarzyna; Domagalska, Joanna; Przybylski, Piotr

    2014-09-12

    Thirty four novel derivatives of 3-formylrifamycin SV were synthesized via reductive alkylation and copper(I)-catalysed azide-alkyne cycloaddition. According to the obtained results, 'click chemistry' can be successfully applied for modification of structurally complex antibiotics such as rifamycins, with the formation of desired 1,2,3-triazole products. However, when azide-alkyne cycloaddition on 3-formylrifamycin SV derivatives demanded higher amount of catalyst, lower temperature and longer reaction time because of the high volatility of substrates, an unexpected intramolecular condensation with the formation of 3,4-dihydrobenzo[g]quinazoline heterocyclic system took place. Structures of new derivatives in solution were determined using one- and two-dimensional NMR methods and FT-IR spectroscopy. Computational DFT and PM6 methods were employed to correlate their conformation and acid-base properties to biological activity and establish SAR of the novel compounds. Microbiological, physico-chemical (logP, solubility) and structural studies of newly synthesised rifamycins indicated that for the presence of relatively high antibacterial (MIC ~0.01 nmol/mL) and antitubercular (MIC ~0.006 nmol/mL) activities, a rigid and basic substituent at C(3) arm, containing a protonated nitrogen atom "open" toward intermolecular interactions, is required. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  12. Research and development of the industrial basic technologies of the next generation, 'composite materials (resin-based)'. Evaluation of the second phase research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'fukugo zairyo (jushikei). Dainiki hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-03-31

    The results of the second phase research and development project for developing the resin-based composite materials as the basic technologies of the next generation are evaluated. The second phase (FY 1983 to 1986) project is carried out to achieve the objectives, which are set up based on the objectives for the basic designs and the results of the first phase project. As a result, the concrete (promotion objectives), including those related to mechanical characteristics, have been almost achieved. For development of the highly functional FRP materials, the R and D efforts are directed to improvement of their moldability while satisfying the requirements of high functions, e.g., resistance to heat, leading to development of the basic techniques for epoxy resin-based intermediate materials. These results indicate possibility of commercialization of highly heat-resistant polyimde resin-based and novel resin-based intermediate materials. The results of the novel molding/processing methods, taken up in the second phase project as the ones which use no autoclave, are rated that they are developed to suggest possibility of the eventual commercialization, although the products they give are not always showing sufficient mechanical properties. (NEDO)

  13. Research and development of the industrial basic technologies of the next generation, 'composite materials (resin-based)'. Evaluation of the first phase research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'fukugo zairyo (jushikei)'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1984-03-30

    The results of the first phase research and development project for developing the resin-based composite materials as the basic technologies of the next generation are evaluated. This project is aimed at development of the resin-based composite materials as light, high-strength and high-rigidity structural materials. Development of the basic techniques for these materials is of high significance, and highly rated. The first R and D phase efforts are directed to synthesis of new skeleton and terminal compounds, and their introduction into the matrices; application/combination of various fiber surface modification techniques; development of the basic techniques for intermediate materials for three-dimensional fabrics, triaxial fabrics and hybrid materials; improvement of the main techniques for a series of molding/processing steps; adoption of new methods; and development of design programs and elementary design techniques. The technical targets of improving functions of each item have been achieved, or bright prospects have been obtained therefor, each involving potentials of functional improvements by developing new functions. It is also considered that the parallel, competitive development and promotion of 7 routes in researches on the matrix resins have brought great effects. (NEDO)

  14. The social contact hypothesis under the assumption of endemic equilibrium: Elucidating the transmission potential of VZV in Europe

    Directory of Open Access Journals (Sweden)

    E. Santermans

    2015-06-01

    Full Text Available The basic reproduction number R0 and the effective reproduction number R are pivotal parameters in infectious disease epidemiology, quantifying the transmission potential of an infection in a population. We estimate both parameters from 13 pre-vaccination serological data sets on varicella zoster virus (VZV in 12 European countries and from population-based social contact surveys under the commonly made assumptions of endemic and demographic equilibrium. The fit to the serology is evaluated using the inferred effective reproduction number R as a model eligibility criterion combined with AIC as a model selection criterion. For only 2 out of 12 countries, the common choice of a constant proportionality factor is sufficient to provide a good fit to the seroprevalence data. For the other countries, an age-specific proportionality factor provides a better fit, assuming physical contacts lasting longer than 15 min are a good proxy for potential varicella transmission events. In all countries, primary infection with VZV most often occurs in early childhood, but there is substantial variation in transmission potential with R0 ranging from 2.8 in England and Wales to 7.6 in The Netherlands. Two non-parametric methods, the maximal information coefficient (MIC and a random forest approach, are used to explain these differences in R0 in terms of relevant country-specific characteristics. Our results suggest an association with three general factors: inequality in wealth, infant vaccination coverage and child care attendance. This illustrates the need to consider fundamental differences between European countries when formulating and parameterizing infectious disease models.

  15. Expressing Environment Assumptions and Real-time Requirements for a Distributed Embedded System with Shared Variables

    DEFF Research Database (Denmark)

    Tjell, Simon; Fernandes, João Miguel

    2008-01-01

    In a distributed embedded system, it is often necessary to share variables among its computing nodes to allow the distribution of control algorithms. It is therefore necessary to include a component in each node that provides the service of variable sharing. For that type of component, this paper...... for the component. The CPN model can be used to validate the environment assumptions and the requirements. The validation is performed by execution of the model during which traces of events and states are automatically generated and evaluated against the requirements....

  16. Basic evaluation of measurement of the serum level of squamous cell carcinoma-related antigen (SCC) and its value in following irradiated patients with cancer of the uterine cervix

    International Nuclear Information System (INIS)

    Obata, Yasunori; Tadokoro, Masanori; Kazato, Sadayuki

    1987-01-01

    The measuremet of the serum level of squamous cell carcinoma-related antigen (SCC) purified from liver metastasis of cancer of the uterine cervix by an RIA kit is basically evaluated. The results of sensitivity, the recovery test, dilution test and variance test are good enough for clinical application. In gynecological disorders, the possitive rate is high (62 % [29/47]) in patients with cancer of the uterine cervix. Furthermore, the rate and level are related with the clinical staging. The changes of the serum SCC level in irradiated patients with cancer of the uterine cervix were a good reflection of the effectiveness of the treatment. (author)

  17. Fiscal 1999 technological survey report. Part 2. Applied technology for measuring human sense (Human sense measuring manual - basic technology for sense evaluation); Ningen kankaku keisoku manual. 2. Kankaku hyoka kiban gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    A method of measuring/evaluating a mental and physical state by means of physiological information developed by a project was compiled into a 'guide book', as was a method of evaluating adaptability to the environment or products; and, this manual was prepared for the purpose of improving the adaptability of human beings to products by making use of the guide book widely in the field of industrial manufacturing. The part 2 explains a hard measuring instrument, evaluation device, simulation system, method of data analysis, etc., as 'basic technology for sense evaluation'. The chapter 1 is a new measuring and evaluation device (device for measuring physiological signals on the surface of the body, device for measuring visual characteristics, measuring device of in vivo substance, measuring device of thermal response, and system for evaluating adaptability of practical form), the chapter 2 is a new simulator (model of human body temperature with clothes on, human comfort meter, perspiring thermal manikin, and autonomic nerve control model in cardiac blood vessel/respiratory system), and the chapter 3 is new experimental/analytical method (new data analysis method and subjective evaluation questionnaire for stress assessment). (NEDO)

  18. Chemometric evaluation of the combined effect of temperature, pressure, and co-solvent fractions on the chiral separation of basic pharmaceuticals using actual vs set operational conditions.

    Science.gov (United States)

    Forss, Erik; Haupt, Dan; Stålberg, Olle; Enmark, Martin; Samuelsson, Jörgen; Fornstedt, Torgny

    2017-05-26

    The need to determine the actual operational conditions, instead of merely using the set operational conditions, was investigated for in packed supercritical fluid chromatography (SFC) by design of experiments (DoE) using a most important type of compounds, pharmaceutical basics, as models. The actual values of temperature, pressure, and methanol levels were recorded and calculated from external sensors, while the responses in the DoE were the retention factors and selectivity. A Kromasil CelluCoat column was used as the stationary phase, carbon dioxide containing varying methanol contents as the mobile phase, and the six racemates of alprenolol, atenolol, metoprolol, propranolol, clenbuterol, and mianserin were selected as model solutes. For the retention modeling, the most important term was the methanol fraction followed by the temperature and pressure. Significant differences (p<0.05) between most of the coefficients in the retention models were observed when comparing models from set and actual conditions. The selectivity was much less affected by operational changes, and therefore was not severely affected by difference between set and actual conditions. The temperature differences were usually small, maximum ±1.4°C, whereas the pressure differences were larger, typically approximately +10.5bar. The set and actual fractions of methanol also differed, usually by ±0.4 percentage points. A cautious conclusion is that the primary reason for the discrepancy between the models is a mismatch between the set and actual methanol fractions. This mismatch is more serious in retention models at low methanol fractions. The study demonstrates that the actual conditions should almost always be preferred. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Positron emission tomography basic sciences

    CERN Document Server

    Townsend, D W; Valk, P E; Maisey, M N

    2003-01-01

    Essential for students, science and medical graduates who want to understand the basic science of Positron Emission Tomography (PET), this book describes the physics, chemistry, technology and overview of the clinical uses behind the science of PET and the imaging techniques it uses. In recent years, PET has moved from high-end research imaging tool used by the highly specialized to an essential component of clinical evaluation in the clinic, especially in cancer management. Previously being the realm of scientists, this book explains PET instrumentation, radiochemistry, PET data acquisition and image formation, integration of structural and functional images, radiation dosimetry and protection, and applications in dedicated areas such as drug development, oncology, and gene expression imaging. The technologist, the science, engineering or chemistry graduate seeking further detailed information about PET, or the medical advanced trainee wishing to gain insight into the basic science of PET will find this book...

  20. Towards a test of non-locality without 'supplementary assumptions'

    International Nuclear Information System (INIS)

    Barbieri, M.; De Martini, F.; Di Nepi, G.; Mataloni, P.

    2005-01-01

    We have experimentally tested the non-local properties of the two-photon states generated by a high brilliance source of entanglement which virtually allows the direct measurement of the full set of photon pairs created by the basic QED process implied by the parametric quantum scattering. Standard Bell measurements and Bell's inequality violation test have been realized over the entire cone of emission of the degenerate pairs. By the same source we have verified Hardy's ladder theory up to the 20th step and the contradiction between the standard quantum theory and the local realism has been tested for 41% of entangled pairs

  1. The Impact of Modeling Assumptions in Galactic Chemical Evolution Models

    Science.gov (United States)

    Côté, Benoit; O'Shea, Brian W.; Ritter, Christian; Herwig, Falk; Venn, Kim A.

    2017-02-01

    We use the OMEGA galactic chemical evolution code to investigate how the assumptions used for the treatment of galactic inflows and outflows impact numerical predictions. The goal is to determine how our capacity to reproduce the chemical evolution trends of a galaxy is affected by the choice of implementation used to include those physical processes. In pursuit of this goal, we experiment with three different prescriptions for galactic inflows and outflows and use OMEGA within a Markov Chain Monte Carlo code to recover the set of input parameters that best reproduces the chemical evolution of nine elements in the dwarf spheroidal galaxy Sculptor. This provides a consistent framework for comparing the best-fit solutions generated by our different models. Despite their different degrees of intended physical realism, we found that all three prescriptions can reproduce in an almost identical way the stellar abundance trends observed in Sculptor. This result supports the similar conclusions originally claimed by Romano & Starkenburg for Sculptor. While the three models have the same capacity to fit the data, the best values recovered for the parameters controlling the number of SNe Ia and the strength of galactic outflows, are substantially different and in fact mutually exclusive from one model to another. For the purpose of understanding how a galaxy evolves, we conclude that only reproducing the evolution of a limited number of elements is insufficient and can lead to misleading conclusions. More elements or additional constraints such as the Galaxy’s star-formation efficiency and the gas fraction are needed in order to break the degeneracy between the different modeling assumptions. Our results show that the successes and failures of chemical evolution models are predominantly driven by the input stellar yields, rather than by the complexity of the Galaxy model itself. Simple models such as OMEGA are therefore sufficient to test and validate stellar yields. OMEGA

  2. Public-private partnerships to improve primary healthcare surgeries: clarifying assumptions about the role of private provider activities.

    Science.gov (United States)

    Mudyarabikwa, Oliver; Tobi, Patrick; Regmi, Krishna

    2017-07-01

    Aim To examine assumptions about public-private partnership (PPP) activities and their role in improving public procurement of primary healthcare surgeries. PPPs were developed to improve the quality of care and patient satisfaction. However, evidence of their effectiveness in delivering health benefits is limited. A qualitative study design was employed. A total of 25 interviews with public sector staff (n=23) and private sector managers (n=2) were conducted to understand their interpretations of assumptions in the activities of private investors and service contractors participating in Local Improvement Finance Trust (LIFT) partnerships. Realist evaluation principles were applied in the data analysis to interpret the findings. Six thematic areas of assumed health benefits were identified: (i) quality improvement; (ii) improved risk management; (iii) reduced procurement costs; (iv) increased efficiency; (v) community involvement; and (vi) sustainable investment. Primary Care Trusts that chose to procure their surgeries through LIFT were expected to support its implementation by providing an environment conducive for the private participants to achieve these benefits. Private participant activities were found to be based on a range of explicit and tacit assumptions perceived helpful in achieving government objectives for LIFT. The success of PPPs depended upon private participants' (i) capacity to assess how PPP assumptions added value to their activities, (ii) effectiveness in interpreting assumptions in their expected activities, and (iii) preparedness to align their business principles to government objectives for PPPs. They risked missing some of the expected benefits because of some factors constraining realization of the assumptions. The ways in which private participants preferred to carry out their activities also influenced the extent to which expected benefits were achieved. Giving more discretion to public than private participants over critical

  3. Assumption-versus data-based approaches to summarizing species' ranges.

    Science.gov (United States)

    Peterson, A Townsend; Navarro-Sigüenza, Adolfo G; Gordillo, Alejandro

    2018-06-01

    For conservation decision making, species' geographic distributions are mapped using various approaches. Some such efforts have downscaled versions of coarse-resolution extent-of-occurrence maps to fine resolutions for conservation planning. We examined the quality of the extent-of-occurrence maps as range summaries and the utility of refining those maps into fine-resolution distributional hypotheses. Extent-of-occurrence maps tend to be overly simple, omit many known and well-documented populations, and likely frequently include many areas not holding populations. Refinement steps involve typological assumptions about habitat preferences and elevational ranges of species, which can introduce substantial error in estimates of species' true areas of distribution. However, no model-evaluation steps are taken to assess the predictive ability of these models, so model inaccuracies are not noticed. Whereas range summaries derived by these methods may be useful in coarse-grained, global-extent studies, their continued use in on-the-ground conservation applications at fine spatial resolutions is not advisable in light of reliance on assumptions, lack of real spatial resolution, and lack of testing. In contrast, data-driven techniques that integrate primary data on biodiversity occurrence with remotely sensed data that summarize environmental dimensions (i.e., ecological niche modeling or species distribution modeling) offer data-driven solutions based on a minimum of assumptions that can be evaluated and validated quantitatively to offer a well-founded, widely accepted method for summarizing species' distributional patterns for conservation applications. © 2016 Society for Conservation Biology.

  4. Basic Electromagnetism and Materials

    CERN Document Server

    Moliton, André

    2007-01-01

    Basic Electromagnetism and Materials is the product of many years of teaching basic and applied electromagnetism. This textbook can be used to teach electromagnetism to a wide range of undergraduate science majors in physics, electrical engineering or materials science. However, by making lesser demands on mathematical knowledge than competing texts, and by emphasizing electromagnetic properties of materials and their applications, this textbook is uniquely suited to students of materials science. Many competing texts focus on the study of propagation waves either in the microwave or optical domain, whereas Basic Electromagnetism and Materials covers the entire electromagnetic domain and the physical response of materials to these waves. Professor André Moliton is Director of the Unité de Microélectronique, Optoélectronique et Polymères (Université de Limoges, France), which brings together three groups studying the optoelectronics of molecular and polymer layers, micro-optoelectronic systems for teleco...

  5. Design, synthesis and evaluation of an anthraquinone derivative conjugated to myelin basic protein immunodominant (MBP85-99) epitope: Towards selective immunosuppression.

    Science.gov (United States)

    Tapeinou, Anthi; Giannopoulou, Efstathia; Simal, Carmen; Hansen, Bjarke E; Kalofonos, Haralabos; Apostolopoulos, Vasso; Vlamis-Gardikas, Alexios; Tselios, Theodore

    2018-01-01

    Anthraquinone type compounds, especially di-substituted amino alkylamino anthraquinones have been widely studied as immunosuppressants. The anthraquinone ring is part of mitoxandrone that has been used for the treatment of multiple sclerosis (MS) and several types of tumors. A desired approach for the treatment of MS would be the immunosuppression and elimination of specific T cells that are responsible for the induction of the disease. Herein, the development of a peptide compound bearing an anthraquinone derivative with the potential to specifically destroy the encephalitogenic T cells responsible for the onset of MS is described. The compound consists of the myelin basic protein (MBP) 85-99 immunodominant epitope (MBP 85-99 ) coupled to an anthraquinone type molecule (AQ) via a disulfide (S-S) and 6 amino hexanoic acid (Ahx) residues (AQ-S-S-(Ahx) 6 MBP 85-99 ). AQ-S-S-(Ahx) 6 MBP 85-99 could bind to HLA II DRB1*-1501 antigen with reasonable affinity (IC 50 of 56 nM) The compound was localized to the nucleus of Jurkat cells (an immortalized line of human T lymphocytes) 10 min after its addition to the medium and resulted in lowered Bcl-2 levels (apoptosis). Entrance of the compound was abolished when cells were pre-treated with cisplatin, an inhibitor of thioredoxin reductase. Accordingly, levels of free thiols were elevated in the culture supernatants of Jurkat cells exposed to N-succinimidyl 3-(2-pyridyldithio) propionate coupled to (Ahx) 6 MBP 85-99 via a disulphide (SPDP-S-S-(Ahx) 6 MBP 85-99 ) but returned to normal after exposure to cisplatin. These results raise the possibility of AQ-S-S-(Ahx) 6 MBP 85-99 being used as an eliminator of encephalitogenic T cells via implication of the thioredoxin system for the generation of the toxic, thiol-containing moiety (AQ-SH). Future experiments would ideally determine whether SPDP-S-S-(Ahx) 6 MBP 85-99 could incorporate into HLA II DRB1*-1501 tetramers and neutralize encephalitogenic T cell lines sensitized to

  6. Luminescent microporous metal–organic framework with functional Lewis basic sites on the pore surface: Quantifiable evaluation of luminescent sensing mechanisms towards Fe{sup 3+}

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Jun-Cheng [Key Laboratory of Synthetic and Natural Functional Molecule Chemistry of the Ministry of Education, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry & Materials Science, Northwest University, Xi' an 710069 (China); Technology Promotion Center of Nano Composite Material of Biomimetic Sensor and Detecting Technology, Preparation and Application, Anhui Provincial Laboratory West Anhui University, Anhui 237012 (China); Guo, Rui-Li; Zhang, Wen-Yan [Key Laboratory of Synthetic and Natural Functional Molecule Chemistry of the Ministry of Education, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry & Materials Science, Northwest University, Xi' an 710069 (China); Jiang, Chen [Technology Promotion Center of Nano Composite Material of Biomimetic Sensor and Detecting Technology, Preparation and Application, Anhui Provincial Laboratory West Anhui University, Anhui 237012 (China); Wang, Yao-Yu, E-mail: wyaoyu@nwu.edu.cn [Key Laboratory of Synthetic and Natural Functional Molecule Chemistry of the Ministry of Education, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry & Materials Science, Northwest University, Xi' an 710069 (China)

    2016-11-15

    A systematic study has been conducted on a novel luminescent metal-organic framework, ([Zn(bpyp)(L-OH)]·DMF·2H{sub 2}O){sub n} (1), to explore its sensing mechanisms to Fe{sup 3+}. Structure analyses show that compound 1 exist pyridine N atoms and -OH groups on the pore surface for specific sensing of metal ions via Lewis acid-base interactions. On this consideration, the quenching mechanisms are studied and the processes are controlled by multiple mechanisms in which dynamic and static mechanisms are calculated, achieving the quantification evaluation of the quenching process. This work not only achieves the quantitative evaluation of the luminescence quenching but also provides certain insights into the quenching process, and the possible mechanisms explored in this work may inspire future research and design of target luminescent metal-organic frameworks (LMOFs) with specific functions. - Graphical abstract: A systematic study has been conducted on a novel luminescent metal-organic framework to explore its sensing mechanisms to Fe{sup 3+}. The quenching mechanisms are studied and the processes are controlled by multiple mechanisms in which dynamic and static mechanisms are calculated, achieving the quantification evaluation of the quenching process. - Highlights: • A novel porous luminescent MOF containing uncoordinated groups in interlayer channels was successfully synthesized. • The compound 1 can exhibit significant luminescent sensitivity to Fe{sup 3+}, which make its good candidate as luminescent sensor. • The corresponding dynamic and static quenching constants are calculated, achieving the quantification evaluation of the quenching process.

  7. Speakers' assumptions about the lexical flexibility of idioms.

    Science.gov (United States)

    Gibbs, R W; Nayak, N P; Bolton, J L; Keppel, M E

    1989-01-01

    In three experiments, we examined why some idioms can be lexically altered and still retain their figurative meanings (e.g., John buttoned his lips about Mary can be changed into John fastened his lips about Mary and still mean "John didn't say anything about Mary"), whereas other idioms cannot be lexically altered without losing their figurative meanings (e.g., John kicked the bucket, meaning "John died," loses its idiomatic meaning when changed into John kicked the pail). Our hypothesis was that the lexical flexibility of idioms is determined by speakers' assumptions about the ways in which parts of idioms contribute to their figurative interpretations as a whole. The results of the three experiments indicated that idioms whose individual semantic components contribute to their overall figurative meanings (e.g., go out on a limb) were judged as less disrupted by changes in their lexical items (e.g., go out on a branch) than were nondecomposable idioms (e.g., kick the bucket) when their individual words were altered (e.g., punt the pail). These findings lend support to the idea that both the syntactic productivity and the lexical makeup of idioms are matters of degree, depending on the idioms' compositional properties. This conclusion suggests that idioms do not form a unique class of linguistic items, but share many of the properties of more literal language.

  8. Weak convergence of Jacobian determinants under asymmetric assumptions

    Directory of Open Access Journals (Sweden)

    Teresa Alberico

    2012-05-01

    Full Text Available Let $\\Om$ be a bounded open set in $\\R^2$ sufficiently smooth and $f_k=(u_k,v_k$ and $f=(u,v$ mappings belong to the Sobolev space $W^{1,2}(\\Om,\\R^2$. We prove that if the sequence of Jacobians $J_{f_k}$ converges to a measure $\\mu$ in sense of measures andif one allows different assumptions on the two components of $f_k$ and $f$, e.g.$$u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,2}(\\Om \\qquad \\, v_k \\rightharpoonup v \\;\\;\\mbox{weakly in} \\;\\; W^{1,q}(\\Om$$for some $q\\in(1,2$, then\\begin{equation}\\label{0}d\\mu=J_f\\,dz.\\end{equation}Moreover, we show that this result is optimal in the sense that conclusion fails for $q=1$.On the other hand, we prove that \\eqref{0} remains valid also if one considers the case $q=1$, but it is necessary to require that $u_k$ weakly converges to $u$ in a Zygmund-Sobolev space with a slightly higher degree of regularity than $W^{1,2}(\\Om$ and precisely$$ u_k \\rightharpoonup u \\;\\;\\mbox{weakly in} \\;\\; W^{1,L^2 \\log^\\alpha L}(\\Om$$for some $\\alpha >1$.    

  9. Stream of consciousness: Quantum and biochemical assumptions regarding psychopathology.

    Science.gov (United States)

    Tonello, Lucio; Cocchi, Massimo; Gabrielli, Fabio; Tuszynski, Jack A

    2017-04-01

    The accepted paradigms of mainstream neuropsychiatry appear to be incompletely adequate and in various cases offer equivocal analyses. However, a growing number of new approaches are being proposed that suggest the emergence of paradigm shifts in this area. In particular, quantum theories of mind, brain and consciousness seem to offer a profound change to the current approaches. Unfortunately these quantum paradigms harbor at least two serious problems. First, they are simply models, theories, and assumptions, with no convincing experiments supporting their claims. Second, they deviate from contemporary mainstream views of psychiatric illness and do so in revolutionary ways. We suggest a possible way to integrate experimental neuroscience with quantum models in order to address outstanding issues in psychopathology. A key role is played by the phenomenon called the "stream of consciousness", which can be linked to the so-called "Gamma Synchrony" (GS), which is clearly demonstrated by EEG data. In our novel proposal, a unipolar depressed patient could be seen as a subject with an altered stream of consciousness. In particular, some clues suggest that depression is linked to an "increased power" stream of consciousness. It is additionally suggested that such an approach to depression might be extended to psychopathology in general with potential benefits to diagnostics and therapeutics in neuropsychiatry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Assumptions of the primordial spectrum and cosmological parameter estimation

    International Nuclear Information System (INIS)

    Shafieloo, Arman; Souradeep, Tarun

    2011-01-01

    The observables of the perturbed universe, cosmic microwave background (CMB) anisotropy and large structures depend on a set of cosmological parameters, as well as the assumed nature of primordial perturbations. In particular, the shape of the primordial power spectrum (PPS) is, at best, a well-motivated assumption. It is known that the assumed functional form of the PPS in cosmological parameter estimation can affect the best-fit-parameters and their relative confidence limits. In this paper, we demonstrate that a specific assumed form actually drives the best-fit parameters into distinct basins of likelihood in the space of cosmological parameters where the likelihood resists improvement via modifications to the PPS. The regions where considerably better likelihoods are obtained allowing free-form PPS lie outside these basins. In the absence of a preferred model of inflation, this raises a concern that current cosmological parameter estimates are strongly prejudiced by the assumed form of PPS. Our results strongly motivate approaches toward simultaneous estimation of the cosmological parameters and the shape of the primordial spectrum from upcoming cosmological data. It is equally important for theorists to keep an open mind towards early universe scenarios that produce features in the PPS. (paper)

  11. Fourth-order structural steganalysis and analysis of cover assumptions

    Science.gov (United States)

    Ker, Andrew D.

    2006-02-01

    We extend our previous work on structural steganalysis of LSB replacement in digital images, building detectors which analyse the effect of LSB operations on pixel groups as large as four. Some of the method previously applied to triplets of pixels carries over straightforwardly. However we discover new complexities in the specification of a cover image model, a key component of the detector. There are many reasonable symmetry assumptions which we can make about parity and structure in natural images, only some of which provide detection of steganography, and the challenge is to identify the symmetries a) completely, and b) concisely. We give a list of possible symmetries and then reduce them to a complete, non-redundant, and approximately independent set. Some experimental results suggest that all useful symmetries are thus described. A weighting is proposed and its approximate variance stabilisation verified empirically. Finally, we apply symmetries to create a novel quadruples detector for LSB replacement steganography. Experimental results show some improvement, in most cases, over other detectors. However the gain in performance is moderate compared with the increased complexity in the detection algorithm, and we suggest that, without new insight, further extension of structural steganalysis may provide diminishing returns.

  12. On Some Unwarranted Tacit Assumptions in Cognitive Neuroscience†

    Science.gov (United States)

    Mausfeld, Rainer

    2011-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input–output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings. PMID:22435062

  13. On some unwarranted tacit assumptions in cognitive neuroscience.

    Science.gov (United States)

    Mausfeld, Rainer

    2012-01-01

    The cognitive neurosciences are based on the idea that the level of neurons or neural networks constitutes a privileged level of analysis for the explanation of mental phenomena. This paper brings to mind several arguments to the effect that this presumption is ill-conceived and unwarranted in light of what is currently understood about the physical principles underlying mental achievements. It then scrutinizes the question why such conceptions are nevertheless currently prevailing in many areas of psychology. The paper argues that corresponding conceptions are rooted in four different aspects of our common-sense conception of mental phenomena and their explanation, which are illegitimately transferred to scientific enquiry. These four aspects pertain to the notion of explanation, to conceptions about which mental phenomena are singled out for enquiry, to an inductivist epistemology, and, in the wake of behavioristic conceptions, to a bias favoring investigations of input-output relations at the expense of enquiries into internal principles. To the extent that the cognitive neurosciences methodologically adhere to these tacit assumptions, they are prone to turn into a largely a-theoretical and data-driven endeavor while at the same time enhancing the prospects for receiving widespread public appreciation of their empirical findings.

  14. Are waves of relational assumptions eroding traditional analysis?

    Science.gov (United States)

    Meredith-Owen, William

    2013-11-01

    The author designates as 'traditional' those elements of psychoanalytic presumption and practice that have, in the wake of Fordham's legacy, helped to inform analytical psychology and expand our capacity to integrate the shadow. It is argued that this element of the broad spectrum of Jungian practice is in danger of erosion by the underlying assumptions of the relational approach, which is fast becoming the new establishment. If the maps of the traditional landscape of symbolic reference (primal scene, Oedipus et al.) are disregarded, analysts are left with only their own self-appointed authority with which to orientate themselves. This self-centric epistemological basis of the relationalists leads to a revision of 'analytic attitude' that may be therapeutic but is not essentially analytic. This theme is linked to the perennial challenge of balancing differentiation and merger and traced back, through Chasseguet-Smirgel, to its roots in Genesis. An endeavour is made to illustrate this within the Journal convention of clinically based discussion through a commentary on Colman's (2013) avowedly relational treatment of the case material presented in his recent Journal paper 'Reflections on knowledge and experience' and through an assessment of Jessica Benjamin's (2004) relational critique of Ron Britton's (1989) transference embodied approach. © 2013, The Society of Analytical Psychology.

  15. Basic properties of semiconductors

    CERN Document Server

    Landsberg, PT

    2013-01-01

    Since Volume 1 was published in 1982, the centres of interest in the basic physics of semiconductors have shifted. Volume 1 was called Band Theory and Transport Properties in the first edition, but the subject has broadened to such an extent that Basic Properties is now a more suitable title. Seven chapters have been rewritten by the original authors. However, twelve chapters are essentially new, with the bulk of this work being devoted to important current topics which give this volume an almost encyclopaedic form. The first three chapters discuss various aspects of modern band theory and the

  16. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  17. Comprehensive basic mathematics

    CERN Document Server

    Veena, GR

    2005-01-01

    Salient Features As per II PUC Basic Mathematics syllabus of Karnataka. Provides an introduction to various basic mathematical techniques and the situations where these could be usefully employed. The language is simple and the material is self-explanatory with a large number of illustrations. Assists the reader in gaining proficiency to solve diverse variety of problems. A special capsule containing a gist and list of formulae titled ''REMEMBER! Additional chapterwise arranged question bank and 3 model papers in a separate section---''EXAMINATION CORNER''.

  18. Ecology and basic laws

    International Nuclear Information System (INIS)

    Mayer-Tasch, P.C.

    1980-01-01

    The author sketches the critical relation between ecology and basic law - critical in more than one sense. He points out the incompatibility of constitutional states and atomic states which is due to constitutional order being jeopardised by nuclear policy. He traces back the continuously rising awareness of pollution and the modern youth movement to their common root i.e. the awakening, the youth movement of the turn of the century. Eventually, he considers an economical, political, and social decentralization as a feasible alternative which would considerably relieve our basic living conditions from the threatening forms of civilization prevailing. (HSCH) [de

  19. Back to the basics: identifying positive youth development as the theoretical framework for a youth drug prevention program in rural Saskatchewan, Canada amidst a program evaluation.

    Science.gov (United States)

    Dell, Colleen Anne; Duncan, Charles Randy; DesRoches, Andrea; Bendig, Melissa; Steeves, Megan; Turner, Holly; Quaife, Terra; McCann, Chuck; Enns, Brett

    2013-10-22

    Despite endorsement by the Saskatchewan government to apply empirically-based approaches to youth drug prevention services in the province, programs are sometimes delivered prior to the establishment of evidence-informed goals and objectives. This paper shares the 'preptory' outcomes of our team's program evaluation of the Prince Albert Parkland Health Region Mental Health and Addiction Services' Outreach Worker Service (OWS) in eight rural, community schools three years following its implementation. Before our independent evaluation team could assess whether expectations of the OWS were being met, we had to assist with establishing its overarching program goals and objectives and 'at-risk' student population, alongside its alliance with an empirically-informed theoretical framework. A mixed-methods approach was applied, beginning with in-depth focus groups with the OWS staff to identify the program's goals and objectives and targeted student population. These were supplemented with OWS and school administrator interviews and focus groups with school staff. Alignment with a theoretical focus was determined though a review of the OWS's work to date and explored in focus groups between our evaluation team and the OWS staff and validated with the school staff and OWS and school administration. With improved understanding of the OWS's goals and objectives, our evaluation team and the OWS staff aligned the program with the Positive Youth Development theoretical evidence-base, emphasizing the program's universality, systems focus, strength base, and promotion of assets. Together we also gained clarity about the OWS's definition of and engagement with its 'at-risk' student population. It is important to draw on expert knowledge to develop youth drug prevention programming, but attention must also be paid to aligning professional health care services with a theoretically informed evidence-base for evaluation purposes. If time does not permit for the establishment of

  20. Precompound Reactions: Basic Concepts

    International Nuclear Information System (INIS)

    Weidenmueller, H. A.

    2008-01-01

    Because of the non-zero nuclear equilibration time, the compound-nucleus scattering model fails when the incident energy exceeds 10 or 20 MeV, and precompound reactions become important. Basic ideas used in the quantum-statistical approaches to these reactions are described

  1. Basic Tuberculosis Facts

    Centers for Disease Control (CDC) Podcasts

    2012-03-12

    In this podcast, Dr. Kenneth Castro, Director of the Division of Tuberculosis Elimination, discusses basic TB prevention, testing, and treatment information.  Created: 3/12/2012 by National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP).   Date Released: 3/12/2012.

  2. Basic Exchange Rate Theories

    NARCIS (Netherlands)

    J.G.M. van Marrewijk (Charles)

    2005-01-01

    textabstractThis four-chapter overview of basic exchange rate theories discusses (i) the elasticity and absorption approach, (ii) the (long-run) implications of the monetary approach, (iii) the short-run effects of monetary and fiscal policy under various economic conditions, and (iv) the transition

  3. Basic SPSS tutorial

    NARCIS (Netherlands)

    Grotenhuis, H.F. te; Matthijssen, A.C.B.

    2015-01-01

    This supplementary book for the social, behavioral, and health sciences helps readers with no prior knowledge of IBM® SPSS® Statistics, statistics, or mathematics learn the basics of SPSS. Designed to reduce fear and build confidence, the book guides readers through point-and-click sequences using

  4. Basic Skills Assessment

    Science.gov (United States)

    Yin, Alexander C.; Volkwein, J. Fredericks

    2010-01-01

    After surveying 1,827 students in their final year at eighty randomly selected two-year and four-year public and private institutions, American Institutes for Research (2006) reported that approximately 30 percent of students in two-year institutions and nearly 20 percent of students in four-year institutions have only basic quantitative…

  5. Basic physics for all

    CERN Document Server

    Kumar, B N

    2012-01-01

    This is a simple, concise book for both student and non-physics students, presenting basic facts in straightforward form and conveying fundamental principles and theories of physics. This book will be helpful as a supplement to class teaching and to aid those who have difficulty in mastering concepts and principles.

  6. Basic pharmaceutical technology

    OpenAIRE

    Angelovska, Bistra; Drakalska, Elena

    2017-01-01

    The lecture deals with basics of pharmaceutical technology as applied discipline of pharmaceutical science, whose main subject of study is formulation and manufacture of drugs. In a broad sense, pharmaceutical technology is science of formulation, preparation, stabilization and determination of the quality of medicines prepared in the pharmacy or in pharmaceutical industry

  7. Basic radiation oncology

    International Nuclear Information System (INIS)

    Beyzadeoglu, M. M.; Ebruli, C.

    2008-01-01

    Basic Radiation Oncology is an all-in-one book. It is an up-to-date bedside oriented book integrating the radiation physics, radiobiology and clinical radiation oncology. It includes the essentials of all aspects of radiation oncology with more than 300 practical illustrations, black and white and color figures. The layout and presentation is very practical and enriched with many pearl boxes. Key studies particularly randomized ones are also included at the end of each clinical chapter. Basic knowledge of all high-tech radiation teletherapy units such as tomotherapy, cyberknife, and proton therapy are also given. The first 2 sections review concepts that are crucial in radiation physics and radiobiology. The remaining 11 chapters describe treatment regimens for main cancer sites and tumor types. Basic Radiation Oncology will greatly help meeting the needs for a practical and bedside oriented oncology book for residents, fellows, and clinicians of Radiation, Medical and Surgical Oncology as well as medical students, physicians and medical physicists interested in Clinical Oncology. English Edition of the book Temel Radyasyon Onkolojisi is being published by Springer Heidelberg this year with updated 2009 AJCC Staging as Basic Radiation Oncology

  8. Bottled Water Basics

    Science.gov (United States)

    Table of Contents Bottled water basics ....................................... pg.2 Advice for people with severely compromised immune systems (Sidebar) ............................. pg2 Know what you’re buying .............................. pg.3 Taste considerations ........................................ pg.4 Bottled water terms (Sidebar) ..................... pg.4 Begin by reading the ...

  9. Monte Carlo: Basics

    OpenAIRE

    Murthy, K. P. N.

    2001-01-01

    An introduction to the basics of Monte Carlo is given. The topics covered include, sample space, events, probabilities, random variables, mean, variance, covariance, characteristic function, chebyshev inequality, law of large numbers, central limit theorem (stable distribution, Levy distribution), random numbers (generation and testing), random sampling techniques (inversion, rejection, sampling from a Gaussian, Metropolis sampling), analogue Monte Carlo and Importance sampling (exponential b...

  10. Ethanol Basics (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2015-01-01

    Ethanol is a widely-used, domestically-produced renewable fuel made from corn and other plant materials. More than 96% of gasoline sold in the United States contains ethanol. Learn more about this alternative fuel in the Ethanol Basics Fact Sheet, produced by the U.S. Department of Energy's Clean Cities program.

  11. Basic Soils. Revision.

    Science.gov (United States)

    Montana State Univ., Bozeman. Dept. of Agricultural and Industrial Education.

    This curriculum guide is designed for use in teaching a course in basic soils that is intended for college freshmen. Addressed in the individual lessons of the unit are the following topics: the way in which soil is formed, the physical properties of soil, the chemical properties of soil, the biotic properties of soil, plant-soil-water…

  12. Uniform background assumption produces misleading lung EIT images.

    Science.gov (United States)

    Grychtol, Bartłomiej; Adler, Andy

    2013-06-01

    Electrical impedance tomography (EIT) estimates an image of conductivity change within a body from stimulation and measurement at body surface electrodes. There is significant interest in EIT for imaging the thorax, as a monitoring tool for lung ventilation. To be useful in this application, we require an understanding of if and when EIT images can produce inaccurate images. In this paper, we study the consequences of the homogeneous background assumption, frequently made in linear image reconstruction, which introduces a mismatch between the reference measurement and the linearization point. We show in simulation and experimental data that the resulting images may contain large and clinically significant errors. A 3D finite element model of thorax conductivity is used to simulate EIT measurements for different heart and lung conductivity, size and position, as well as different amounts of gravitational collapse and ventilation-associated conductivity change. Three common linear EIT reconstruction algorithms are studied. We find that the asymmetric position of the heart can cause EIT images of ventilation to show up to 60% undue bias towards the left lung and that the effect is particularly strong for a ventilation distribution typical of mechanically ventilated patients. The conductivity gradient associated with gravitational lung collapse causes conductivity changes in non-dependent lung to be overestimated by up to 100% with respect to the dependent lung. Eliminating the mismatch by using a realistic conductivity distribution in the forward model of the reconstruction algorithm strongly reduces these undesirable effects. We conclude that subject-specific anatomically accurate forward models should be used in lung EIT and extra care is required when analysing EIT images of subjects whose background conductivity distribution in the lungs is known to be heterogeneous or exhibiting large changes.

  13. Uniform background assumption produces misleading lung EIT images

    International Nuclear Information System (INIS)

    Grychtol, Bartłomiej; Adler, Andy

    2013-01-01

    Electrical impedance tomography (EIT) estimates an image of conductivity change within a body from stimulation and measurement at body surface electrodes. There is significant interest in EIT for imaging the thorax, as a monitoring tool for lung ventilation. To be useful in this application, we require an understanding of if and when EIT images can produce inaccurate images. In this paper, we study the consequences of the homogeneous background assumption, frequently made in linear image reconstruction, which introduces a mismatch between the reference measurement and the linearization point. We show in simulation and experimental data that the resulting images may contain large and clinically significant errors. A 3D finite element model of thorax conductivity is used to simulate EIT measurements for different heart and lung conductivity, size and position, as well as different amounts of gravitational collapse and ventilation-associated conductivity change. Three common linear EIT reconstruction algorithms are studied. We find that the asymmetric position of the heart can cause EIT images of ventilation to show up to 60% undue bias towards the left lung and that the effect is particularly strong for a ventilation distribution typical of mechanically ventilated patients. The conductivity gradient associated with gravitational lung collapse causes conductivity changes in non-dependent lung to be overestimated by up to 100% with respect to the dependent lung. Eliminating the mismatch by using a realistic conductivity distribution in the forward model of the reconstruction algorithm strongly reduces these undesirable effects. We conclude that subject-specific anatomically accurate forward models should be used in lung EIT and extra care is required when analysing EIT images of subjects whose background conductivity distribution in the lungs is known to be heterogeneous or exhibiting large changes. (paper)

  14. Energetics of basic karate kata.

    Science.gov (United States)

    Bussweiler, Jens; Hartmann, Ulrich

    2012-12-01

    Knowledge about energy requirements during exercises seems necessary to develop training concepts in combat sport Karate. It is a commonly held view that the anaerobic lactic energy metabolism plays a key role, but this assumption could not be confirmed so far. The metabolic cost and fractional energy supply of basic Karate Kata (Heian Nidan, Shotokan style) with duration of about 30 s were analyzed. Six male Karateka [mean ± SD (age 29 ± 8 years; height 177 ± 5 cm, body mass 75 ± 9 kg)] with different training experience (advanced athletes, experts, elite athletes) were examined while performing one time and two time continuously the sport-specific movements. During Kata performance oxygen uptake was measured with a portable spirometric device, blood lactate concentrations were examined before and after testing and fractional energy supply was calculated. The results have shown that on average 52 % of the energy supply for one Heian Nidan came from anaerobic alactic metabolism, 25 % from anaerobic lactic and 23 % from aerobic metabolism. For two sequentially executed Heian Nidan and thus nearly doubling the duration, the calculated percentages were 33, 25 and 42 %. Total energy demand for one Kata and two Kata was approximately 61 and 99 kJ, respectively. Despite measured blood lactate concentrations up to 8.1 mmol l(-1), which might suggest a dominance of lactic energy supply, a lactic fraction of only 17-31 % during these relatively short and intense sequences could be found. A heavy use of lactic energy metabolism had to be rejected.

  15. Back to basics: an evaluation of NaOH and alternative rapid DNA extraction protocols for DNA barcoding, genotyping, and disease diagnostics from fungal and oomycete samples.

    Science.gov (United States)

    Osmundson, Todd W; Eyre, Catherine A; Hayden, Katherine M; Dhillon, Jaskirn; Garbelotto, Matteo M

    2013-01-01

    The ubiquity, high diversity and often-cryptic manifestations of fungi and oomycetes frequently necessitate molecular tools for detecting and identifying them in the environment. In applications including DNA barcoding, pathogen detection from plant samples, and genotyping for population genetics and epidemiology, rapid and dependable DNA extraction methods scalable from one to hundreds of samples are desirable. We evaluated several rapid extraction methods (NaOH, Rapid one-step extraction (ROSE), Chelex 100, proteinase K) for their ability to obtain DNA of quantity and quality suitable for the following applications: PCR amplification of the multicopy barcoding locus ITS1/5.8S/ITS2 from various fungal cultures and sporocarps; single-copy microsatellite amplification from cultures of the phytopathogenic oomycete Phytophthora ramorum; probe-based P. ramorum detection from leaves. Several methods were effective for most of the applications, with NaOH extraction favored in terms of success rate, cost, speed and simplicity. Frozen dilutions of ROSE and NaOH extracts maintained PCR viability for over 32 months. DNA from rapid extractions performed poorly compared to CTAB/phenol-chloroform extracts for TaqMan diagnostics from tanoak leaves, suggesting that incomplete removal of PCR inhibitors is an issue for sensitive diagnostic procedures, especially from plants with recalcitrant leaf chemistry. NaOH extracts exhibited lower yield and size than CTAB/phenol-chloroform extracts; however, NaOH extraction facilitated obtaining clean sequence data from sporocarps contaminated by other fungi, perhaps due to dilution resulting from low DNA yield. We conclude that conventional extractions are often unnecessary for routine DNA sequencing or genotyping of fungi and oomycetes, and recommend simpler strategies where source materials and intended applications warrant such use. © 2012 Blackwell Publishing Ltd.

  16. Transportation Emissions: some basics

    DEFF Research Database (Denmark)

    Kontovas, Christos A.; Psaraftis, Harilaos N.

    2016-01-01

    transportation and especially carbon dioxide emissions are at the center stage of discussion by the world community through various international treaties, such as the Kyoto Protocol. The transportation sector also emits non-CO2 pollutants that have important effects on air quality, climate, and public health......Transportation is the backbone of international trade and a key engine driving globalization. However, there is growing concern that the Earth’s atmospheric composition is being altered by human activities, including transportation, which can lead to climate change. Air pollution from....... The main purpose of this chapter is to introduce some basic concepts that are relevant in the quest of green transportation logistics. First, we present the basics of estimating emissions from transportation activities, the current statistics and future trends, as well as the total impact of air emissions...

  17. Basic Emotions: A Reconstruction

    Science.gov (United States)

    Mason, William A.; Capitanio, John P.

    2016-01-01

    Emotionality is a basic feature of behavior. The argument over whether the expression of emotions is based primarily on culture (constructivism, nurture) or biology (natural forms, nature) will never be resolved because both alternatives are untenable. The evidence is overwhelming that at all ages and all levels of organization, the development of emotionality is epigenetic: The organism is an active participant in its own development. To ascribe these effects to “experience” was the best that could be done for many years. With the rapid acceleration of information on how changes in organization are actually brought about, it is a good time to review, update, and revitalize our views of experience in relation to the concept of basic emotion. PMID:27110280

  18. Basic electronic circuits

    CERN Document Server

    Buckley, P M

    1980-01-01

    In the past, the teaching of electricity and electronics has more often than not been carried out from a theoretical and often highly academic standpoint. Fundamentals and basic concepts have often been presented with no indication of their practical appli­ cations, and all too frequently they have been illustrated by artificially contrived laboratory experiments bearing little relationship to the outside world. The course comes in the form of fourteen fairly open-ended constructional experiments or projects. Each experiment has associated with it a construction exercise and an explanation. The basic idea behind this dual presentation is that the student can embark on each circuit following only the briefest possible instructions and that an open-ended approach is thereby not prejudiced by an initial lengthy encounter with the theory behind the project; this being a sure way to dampen enthusiasm at the outset. As the investigation progresses, questions inevitably arise. Descriptions of the phenomena encounte...

  19. Basic linear algebra

    CERN Document Server

    Blyth, T S

    2002-01-01

    Basic Linear Algebra is a text for first year students leading from concrete examples to abstract theorems, via tutorial-type exercises. More exercises (of the kind a student may expect in examination papers) are grouped at the end of each section. The book covers the most important basics of any first course on linear algebra, explaining the algebra of matrices with applications to analytic geometry, systems of linear equations, difference equations and complex numbers. Linear equations are treated via Hermite normal forms which provides a successful and concrete explanation of the notion of linear independence. Another important highlight is the connection between linear mappings and matrices leading to the change of basis theorem which opens the door to the notion of similarity. This new and revised edition features additional exercises and coverage of Cramer's rule (omitted from the first edition). However, it is the new, extra chapter on computer assistance that will be of particular interest to readers:...

  20. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  1. Emulsion Science Basic Principles

    CERN Document Server

    Leal-Calderon, Fernando; Schmitt, Véronique

    2007-01-01

    Emulsions are generally made out of two immiscible fluids like oil and water, one being dispersed in the second in the presence of surface-active compounds.They are used as intermediate or end products in a huge range of areas including the food, chemical, cosmetic, pharmaceutical, paint, and coating industries. Besides the broad domain of technological interest, emulsions are raising a variety of fundamental questions at the frontier between physics and chemistry. This book aims to give an overview of the most recent advances in emulsion science. The basic principles, covering aspects of emulsions from their preparation to their destruction, are presented in close relation to both the fundamental physics and the applications of these materials. The book is intended to help scientists and engineers in formulating new materials by giving them the basics of emulsion science.

  2. Basics of Computer Networking

    CERN Document Server

    Robertazzi, Thomas

    2012-01-01

    Springer Brief Basics of Computer Networking provides a non-mathematical introduction to the world of networks. This book covers both technology for wired and wireless networks. Coverage includes transmission media, local area networks, wide area networks, and network security. Written in a very accessible style for the interested layman by the author of a widely used textbook with many years of experience explaining concepts to the beginner.

  3. Risk communication basics

    International Nuclear Information System (INIS)

    Corrado, P.G.

    1995-01-01

    In low-trust, high-concern situations, 50% of your credibility comes from perceived empathy and caring, demonstrated in the first 30 s you come in contact with someone. There is no second chance for a first impression. These and other principles contained in this paper provide you with a basic level of understanding of risk communication. The principles identified are time-tested caveats and will assist you in effectively communicating technical information

  4. Risk communication basics

    Energy Technology Data Exchange (ETDEWEB)

    Corrado, P.G. [Lawrence Livermore National Laboratory, CA (United States)

    1995-12-31

    In low-trust, high-concern situations, 50% of your credibility comes from perceived empathy and caring, demonstrated in the first 30 s you come in contact with someone. There is no second chance for a first impression. These and other principles contained in this paper provide you with a basic level of understanding of risk communication. The principles identified are time-tested caveats and will assist you in effectively communicating technical information.

  5. Basic nucleonics. 2. ed.

    International Nuclear Information System (INIS)

    Guzman, M.E.

    1989-01-01

    This book is oriented mainly towards professionals who are not physicists or experts in nuclear sciences, physicians planning to specialize in nuclear medicine or radiotherapy and technicians involved in nuclear applications. The book covers the fundamental concepts of nuclear science and technology in a simple and ordered fashion. Theory is illustrated with appropriate exercises and answers. With 17 chapters plus 3 appendices on mathematics, basic concepts are covered in: nuclear science, radioactivity, radiation and matter, nuclear reactions, X rays, shielding and radioprotection

  6. Basic of Neutron NDA

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Alexis Chanel [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-15

    The objectives of this presentation are to introduce the basic physics of neutron production, interactions and detection; identify the processes that generate neutrons; explain the most common neutron mechanism, spontaneous and induced fission and (a,n) reactions; describe the properties of neutron from different sources; recognize advantages of neutron measurements techniques; recognize common neutrons interactions; explain neutron cross section measurements; describe the fundamental of 3He detector function and designs; and differentiate between passive and active assay techniques.

  7. Shoulder arthroscopy: the basics.

    Science.gov (United States)

    Farmer, Kevin W; Wright, Thomas W

    2015-04-01

    Shoulder arthroscopy is a commonly performed and accepted procedure for a wide variety of pathologies. Surgeon experience, patient positioning, knowledge of surgical anatomy, proper portal placement, and proper use of instrumentation can improve technical success and minimize complication risks. This article details the surgical anatomy, indications, patient positioning, portal placement, instrumentation, and complications for basic shoulder arthroscopy. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  8. Basic accelerator optics

    CERN Document Server

    CERN. Geneva. Audiovisual Unit

    1985-01-01

    A complete derivation, from first principles, of the concepts and methods applied in linear accelerator and beamline optics will be presented. Particle motion and beam motion in systems composed of linear magnets, as well as weak and strong focusing and special insertions are treated in mathematically simple terms, and design examples for magnets and systems are given. This series of five lectures is intended to provide all the basic tools required for the design and operation of beam optical systems.

  9. Basic concepts in oceanography

    International Nuclear Information System (INIS)

    Small, L.F.

    1997-01-01

    Basic concepts in oceanography include major wind patterns that drive ocean currents, and the effects that the earth's rotation, positions of land masses, and temperature and salinity have on oceanic circulation and hence global distribution of radioactivity. Special attention is given to coastal and near-coastal processes such as upwelling, tidal effects, and small-scale processes, as radionuclide distributions are currently most associated with coastal regions. (author)

  10. Basic Financial Accounting

    DEFF Research Database (Denmark)

    Wiborg, Karsten

    This textbook on Basic Financial Accounting is targeted students in the economics studies at universities and business colleges having an introductory subject in the external dimension of the company's economic reporting, including bookkeeping, etc. The book includes the following subjects......: business entities, the transformation process, types of businesses, stakeholders, legislation, the annual report, the VAT system, double-entry bookkeeping, inventories, and year-end cast flow analysis....

  11. School Principals' Assumptions about Human Nature: Implications for Leadership in Turkey

    Science.gov (United States)

    Sabanci, Ali

    2008-01-01

    This article considers principals' assumptions about human nature in Turkey and the relationship between the assumptions held and the leadership style adopted in schools. The findings show that school principals hold Y-type assumptions and prefer a relationship-oriented style in their relations with assistant principals. However, both principals…

  12. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    Science.gov (United States)

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  13. Educational Technology as a Subversive Activity: Questioning Assumptions Related to Teaching and Leading with Technology

    Science.gov (United States)

    Kruger-Ross, Matthew J.; Holcomb, Lori B.

    2012-01-01

    The use of educational technologies is grounded in the assumptions of teachers, learners, and administrators. Assumptions are choices that structure our understandings and help us make meaning. Current advances in Web 2.0 and social media technologies challenge our assumptions about teaching and learning. The intersection of technology and…

  14. Research and development of basic technologies for the next generation industries, 'three-dimensional circuit elements'. Evaluation on the research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'sanjigen kairo soshi'. Kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-04-01

    Research, development and evaluation were performed with an objective of establishing the basic technology related to three-dimensional circuit elements that integrate functions at ultra-high density. For the basic technology of lamination, the SOI technology suitable for the three-dimensional circuit elements was developed, and it has become possible to manufacture high-quality multi-layered crystalline structure by means of annealing that uses laser and electron beam. In addition, a lateral epitaxial technology for solid phase was developed, and the base to be applied to the three-dimensional circuit elements was established. Furthermore, the technology to put thin film circuits together would be useful for high-density integration in the future. The three-dimensional circuit makes parallel processing in each segment possible, whereas a possibility was shown that the processing can be performed at much higher speed than before. Actually a prototype three-dimensional circuit equipped with functions for parallel processing and judgment processing was fabricated. The image pre-processing which has been impossible on the real time basis in the conventional two-dimensional integrated circuit was realized in a speed as fast as milli-second order. These achievements lead to a belief that the targets for the present research and development have been achieved. (NEDO)

  15. Report on evaluation of research and development of methods for producing basic chemicals from carbon monoxide and other stocks; Issanka tanso nado wo genryo to suru kiso kagakuhin no seizoho no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-08-01

    This project was aimed at developing methods for producing basic chemicals from carbon monoxide and other stocks (the so-called C1 chemistry), in order to establish the techniques that could promote stable supply of basic chemicals from the new carbon resources in place of oil. It was a 7-year national project beginning in FY 1980, jointly implemented by the government, academic and industrial circles. Described herein are the overall evaluation of the results. There are several carbon resources other than oil, e.g., coal, natural gas, oil shale and tar sand. They are abundantly occurring, although unevenly, and various countries are developing these resources. They can be advantageously utilized as stocks for chemicals, after being converted into synthesis gases. In Japan, they have been efficiently developed cooperatively by national institutes, enterprises, academic circles and chemical industry, to produce the world's results, e.g., gas separation/purification techniques and new catalysts for new synthesis methods. This project was terminated because of the relaxed oil supply/demand situations and lowered crude prices in the middle of the 80's, which made stock conversion less urgent. (NEDO)

  16. Report on final evaluation of industrial science and technology research and development system. Comprehensive basic technologies for development of ocean resources. Manganese nodule exploitation system; Kaiyo shigen sogo kiban gijutsu (mangan dankai saiko system). Saishu hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-01

    Described herein are the final evaluation results of the basic research and development of the system for exploiting manganese nodules as one of ocean resources. A 9-year project was started in the FY 1981 to establish the techniques to efficiently, economically exploit Mn nodules on a commercial basis, which are occurring on deep sea bottoms (4,000 to 6,000 m deep), in order to stably supply non-ferrous metallic resources, e.g., Ni, Cu, Co and Mn, which are essential for economic activities of Japan. Originally, the UN convention related to ocean laws raised development of unique exploitation techniques as the prerequisite condition for obtaining the right to develop Mn nodules. However, the situations around development of Mn nodules were changed since then, to devalue objects, significance and urgency of this project. The fourth amendment of the basic plans decided to suspend the comprehensive ocean tests in 1996, and to implement only the ocean/land tests in which part of the individual elementary techniques were combined. Therefore, the technological validation of the overall system could not be done sufficiently, and degree of achievement of the project is low, viewed from insufficient prospects of the commercial production. However, this project produced good results in individual elementary techniques, which are of significance for the resources policies. (NEDO)

  17. Report on evaluation of research and development of methods for producing basic chemicals from carbon monoxide and other stocks; Issanka tanso nado wo genryo to suru kiso kagakuhin no seizoho no kenkyu kaihatsu ni kansuru hyoka hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1987-08-01

    This project was aimed at developing methods for producing basic chemicals from carbon monoxide and other stocks (the so-called C1 chemistry), in order to establish the techniques that could promote stable supply of basic chemicals from the new carbon resources in place of oil. It was a 7-year national project beginning in FY 1980, jointly implemented by the government, academic and industrial circles. Described herein are the overall evaluation of the results. There are several carbon resources other than oil, e.g., coal, natural gas, oil shale and tar sand. They are abundantly occurring, although unevenly, and various countries are developing these resources. They can be advantageously utilized as stocks for chemicals, after being converted into synthesis gases. In Japan, they have been efficiently developed cooperatively by national institutes, enterprises, academic circles and chemical industry, to produce the world's results, e.g., gas separation/purification techniques and new catalysts for new synthesis methods. This project was terminated because of the relaxed oil supply/demand situations and lowered crude prices in the middle of the 80's, which made stock conversion less urgent. (NEDO)

  18. Research and development of basic technologies for the next generation industries, 'recombinant DNA utilizing technology'. Evaluation on the research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'kumikae DNA riyo gijutsu'. Kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1991-03-01

    Research, development and evaluation were performed with an objective of establishing the basic technology related to the recombinant DNA utilizing technology to create new microorganisms for processes in the chemical industry. The major achievements of the present research and development include establishment of the P450 gene manifestation system attributed from microsomes and mitochondria, and the success of the world's first simultaneous manifestation of P450 and reduction enzyme. Furthermore, the fused enzyme combining P450 and the reduction enzyme genetically was successfully manufactured ahead of the other countries, opening the way to industrializing the recombinant enzymes for use in bio-processes in the chemical industry. In creating a high-efficiency secretion recombinant bacillus subtilis stock, a bacillus subtilis host whose protease activity has been noticeably decreased was created. As an achievement of the research on the 'basic recombinant DNA technology', high-efficiency manifestation vector of medium level thermophile was created, and its usefulness was demonstrated. In addition, a host and vector system for high level thermophile was developed for the first time in the world. These achievements have opened the way to industrial utilization of the thermophilic bacteria. (NEDO)

  19. Rethinking our assumptions about the evolution of bird song and other sexually dimorphic signals

    Directory of Open Access Journals (Sweden)

    J. Jordan Price

    2015-04-01

    Full Text Available Bird song is often cited as a classic example of a sexually-selected ornament, in part because historically it has been considered a primarily male trait. Recent evidence that females also sing in many songbird species and that sexual dimorphism in song is often the result of losses in females rather than gains in males therefore appears to challenge our understanding of the evolution of bird song through sexual selection. Here I propose that these new findings do not necessarily contradict previous research, but rather they disagree with some of our assumptions about the evolution of sexual dimorphisms in general and female song in particular. These include misconceptions that current patterns of elaboration and diversity in each sex reflect past rates of change and that levels of sexual dimorphism necessarily reflect levels of sexual selection. Using New World blackbirds (Icteridae as an example, I critically evaluate these past assumptions in light of new phylogenetic evidence. Understanding the mechanisms underlying such sexually dimorphic traits requires a clear understanding of their evolutionary histories. Only then can we begin to ask the right questions.

  20. Personal and Communal Assumptions to Determine Pragmatic Meanings of Phatic Functions

    Directory of Open Access Journals (Sweden)

    Kunjana Rahardi

    2016-11-01

    Full Text Available This research was meant to describe the manifestations of phatic function in the education domain. The phatic function in the communication and interaction happening in the education domain could be accurately identified when the utterances were not separated from their determining pragmatic context. The context must not be limited only to contextual and social or societal perspectives, but must be defined as basic assumptions. The data of this research included various kinds of speech gathered naturally in education circles that contain phatic functions. Two methods of data gathering were employed in this study, namely listening and conversation methods. Recorded data was analyzed through the steps as follows (1 data were identified based on the discourse markers found (2 data were classified based on the phatic perception criteria; (3 data were interpreted based on the referenced theories; (4 data were described in the form of analysis result description. The research proves that phatic function in the form of small talks in the education domain cannot be separated from the context surrounding it. 

  1. “Booster” training: Evaluation of instructor-led bedside cardiopulmonary resuscitation skill training and automated corrective feedback to improve cardiopulmonary resuscitation compliance of Pediatric Basic Life Support providers during simulated cardiac arrest

    Science.gov (United States)

    Sutton, Robert M.; Niles, Dana; Meaney, Peter A.; Aplenc, Richard; French, Benjamin; Abella, Benjamin S.; Lengetti, Evelyn L.; Berg, Robert A.; Helfaer, Mark A.; Nadkarni, Vinay

    2013-01-01

    Objective To investigate the effectiveness of brief bedside “booster” cardiopulmonary resuscitation (CPR) training to improve CPR guideline compliance of hospital-based pediatric providers. Design Prospective, randomized trial. Setting General pediatric wards at Children’s Hospital of Philadelphia. Subjects Sixty-nine Basic Life Support–certified hospital-based providers. Intervention CPR recording/feedback defibrillators were used to evaluate CPR quality during simulated pediatric arrest. After a 60-sec pretraining CPR evaluation, subjects were randomly assigned to one of three instructional/feedback methods to be used during CPR booster training sessions. All sessions (training/CPR manikin practice) were of equal duration (2 mins) and differed only in the method of corrective feedback given to participants during the session. The study arms were as follows: 1) instructor-only training; 2) automated defibrillator feedback only; and 3) instructor training combined with automated feedback. Measurements and Main Results Before instruction, 57% of the care providers performed compressions within guideline rate recommendations (rate >90 min−1 and 38 mm); and 36% met overall CPR compliance (rate and depth within targets). After instruction, guideline compliance improved (instructor-only training: rate 52% to 87% [p .01], and overall CPR compliance, 43% to 78% [p CPR compliance, 35% to 96% [p training combined with automated feedback: rate 48% to 100% [p CPR compliance, 30% to 100% [p CPR instruction, most certified Pediatric Basic Life Support providers did not perform guideline-compliant CPR. After a brief bedside training, CPR quality improved irrespective of training content (instructor vs. automated feedback). Future studies should investigate bedside training to improve CPR quality during actual pediatric cardiac arrests. PMID:20625336

  2. "Booster" training: evaluation of instructor-led bedside cardiopulmonary resuscitation skill training and automated corrective feedback to improve cardiopulmonary resuscitation compliance of Pediatric Basic Life Support providers during simulated cardiac arrest.

    Science.gov (United States)

    Sutton, Robert M; Niles, Dana; Meaney, Peter A; Aplenc, Richard; French, Benjamin; Abella, Benjamin S; Lengetti, Evelyn L; Berg, Robert A; Helfaer, Mark A; Nadkarni, Vinay

    2011-05-01

    To investigate the effectiveness of brief bedside "booster" cardiopulmonary resuscitation (CPR) training to improve CPR guideline compliance of hospital-based pediatric providers. Prospective, randomized trial. General pediatric wards at Children's Hospital of Philadelphia. Sixty-nine Basic Life Support-certified hospital-based providers. CPR recording/feedback defibrillators were used to evaluate CPR quality during simulated pediatric arrest. After a 60-sec pretraining CPR evaluation, subjects were randomly assigned to one of three instructional/feedback methods to be used during CPR booster training sessions. All sessions (training/CPR manikin practice) were of equal duration (2 mins) and differed only in the method of corrective feedback given to participants during the session. The study arms were as follows: 1) instructor-only training; 2) automated defibrillator feedback only; and 3) instructor training combined with automated feedback. Before instruction, 57% of the care providers performed compressions within guideline rate recommendations (rate >90 min(-1) and 38 mm); and 36% met overall CPR compliance (rate and depth within targets). After instruction, guideline compliance improved (instructor-only training: rate 52% to 87% [p .01], and overall CPR compliance, 43% to 78% [p CPR compliance, 35% to 96% [p training combined with automated feedback: rate 48% to 100% [p CPR compliance, 30% to 100% [p CPR instruction, most certified Pediatric Basic Life Support providers did not perform guideline-compliant CPR. After a brief bedside training, CPR quality improved irrespective of training content (instructor vs. automated feedback). Future studies should investigate bedside training to improve CPR quality during actual pediatric cardiac arrests.

  3. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    Science.gov (United States)

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  4. Research and development of the industrial basic techniques of the next generation. Composite materials (Final research and development evaluation / Part 1); Jisedai dangyo kiban gijutsu kenkyu kaihatsu. Fukugo zairyo (Saishu kenkyu kaihatsu hyoka 1)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-05-01

    This R and D project is aimed at development of highly functional materials for aerospace devices, bringing into focus research and development of resin-based composite materials (FRPs), metal-based composite materials (FRMs), and evaluation of their properties and their design techniques. The basic target properties are heat resistance temperature of 250 degrees C or higher and tensile strength of 240 kg/mm{sup 2} or more for the FRPs, and heat resistance temperature of 450 degrees C or higher and tensile strength of 150 kgf/mm{sup 2} or more for the FRMs. This R and D program has been implemented for 8 years for development of raw materials, molding/processing techniques, quality evaluation and designs through information exchange and discussions of the experts in each area under integrated, close cooperation from raw materials to molding/processing. Most of the data indicate that the target properties and objects are satisfied or exceeded. It is therefore concluded that this project for the composite materials, extending for 8 years in 3 phases, has sufficiently achieved the initial objects. The unique techniques are incorporated in the raw materials, molding/processing processes, quality evaluation and designs. These efforts have produced the FRPs and FRMs of the world highest qualities. (NEDO)

  5. Evaluating Basic Grammar Projects, Using the SAMR Model (La evaluación de proyectos de Gramática Básica según el modelo SAMR

    Directory of Open Access Journals (Sweden)

    Alejandra Giangiulio Lobo

    2017-11-01

    Full Text Available Abstract The research evaluates the projects assigned in two basic grammar courses of the English teaching majors, at Universidad Nacional in Costa Rica, using the SAMR framework for evaluating learning activities that implemented Information and Communication Technologies. First, the relevance of the use of these projects is presented. Second, the SAMR framework is explained. Third, the six different projects are discussed and evaluated according to the SAMR framework, taking into consideration the students’ perceptions. Recommendations are given regarding the use of technology to learn grammatical structures. Resumen Se analizan proyectos efectuados en dos cursos básicos de gramática para las carreras de enseñanza del inglés, en la Universidad Nacional de Costa Rica, mediante el modelo SAMR para la evaluación de actividades de aprendizaje que se valen de tecnologías de la información y la comunicación. En primer lugar, se refiere a la pertinencia del uso de este tipo de proyecto; en segundo lugar, se describe y explica tal modelo; y en tercer lugar se analizan los proyectos llevados a cabo con base en el modelo, teniendo en cuenta la percepción del estudiantado. Se dan recomendaciones en cuanto al uso de la tecnología para el aprendizaje de estructuras gramaticales.

  6. Catalyst in Basic Oleochemicals

    Directory of Open Access Journals (Sweden)

    Eva Suyenty

    2007-10-01

    Full Text Available Currently Indonesia is the world largest palm oil producer with production volume reaching 16 million tones per annum. The high crude oil and ethylene prices in the last 3 – 4 years contribute to the healthy demand growth for basic oleochemicals: fatty acids and fatty alcohols. Oleochemicals are starting to replace crude oil derived products in various applications. As widely practiced in petrochemical industry, catalyst plays a very important role in the production of basic oleochemicals. Catalytic reactions are abound in the production of oleochemicals: Nickel based catalysts are used in the hydrogenation of unsaturated fatty acids; sodium methylate catalyst in the transesterification of triglycerides; sulfonic based polystyrene resin catalyst in esterification of fatty acids; and copper chromite/copper zinc catalyst in the high pressure hydrogenation of methyl esters or fatty acids to produce fatty alcohols. To maintain long catalyst life, it is crucial to ensure the absence of catalyst poisons and inhibitors in the feed. The preparation methods of nickel and copper chromite catalysts are as follows: precipitation, filtration, drying, and calcinations. Sodium methylate is derived from direct reaction of sodium metal and methanol under inert gas. The sulfonic based polystyrene resin is derived from sulfonation of polystyrene crosslinked with di-vinyl-benzene. © 2007 BCREC UNDIP. All rights reserved.[Presented at Symposium and Congress of MKICS 2007, 18-19 April 2007, Semarang, Indonesia][How to Cite: E. Suyenty, H. Sentosa, M. Agustine, S. Anwar, A. Lie, E. Sutanto. (2007. Catalyst in Basic Oleochemicals. Bulletin of Chemical Reaction Engineering and Catalysis, 2 (2-3: 22-31.  doi:10.9767/bcrec.2.2-3.6.22-31][How to Link/DOI: http://dx.doi.org/10.9767/bcrec.2.2-3.6.22-31 || or local: http://ejournal.undip.ac.id/index.php/bcrec/article/view/6

  7. Quality quantification model of basic raw materials

    Directory of Open Access Journals (Sweden)

    Š. Vilamová

    2016-07-01

    Full Text Available Basic raw materials belong to the key input sources in the production of pig iron. The properties of basic raw materials can be evaluated using a variety of criteria. The essential ones include the physical and chemical properties. Current competitive pressures, however, force the producers of iron more and more often to include cost and logistic criteria into the decision-making process. In this area, however, they are facing a problem of how to convert a variety of vastly different parameters into one evaluation indicator in order to compare the available raw materials. This article deals with the analysis of a model created to evaluate the basic raw materials, which was designed as part of the research.

  8. C# Database Basics

    CERN Document Server

    Schmalz, Michael

    2012-01-01

    Working with data and databases in C# certainly can be daunting if you're coming from VB6, VBA, or Access. With this hands-on guide, you'll shorten the learning curve considerably as you master accessing, adding, updating, and deleting data with C#-basic skills you need if you intend to program with this language. No previous knowledge of C# is necessary. By following the examples in this book, you'll learn how to tackle several database tasks in C#, such as working with SQL Server, building data entry forms, and using data in a web service. The book's code samples will help you get started

  9. Electrical installation calculations basic

    CERN Document Server

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for basic electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practice. A step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3Fo

  10. Basic structural dynamics

    CERN Document Server

    Anderson, James C

    2012-01-01

    A concise introduction to structural dynamics and earthquake engineering Basic Structural Dynamics serves as a fundamental introduction to the topic of structural dynamics. Covering single and multiple-degree-of-freedom systems while providing an introduction to earthquake engineering, the book keeps the coverage succinct and on topic at a level that is appropriate for undergraduate and graduate students. Through dozens of worked examples based on actual structures, it also introduces readers to MATLAB, a powerful software for solving both simple and complex structural d

  11. Basic heat transfer

    CERN Document Server

    Bacon, D H

    2013-01-01

    Basic Heat Transfer aims to help readers use a computer to solve heat transfer problems and to promote greater understanding by changing data values and observing the effects, which are necessary in design and optimization calculations.The book is concerned with applications including insulation and heating in buildings and pipes, temperature distributions in solids for steady state and transient conditions, the determination of surface heat transfer coefficients for convection in various situations, radiation heat transfer in grey body problems, the use of finned surfaces, and simple heat exc

  12. Back to basics audio

    CERN Document Server

    Nathan, Julian

    1998-01-01

    Back to Basics Audio is a thorough, yet approachable handbook on audio electronics theory and equipment. The first part of the book discusses electrical and audio principles. Those principles form a basis for understanding the operation of equipment and systems, covered in the second section. Finally, the author addresses planning and installation of a home audio system.Julian Nathan joined the audio service and manufacturing industry in 1954 and moved into motion picture engineering and production in 1960. He installed and operated recording theaters in Sydney, Austra

  13. Machine shop basics

    CERN Document Server

    Miller, Rex

    2004-01-01

    Use the right tool the right wayHere, fully updated to include new machines and electronic/digital controls, is the ultimate guide to basic machine shop equipment and how to use it. Whether you're a professional machinist, an apprentice, a trade student, or a handy homeowner, this fully illustrated volume helps you define tools and use them properly and safely. It's packed with review questions for students, and loaded with answers you need on the job.Mark Richard Miller is a Professor and Chairman of the Industrial Technology Department at Texas A&M University in Kingsville, T

  14. Basic bladder neurophysiology.

    Science.gov (United States)

    Clemens, J Quentin

    2010-11-01

    Maintenance of normal lower urinary tract function is a complex process that requires coordination between the central nervous system and the autonomic and somatic components of the peripheral nervous system. This article provides an overview of the basic principles that are recognized to regulate normal urine storage and micturition, including bladder biomechanics, relevant neuroanatomy, neural control of lower urinary tract function, and the pharmacologic processes that translate the neural signals into functional results. Finally, the emerging role of the urothelium as a sensory structure is discussed. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Agronomic Use of Basic Slag

    Directory of Open Access Journals (Sweden)

    Fabio Oliveiri de Nobile

    2015-01-01

    Full Text Available Modern civilization, in recent years, has increased the requirement of products derived from iron and steel, stimulating the growth of the national siderurgical sector and, consequently, the generation of industrial residue called basic slag. In this context, the recycling of residues can contribute to solve problems of the industries that give priority to the excellence of the production with quality. On the other hand, there is a sector of primary production in Brazil, the agriculture, with a great cultivated area in acid ground and with low fertility, being these factors admittedly determinative for vegetal production, under tropical conditions. Thus, there is a scenery of two primary sectors of production, although distinct ones, that present interaction potential, for , on one hand, there is disponibility of a product with similar properties to the liming materials and traditional fertilizers and, on the other hand, a production sector that is highly dependent of these products. And the interaction between these two sectors helps in the preservation of the environment, bringing, thus, a certain sustainability in the production systems of the postmodern civilization that will be the challenge of this new century. Considering the current possibility of recycling these industrial residues in agriculture, three important factors have to be taken into account. The first would be the proper use of the abundant, available and promising industrial residue; the second, in a propitious agricultural environment, acid soil and low fertility; and third, in a responsive and important socio-economic culture, the sugar cane, considering its vast cultivated area. In national literature, few works have dealt with the use of the basic slag and have evaluated the reply of the cultures to its application. Thus, the present work had as its aim to gather information from literature concerning the characterization and production of basic slag in Brazil, as well

  16. Comparison of risk-dominant scenario assumptions for several TRU waste facilities in the DOE complex

    International Nuclear Information System (INIS)

    Foppe, T.L.; Marx, D.R.

    1999-01-01

    In order to gain a risk management perspective, the DOE Rocky Flats Field Office (RFFO) initiated a survey of other DOE sites regarding risks from potential accidents associated with transuranic (TRU) storage and/or processing facilities. Recently-approved authorization basis documents at the Rocky Flats Environmental Technology Site (RFETS) have been based on the DOE Standard 3011 risk assessment methodology with three qualitative estimates of frequency of occurrence and quantitative estimates of radiological consequences to the collocated worker and the public binned into three severity levels. Risk Class 1 and 2 events after application of controls to prevent or mitigate the accident are designated as risk-dominant scenarios. Accident Evaluation Guidelines for selection of Technical Safety Requirements (TSRs) are based on the frequency and consequence bin assignments to identify controls that can be credited to reduce risk to Risk Class 3 or 4, or that are credited for Risk Class 1 and 2 scenarios that cannot be further reduced. This methodology resulted in several risk-dominant scenarios for either the collocated worker or the public that warranted consideration on whether additional controls should be implemented. RFFO requested the survey because of these high estimates of risks that are primarily due to design characteristics of RFETS TRU waste facilities (i.e., Butler-type buildings without a ventilation and filtration system, and a relatively short distance to the Site boundary). Accident analysis methodologies and key assumptions are being compared for the DOE sites responding to the survey. This includes type of accidents that are risk dominant (e.g., drum explosion, material handling breach, fires, natural phenomena, external events, etc.), source term evaluation (e.g., radionuclide material-at-risk, chemical and physical form, damage ratio, airborne release fraction, respirable fraction, leakpath factors), dispersion analysis (e.g., meteorological

  17. Basic research projects

    International Nuclear Information System (INIS)

    1979-04-01

    The research programs under the cognizance of the Office of Energy Research (OER) are directed toward discovery of natural laws and new knowledge, and to improved understanding of the physical and biological sciences as related to the development, use, and control of energy. The ultimate goal is to develop a scientific underlay for the overall DOE effort and the fundamental principles of natural phenomena so that these phenomena may be understood, and new principles, formulated. The DOE-OER outlay activities include three major programs: High Energy Physics, Nuclear Physics, and Basic Energy Sciences. Taken together, these programs represent some 30 percent of the Nation's Federal support of basic research in the energy sciences. The research activities of OER involve more than 6,000 scientists and engineers working in some 17 major Federal Research Centers and at more than 135 different universities and industrial firms throughout the United States. Contract holders in the areas of high-energy physics, nuclear physics, materials sciences, nuclear science, chemical sciences, engineering, mathematics geosciences, advanced energy projects, and biological energy research are listed. Funding trends for recent years are outlined

  18. Basic scattering theory

    International Nuclear Information System (INIS)

    Queen, N.M.

    1978-01-01

    This series of lectures on basic scattering theory were given as part of a course for postgraduate high energy physicists and were designed to acquaint the student with some of the basic language and formalism used for the phenomenological description of nuclear reactions and decay processes used for the study of elementary particle interactions. Well established and model independent aspects of scattering theory, which are the basis of S-matrix theory, are considered. The subject is considered under the following headings; the S-matrix, cross sections and decay rates, phase space, relativistic kinematics, the Mandelstam variables, the flux factor, two-body phase space, Dalitz plots, other kinematic plots, two-particle reactions, unitarity, the partial-wave expansion, resonances (single-channel case), multi-channel resonances, analyticity and crossing, dispersion relations, the one-particle exchange model, the density matrix, mathematical properties of the density matrix, the density matrix in scattering processes, the density matrix in decay processes, and the helicity formalism. Some exercises for the students are included. (U.K.)

  19. Basic and clinical immunology

    Science.gov (United States)

    Chinen, Javier; Shearer, William T.

    2003-01-01

    Progress in immunology continues to grow exponentially every year. New applications of this knowledge are being developed for a broad range of clinical conditions. Conversely, the study of primary and secondary immunodeficiencies is helping to elucidate the intricate mechanisms of the immune system. We have selected a few of the most significant contributions to the fields of basic and clinical immunology published between October 2001 and October 2002. Our choice of topics in basic immunology included the description of T-bet as a determinant factor for T(H)1 differentiation, the role of the activation-induced cytosine deaminase gene in B-cell development, the characterization of CD4(+)CD25(+) regulatory T cells, and the use of dynamic imaging to study MHC class II transport and T-cell and dendritic cell membrane interactions. Articles related to clinical immunology that were selected for review include the description of immunodeficiency caused by caspase 8 deficiency; a case series report on X-linked agammaglobulinemia; the mechanism of action, efficacy, and complications of intravenous immunoglobulin; mechanisms of autoimmunity diseases; and advances in HIV pathogenesis and vaccine development. We also reviewed two articles that explore the possible alterations of the immune system caused by spaceflights, a new field with increasing importance as human space expeditions become a reality in the 21st century.

  20. VQABQ: Visual Question Answering by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-03-19

    Taking an image and question as the input of our method, it can output the text-based answer of the query question about the given image, so called Visual Question Answering (VQA). There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the basic questions of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question. We formulate the basic questions generation problem as a LASSO optimization problem, and also propose a criterion about how to exploit these basic questions to help answer main question. Our method is evaluated on the challenging VQA dataset and yields state-of-the-art accuracy, 60.34% in open-ended task.

  1. VQABQ: Visual Question Answering by Basic Questions

    KAUST Repository

    Huang, Jia-Hong; Alfadly, Modar; Ghanem, Bernard

    2017-01-01

    Taking an image and question as the input of our method, it can output the text-based answer of the query question about the given image, so called Visual Question Answering (VQA). There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the basic questions of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question. We formulate the basic questions generation problem as a LASSO optimization problem, and also propose a criterion about how to exploit these basic questions to help answer main question. Our method is evaluated on the challenging VQA dataset and yields state-of-the-art accuracy, 60.34% in open-ended task.

  2. [Spirometry - basic examination of the lung function].

    Science.gov (United States)

    Kociánová, Jana

    Spirometry is one of the basic internal examination methods, similarly as e.g. blood pressure measurement or ECG recording. It is used to detect or assess the extent of ventilatory disorders. Indications include respiratory symptoms or laboratory anomalies, smoking, inhalation risks and more. Its performance and evaluation should be among the basic skills of pulmonologists, internists, alergologists, pediatricians and sports physicians. The results essentially influence the correct diagnosing and treatment method. Therefore spirometry must be performed under standardized conditions and accurately and clearly assessed to enable answering clinical questions.Key words: acceptability - calibration - contraindication - evaluation - indication - parameters - spirometry - standardization.

  3. Fundamentals of neurogastroenterology: basic science.

    Science.gov (United States)

    Grundy, David; Al-Chaer, Elie D; Aziz, Qasim; Collins, Stephen M; Ke, Meiyun; Taché, Yvette; Wood, Jackie D

    2006-04-01

    The focus of neurogastroenterology in Rome II was the enteric nervous system (ENS). To avoid duplication with Rome II, only advances in ENS neurobiology after Rome II are reviewed together with stronger emphasis on interactions of the brain, spinal cord, and the gut in terms of relevance for abdominal pain and disordered gastrointestinal function. A committee with expertise in selective aspects of neurogastroenterology was invited to evaluate the literature and provide a consensus overview of the Fundamentals of Neurogastroenterology textbook as they relate to functional gastrointestinal disorders (FGIDs). This review is an abbreviated version of a fuller account that appears in the forthcoming book, Rome III. This report reviews current basic science understanding of visceral sensation and its modulation by inflammation and stress and advances in the neurophysiology of the ENS. Many of the concepts are derived from animal studies in which the physiologic mechanisms underlying visceral sensitivity and neural control of motility, secretion, and blood flow are examined. Impact of inflammation and stress in experimental models relative to FGIDs is reviewed as is human brain imaging, which provides a means for translating basic science to understanding FGID symptoms. Investigative evidence and emerging concepts implicate dysfunction in the nervous system as a significant factor underlying patient symptoms in FGIDs. Continued focus on neurogastroenterologic factors that underlie the development of symptoms will lead to mechanistic understanding that is expected to directly benefit the large contingent of patients and care-givers who deal with FGIDs.

  4. Basics and application of PSpice

    International Nuclear Information System (INIS)

    Choi, Pyeong; Cho, Yong Beom; Mok, Hyeong Su; Baek, Dong CHeol

    2006-03-01

    This book is comprised of nineteenth chapters, which introduces basics and application of PSpice. The contents of this book are PSpice?, PSpice introduction, PSpice simulation, DC analysis, parametric analysis, Transient analysis, parametric analysis and measurements, Monte Carlo analysis, changing of device characteristic, ABM application. The elementary laws of circuit, R.L.C. basic circuit, Diode basic cc circuit, Transistor and EET basic circuit, OP-Amp basic circuit, Digital basic circuit, Analog, digital circuit practice, digital circuit application and practice and ABM circuit application and practice.

  5. ESPlannerBASIC CANADA

    Directory of Open Access Journals (Sweden)

    Laurence Kotlikoff

    2015-02-01

    Full Text Available Traditional financial planning is based on a fundamental rule of thumb: Aim to save enough for retirement to replace 80 per cent of your pre-retirement income with income from pensions and assets. Millions of Canadians follow this formula. Yet, there is no guarantee this approach is consistent with a savings plan that will allow them to experience their optimal standard of living — given their income — throughout their working lives. Consumption smoothing happens when a consumer projects her income and her non-discretionary expenses (such as mortgage payments all the way up until the end of her life, and is able to determine her household discretionary spending power over time, to achieve the smoothest living standard path possible without going into debt. When consumption smoothing is calculated accurately, a person’s lifestyle should be roughly the same whether she is in her 30s with small children, in her 50s with kids in college, or in retirement, with adult children. Consumption smoothing allows that to happen. But while it is conceptually straightforward, consumption smoothing requires the use of advanced numerical techniques. Now, Canadian families have access to a powerful consumption-smoothing tool: ESPlannerBASIC Canada. This free, secure and confidential online tool will allow Canadian families to safely and securely enter their earnings and other financial resources and will calculate for them how much they can spend and how much they should save in order to maintain their lifestyle from now until they die, without going into debt. It will also calculate how much life insurance they should buy, to ensure that household living standards are not affected after a family member dies. Users can easily and instantly run “what-if” scenarios to see how retiring early (or later, changing jobs, adjusting retirement contributions, having children, moving homes, timing RRSP withdrawals, and other financial and lifestyle decisions would

  6. Basic Evaluation of the Newly Developed "Lias Auto P-FDP" Assay and the Influence of Plasmin-α2 Plasmin Inhibitor Complex Values on Discrepancy in the Comparison with "Lias Auto D-Dimer Neo" Assay.

    Science.gov (United States)

    Kumano, Osamu; Ieko, Masahiro; Komiyama, Yutaka; Naito, Sumiyoshi; Yoshida, Mika; Takahashi, Nobuhiko; Ohmura, Kazumasa; Hayasaki, Junki; Hayakawa, Mineji

    2018-04-01

    Laboratory determination of fibrin/fibrinogen degradation products (FDP) levels, along with that of the D-dimer, is important for assessing the fibrinolytic situation. Recently, we developed a new FDP reagent "Lias Auto P-FDP", which can detect various FDP fragments. The purpose of this study was to evaluate the basic performance of the newly developed Lias Auto P-FDP and compare it with Lias Auto D-Dimer Neo assay. The within-run precision of Lias Auto P-FDP and Lias Auto D-Dimer was determined 20 times in low and high value controls. The between-day precision was evaluated five times a day for five days. The linearity study was performed by diluting high value samples for 2 - 10-fold and 2 - 8-fold. The comparative study was performed using 172 patient samples with elevated FDP values. For the discrepancy analysis, the samples were divided into three groups by the discrepancy percentage between the FDP and D-dimer values. The groups were defined as follows: lower discrepancy group, less than -20%; no discrepancy group, -20% to 20%; upper discrepancy group, more than 20%. The coefficient of variation % (CV%) in within-run and between-day precision were within 3.8% for both FDP and the D-dimer. The correlation coefficients were more than 0.999 and the linearity was high. In the comparative study, the values of FDP were higher than that of the D-dimer in all samples. The median FDP and D-dimer values of lower discrepancy, no discrepancy, and upper discrepancy groups were 11.8, 20.3, and 51.4, and 8.0, 11.3, and 13.1, respectively. FDP showed an increasing tendency but D-Dimer showed constant values. Thus, the possible cause of discrepancy between FDP and D-dimer values were the elevated FDP values. In addition, the values of plasmin-α2 plasmin inhibitor complex (PIC) in the upper discrepancy group were higher than that of the lower and no discrepancy groups, indicating progression of fibrinolysis. In this study, we evaluated the newly developed Lias Auto P

  7. Research and development of basic technologies for the next generation industries, 'environment resistance strengthened elements'. Evaluation on the second term research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'Taikankyo kyoka soshi'. Dainiki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-03-31

    In the research and development of the environment resistance strengthened elements with emphasis placed respectively on radiation resistance, heat resistance, and integration degree according to specific requirements in the using environments, the second term has developed an integration technology and its evaluation technology based on the achievements in the first term. In developing the heat resistant element technology, the technology to grow {beta}-SiC crystals was expanded to obtaining thin film crystals with high migratory performance by using higher temperatures. At the same time, development was performed on a technology to manufacture multiple number of transistors on one substrate, such as the doping technology and etching technology. Using this technology, schottky diodes and p-n junction elements being the basic structures of MES-FET and bipolar transistors were fabricated. In the evaluation and testing technology, the {gamma} dose measuring method using TLD was improved, the traceability of {gamma} ray irradiation amount was assured, the simplified irradiation testing method using X-ray was established, and the heat resistance testing technology for electronic parts was established. Furthermore, attempts were made on enhancing radiation resistance of the elements, such as in MOS silicon integrated circuit, bipolar silicon integrated circuit, and compound semiconductor integrated circuit. (NEDO)

  8. Fiscal 1999 research result report. Basic research on the evaluation method of deep water by fine algae; 1999 nendo bisai sorui wo mochiita shinsosui hyokaho ni kansuru kisoteki kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Basic research was made on establishment of a bioassay for testing the effect of deep water on surface biota. Mixing of surface water and deep water with high-concentration nutrient salts has effect on fine algae (phytoplankton) immediately. In this research, based on conventional AGP (algae growth potential) method as water quality evaluation method by fine algae, the multiplication potential of 13 strains of algae in Kochi's and Toyama's deep water was evaluated by using the increase rate of the number of cells. The research result showed that (1) deep water has the potential increasing cell concentrations of every fine algae to several times or over ten times as compared with surface water, (2) most of both nitrogen and phosphorus in deep water are consumed during the above process, (3) cell concentrations of both harmful and usable species increase, and (4) although no difference in mean potential is found between Kochi's and Toyama's deep water, the patterns of strains promoting multiplication are different between them. (NEDO)

  9. Research and development of basic technologies for the next generation industries, 'light-reactive materials'. Evaluation on the first term research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'hikari hanno zairyo'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-03-01

    Research, development and evaluation were performed with an objective of establishing the basic technology related to light-reactive materials that control the structures and status of aggregation of molecules by using actions of light, and can be used for ultra-high density recording, high resolution indication and light switches. In elucidating the mechanism of light deterioration reaction of photochromic molecules, it was disclosed that 6-nitrospirobenzopyran has the light deterioration caused from the excitation triplet state. This disclosure presents a possibility of preventing the light deterioration. New derivatives that show photochromism were synthesized, and thin films were produced by using the LB process. This indicates a possibility of producing the photochromic materials as the high multiplex recording material. With regard to PHB materials, an evaluation technology having spectrum resolution of the world's highest level was established and measurements were performed. Hole formation was verified for the first time in the world at the temperature higher than the liquid nitrogen temperature by using a PHB material of ionic porphin/polyvinyl alcohol systems. This verification indicates a feasibility of practically usable PHB materials. (NEDO)

  10. Research and development of basic technologies for the next generation industries, 'environment resistance strengthened elements'. Evaluation on the first term research and development; Jisedai sangyo kiban gijutsu kenkyu kaihatsu 'taikankyo kyoka soshi'. Daiikki kenkyu kaihatsu hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1984-03-30

    Research, development and evaluation were performed with an objective of developing environment resistance strengthened elements, with emphasis placed respectively on radiation resistance, heat resistance, and integration degree according to specific requirements in the using environments. The objective for the first term is to develop the basic technology on element structures required to raise the environment resistance, and the methods of testing them. With regard to heat resistant elements, {beta}-SiC single crystal thin films were formed, and a prospect was obtained on using them as the elements. Regarding MOS integrated circuit, bipolar integrated circuit, and GaAs element, points presenting the largest issue in radiation resistance were noticed according to respective elements. Thus, temperature for gate oxide film formation was lowered, element structures were improved, and gate lengths were decreased to enhance the radiation resistance. For the evaluation test technology, a provisional testing method was prepared for the radiation resistance, a prototype in-situ irradiation testing device was fabricated and so was a prototype high-temperature testing device usable up to 500 degrees C. These achievements lead to a belief that the targets for the first term have been achieved. (NEDO)

  11. Cloud computing basics

    CERN Document Server

    Srinivasan, S

    2014-01-01

    Cloud Computing Basics covers the main aspects of this fast moving technology so that both practitioners and students will be able to understand cloud computing. The author highlights the key aspects of this technology that a potential user might want to investigate before deciding to adopt this service. This book explains how cloud services can be used to augment existing services such as storage, backup and recovery. Addressing the details on how cloud security works and what the users must be prepared for when they move their data to the cloud. Also this book discusses how businesses could prepare for compliance with the laws as well as industry standards such as the Payment Card Industry.

  12. Basic semiconductor physics

    CERN Document Server

    Hamaguchi, Chihiro

    2017-01-01

    This book presents a detailed description of basic semiconductor physics. The text covers a wide range of important phenomena in semiconductors, from the simple to the advanced. Four different methods of energy band calculations in the full band region are explained: local empirical pseudopotential, non-local pseudopotential, KP perturbation and tight-binding methods. The effective mass approximation and electron motion in a periodic potential, Boltzmann transport equation and deformation potentials used for analysis of transport properties are discussed. Further, the book examines experiments and theoretical analyses of cyclotron resonance in detail. Optical and transport properties, magneto-transport, two-dimensional electron gas transport (HEMT and MOSFET) and quantum transport are reviewed, while optical transition, electron-phonon interaction and electron mobility are also addressed. Energy and electronic structure of a quantum dot (artificial atom) are explained with the help of Slater determinants. The...

  13. Basic category theory

    CERN Document Server

    Leinster, Tom

    2014-01-01

    At the heart of this short introduction to category theory is the idea of a universal property, important throughout mathematics. After an introductory chapter giving the basic definitions, separate chapters explain three ways of expressing universal properties: via adjoint functors, representable functors, and limits. A final chapter ties all three together. The book is suitable for use in courses or for independent study. Assuming relatively little mathematical background, it is ideal for beginning graduate students or advanced undergraduates learning category theory for the first time. For each new categorical concept, a generous supply of examples is provided, taken from different parts of mathematics. At points where the leap in abstraction is particularly great (such as the Yoneda lemma), the reader will find careful and extensive explanations. Copious exercises are included.

  14. Energy the basics

    CERN Document Server

    Schobert, Harold

    2013-01-01

    People rarely stop to think about where the energy they use to power their everyday lives comes from and when they do it is often to ask a worried question: is mankind's energy usage killing the planet? How do we deal with nuclear waste? What happens when the oil runs out? Energy: The Basics answers these questions but it also does much more. In this engaging yet even-handed introduction, readers are introduced to: the concept of 'energy' and what it really means the ways energy is currently generated and the sources used new and emerging energy technologies such as solar power and biofuels the impacts of energy use on the environment including climate change Featuring explanatory diagrams, tables, a glossary and an extensive further reading list, this book is the ideal starting point for anyone interested in the impact and future of the world's energy supply.

  15. Basic ionizing physic radiation

    International Nuclear Information System (INIS)

    Abdul Nassir Ibrahim; Azali Muhammad; Ab. Razak Hamzah; Abd. Aziz Mohamed; Mohamad Pauzi Ismail

    2008-01-01

    To become an expert in this field, radiographer must first master in radiation physics. That why the second chapter discussed on radiation physic. The topic that must covered such as atom and molecule, atomic structure, proton, isotope, half life, types of radiation and some basic formula such as formula for shielding, half life, half value layer, tenth value layer and more. All of this must be mastered by radiographer if they want to know more detail on this technique because this technique was a combination of theory and practical. Once they failed the theory they cannot go further on this technique. And to master this technique, once cannot depend on theory only. So, for this technique theory and practical must walk together.

  16. 15. Basic economic indicators

    International Nuclear Information System (INIS)

    Carless, J.; Dow, B.; Farivari, R.; O'Connor, J.; Fox, T.; Tunstall, D.; Mentzingen, M.

    1992-01-01

    The clear value of economic data and analysis to decisionmakers has motivated them to mandate the creation of extensive global economic data sets. This chapter contains a set of these basic economic data, which provides the context for understanding the causes and the consequences of many of the decisions that affect the world's resources. Many traditional economic indicators fail to account for the depletion or deterioration of natural resources, the long-term consequences of such depletion, the equitable distribution of income within a country, or the sustainability of current economic practices. The type of measurement shown here, however, is still useful in showing the great differences between the wealthiest and the poorest countries. Tables are given on the following: Gross national product and official development assistance 1969-89; External debt indicators 1979-89; Central government expenditures; and World commodity indexes and prices 1975-89

  17. Chernobyl versus Basic Law

    Energy Technology Data Exchange (ETDEWEB)

    Sauer, G W

    1986-01-01

    The author discusses the terms 'remaining risk to be accepted' and 'remainder of the aggregate risk', and explains the line of action to be adopted in compliance with the Constitution in order to respond to the event at Chernobyl: The Constitution demands maximum acceptable limits to be defined as low as possible. The author discusses the various dose estimations and the contradictions to be observed in this context. He states that the Chernobyl accident has done most harm to our legal system, as the basic right of freedom from injury has been ploughed under with the radioactivity that covered the soil after the Chernobyl accident. But, he says, a positive effect is that the idea of abandoning nuclear power as too dangerous a technology has gained more widespread acceptance. (HSCH).

  18. Basic engineering mathematics

    CERN Document Server

    Bird, John

    2014-01-01

    Introductory mathematics written specifically for students new to engineering Now in its sixth edition, Basic Engineering Mathematics is an established textbook that has helped thousands of students to succeed in their exams. John Bird's approach is based on worked examples and interactive problems. This makes it ideal for students from a wide range of academic backgrounds as the student can work through the material at their own pace. Mathematical theories are explained in a straightforward manner, being supported by practical engineering examples and applications in order to ensure that readers can relate theory to practice. The extensive and thorough topic coverage makes this an ideal text for introductory level engineering courses. This title is supported by a companion website with resources for both students and lecturers, including lists of essential formulae, multiple choice tests, full solutions for all 1,600 further questions contained within the practice exercises, and biographical information on t...

  19. Chernobyl versus Basic Law?

    International Nuclear Information System (INIS)

    Sauer, G.W.

    1986-01-01

    The author discusses the terms 'remaining risk to be accepted' and 'remainder of the aggregate risk', and explains the line of action to be adopted in compliance with the Constitution in order to respond to the event at Chernobyl: The Constitution demands maximum acceptable limits to be defined as low as possible. The author discusses the various dose estimations and the contradictions to be observed in this context. He states that the Chernobyl accident has done most harm to our legal system, as the basic right of freedom from injury has been ploughed under with the radioactivity that covered the soil after the Chernobyl accident. But, he says, a positive effect is that the idea of abandoning nuclear power as too dangerous a technology has gained more widespread acceptance. (HSCH) [de

  20. Basic real analysis

    CERN Document Server

    Sohrab, Houshang H

    2014-01-01

    This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....