WorldWideScience

Sample records for methodological details ignoring

  1. Methodological Details and Full Bibliography

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset has several components, The first part describes fully our literature review, providing details not included in the text. The second part provides all...

  2. Ignoring detailed fast-changing dynamics of land use overestimates regional terrestrial carbon sequestration

    Directory of Open Access Journals (Sweden)

    S. Q. Zhao

    2009-08-01

    Full Text Available Land use change is critical in determining the distribution, magnitude and mechanisms of terrestrial carbon budgets at the local to global scales. To date, almost all regional to global carbon cycle studies are driven by a static land use map or land use change statistics with decadal time intervals. The biases in quantifying carbon exchange between the terrestrial ecosystems and the atmosphere caused by using such land use change information have not been investigated. Here, we used the General Ensemble biogeochemical Modeling System (GEMS, along with consistent and spatially explicit land use change scenarios with different intervals (1 yr, 5 yrs, 10 yrs and static, respectively, to evaluate the impacts of land use change data frequency on estimating regional carbon sequestration in the southeastern United States. Our results indicate that ignoring the detailed fast-changing dynamics of land use can lead to a significant overestimation of carbon uptake by the terrestrial ecosystem. Regional carbon sequestration increased from 0.27 to 0.69, 0.80 and 0.97 Mg C ha−1 yr−1 when land use change data frequency shifting from 1 year to 5 years, 10 years interval and static land use information, respectively. Carbon removal by forest harvesting and prolonged cumulative impacts of historical land use change on carbon cycle accounted for the differences in carbon sequestration between static and dynamic land use change scenarios. The results suggest that it is critical to incorporate the detailed dynamics of land use change into local to global carbon cycle studies. Otherwise, it is impossible to accurately quantify the geographic distributions, magnitudes, and mechanisms of terrestrial carbon sequestration at the local to global scales.

  3. Organizational Ignorance

    DEFF Research Database (Denmark)

    Lange, Ann-Christina

    2016-01-01

    This paper provides an analysis of strategic uses of ignorance or not-knowing in one of the most secretive industries within the financial sector. The focus of the paper is on the relation between imitation and ignorance within the organizational structure of high-frequency trading (HFT) firms...... and investigate the kinds of imitations that might be produced from structures of not-knowing (i.e. structures intended to divide, obscure and protect knowledge). This point is illustrated through ethnographic studies and interviews within five HFT firms. The data show how a black-box structure of ignorance...

  4. The Varieties of Ignorance

    DEFF Research Database (Denmark)

    Nottelmann, Nikolaj

    2016-01-01

    This chapter discusses varieties of ignorance divided according to kind (what the subject is ignorant of), degree, and order (e.g. ignorance of ignorance equals second-order ignorance). It provides analyses of notions such as factual ignorance, erotetic ignorance (ignorance of answers to question...

  5. Satellite telemetry of Afrotropical ducks: methodological details and ...

    African Journals Online (AJOL)

    Despite widespread and increasing use of solarpowered satellite transmitters to tag wild birds, there are few published articles that detail how transmitters should be attached to different species and even fewer assessments of the overall field success of telemetry projects. The scarcity of this information makes it difficult to ...

  6. Ignorability for categorical data

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    We study the problem of ignorability in likelihood-based inference from incomplete categorical data. Two versions of the coarsened at random assumption (car) are distinguished, their compatibility with the parameter distinctness assumption is investigated and several conditions for ignorability...

  7. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    Energy Technology Data Exchange (ETDEWEB)

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  8. Ignore and Conquer.

    Science.gov (United States)

    Conroy, Mary

    1989-01-01

    Discusses how teachers can deal with student misbehavior by ignoring negative behavior that is motivated by a desire for attention. Practical techniques are described for pinpointing attention seekers, enlisting classmates to deal with misbehaving students, ignoring misbehavior, and distinguishing behavior that responds to this technique from…

  9. Strategic Self-Ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    We examine strategic self-ignorance—the use of ignorance as an excuse to overindulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals — a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58 percent) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  10. Strategic self-ignorance

    DEFF Research Database (Denmark)

    Thunström, Linda; Nordström, Leif Jonas; Shogren, Jason F.

    2016-01-01

    We examine strategic self-ignorance—the use of ignorance as an excuse to over-indulge in pleasurable activities that may be harmful to one’s future self. Our model shows that guilt aversion provides a behavioral rationale for present-biased agents to avoid information about negative future impacts...... of such activities. We then confront our model with data from an experiment using prepared, restaurant-style meals—a good that is transparent in immediate pleasure (taste) but non-transparent in future harm (calories). Our results support the notion that strategic self-ignorance matters: nearly three of five...... subjects (58%) chose to ignore free information on calorie content, leading at-risk subjects to consume significantly more calories. We also find evidence consistent with our model on the determinants of strategic self-ignorance....

  11. Ignorance, information and autonomy

    OpenAIRE

    Harris, J.; Keywood, K.

    2001-01-01

    People have a powerful interest in genetic privacy and its associated claim to ignorance, and some equally powerful desires to be shielded from disturbing information are often voiced. We argue, however, that there is no such thing as a right to remain in ignorance, where a right is understood as an entitlement that trumps competing claims. This does not of course mean that information must always be forced upon unwilling recipients, only that there is no prima facie entitlement to be protect...

  12. Clash of Ignorance

    Directory of Open Access Journals (Sweden)

    Mahmoud Eid

    2012-06-01

    Full Text Available The clash of ignorance thesis presents a critique of the clash of civilizations theory. It challenges the assumptions that civilizations are monolithic entities that do not interact and that the Self and the Other are always opposed to each other. Despite some significantly different values and clashes between Western and Muslim civilizations, they overlap with each other in many ways and have historically demonstrated the capacity for fruitful engagement. The clash of ignorance thesis makes a significant contribution to the understanding of intercultural and international communication as well as to the study of inter-group relations in various other areas of scholarship. It does this by bringing forward for examination the key impediments to mutually beneficial interaction between groups. The thesis directly addresses the particular problem of ignorance that other epistemological approaches have not raised in a substantial manner. Whereas the critique of Orientalism deals with the hegemonic construction of knowledge, the clash of ignorance paradigm broadens the inquiry to include various actors whose respective distortions of knowledge symbiotically promote conflict with each other. It also augments the power-knowledge model to provide conceptual and analytical tools for understanding the exploitation of ignorance for the purposes of enhancing particular groups’ or individuals’ power. Whereas academics, policymakers, think tanks, and religious leaders have referred to the clash of ignorance concept, this essay contributes to its development as a theory that is able to provide a valid basis to explain the empirical evidence drawn from relevant cases.

  13. [Detailed methodological recommendations for the treatment of Clostridium difficile-associated diarrhea with faecal transplantation].

    Science.gov (United States)

    Nagy, Gergely György; Várvölgyi, Csaba; Balogh, Zoltán; Orosi, Piroska; Paragh, György

    2013-01-06

    The incidence of Clostridium difficile associated enteral disease shows dramatic increase worldwide, with appallingly high treatment costs, mortality figures, recurrence rates and treatment refractoriness. It is not surprising, that there is significant interest in the development and introduction of alternative therapeutic strategies. Among these only stool transplantation (or faecal bacteriotherapy) is gaining international acceptance due to its excellent cure rate (≈92%), low recurrence rate (≈6%), safety and cost-effectiveness. Unfortunately faecal transplantation is not available for most patients, although based on promising international results, its introduction into the routine clinical practice is well justified and widely expected. The authors would like to facilitate this process, by presenting a detailed faecal transplantation protocol prepared in their Institution based on the available literature and clinical rationality. Officially accepted national methodological guidelines will need to be issued in the future, founded on the expert opinion of relevant professional societies and upcoming advances in this field.

  14. Ignorance, information and autonomy.

    Science.gov (United States)

    Harris, J; Keywood, K

    2001-09-01

    People have a powerful interest in genetic privacy and its associated claim to ignorance, and some equally powerful desires to be shielded from disturbing information are often voiced. We argue, however, that there is no such thing as a right to remain in ignorance, where a fight is understood as an entitlement that trumps competing claims. This does not of course mean that information must always be forced upon unwilling recipients, only that there is no prima facie entitlement to be protected from true or honest information about oneself. Any claims to be shielded from information about the self must compete on equal terms with claims based in the rights and interests of others. In balancing the weight and importance of rival considerations about giving or withholding information, if rights claims have any place, rights are more likely to be defensible on the side of honest communication of information rather than in defence of ignorance. The right to free speech and the right to decline to accept responsibility to take decisions for others imposed by those others seem to us more plausible candidates for fully fledged rights in this field than any purported right to ignorance. Finally, and most importantly, if the right to autonomy is invoked, a proper understanding of the distinction between claims to liberty and claims to autonomy show that the principle of autonomy, as it is understood in contemporary social ethics and English law, supports the giving rather than the withholding of information in most circumstances.

  15. The virtues of ignorance.

    Science.gov (United States)

    Son, Lisa K; Kornell, Nate

    2010-02-01

    Although ignorance and uncertainty are usually unwelcome feelings, they have unintuitive advantages for both human and non-human animals, which we review here. We begin with the perils of too much information: expertise and knowledge can come with illusions (and delusions) of knowing. We then describe how withholding information can counteract these perils: providing people with less information enables them to judge more precisely what they know and do not know, which in turn enhances long-term memory. Data are presented from a new experiment that illustrates how knowing what we do not know can result in helpful choices and enhanced learning. We conclude by showing that ignorance can be a virtue, as long as it is recognized and rectified. Copyright 2009 Elsevier B.V. All rights reserved.

  16. Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology.

    Science.gov (United States)

    Haddaway, Neal R; Verhoeven, Jos T A

    2015-10-01

    Despite the scientific method's central tenets of reproducibility (the ability to obtain similar results when repeated) and repeatability (the ability to replicate an experiment based on methods described), published ecological research continues to fail to provide sufficient methodological detail to allow either repeatability of verification. Recent systematic reviews highlight the problem, with one example demonstrating that an average of 13% of studies per year (±8.0 [SD]) failed to report sample sizes. The problem affects the ability to verify the accuracy of any analysis, to repeat methods used, and to assimilate the study findings into powerful and useful meta-analyses. The problem is common in a variety of ecological topics examined to date, and despite previous calls for improved reporting and metadata archiving, which could indirectly alleviate the problem, there is no indication of an improvement in reporting standards over time. Here, we call on authors, editors, and peer reviewers to consider repeatability as a top priority when evaluating research manuscripts, bearing in mind that legacy and integration into the evidence base can drastically improve the impact of individual research reports.

  17. UK ignores treaty obligations

    International Nuclear Information System (INIS)

    Roche, P.

    1995-01-01

    A detailed critique is offered of United Kingdom (UK) political policy with respect to the Non-Proliferation Treaty, an interim agreement valid while nuclear disarmament was supposed to occur, by a representative of Greenpeace, the anti-nuclear campaigning group. The author argues that the civil and military nuclear programmes are still firmly linked, and emphasises his opinions by quoting examples of how UK politicians have broken treaty obligations in order to pursue their own political, and in some cases financial, goals. It is argued that the treaty has failed to force nuclear countries to disarm because of its promoted civil nuclear power programmes. (U.K.)

  18. The logic of strategic ignorance.

    Science.gov (United States)

    McGoey, Linsey

    2012-09-01

    Ignorance and knowledge are often thought of as opposite phenomena. Knowledge is seen as a source of power, and ignorance as a barrier to consolidating authority in political and corporate arenas. This article disputes this, exploring the ways that ignorance serves as a productive asset, helping individuals and institutions to command resources, deny liability in the aftermath of crises, and to assert expertise in the face of unpredictable outcomes. Through a focus on the Food and Drug Administration's licensing of Ketek, an antibiotic drug manufactured by Sanofi-Aventis and linked to liver failure, I suggest that in drug regulation, different actors, from physicians to regulators to manufacturers, often battle over who can attest to the least knowledge of the efficacy and safety of different drugs - a finding that raises new insights about the value of ignorance as an organizational resource. © London School of Economics and Political Science 2012.

  19. Ignoring Ignorance: Notes on Pedagogical Relationships in Citizen Science

    Directory of Open Access Journals (Sweden)

    Michael Scroggins

    2017-04-01

    Full Text Available Theoretically, this article seeks to broaden the conceptualization of ignorance within STS by drawing on a line of theory developed in the philosophy and anthropology of education to argue that ignorance can be productively conceptualized as a state of possibility and that doing so can enable more democratic forms of citizen science. In contrast to conceptualizations of ignorance as a lack, lag, or manufactured product, ignorance is developed here as both the opening move in scientific inquiry and the common ground over which that inquiry proceeds. Empirically, the argument is developed through an ethnographic description of Scroggins' participation in a failed citizen science project at a DIYbio laboratory. Supporting the empirical case are a review of the STS literature on expertise and a critical examination of the structures of participation within two canonical citizen science projects. Though onerous, through close attention to how people transform one another during inquiry, increasingly democratic forms of citizen science, grounded in the commonness of ignorance, can be put into practice.

  20. Methodology of Detailed Geophysical Examination of the Areas of World Recognized Religious and Cultural Artifacts

    Science.gov (United States)

    Eppelbaum, Lev

    2010-05-01

    Geophysics to Engineering and Environmental Problems, Philadelphia, USA, 938-963. Eppelbaum, L.V., 2009a. Near-surface temperature survey: An independent tool for buried archaeological targets delineation. Journal of Cultural Heritage, 12, Suppl.1, e93-e103. Eppelbaum, L.V., 2009b. Application of microgravity at archaeological sites in Israel: some estimation derived from 3D modeling and quantitative analysis of gravity field. Proceed. of the Symp. on the Application of Geophysics to Engineering and Environmental Problems, Denver, USA, 22, No. 1, 434-446. Eppelbaum, L. and Ben-Avraham, Z., 2002. On the development of 4D geophysical Data Base of archaeological sites in Israel. Trans. of the Conf. of the Israel Geol. Soc. Ann. Meet., MaHagan - Lake Kinneret, Israel, p.21. Eppelbaum, L., Eppelbaum,V. and Ben-Avraham, Z., 2003. Formalization and estimation of integrated geological investigations: Informational Approach. Geoinformatics, 14, No.3, 233-240. Eppelbaum, L., Ben-Avraham, Z. and Itkis, S., 2003a. Ancient Roman Remains in Israel provide a challenge for physical-archaeological modeling techniques. First Break, 21 (2), 51-61. Eppelbaum, L., Ben-Avraham, Z., Itkis, S., and Kouznetsov, S., 2001a. First results of self-potential method application at archaeological sites in Israel. Trans. of the EUG XI Intern. Symp., Strasbourg, France, p. 657. Eppelbaum, L.V. and Itkis, S.E., 2001. Detailed magnetic investigations at the ancient Roman site Banias II (northern Israel). Proceed. of the 1st Intern Symp. on Soil and Archaeology, Szazhalombatta, Hungary, 13-16. Eppelbaum, L.V. and Itkis, S.E., 2003. Geophysical examination of the archaeological site Emmaus-Nicopolis (central Israel). Collection of Papers of the XIXth International UNESCO Symposium 'New Perspectives to Save the Cultural Heritage', Antalya, Turkey, 395-400. Eppelbaum, L.V., Itkis, S.E., Fleckenstein, K.-H., and Fleckenstein, L., 2007. Latest results of geophysical-archaeological investigations at the Christian

  1. Fault-ignorant quantum search

    International Nuclear Information System (INIS)

    Vrana, Péter; Reeb, David; Reitzner, Daniel; Wolf, Michael M

    2014-01-01

    We investigate the problem of quantum searching on a noisy quantum computer. Taking a fault-ignorant approach, we analyze quantum algorithms that solve the task for various different noise strengths, which are possibly unknown beforehand. We prove lower bounds on the runtime of such algorithms and thereby find that the quadratic speedup is necessarily lost (in our noise models). However, for low but constant noise levels the algorithms we provide (based on Grover's algorithm) still outperform the best noiseless classical search algorithm. (paper)

  2. Traffic forecasts ignoring induced demand

    DEFF Research Database (Denmark)

    Næss, Petter; Nicolaisen, Morten Skou; Strand, Arvid

    2012-01-01

    the model calculations included only a part of the induced traffic, the difference in cost-benefit results compared to the model excluding all induced traffic was substantial. The results show lower travel time savings, more adverse environmental impacts and a considerably lower benefitcost ratio when...... induced traffic is partly accounted for than when it is ignored. By exaggerating the economic benefits of road capacity increase and underestimating its negative effects, omission of induced traffic can result in over-allocation of public money on road construction and correspondingly less focus on other...... performance of a proposed road project in Copenhagen with and without short-term induced traffic included in the transport model. The available transport model was not able to include long-term induced traffic resulting from changes in land use and in the level of service of public transport. Even though...

  3. The Power of Ignorance | Code | Philosophical Papers

    African Journals Online (AJOL)

    Taking my point of entry from George Eliot's reference to 'the power of Ignorance', I analyse some manifestations of that power as she portrays it in the life of a young woman of affluence, in her novel Daniel Deronda. Comparing and contrasting this kind of ignorance with James Mill's avowed ignorance of local tradition and ...

  4. Life-cycle cost as basis to optimize waste collection in space and time: A methodology for obtaining a detailed cost breakdown structure.

    Science.gov (United States)

    Sousa, Vitor; Dias-Ferreira, Celia; Vaz, João M; Meireles, Inês

    2018-05-01

    Extensive research has been carried out on waste collection costs mainly to differentiate costs of distinct waste streams and spatial optimization of waste collection services (e.g. routes, number, and location of waste facilities). However, waste collection managers also face the challenge of optimizing assets in time, for instance deciding when to replace and how to maintain, or which technological solution to adopt. These issues require a more detailed knowledge about the waste collection services' cost breakdown structure. The present research adjusts the methodology for buildings' life-cycle cost (LCC) analysis, detailed in the ISO 15686-5:2008, to the waste collection assets. The proposed methodology is then applied to the waste collection assets owned and operated by a real municipality in Portugal (Cascais Ambiente - EMAC). The goal is to highlight the potential of the LCC tool in providing a baseline for time optimization of the waste collection service and assets, namely assisting on decisions regarding equipment operation and replacement.

  5. Aspiring to Spectral Ignorance in Earth Observation

    Science.gov (United States)

    Oliver, S. A.

    2016-12-01

    Enabling robust, defensible and integrated decision making in the Era of Big Earth Data requires the fusion of data from multiple and diverse sensor platforms and networks. While the application of standardised global grid systems provides a common spatial analytics framework that facilitates the computationally efficient and statistically valid integration and analysis of these various data sources across multiple scales, there remains the challenge of sensor equivalency; particularly when combining data from different earth observation satellite sensors (e.g. combining Landsat and Sentinel-2 observations). To realise the vision of a sensor ignorant analytics platform for earth observation we require automation of spectral matching across the available sensors. Ultimately, the aim is to remove the requirement for the user to possess any sensor knowledge in order to undertake analysis. This paper introduces the concept of spectral equivalence and proposes a methodology through which equivalent bands may be sourced from a set of potential target sensors through application of equivalence metrics and thresholds. A number of parameters can be used to determine whether a pair of spectra are equivalent for the purposes of analysis. A baseline set of thresholds for these parameters and how to apply them systematically to enable relation of spectral bands amongst numerous different sensors is proposed. The base unit for comparison in this work is the relative spectral response. From this input, determination of a what may constitute equivalence can be related by a user, based on their own conceptualisation of equivalence.

  6. From dissecting ignorance to solving algebraic problems

    International Nuclear Information System (INIS)

    Ayyub, Bilal M.

    2004-01-01

    Engineers and scientists are increasingly required to design, test, and validate new complex systems in simulation environments and/or with limited experimental results due to international and/or budgetary restrictions. Dealing with complex systems requires assessing knowledge and information by critically evaluating them in terms relevance, completeness, non-distortion, coherence, and other key measures. Using the concepts and definitions from evolutionary knowledge and epistemology, ignorance is examined and classified in the paper. Two ignorance states for a knowledge agent are identified: (1) non-reflective (or blind) state, i.e. the person does not know of self-ignorance, a case of ignorance of ignorance; and (2) reflective state, i.e. the person knows and recognizes self-ignorance. Ignorance can be viewed to have a hierarchal classification based on its sources and nature as provided in the paper. The paper also explores limits on knowledge construction, closed and open world assumptions, and fundamentals of evidential reasoning using belief revision and diagnostics within the framework of ignorance analysis for knowledge construction. The paper also examines an algebraic problem set as identified by Sandia National Laboratories to be a basic building block for uncertainty propagation in computational mechanics. Solution algorithms are provided for the problem set for various assumptions about the state of knowledge about its parameters

  7. On the Rationality of Pluralistic Ignorance

    DEFF Research Database (Denmark)

    Bjerring, Jens Christian Krarup; Hansen, Jens Ulrik; Pedersen, Nikolaj Jang Lee Linding

    2014-01-01

    Pluralistic ignorance is a socio-psychological phenomenon that involves a systematic discrepancy between people’s private beliefs and public behavior in cer- tain social contexts. Recently, pluralistic ignorance has gained increased attention in formal and social epistemology. But to get clear...

  8. From dissecting ignorance to solving algebraic problems

    Energy Technology Data Exchange (ETDEWEB)

    Ayyub, Bilal M

    2004-09-01

    Engineers and scientists are increasingly required to design, test, and validate new complex systems in simulation environments and/or with limited experimental results due to international and/or budgetary restrictions. Dealing with complex systems requires assessing knowledge and information by critically evaluating them in terms relevance, completeness, non-distortion, coherence, and other key measures. Using the concepts and definitions from evolutionary knowledge and epistemology, ignorance is examined and classified in the paper. Two ignorance states for a knowledge agent are identified: (1) non-reflective (or blind) state, i.e. the person does not know of self-ignorance, a case of ignorance of ignorance; and (2) reflective state, i.e. the person knows and recognizes self-ignorance. Ignorance can be viewed to have a hierarchal classification based on its sources and nature as provided in the paper. The paper also explores limits on knowledge construction, closed and open world assumptions, and fundamentals of evidential reasoning using belief revision and diagnostics within the framework of ignorance analysis for knowledge construction. The paper also examines an algebraic problem set as identified by Sandia National Laboratories to be a basic building block for uncertainty propagation in computational mechanics. Solution algorithms are provided for the problem set for various assumptions about the state of knowledge about its parameters.

  9. Background report to the OECD Environmental Outlook to 2030. Overviews, details, and methodology of model-based analysis

    International Nuclear Information System (INIS)

    Bakkes, J.A.; Bagnoli, P.; Chateau, J.; Corfee-Morlot, J.; Kim, Y.G.

    2008-01-01

    This background report provides overviews and details of the model-based analyses for the Outlook. The global analyses have been conducted for 24 regions. They cover: climate change; urban air pollution and related health impacts; nutrient loading to the aquatic environment by agriculture and by trends in sanitation and sewerage; terrestrial biodiversity. A baseline scenario has been developed, as well as three policy packages. Most of the model-based analyses for the Environmental Outlook include a retrospect to 1970 and a look forward up to 2050. This enables an assessment of the cost of policy inaction and of the delaying of such action. This background report compares the impacts of the baseline for the various regions of the world. It also assesses the impact of uncertainties in the modelling for the key messages of the Environmental Outlook

  10. Modelling non-ignorable missing data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cornelis A.W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  11. Modelling non-ignorable missing-data mechanisms with item response theory models

    NARCIS (Netherlands)

    Holman, Rebecca; Glas, Cees A. W.

    2005-01-01

    A model-based procedure for assessing the extent to which missing data can be ignored and handling non-ignorable missing data is presented. The procedure is based on item response theory modelling. As an example, the approach is worked out in detail in conjunction with item response data modelled

  12. Ignorance-Based Instruction in Higher Education.

    Science.gov (United States)

    Stocking, S. Holly

    1992-01-01

    Describes how three groups of educators (in a medical school, a psychology department, and a journalism school) are helping instructors and students to recognize, manage, and use ignorance to promote learning. (SR)

  13. Is Ignorance of Climate Change Culpable?

    Science.gov (United States)

    Robichaud, Philip

    2017-10-01

    Sometimes ignorance is an excuse. If an agent did not know and could not have known that her action would realize some bad outcome, then it is plausible to maintain that she is not to blame for realizing that outcome, even when the act that leads to this outcome is wrong. This general thought can be brought to bear in the context of climate change insofar as we think (a) that the actions of individual agents play some role in realizing climate harms and (b) that these actions are apt targets for being considered right or wrong. Are agents who are ignorant about climate change and the way their actions contribute to it excused because of their ignorance, or is their ignorance culpable? In this paper I examine these questions from the perspective of recent developments in the theories of responsibility for ignorant action and characterize their verdicts. After developing some objections to existing attempts to explore these questions, I characterize two influential theories of moral responsibility and discuss their implications for three different types of ignorance about climate change. I conclude with some recommendations for how we should react to the face of the theories' conflicting verdicts. The answer to the question posed in the title, then, is: "Well, it's complicated."

  14. Knowledge, responsibility, decision making and ignorance

    DEFF Research Database (Denmark)

    Huniche, Lotte

    2001-01-01

    of and ignoring) seems to be commonly applicable to describing persons living at risk for Huntington´s Disease (HD). So what does everyday conduct of life look like from an "ignorance" perspective? And how can we discuss and argue about morality and ethics taking these seemingly diverse ways of living at risk...... into account? Posing this question, I hope to contribute to new reflections on possibilities and constraints in people´s lives with HD as well as in research and to open up new ways of discussing "right and wrong"....

  15. DMPD: TLR ignores methylated RNA? [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available 16111629 TLR ignores methylated RNA? Ishii KJ, Akira S. Immunity. 2005 Aug;23(2):11...1-3. (.png) (.svg) (.html) (.csml) Show TLR ignores methylated RNA? PubmedID 16111629 Title TLR ignores methylated

  16. Should general psychiatry ignore somatization and hypochondriasis?

    Science.gov (United States)

    Creed, Francis

    2006-10-01

    This paper examines the tendency for general psychiatry to ignore somatization and hypochondriasis. These disorders are rarely included in national surveys of mental health and are not usually regarded as a concern of general psychiatrists; yet primary care doctors and other physicians often feel let down by psychiatry's failure to offer help in this area of medical practice. Many psychiatrists are unaware of the suffering, impaired function and high costs that can result from these disorders, because these occur mainly within primary care and secondary medical services. Difficulties in diagnosis and a tendency to regard them as purely secondary phenomena of depression, anxiety and related disorders mean that general psychiatry may continue to ignore somatization and hypochondriasis. If general psychiatry embraced these disorders more fully, however, it might lead to better prevention and treatment of depression as well as helping to prevent the severe disability that may arise in association with these disorders.

  17. Should general psychiatry ignore somatization and hypochondriasis?

    OpenAIRE

    CREED, FRANCIS

    2006-01-01

    This paper examines the tendency for general psychiatry to ignore somatization and hypochondriasis. These disorders are rarely included in national surveys of mental health and are not usually regarded as a concern of general psychiatrists; yet primary care doctors and other physicians often feel let down by psychiatry's failure to offer help in this area of medical practice. Many psychiatrists are unaware of the suffering, impaired function and high costs that can result fr...

  18. Can Strategic Ignorance Explain the Evolution of Love?

    Science.gov (United States)

    Bear, Adam; Rand, David G

    2018-04-24

    People's devotion to, and love for, their romantic partners poses an evolutionary puzzle: Why is it better to stop your search for other partners once you enter a serious relationship when you could continue to search for somebody better? A recent formal model based on "strategic ignorance" suggests that such behavior can be adaptive and favored by natural selection, so long as you can signal your unwillingness to "look" for other potential mates to your current partner. Here, we re-examine this conclusion with a more detailed model designed to capture specific features of romantic relationships. We find, surprisingly, that devotion does not typically evolve in our model: Selection favors agents who choose to "look" while in relationships and who allow their partners to do the same. Non-looking is only expected to evolve if there is an extremely large cost associated with being left by your partner. Our results therefore raise questions about the role of strategic ignorance in explaining the evolution of love. Copyright © 2018 Cognitive Science Society, Inc.

  19. Issues ignored in laboratory quality surveillance

    International Nuclear Information System (INIS)

    Zeng Jing; Li Xingyuan; Zhang Tingsheng

    2008-01-01

    According to the work requirement of the related laboratory quality surveillance in ISO17025, this paper analyzed and discussed the issued ignored in the laboratory quality surveillance. In order to solve the present problem, it is required to understand the work responsibility in the quality surveillance correctly, to establish the effective working routine in the quality surveillance, and to conduct, the quality surveillance work. The object in the quality surveillance shall be 'the operator' who engaged in the examination/calibration directly in the laboratory, especially the personnel in training (who is engaged in the examination/calibration). The quality supervisors shall be fully authorized, so that they can correctly understand the work responsibility in quality surveillance, and are with the rights for 'full supervision'. The laboratory also shall arrange necessary training to the quality supervisor, so that they can obtain sufficient guide in time and are with required qualification or occupation prerequisites. (authors)

  20. Ignorance of electrosurgery among obstetricians and gynaecologists.

    Science.gov (United States)

    Mayooran, Zorana; Pearce, Scott; Tsaltas, Jim; Rombauts, Luk; Brown, T Ian H; Lawrence, Anthony S; Fraser, Kym; Healy, David L

    2004-12-01

    The purpose of this study was to assess the level of skill of laparoscopic surgeons in electrosurgery. Subjects were asked to complete a practical diathermy station and a written test of electrosurgical knowledge. Tests were held in teaching and non-teaching hospitals. Twenty specialists in obstetrics and gynaecology were randomly selected and tested on the Monash University gynaecological laparoscopic pelvi-trainer. Twelve candidates were consultants with 9-28 years of practice in operative laparoscopy, and 8 were registrars with up to six years of practice in operative laparoscopy. Seven consultants and one registrar were from rural Australia, and three consultants were from New Zealand. Candidates were marked with checklist criteria resulting in a pass/fail score, as well as a weighted scoring system. We retested 11 candidates one year later with the same stations. No improvement in electrosurgery skill in one year of obstetric and gynaecological practice. No candidate successfully completed the written electrosurgery station in the initial test. A slight improvement in the pass rate to 18% was observed in the second test. The pass rate of the diathermy station dropped from 50% to 36% in the second test. The study found ignorance of electrosurgery/diathermy among gynaecological surgeons. One year later, skills were no better.

  1. Author Details

    African Journals Online (AJOL)

    Kaggwa, JK. Vol 9, No 3 (2012) - Articles Transgender in Africa: Invisible, inaccessible, or ignored? Abstract PDF. ISSN: 1813-4424. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact AJOL ...

  2. Beyond duplicity and ignorance in global fisheries

    Directory of Open Access Journals (Sweden)

    Daniel Pauly

    2009-06-01

    Full Text Available The three decades following World War II were a period of rapidly increasing fishing effort and landings, but also of spectacular collapses, particularly in small pelagic fish stocks. This is also the period in which a toxic triad of catch underreporting, ignoring scientific advice and blaming the environment emerged as standard response to ongoing fisheries collapses, which became increasingly more frequent, finally engulfing major North Atlantic fisheries. The response to the depletion of traditional fishing grounds was an expansion of North Atlantic (and generally of northern hemisphere fisheries in three dimensions: southward, into deeper waters and into new taxa, i.e. catching and marketing species of fish and invertebrates previously spurned, and usually lower in the food web. This expansion provided many opportunities for mischief, as illustrated by the European Union’s negotiated ‘agreements’ for access to the fish resources of Northwest Africa, China’s agreement-fee exploitation of the same, and Japan blaming the resulting resource declines on the whales. Also, this expansion provided new opportunities for mislabelling seafood unfamiliar to North Americans and Europeans, and misleading consumers, thus reducing the impact of seafood guides and similar effort toward sustainability. With fisheries catches declining, aquaculture—despite all public relation efforts—not being able to pick up the slack, and rapidly increasing fuel prices, structural changes are to be expected in both the fishing industry and the scientific disciplines that study it and influence its governance. Notably, fisheries biology, now predominantly concerned with the welfare of the fishing industry, will have to be converted into fisheries conservation science, whose goal will be to resolve the toxic triad alluded to above, and thus maintain the marine biodiversity and ecosystems that provide existential services to fisheries. Similarly, fisheries

  3. Learning to ignore: acquisition of sustained attentional suppression.

    Science.gov (United States)

    Dixon, Matthew L; Ruppel, Justin; Pratt, Jay; De Rosa, Eve

    2009-04-01

    We examined whether the selection mechanisms committed to the suppression of ignored stimuli can be modified by experience to produce a sustained, rather than transient, change in behavior. Subjects repeatedly ignored the shape of stimuli, while attending to their color. On subsequent attention to shape, there was a robust and sustained decrement in performance that was selective to when shape was ignored across multiple-color-target contexts, relative to a single-color-target context. Thus, amount of time ignored was not sufficient to induce a sustained performance decrement. Moreover, in this group, individual differences in initial color target selection were associated with the subsequent performance decrement when attending to previously ignored stimuli. Accompanying this sustained decrement in performance was a transfer in the locus of suppression from an exemplar (e.g., a circle) to a feature (i.e., shape) level of representation. These data suggest that learning can influence attentional selection by sustained attentional suppression of ignored stimuli.

  4. On strategic ignorance of environmental harm and social norms

    DEFF Research Database (Denmark)

    Thunström, Linda; van 't Veld, Klaas; Shogren, Jason

    , and that they use ignorance as an excuse to engage in less pro-environmental behavior. It also predicts that the cost of ignorance increases if people can learn about the social norm from the information. We test the model predictions empirically with an experiment that involves an imaginary long- distance flight...... and an option to buy offsets for the flight’s carbon footprint. More than half (53 percent) of the subjects choose to ignore information on the carbon footprint alone before deciding their offset purchase, but ignorance significantly decreases (to 29 percent) when the information additionally reveals the social...

  5. Malassezia-Can it be ignored?

    Directory of Open Access Journals (Sweden)

    Ambujavalli Balakrishnan Thayikkannu

    2015-01-01

    Full Text Available Genus Malassezia comprises of 14 species of "yeast like fungi," 13 of which are lipophilic and 1 is nonlipophilic. They are known commensals and in predisposed individuals they commonly cause a spectrum of chronic recurrent infections. They rarely also cause serious illnesses like catheter-related blood stream infections, CAPD associated peritonitis etc., Though these fungi have been known to man for over 150 years, their fastidious nature and cumbersome culture and speciation techniques have restricted research. Since the last taxonomic revision, seven new species have been added to this genus. Their ability to evade the host immune system and virulence has increased the spectrum of the diseases caused by them. These agents have been implicated as causal agents in common diseases like atopic dermatitis recently. Though culture-based research is difficult, the new molecular analysis techniques and facilities have increased research in this field such that we can devote more attention to this genus to study in detail, their characteristics and their growing implications implications in the clinical scenario.

  6. Author Details

    African Journals Online (AJOL)

    PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING AJOL · RESOURCES. Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads.

  7. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... An algorithm to retrieve Land Surface Temperature using Landsat-8 Dataset Abstract PDF. ISSN: 2225-8531.

  8. The cost of ignoring acute cholecystectomy.

    Science.gov (United States)

    Garner, J P; Sood, S K; Robinson, J; Barber, W; Ravi, K

    2009-01-01

    Biliary symptoms whilst awaiting elective cholecystectomy are common, resulting in hospital admission, further investigation and increased hospital costs. Immediate cholecystectomy during the first admission is safe and effective, even when performed laparoscopically, but acute laparoscopic cholecystectomy has only recently become increasingly commonplace in the UK. This study was designed to quantify this problem in our hospital and its cost implications. The case notes of all patients undergoing laparoscopic cholecystectomy in our hospital between January 2004 and June 2005 were examined for details of hospital admissions with biliary symptoms or complications whilst waiting for elective cholecystectomy. Additional bed occupancy and radiological investigations were recorded and these costs to the trust calculated. We compared the potential tariff income to the hospital trust for the actual management of these patients and if a policy of acute laparoscopic cholecystectomy on first admission were in place. In the 18-month study period, 259 patients (202 females) underwent laparoscopic cholecystectomy. Of these, 147 presented as out-patients and only 11% required hospital admission because of biliary symptoms whilst waiting for elective surgery. There were 112 patients who initially presented acutely and were managed conservatively. Twenty-four patients were re-admitted 37 times, which utilised 231 hospital bed-days and repeat investigations costing over 40,000 pounds. There would have been a marginal increase in tariff income if a policy of acute laparoscopic cholecystectomy had been in place. Adoption of a policy of acute laparoscopic cholecystectomy on the index admission would result in substantial cost savings to the trust, reduce elective cholecystectomy waiting times and increase tariff income.

  9. Author Details

    African Journals Online (AJOL)

    Details PDF · Vol 22, No 2 (1999) - Articles Vegetation under different tree species in Acacia woodland in the Rift Valley of Ethiopia Details PDF · Vol 22, No 2 (1999) - Articles Preliminary evaluation of Phytomyza orobanchia (Diptera: Agromyzidae) as a controller of Orobanche spp in Ethiopia Details PDF. ISSN: 2520–7997.

  10. Should we ignore U-235 series contribution to dose?

    International Nuclear Information System (INIS)

    Beaugelin-Seiller, Karine; Goulet, Richard; Mihok, Steve; Beresford, Nicholas A.

    2016-01-01

    Environmental Risk Assessment (ERA) methodology for radioactive substances is an important regulatory tool for assessing the safety of licensed nuclear facilities for wildlife, and the environment as a whole. ERAs are therefore expected to be both fit for purpose and conservative. When uranium isotopes are assessed, there are many radioactive decay products which could be considered. However, risk assessors usually assume 235 U and its daughters contribute negligibly to radiological dose. The validity of this assumption has not been tested: what might the 235 U family contribution be and how does the estimate depend on the assumptions applied? In this paper we address this question by considering aquatic wildlife in Canadian lakes exposed to historic uranium mining practices. A full theoretical approach was used, in parallel to a more realistic assessment based on measurements of several elements of the U decay chains. The 235 U family contribution varied between about 4% and 75% of the total dose rate depending on the assumptions of the equilibrium state of the decay chains. Hence, ignoring the 235 U series will not result in conservative dose assessments for wildlife. These arguments provide a strong case for more in situ measurements of the important members of the 235 U chain and for its consideration in dose assessments. - Highlights: • Realistic ecological risk assessment infers a complete inventory of radionuclides. • U-235 family may not be minor when assessing total dose rates experienced by biota. • There is a need to investigate the real state of equilibrium decay of U chains. • There is a need to improve the capacity to measure all elements of the U decay chains.

  11. On strategic ignorance of environmental harm and social norms

    DEFF Research Database (Denmark)

    Thunström, Linda; van’t Veld, Klaas; Shogren, Jason. F.

    2014-01-01

    decreases (to 29 percent) when the information additionally reveals the share of air travelers who buy carbon offsets. We find evidence that some people use ignorance as an excuse to reduce pro-environmental behavior—ignorance significantly decreases the probability of buying carbon offsets.......Are people strategically ignorant of the negative externalities their activities cause the environment? Herein we examine if people avoid costless information on those externalities and use ignorance as an excuse to reduce pro-environmental behavior. We develop a theoretical framework in which...... people feel internal pressure (“guilt”) from causing harm to the environment (e.g., emitting carbon dioxide) as well as external pressure to conform to the social norm for pro-environmental behavior (e.g., offsetting carbon emissions). Our model predicts that people may benefit from avoiding information...

  12. Author Details

    African Journals Online (AJOL)

    Petrology of the Cenomanian Upper Member of the Mamfe Embayment, southwestern Cameroon Details · Vol 38, No 1 (2002) - Articles Sequence stratigraphy of Iso field, western onshore Niger Delta, Nigeria Details · Vol 39, No 2 (2003) - Articles Preliminary studies on the lithostratigraphy and depositional environment of ...

  13. Author Details

    African Journals Online (AJOL)

    An Overview of Africa's Marine Resources: Their Utilization and Sustainable Management Details · Vol 12, No 3 (2000) - Articles EDITORIAL Ganoderma Lucidum - Paramount among Medicinal Mushrooms. Details · Vol 15, No 3 (2003) - Articles Editorial: Africa's Mushrooms: A neglected bioresource whose time has come

  14. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Abstract PDF · Vol 3, No 6 (2011) - Articles Mixed convection flow and heat transfer in a vertical wavy channel containing porous and fluid layer with traveling thermal waves. Abstract PDF · Vol 3, No 8 ...

  15. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Singh, J. Vol 3, No 2 (2011) - Articles Plane waves in a rotating generalized thermo-elastic solid with voids. Abstract PDF. ISSN: 2141-2839. AJOL African Journals Online. HOW TO USE AJOL.

  16. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Vol 12 (2008) - Articles On the wave equations of shallow water with rough bottom topography. Abstract · Vol 14 (2009) - Articles Energy generation in a plant due to variable sunlight intensity

  17. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Iliopsoas haematoma in a rugby player. Abstract PDF · Vol 29, No 1 (2017) - Articles The use of negative pressure wave treatment in athlete recovery. Abstract PDF. ISSN: 2078-516X. AJOL African ...

  18. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Ismail, A. Vol 9, No 3S (2017): Special Issue - Articles Investigate of wave absorption performance for oil palm frond and empty fruit bunch at 5.8 GHz. Abstract PDF · Vol 9, No 3S (2017): Special Issue ...

  19. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Isa, M.F.M.. Vol 9, No 3S (2017): Special Issue - Articles Experimental and numerical investigation on blast wave propagation in soil structure. Abstract PDF · Vol 9, No 3S (2017): Special Issue - ...

  20. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... No 3S (2017): Special Issue - Articles Experimental and numerical investigation on blast wave propagation in soil structure. Abstract PDF · Vol 9, No 3S (2017): Special Issue - Articles Simulation on ...

  1. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Duwa, S S. Vol 8 (2004) - Articles Lower hybrid waves instability in a velocity–sheared inhomogenous charged dust beam. Abstract · Vol 9 (2005) - Articles The slide away theory of lower hybrid bursts

  2. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Vol 45 (2016) - Articles From vectors to waves and streams: An alternative approach to semantic maps1. Abstract PDF · Vol 48 (2017) - Articles Introduction: 'n Klein ietsie for Johan Oosthuizen

  3. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... to blast loadings. Abstract PDF · Vol 9, No 3S (2017): Special Issue - Articles Experimental and numerical investigation on blast wave propagation in soil structure. Abstract PDF. ISSN: 1112-9867.

  4. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... The use of negative pressure wave treatment in athlete recovery. Abstract PDF · Vol 29, No 1 (2017) - Articles The prevalence, risk factors predicting injury and the severity of injuries sustained during ...

  5. Author Details

    African Journals Online (AJOL)

    Author Details. Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Vol 29, No 1 (2017) - Articles The use of negative pressure wave treatment in athlete recovery. Abstract PDF · Vol 29, No 1 (2017) - Articles The prevalence, risk factors predicting injury and the ...

  6. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or Register ... (2013) - Articles Technical Note: Development of a Photobioreactor for Microalgae Culture ... Design, Construction and Evaluation of Motorized Okra Slicer Abstract PDF ...

  7. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or Register to ... No 1 (2014) - Articles Knowledge and Attitudes towards Basic Cardiopulmonary Resuscitation (CPR) among Community Nurses in Remo Area of Ogun State, Nigeria

  8. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or Register to get ... Optical bus of centralized relay protection and automation system of medium voltage switchgear for data collection and transmission. Abstract PDF. ISSN: 1112- ...

  9. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or ... The prevention of mother-to-child HIV transmission programme and infant feeding practices ... Evaluation of a diagnostic algorithm for smear-negative pulmonary tuberculosis in ...

  10. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details ... Design and Implementation of an M/M/1 Queuing Model Algorithm and its Applicability in ... Vehicle Identification Technology to Intercept Small Arms and Ammunition on Nigeria Roads

  11. Author Details

    African Journals Online (AJOL)

    Mahapatra, S. Vol 2, No 5 (2010) - Articles Modeling, simulation and parametric optimization of wire EDM process using response surface methodology coupled with grey-Taguchi technique. Abstract PDF. ISSN: 2141-2839. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors ...

  12. Author Details

    African Journals Online (AJOL)

    Benmeziane, S. Vol 11, No 2 (2001) - Articles A Java-Web-Based-Learning Methodology, Case Study : Waterborne diseases. Abstract. ISSN: 1111-0015. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of ...

  13. Author Details

    African Journals Online (AJOL)

    Waste Management Policy Implementation in South Africa: An Emerging Stakeholder Participation Paradox Abstract PDF · Vol 20 (2003) - Articles Book Review: Understanding Environmental Policy Processes: Cases from Africa Abstract PDF · Vol 23 (2006) - Articles Actor/Actant-Network Theory as Emerging Methodology ...

  14. Author Details

    African Journals Online (AJOL)

    Abstract PDF · Vol 9, No 1 (2004) - Articles Developing a methodology for sustainable production of improved animal breeds. Abstract PDF · Vol 9, ... Abstract PDF · Vol 9, No 1 (2004) - Articles Assessment of the potential productivity of pigs in the Teso and Lango farming systems, Uganda: A case study. Abstract PDF · Vol 9, ...

  15. Author Details

    African Journals Online (AJOL)

    Ross, E. Vol 104, No 1 (2014) - Articles Parents' perceptions of HIV counselling and testing in schools: Ethical, legal and social implications. Abstract PDF · Vol 104, No 5 (2014) - Articles Parents' perceptions of HIV counselling and testing in schools: Study methodology deeply flawed. Abstract PDF. ISSN: 0256-95749.

  16. Author Details

    African Journals Online (AJOL)

    Chukwuokolo, J. Chidozie. Vol 6, No 2 (2017) - Articles Methodological anarchism or pluralism? An afro-constructivist perspective on Paul Feyerabend's critique of science. Abstract. ISSN: 2408-5987. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL ...

  17. Author Details

    African Journals Online (AJOL)

    Nyarko, KB. Vol 2, No 2 (2004) - Articles A methodology for assessing conditions of water assets in small towns in Ghana Abstract · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact AJOL ...

  18. Author Details

    African Journals Online (AJOL)

    Njubi, Francis. Vol 15, No 1 (2001): Media Freedom and Human Rights - Articles New Media, Old Struggles: Pan Africanism, Anti-racism and Information Technology Details. ISSN: 0256-004. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's ...

  19. Author Details

    African Journals Online (AJOL)

    Radwan M.D, Mona Ahmed. Vol 12, No 1 (2000) - Articles RELAPSING REMITTING MULTIPLE SCLEROSIS: CT AND MRI IMAGING VS CLINICAL FINDINGIN THE DIAGNOSIS AND DETERMINATION OF DISEASE ACTIVITY. Details. ISSN: 1110-5607. AJOL African Journals Online. HOW TO USE AJOL... for Researchers ...

  20. Author Details

    African Journals Online (AJOL)

    Comarof, Jean. Vol 1999, No 3-4 (1999) - Articles Alien-Nation: Zombies, Immigrants and Millennial Capitalism Details. ISSN: 0850-8712. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact ...

  1. Author Details

    African Journals Online (AJOL)

    NENTY, N. JOHNSON. Vol 7, No 3 (2001) - Articles Common errors and perfomance of students in junior secondary mathematics certificate examinations in Cross River State, Nigeria Details PDF. ISSN: 1118-0579. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's ...

  2. Author Details

    African Journals Online (AJOL)

    A Preliminary Investigation of Relative Frequency of Undiagnosed and Previously Diagnosed Hypertension Before First Stroke in a Lagos Hospital Abstract · Vol 9, No 4 (1999) - Articles Localised tetanus in Lagos, Nigeria Details · Vol 9, No 4 (1999) - Articles Stroke with localised infarction of Wernicke's Area misdiagnosed ...

  3. Author Details

    African Journals Online (AJOL)

    SAMA, G. Vol 2 (2002): Supplement - Articles A Longitudinal Study of the Role of T Cell subset, Th1/Th2 cytokines and antiplasmodial antibodies in uncomplicated Malaria in a Village Population Chronically Exposed to Plasmodium falciparum Malaria. Details PDF · AJOL African Journals Online. HOW TO USE AJOL.

  4. Author Details

    African Journals Online (AJOL)

    QUAKYI, A.I.. Vol 2 (2002): Supplement - Articles A Longitudinal Study of the Role of T Cell subset, Th1/Th2 cytokines and antiplasmodial antibodies in uncomplicated Malaria in a Village Population Chronically Exposed to Plasmodium falciparum Malaria. Details PDF · AJOL African Journals Online. HOW TO USE AJOL.

  5. Author Details

    African Journals Online (AJOL)

    KOUONTCHOU, Samuel. Vol 2 (2002): Supplement - Articles Prevalence of Multiple Concomitant Intestinal Parasitic Infections in Simbok a Malaria Endemic Village in Cameroon. Details PDF · Vol 2 (2002): Supplement - Articles A Longitudinal Study of the Role of T Cell subset, Th1/Th2 cytokines and antiplasmodial ...

  6. Author Details

    African Journals Online (AJOL)

    ALAKE, J. Vol 2 (2002): Supplement - Articles A Longitudinal Study of the Role of T Cell subset, Th1/Th2 cytokines and antiplasmodial antibodies in uncomplicated Malaria in a Village Population Chronically Exposed to Plasmodium falciparum Malaria. Details PDF · AJOL African Journals Online. HOW TO USE AJOL.

  7. Author Details

    African Journals Online (AJOL)

    Rakotonirina, Alice. Vol 2, No 2 (2002) - Articles Effect of the decoction of rhizomes of Cyperus articulatus on bicuculline-, N-methyl-D-aspartate- and strychnine-induced behavioural excitation and convulsions in mice. Details PDF · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians ...

  8. Author Details

    African Journals Online (AJOL)

    Love, Alison. Vol 29, No 2 (2002) - Articles Policy-makers, the Press and Politics: Reporting a Public Policy Document Details. ISSN: 0379-0622. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use ...

  9. Author Details

    African Journals Online (AJOL)

    Focho, DA. Vol 2, No 1 (2002) - Articles Observations on the Meiotic Process in the African Pest Grasshopper Taphronota thaelephora Stal. (Orthoptera : Pyrgomorphidae) Details PDF · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's ...

  10. Author Details

    African Journals Online (AJOL)

    Idowu, OO. Vol 8, No 1 (2003) - Articles Evaluation of Different Substrates and Combinations on the Growth of Pleurotus pulmonarius (Fries) Quelet (Sajor-caju) Details. ISSN: 1118-2733. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's ...

  11. Author Details

    African Journals Online (AJOL)

    Ligthelm, A.A.. Vol 5, No 2 (2001) - Articles Community attitudes towards Casinos and the estimated magnitude of problem gambling The Mpumalanga case. Details PDF. ISSN: 1027-4332. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's ...

  12. Author Details

    African Journals Online (AJOL)

    Kioni, P N. Vol 9, No 1 (2007) - Articles Detailed structure of pipe flow with water hammer oscillations. Abstract. ISSN: 1561-7645. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact AJOL ...

  13. Author Details

    African Journals Online (AJOL)

    Development of a trap to contaminate variegated grasshoppers (Zonocerus variegatus L.) (Orthoptera: Pyrgomorphidae) with Metarrhyzium flavo-viride Gams & Rozsypal in the field. Details · Vol 40, No 1 (2007) - Articles Yam pests in the Ashanti and Brong Ahafo regions of Ghana: A study of farmers\\' indigenous technical ...

  14. Author Details

    African Journals Online (AJOL)

    Brown, Duncan. Vol 16, No 2 (2002): Continental Africans & the Question of Identity - Articles Environment and Identity: Douglas Livingstone's A Littoral Zone Details. ISSN: 0256-004. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's ...

  15. Author Details

    African Journals Online (AJOL)

    Aderinokun, GA. Vol 9, No 1 (1999) - Articles Relative Influence Of Sociodemographic Variables On Oral Health And Habits Of Some Nigerian School Children Abstract · Vol 9, No 4 (1999) - Articles Oral health services in Nigeria Details. ISSN: 0189-2657. AJOL African Journals Online. HOW TO USE AJOL.

  16. Author Details

    African Journals Online (AJOL)

    EKPA, O. D.. Vol 7, No 2 (2001) - Articles Variental differences AND polymorphism in palm oil: a case study of palm oils blended with coconut oil. Details PDF. ISSN: 1118-0579. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners ...

  17. Author Details

    African Journals Online (AJOL)

    SONUGA, F A. Vol 6, No 1 (2000) - Articles Geophysical investigation of Karkarku earthdam embankment. Details. ISSN: 1118-0579. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact AJOL ...

  18. Author Details

    African Journals Online (AJOL)

    Geotechnical properties of lateritic soil developed over quartz schist in Ishara area, south western Nigeria Details · Vol 44, No 1 (2008) - Articles Comparative study of the influence of cement and lime stabilization on geotechnical properties of lateritic soil derived from pegmatite in Ago-Iwoye area, southwestern Nigeria

  19. Author Details

    African Journals Online (AJOL)

    McCarthy, Greg. Vol 15, No 1 (2001): Media Freedom and Human Rights - Articles Caught between Empires: Ambivalence in Australian Films Details. ISSN: 0256-004. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and ...

  20. Author Details

    African Journals Online (AJOL)

    Legwaila, GM. Vol 12 (2003) - Articles Review of sweet sorghum: a potential cash and forage crop in Botswana Details. ISSN: 1021-0873. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions of Use · Contact ...

  1. Author Details

    African Journals Online (AJOL)

    Admasu, Assefa. Vol 22, No 2 (1999) - Articles Preliminary evaluation of Phytomyza orobanchia (Diptera: Agromyzidae) as a controller of Orobanche spp in Ethiopia Details PDF. ISSN: 2520–7997. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL ...

  2. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Okeke, EO. Vol 10 (2006) - Articles Analysis of Stokes waves theory as a diffusion problem. Abstract · Vol 11 (2007) - Articles On the impact of wave-current on Stokes waves. Abstract. ISSN: 1116-4336. AJOL African ...

  3. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Obtaining the green's function for electromagnetic waves propagating in layered in-homogeneous thin film media of spherical particles on a substrate. Abstract · Vol 20, No 2 (2008) - Articles solution growth and ...

  4. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text ... Abstract · Vol 17 (2010) - Articles Investigating The Travelling Wave Solution For an SIR Endemic Disease Model With No Disease Related Death (When The Spatial Spread Of The Susceptible Is Not Negligible). Abstract.

  5. Author Details

    African Journals Online (AJOL)

    Journal Home > Advanced Search > Author Details. Log in or Register to get access to full text downloads. ... Vol 8 (2004) - Articles Further on stokes expansions for the finite amplitude water waves. Abstract · Vol 11 (2007) - Articles On the effects of wave steepness on higher order Stokes waves. Abstract. ISSN: 1116-4336.

  6. Author Details

    African Journals Online (AJOL)

    Akum, ZE. Vol 1, No 3 (2001) - Articles Basic home range characteristics for the conservation of the African grey parrot in the Korup national park, Cameroon Details PDF · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms ...

  7. Author Details

    African Journals Online (AJOL)

    Bobcokono, Irene Yatabene. Vol 1, No 1 (2001) - Articles Utilisation du papier filtre dans la gestion de programme de lute contre le SIDA au Cameroun Details PDF · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and ...

  8. Author Details

    African Journals Online (AJOL)

    Lema, VM. Vol 80, No 9 (2003): - Articles Fournier's gangrene complicating vasectomy. Details PDF · Vol 86, No 6 (2009) - Articles Therapeutic misconception and clinical trials in sub-saharan Africa: A review. Abstract PDF · Vol 86, No 11 (2009) - Articles HIV/AIDS and pregnancy-related deaths in Blantyre, Malawi

  9. Author Details

    African Journals Online (AJOL)

    Green, J.M.. Vol 10, No 1 (2001) - Articles Information from Radio Telemetry on movements and exploitation of naturalized Rainbow trout, Oncorhynchus mykiss (Walbaum), in Kenya cold water streams. Details. ISSN: 0002-0036. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians ...

  10. Author Details

    African Journals Online (AJOL)

    Erasmus, GJ. Vol 1, No 1 (2001) - Articles Genetic parameter estimates for growth traits in purebred Gudali and two-breed synthetic Wakwa beef cattle in a tropical environment. Details PDF · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's ...

  11. Author Details

    African Journals Online (AJOL)

    Odigie, IP. Vol 10, No 4 (2000) - Articles High dose vitamin E administration attenuates hypertensin in 2-Kidney 1 Clip Goldblatt hypertensive rats. Details. ISSN: 0189-2657. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms ...

  12. Author Details

    African Journals Online (AJOL)

    Motabagani, MA. Vol 80, No 9 (2003): - Articles Anomalies of the renal, phrenic and suprarenal arteries: Case Report Details PDF · Vol 81, No 3 (2004): - Articles Morphological study of the uncommon rectus sterni muscle in German cadavers. Abstract PDF. ISSN: 0012-835X. AJOL African Journals Online. HOW TO USE ...

  13. Author Details

    African Journals Online (AJOL)

    Ibeabuchi, NM. Vol 10, No 3 (2000) - Articles Comparison of the effects of Methylsalicylate Cream with cryotherapy on delayed onset muscle soreness. Details · Vol 22, No 2 (2012) - Articles X-ray Pelvimetry And Labour Outcome In Term Pregnancy In A Rural Nigerian Population Abstract. ISSN: 0189-2657. AJOL African ...

  14. Author Details

    African Journals Online (AJOL)

    Warnorff, DK. Vol 13, No 4 (2001) - Articles Development of a scoring system for the diagnosis of tuberculous lymphadenitis. Details PDF. ISSN: 1995-7262. AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about AJOL · AJOL's Partners · Terms and Conditions ...

  15. Willful Ignorance and the Death Knell of Critical Thought

    Science.gov (United States)

    Rubin, Daniel Ian

    2018-01-01

    Independent, critical thought has never been more important in the United States. In the Age of Trump, political officials spout falsehoods called "alternative facts" as if they were on equal footing with researchable, scientific data. At the same time, an unquestioning populace engages in acts of "willful ignorance" on a daily…

  16. Tunnel Vision: New England Higher Education Ignores Demographic Peril

    Science.gov (United States)

    Hodgkinson, Harold L.

    2004-01-01

    This author states that American higher education ignores about 90 percent of the environment in which it operates. Colleges change admissions requirements without even informing high schools in their service areas. Community college graduates are denied access to four-year programs because of policy changes made only after it was too late for the…

  17. Academic detailing.

    Science.gov (United States)

    Shankar, P R; Jha, N; Piryani, R M; Bajracharya, O; Shrestha, R; Thapa, H S

    2010-01-01

    There are a number of sources available to prescribers to stay up to date about medicines. Prescribers in rural areas in developing countries however, may not able to access some of them. Interventions to improve prescribing can be educational, managerial, and regulatory or use a mix of strategies. Detailing by the pharmaceutical industry is widespread. Academic detailing (AD) has been classically seen as a form of continuing medical education in which a trained health professional such as a physician or pharmacist visits physicians in their offices to provide evidence-based information. Face-to-face sessions, preferably on an individual basis, clear educational and behavioural objectives, establishing credibility with respect to objectivity, stimulating physician interaction, use of concise graphic educational materials, highlighting key messages, and when possible, providing positive reinforcement of improved practices in follow-up visits can increase success of AD initiatives. AD is common in developed countries and certain examples have been cited in this review. In developing countries the authors have come across reports of AD in Pakistan, Sudan, Argentina and Uruguay, Bihar state in India, Zambia, Cuba, Indonesia and Mexico. AD had a consistent, small but potentially significant impact on prescribing practices. AD has much less resources at its command compared to the efforts by the industry. Steps have to be taken to formally start AD in Nepal and there may be specific hindering factors similar to those in other developing nations.

  18. Professional orientation and pluralistic ignorance among jail correctional officers.

    Science.gov (United States)

    Cook, Carrie L; Lane, Jodi

    2014-06-01

    Research about the attitudes and beliefs of correctional officers has historically been conducted in prison facilities while ignoring jail settings. This study contributes to our understanding of correctional officers by examining the perceptions of those who work in jails, specifically measuring professional orientations about counseling roles, punitiveness, corruption of authority by inmates, and social distance from inmates. The study also examines whether officers are accurate in estimating these same perceptions of their peers, a line of inquiry that has been relatively ignored. Findings indicate that the sample was concerned about various aspects of their job and the management of inmates. Specifically, officers were uncertain about adopting counseling roles, were somewhat punitive, and were concerned both with maintaining social distance from inmates and with an inmate's ability to corrupt their authority. Officers also misperceived the professional orientation of their fellow officers and assumed their peer group to be less progressive than they actually were.

  19. 'More is less'. The tax effects of ignoring flow externalities

    International Nuclear Information System (INIS)

    Sandal, Leif K.; Steinshamn, Stein Ivar; Grafton, R. Quentin

    2003-01-01

    Using a model of non-linear, non-monotone decay of the stock pollutant, and starting from the same initial conditions, the paper shows that an optimal tax that corrects for both stock and flow externalities may result in a lower tax, fewer cumulative emissions (less decay in emissions) and higher output at the steady state than a corrective tax that ignores the flow externality. This 'more is less' result emphasizes that setting a corrective tax that ignores the flow externality, or imposing a corrective tax at too low a level where there exists only a stock externality, may affect both transitory and steady-state output, tax payments and cumulative emissions. The result has important policy implications for decision makers setting optimal corrective taxes and targeted emission limits whenever stock externalities exist

  20. Egoism, ignorance and choice : on society's lethal infection

    OpenAIRE

    Camilleri, Jonathan

    2015-01-01

    The ability to choose and our innate selfish, or rather, self-preservative urges are a recipe for disaster. Combining this with man's ignorance by definition and especially his general refusal to accept it, inevitably leads to Man's demise as a species. It is our false notion of freedom which contributes directly to our collective death, and therefore, man's trying to escape death is, in the largest of ways, counterproductive.

  1. The importance of ignoring: Alpha oscillations protect selectivity

    OpenAIRE

    Payne, Lisa; Sekuler, Robert

    2014-01-01

    Selective attention is often thought to entail an enhancement of some task-relevant stimulus or attribute. We discuss the perspective that ignoring irrelevant, distracting information plays a complementary role in information processing. Cortical oscillations within the alpha (8–14 Hz) frequency band have emerged as a marker of sensory suppression. This suppression is linked to selective attention for visual, auditory, somatic, and verbal stimuli. Inhibiting processing of irrelevant input mak...

  2. Maggots in the Brain: Sequelae of Ignored Scalp Wound.

    Science.gov (United States)

    Aggarwal, Ashish; Maskara, Prasant

    2018-01-01

    A 26-year-old male had suffered a burn injury to his scalp in childhood and ignored it. He presented with a complaint of something crawling on his head. Inspection of his scalp revealed multiple maggots on the brain surface with erosion of overlying bone and scalp. He was successfully managed by surgical debridement and regular dressing. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. On uncertainty in information and ignorance in knowledge

    Science.gov (United States)

    Ayyub, Bilal M.

    2010-05-01

    This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.

  4. Exploitation of commercial remote sensing images: reality ignored?

    Science.gov (United States)

    Allen, Paul C.

    1999-12-01

    The remote sensing market is on the verge of being awash in commercial high-resolution images. Market estimates are based on the growing numbers of planned commercial remote sensing electro-optical, radar, and hyperspectral satellites and aircraft. EarthWatch, Space Imaging, SPOT, and RDL among others are all working towards launch and service of one to five meter panchromatic or radar-imaging satellites. Additionally, new advances in digital air surveillance and reconnaissance systems, both manned and unmanned, are also expected to expand the geospatial customer base. Regardless of platform, image type, or location, each system promises images with some combination of increased resolution, greater spectral coverage, reduced turn-around time (request-to- delivery), and/or reduced image cost. For the most part, however, market estimates for these new sources focus on the raw digital images (from collection to the ground station) while ignoring the requirements for a processing and exploitation infrastructure comprised of exploitation tools, exploitation training, library systems, and image management systems. From this it would appear the commercial imaging community has failed to learn the hard lessons of national government experience choosing instead to ignore reality and replicate the bias of collection over processing and exploitation. While this trend may be not impact the small quantity users that exist today it will certainly adversely affect the mid- to large-sized users of the future.

  5. The Marley hypothesis: denial of racism reflects ignorance of history.

    Science.gov (United States)

    Nelson, Jessica C; Adams, Glenn; Salter, Phia S

    2013-02-01

    This study used a signal detection paradigm to explore the Marley hypothesis--that group differences in perception of racism reflect dominant-group denial of and ignorance about the extent of past racism. White American students from a midwestern university and Black American students from two historically Black universities completed surveys about their historical knowledge and perception of racism. Relative to Black participants, White participants perceived less racism in both isolated incidents and systemic manifestations of racism. They also performed worse on a measure of historical knowledge (i.e., they did not discriminate historical fact from fiction), and this group difference in historical knowledge mediated the differences in perception of racism. Racial identity relevance moderated group differences in perception of systemic manifestations of racism (but not isolated incidents), such that group differences were stronger among participants who scored higher on a measure of racial identity relevance. The results help illuminate the importance of epistemologies of ignorance: cultural-psychological tools that afford denial of and inaction about injustice.

  6. The end of ignorance multiplying our human potential

    CERN Document Server

    Mighton, John

    2008-01-01

    A revolutionary call for a new understanding of how people learn. The End of Ignorance conceives of a world in which no child is left behind – a world based on the assumption that each child has the potential to be successful in every subject. John Mighton argues that by recognizing the barriers that we have experienced in our own educational development, by identifying the moment that we became disenchanted with a certain subject and forever closed ourselves off to it, we will be able to eliminate these same barriers from standing in the way of our children. A passionate examination of our present education system, The End of Ignorance shows how we all can work together to reinvent the way that we are taught. John Mighton, the author of The Myth of Ability, is the founder of JUMP Math, a system of learning based on the fostering of emergent intelligence. The program has proved so successful an entire class of Grade 3 students, including so-called slow learners, scored over 90% on a Grade 6 math test. A ...

  7. Lessons in Equality: From Ignorant Schoolmaster to Chinese Aesthetics

    Directory of Open Access Journals (Sweden)

    Ernest Ženko

    2017-09-01

    Full Text Available The postponement of equality is not only a recurring topic in Jacques Rancière’s writings, but also the most defining feature of modern Chinese aesthetics. Particularly in the period after 1980’s, when the country opened its doors to Western ideas, Chinese aesthetics extensively played a subordinate role in an imbalanced knowledge transfer, in which structural inequality was only reinforced. Aesthetics in China plays an important role and is expected not only to interpret literature and art, but also to help building a harmonious society within globalized world. This is the reason why some commentators – Wang Jianjiang being one of them – point out that it is of utmost importance to eliminate this imbalance and develop proper Chinese aesthetics. Since the key issue in this development is the problem of inequality, an approach developed by Jacques Rancière, “the philosopher of equality”, is proposed. Even though Rancière wrote extensively about literature, art and aesthetics, in order to confront the problem of Chinese aesthetics, it seems that a different approach, found in his repertoire, could prove to be more fruitful. In 1987, he published a book titled The Ignorant Schoolmaster, which contributed to his ongoing philosophical emancipatory project, and focused on inequality and its conditions in the realm of education. The Ignorant Schoolmaster, nonetheless, stretches far beyond the walls of classroom or even educational system, and brings to the fore political implications that cluster around the fundamental core of Rancière's political philosophy: the definition of politics as the verification of the presupposition of the equality of intelligence. Equality cannot be postponed as a goal to be only attained in the future and, therefore, has to be considered as a premise of egalitarian politics that needs to operate as a presupposition.   Article received: May 21, 2017; Article accepted: May 28, 2017; Published online

  8. Mes chers collègues, les moines, ou le partage de l’ignorance

    Directory of Open Access Journals (Sweden)

    Laurence Caillet

    2009-03-01

    Full Text Available Mes chers collègues, les moines, ou le partage de l’ignorance. Aucun statut ne m’a autant étonnée que celui de collègue qui me fut conféré par les moines du Grand monastère de l’Est, à Nara. Après avoir testé mes connaissances en matière de rituel, ces moines fort savants manifestèrent en effet, avec ostentation, leur ignorance. Pointant pour moi des détails liturgiques qu’ils tenaient pour incompréhensibles, ils prirent un plaisir évident à bavarder histoire et théologie, comme si je pouvais apporter quoi que ce soit. Cette mise en scène du caractère incompréhensible du rituel soulignait le caractère ineffable de cérémonies jadis accomplies au ciel par des entités supérieures. Je fournissais prétexte à décrire la vanité de l’érudition face à l’accomplissement des mystères et aussi l’importance de cette érudition pour renouer avec un sens originel irrémédiablement inconnaissable.My dear colleagues the monks, or the sharing of ignorance. No status has ever surprised me as much as that of “colleague” conferred on me by the monks of the Great Eastern Monastery of Nara. After testing my knowledge of ritual, these very learned monks made great show of their ignorance. Drawing my attention to liturgical details that they held to be incomprehensible, they took obvious pleasure in chatting about history and theology, as if I were capable of making the slightest contribution. This staging of the impenetrable nature of the ritual highlighted the ineffable character of the ceremonies performed in heaven long ago by superior beings. I provided a convenient pretext for describing the vanity of erudition in the face of the accomplishment of the mysteries, and also the importance of this erudition for renewing an original, irreparably unknowable meaning.

  9. The importance of ignoring: Alpha oscillations protect selectivity.

    Science.gov (United States)

    Payne, Lisa; Sekuler, Robert

    2014-06-01

    Selective attention is often thought to entail an enhancement of some task-relevant stimulus or attribute. We discuss the perspective that ignoring irrelevant, distracting information plays a complementary role in information processing. Cortical oscillations within the alpha (8-14 Hz) frequency band have emerged as a marker of sensory suppression. This suppression is linked to selective attention for visual, auditory, somatic, and verbal stimuli. Inhibiting processing of irrelevant input makes responses more accurate and timely. It also helps protect material held in short-term memory against disruption. Furthermore, this selective process keeps irrelevant information from distorting the fidelity of memories. Memory is only as good as the perceptual representations on which it is based, and on whose maintenance it depends. Modulation of alpha oscillations can be exploited as an active, purposeful mechanism to help people pay attention and remember the things that matter.

  10. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  11. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  12. What Is Hospitality in the Academy? Epistemic Ignorance and the (Im)possible Gift

    Science.gov (United States)

    Kuokkanen, Rauna

    2008-01-01

    The academy is considered by many as the major Western institution of knowledge. This article, however, argues that the academy is characterized by prevalent "epistemic ignorance"--a concept informed by Spivak's discussion of "sanctioned ignorance." Epistemic ignorance refers to academic practices and discourses that enable the continued exclusion…

  13. Experiences of Being Ignored by Peers during Late Adolescence: Linkages to Psychological Maladjustment

    Science.gov (United States)

    Bowker, Julie C.; Adams, Ryan E.; Fredstrom, Bridget K.; Gilman, Rich

    2014-01-01

    In this study on being ignored by peers, 934 twelfth-grade students reported on their experiences of being ignored, victimized, and socially withdrawn, and completed measures of friendship and psychological adjustment (depression, self-esteem, and global satisfaction). Peer nominations of being ignored, victimized, and accepted by peers were also…

  14. Hooking up: Gender Differences, Evolution, and Pluralistic Ignorance

    Directory of Open Access Journals (Sweden)

    Chris Reiber

    2010-07-01

    Full Text Available “Hooking-up” – engaging in no-strings-attached sexual behaviors with uncommitted partners - has become a norm on college campuses, and raises the potential for disease, unintended pregnancy, and physical and psychological trauma. The primacy of sex in the evolutionary process suggests that predictions derived from evolutionary theory may be a useful first step toward understanding these contemporary behaviors. This study assessed the hook-up behaviors and attitudes of 507 college students. As predicted by behavioral-evolutionary theory: men were more comfortable than women with all types of sexual behaviors; women correctly attributed higher comfort levels to men, but overestimated men's actual comfort levels; and men correctly attributed lower comfort levels to women, but still overestimated women's actual comfort levels. Both genders attributed higher comfort levels to same-gendered others, reinforcing a pluralistic ignorance effect that might contribute to the high frequency of hook-up behaviors in spite of the low comfort levels reported and suggesting that hooking up may be a modern form of intrasexual competition between females for potential mates.

  15. Sarajevo: Politics and Cultures of Remembrance and Ignorance

    Directory of Open Access Journals (Sweden)

    Adla Isanović

    2017-10-01

    Full Text Available This text critically reflects on cultural events organized to mark the 100th anniversary of the start of the First World War in Sarajevo and Bosnia & Herzegovina. It elaborates on disputes which showed that culture is in the centre of identity politics and struggles (which can also take a fascist nationalist form, accept the colonizer’s perspective, etc., on how commemorations ‘swallowed’ the past and present, but primarily contextualizes, historicizes and politicizes Sarajevo 2014 and its politics of visibility. This case is approached as an example and symptomatic of the effects of the current state of capitalism, coloniality, racialization and subjugation, as central to Europe today. Article received: June 2, 2017; Article accepted: June 8, 2017; Published online: October 15, 2017; Original scholarly paper How to cite this article: Isanović, Adla. "Sarajevo: Politics and Cultures of Remembrance and Ignorance." AM Journal of Art and Media Studies 14 (2017: 133-144. doi: 10.25038/am.v0i14.199

  16. Technology trends in econometric energy models: Ignorance or information?

    International Nuclear Information System (INIS)

    Boyd, G.; Kokkelenberg, E.; State Univ., of New York, Binghamton, NY; Ross, M.; Michigan Univ., Ann Arbor, MI

    1991-01-01

    Simple time trend variables in factor demand models can be statistically powerful variables, but may tell the researcher very little. Even more complex specification of technical change, e.g. factor biased, are still the economentrician's ''measure of ignorance'' about the shifts that occur in the underlying production process. Furthermore, in periods of rapid technology change the parameters based on time trends may be too large for long run forecasting. When there is clearly identifiable engineering information about new technology adoption that changes the factor input mix, data for the technology adoption may be included in the traditional factor demand model to economically model specific factor biased technical change and econometrically test their contribution. The adoption of thermomechanical pulping (TMP) and electric are furnaces (EAF) are two electricity intensive technology trends in the Paper and Steel industries, respectively. This paper presents the results of including these variables in a tradition econometric factor demand model, which is based on the Generalized Leontief. The coefficients obtained for this ''engineering based'' technical change compares quite favorably to engineering estimates of the impact of TMP and EAF on electricity intensities, improves the estimates of the other price coefficients, and yields a more believable long run electricity forecast. 6 refs., 1 fig

  17. Main: Clone Detail [KOME

    Lifescience Database Archive (English)

    Full Text Available Clone Detail Mapping Pseudomolecule data detail Detail information Mapping to the T...IGR japonica Pseudomolecules kome_mapping_pseudomolecule_data_detail.zip kome_mapping_pseudomolecule_data_detail ...

  18. Devil's in the (diffuse) detail

    International Nuclear Information System (INIS)

    Welberry, R.

    2006-07-01

    X-ray crystallography is an important workhorse in the world of solid-state chemistry. However, while it's a powerful tool in determining the average structure in a crystal lattice, conventional crystallography is very limited when it comes to understanding nano-scale disorder within that crystal structure. And when it comes to understanding the properties of many important materials, the devil is in the detail. X-ray diffraction is still one of the keys to understanding this finer scale structure but using it requires a capacity to read between the lines - to understand the diffuse diffraction that most crystallography ignores. Scientists at the Research School of Chemistry are leading the world in this field. Their work on modelling nano-scaled disorder using diffuse diffraction is opening up new possibilities in understanding and modifying many of our most important materials

  19. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  20. That Escalated Quickly—Planning to Ignore RPE Can Backfire

    Directory of Open Access Journals (Sweden)

    Maik Bieleke

    2017-09-01

    Full Text Available Ratings of perceived exertion (RPE are routinely assessed in exercise science and RPE is substantially associated with physiological criterion measures. According to the psychobiological model of endurance, RPE is a central limiting factor in performance. While RPE is known to be affected by psychological manipulations, it remains to be examined whether RPE can be self-regulated during static muscular endurance exercises to enhance performance. In this experiment, we investigate the effectiveness of the widely used and recommended self-regulation strategy of if-then planning (i.e., implementation intentions in down-regulating RPE and improving performance in a static muscular endurance task. 62 female students (age: M = 23.7 years, SD = 4.0 were randomly assigned to an implementation intention or a control condition and performed a static muscular endurance task. They held two intertwined rings as long as possible while avoiding contacts between the rings. In the implementation intention condition, participants had an if-then plan: “If the task becomes too strenuous for me, then I ignore the strain and tell myself: Keep going!” Every 25 ± 10 s participants reported their RPE along with their perceived pain. Endurance performance was measured as time to failure, along with contact errors as a measure of performance quality. No differences emerged between implementation intention and control participants regarding time to failure and performance quality. However, mixed-effects model analyses revealed a significant Time-to-Failure × Condition interaction for RPE. Compared to the control condition, participants in the implementation intention condition reported substantially greater increases in RPE during the second half of the task and reached higher total values of RPE before task termination. A similar but weaker pattern evinced for perceived pain. Our results demonstrate that RPE during an endurance task can be self-regulated with if

  1. Desiccator Volume: A Vital Yet Ignored Parameter in CaCO3 Crystallization by the Ammonium Carbonate Diffusion Method

    Directory of Open Access Journals (Sweden)

    Joe Harris

    2017-07-01

    Full Text Available Employing the widely used ammonium carbonate diffusion method, we demonstrate that altering an extrinsic parameter—desiccator size—which is rarely detailed in publications, can alter the route of crystallization. Hexagonally packed assemblies of spherical magnesium-calcium carbonate particles or spherulitic aragonitic particles can be selectively prepared from the same initial reaction solution by simply changing the internal volume of the desiccator, thereby changing the rate of carbonate addition and consequently precursor formation. This demonstrates that it is not merely the quantity of an additive which can control particle morphogenesis and phase selectivity, but control of other often ignored parameters are vital to ensure adequate reproducibility.

  2. The Mathematical Miseducation of America's Youth: Ignoring Research and Scientific Study in Education.

    Science.gov (United States)

    Battista, Michael T.

    1999-01-01

    Because traditional instruction ignores students' personal construction of mathematical meaning, mathematical thought development is not properly nurtured. Several issues must be addressed, including adults' ignorance of math- and student-learning processes, identification of math-education research specialists, the myth of coverage, testing…

  3. Persistence of Memory for Ignored Lists of Digits: Areas of Developmental Constancy and Change.

    Science.gov (United States)

    Cowan, Nelson; Nugent, Lara D.; Elliott, Emily M.; Saults, J. Scott

    2000-01-01

    Examined persistence of sensory memory by studying developmental differences in recall of attended and ignored lists of digits for second-graders, fifth-graders, and adults. Found developmental increase in the persistence of memory only for the final item in an ignored list, which is the item for which sensory memory is thought to be the most…

  4. Is There Such a Thing as 'White Ignorance' in British Education?

    Science.gov (United States)

    Bain, Zara

    2018-01-01

    I argue that political philosopher Charles W. Mills' twin concepts of 'the epistemology of ignorance' and 'white ignorance' are useful tools for thinking through racial injustice in the British education system. While anti-racist work in British education has a long history, racism persists in British primary, secondary and tertiary education. For…

  5. The concept of ignorance in a risk assessment and risk management context

    International Nuclear Information System (INIS)

    Aven, T.; Steen, R.

    2010-01-01

    There are many definitions of ignorance in the context of risk assessment and risk management. Most refer to situations in which there are lack of knowledge, poor basis for probability assignments and possible outcomes not (fully) known. The purpose of this paper is to discuss the ignorance concept in this setting. Based on a set of risk and uncertainty features, we establish conceptual structures characterising the level of ignorance. These features include the definition of chances (relative frequency-interpreted probabilities) and the existence of scientific uncertainties. Based on these structures, we suggest a definition of ignorance linked to scientific uncertainties, i.e. the lack of understanding of how consequences of the activity are influenced by the underlying factors. In this way, ignorance can be viewed as a condition for applying the precautionary principle. The discussion is also linked to the use and boundaries of risk assessments in the case of large uncertainties, and the methods for classifying risk and uncertainty problems.

  6. Wooden houses in detail. Holzhaeuser im Detail

    Energy Technology Data Exchange (ETDEWEB)

    Ruske, W. (ed.)

    1986-01-01

    Under the serial title 'Planning and construction of wooden houses', WEKA will publish a number of books of which this is the first. Details of design and construction are presented, e.g.: Details of modern one-family houses; Fundamentals of design and hints for planning of wooden houses and compact wooden structures; Constructional ecology, wood protection, thermal insulation, sound insulation; Modular systems for domestic buildings; The 'bookshelf-type' house at the Berlin International Construction Exhibition (IBA); Experience with do-it-yourself systems. With 439 figs.

  7. Ignoring the Obvious: Combined Arms and Fire and Maneuver Tactics Prior to World War I

    National Research Council Canada - National Science Library

    Bruno, Thomas

    2002-01-01

    The armies that entered WWI ignored many pre-war lessons though WWI armies later developed revolutionary tactical-level advances, scholars claim that this tactical evolution followed an earlier period...

  8. Ignoring imperfect detection in biological surveys is dangerous: a response to 'fitting and interpreting occupancy models'.

    Directory of Open Access Journals (Sweden)

    Gurutzeta Guillera-Arroita

    Full Text Available In a recent paper, Welsh, Lindenmayer and Donnelly (WLD question the usefulness of models that estimate species occupancy while accounting for detectability. WLD claim that these models are difficult to fit and argue that disregarding detectability can be better than trying to adjust for it. We think that this conclusion and subsequent recommendations are not well founded and may negatively impact the quality of statistical inference in ecology and related management decisions. Here we respond to WLD's claims, evaluating in detail their arguments, using simulations and/or theory to support our points. In particular, WLD argue that both disregarding and accounting for imperfect detection lead to the same estimator performance regardless of sample size when detectability is a function of abundance. We show that this, the key result of their paper, only holds for cases of extreme heterogeneity like the single scenario they considered. Our results illustrate the dangers of disregarding imperfect detection. When ignored, occupancy and detection are confounded: the same naïve occupancy estimates can be obtained for very different true levels of occupancy so the size of the bias is unknowable. Hierarchical occupancy models separate occupancy and detection, and imprecise estimates simply indicate that more data are required for robust inference about the system in question. As for any statistical method, when underlying assumptions of simple hierarchical models are violated, their reliability is reduced. Resorting in those instances where hierarchical occupancy models do no perform well to the naïve occupancy estimator does not provide a satisfactory solution. The aim should instead be to achieve better estimation, by minimizing the effect of these issues during design, data collection and analysis, ensuring that the right amount of data is collected and model assumptions are met, considering model extensions where appropriate.

  9. ECSIN's methodological approach for hazard evaluation of engineered nanomaterials

    Science.gov (United States)

    Bregoli, Lisa; Benetti, Federico; Venturini, Marco; Sabbioni, Enrico

    2013-04-01

    The increasing production volumes and commercialization of engineered nanomaterials (ENM), together with data on their higher biological reactivity when compared to bulk counterpart and ability to cross biological barriers, have caused concerns about their potential impacts on the health and safety of both humans and the environment. A multidisciplinary component of the scientific community has been called to evaluate the real risks associated with the use of products containing ENM, and is today in the process of developing specific definitions and testing strategies for nanomaterials. At ECSIN we are developing an integrated multidisciplinary methodological approach for the evaluation of the biological effects of ENM on the environment and human health. While our testing strategy agrees with the most widely advanced line of work at the European level, the choice of methods and optimization of protocols is made with an extended treatment of details. Our attention to the methodological and technical details is based on the acknowledgment that the innovative characteristics of matter at the nano-size range may influence the existing testing methods in a partially unpredictable manner, an aspect which is frequently recognized at the discussion level but oftentimes disregarded at the laboratory bench level. This work outlines the most important steps of our testing approach. In particular, each step will be briefly discussed in terms of potential technical and methodological pitfalls that we have encountered, and which are often ignored in nanotoxicology research. The final aim is to draw attention to the need of preliminary studies in developing reliable tests, a crucial aspect to confirm the suitability of the chosen analytical and toxicological methods to be used for the specific tested nanoparticle, and to express the idea that in nanotoxicology,"devil is in the detail".

  10. ECSIN's methodological approach for hazard evaluation of engineered nanomaterials

    International Nuclear Information System (INIS)

    Bregoli, Lisa; Benetti, Federico; Venturini, Marco; Sabbioni, Enrico

    2013-01-01

    The increasing production volumes and commercialization of engineered nanomaterials (ENM), together with data on their higher biological reactivity when compared to bulk counterpart and ability to cross biological barriers, have caused concerns about their potential impacts on the health and safety of both humans and the environment. A multidisciplinary component of the scientific community has been called to evaluate the real risks associated with the use of products containing ENM, and is today in the process of developing specific definitions and testing strategies for nanomaterials. At ECSIN we are developing an integrated multidisciplinary methodological approach for the evaluation of the biological effects of ENM on the environment and human health. While our testing strategy agrees with the most widely advanced line of work at the European level, the choice of methods and optimization of protocols is made with an extended treatment of details. Our attention to the methodological and technical details is based on the acknowledgment that the innovative characteristics of matter at the nano-size range may influence the existing testing methods in a partially unpredictable manner, an aspect which is frequently recognized at the discussion level but oftentimes disregarded at the laboratory bench level. This work outlines the most important steps of our testing approach. In particular, each step will be briefly discussed in terms of potential technical and methodological pitfalls that we have encountered, and which are often ignored in nanotoxicology research. The final aim is to draw attention to the need of preliminary studies in developing reliable tests, a crucial aspect to confirm the suitability of the chosen analytical and toxicological methods to be used for the specific tested nanoparticle, and to express the idea that in nanotoxicology,'devil is in the detail'.

  11. Passionate ignorance

    DEFF Research Database (Denmark)

    Hyldgaard, Kirsten

    2006-01-01

    Psychoanalysis has nothing to say about education. Psychoanalysis has something to say about pedagogy; psychoanalysis has pedagogical-philosophical implications. Pedagogy, in distinction to education, addresses the question of the subject. This implies that pedagogical theory is and cannot be a s...

  12. Fatal ignorance.

    Science.gov (United States)

    1996-01-01

    The Rajiv Gandhi Foundation (RGF), together with the AIMS-affiliated NGO AIDS Cell, Delhi, held a workshop as part of an effort to raise a 90-doctor RGF AIDS workforce which will work together with nongovernmental organizations on AIDS prevention, control, and management. 25 general practitioners registered with the Indian Medical Council, who have practiced medicine in Delhi for the past 10-20 years, responded to a pre-program questionnaire on HIV-related knowledge and attitudes. 6 out of the 25 physicians did not know what the acronym AIDS stands for, extremely low awareness of the clinical aspects of the disease was revealed, 9 believed in the conspiracy theory of HIV development and accidental release by the US Central Intelligence Agency, 8 believed that AIDS is a problem of only the promiscuous, 18 did not know that the mode of HIV transmission is similar to that of the hepatitis B virus, 12 were unaware that HIV-infected people will test HIV-seronegative during the first three months after initial infection and that they will develop symptoms of full-blown AIDS only after 10 years, 10 did not know the name of even one drug used to treat the disease, 3 believed aspirin to be an effective drug against AIDS, many believed fantastic theories about the modes of HIV transmission, and many were acutely homophobic. Efforts were made to clear misconceptions about HIV during the workshop. It is hoped that participating doctors' attitudes about AIDS and the high-risk groups affected by it were also improved.

  13. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  14. Mid-adolescent neurocognitive development of ignoring and attending emotional stimuli

    Directory of Open Access Journals (Sweden)

    Nora C. Vetter

    2015-08-01

    Full Text Available Appropriate reactions toward emotional stimuli depend on the distribution of prefrontal attentional resources. In mid-adolescence, prefrontal top-down control systems are less engaged, while subcortical bottom-up emotional systems are more engaged. We used functional magnetic resonance imaging to follow the neural development of attentional distribution, i.e. attending versus ignoring emotional stimuli, in adolescence. 144 healthy adolescents were studied longitudinally at age 14 and 16 while performing a perceptual discrimination task. Participants viewed two pairs of stimuli – one emotional, one abstract – and reported on one pair whether the items were the same or different, while ignoring the other pair. Hence, two experimental conditions were created: “attending emotion/ignoring abstract” and “ignoring emotion/attending abstract”. Emotional valence varied between negative, positive, and neutral. Across conditions, reaction times and error rates decreased and activation in the anterior cingulate and inferior frontal gyrus increased from age 14 to 16. In contrast, subcortical regions showed no developmental effect. Activation of the anterior insula increased across ages for attending positive and ignoring negative emotions. Results suggest an ongoing development of prefrontal top-down resources elicited by emotional attention from age 14 to 16 while activity of subcortical regions representing bottom-up processing remains stable.

  15. Investigating Deviance Distraction and the Impact of the Modality of the To-Be-Ignored Stimuli.

    Science.gov (United States)

    Marsja, Erik; Neely, Gregory; Ljungberg, Jessica K

    2018-03-01

    It has been suggested that deviance distraction is caused by unexpected sensory events in the to-be-ignored stimuli violating the cognitive system's predictions of incoming stimuli. The majority of research has used methods where the to-be-ignored expected (standards) and the unexpected (deviants) stimuli are presented within the same modality. Less is known about the behavioral impact of deviance distraction when the to-be-ignored stimuli are presented in different modalities (e.g., standard and deviants presented in different modalities). In three experiments using cross-modal oddball tasks with mixed-modality to-be-ignored stimuli, we examined the distractive role of unexpected auditory deviants presented in a continuous stream of expected standard vibrations. The results showed that deviance distraction seems to be dependent upon the to-be-ignored stimuli being presented within the same modality, and that the simplest omission of something expected; in this case, a standard vibration may be enough to capture attention and distract performance.

  16. Non-ignorable missingness item response theory models for choice effects in examinee-selected items.

    Science.gov (United States)

    Liu, Chen-Wei; Wang, Wen-Chung

    2017-11-01

    Examinee-selected item (ESI) design, in which examinees are required to respond to a fixed number of items in a given set, always yields incomplete data (i.e., when only the selected items are answered, data are missing for the others) that are likely non-ignorable in likelihood inference. Standard item response theory (IRT) models become infeasible when ESI data are missing not at random (MNAR). To solve this problem, the authors propose a two-dimensional IRT model that posits one unidimensional IRT model for observed data and another for nominal selection patterns. The two latent variables are assumed to follow a bivariate normal distribution. In this study, the mirt freeware package was adopted to estimate parameters. The authors conduct an experiment to demonstrate that ESI data are often non-ignorable and to determine how to apply the new model to the data collected. Two follow-up simulation studies are conducted to assess the parameter recovery of the new model and the consequences for parameter estimation of ignoring MNAR data. The results of the two simulation studies indicate good parameter recovery of the new model and poor parameter recovery when non-ignorable missing data were mistakenly treated as ignorable. © 2017 The British Psychological Society.

  17. Ignorance Is Bliss, But for Whom? The Persistent Effect of Good Will on Cooperation

    Directory of Open Access Journals (Sweden)

    Mike Farjam

    2016-10-01

    Full Text Available Who benefits from the ignorance of others? We address this question from the point of view of a policy maker who can induce some ignorance into a system of agents competing for resources. Evolutionary game theory shows that when unconditional cooperators or ignorant agents compete with defectors in two-strategy settings, unconditional cooperators get exploited and are rendered extinct. In contrast, conditional cooperators, by utilizing some kind of reciprocity, are able to survive and sustain cooperation when competing with defectors. We study how cooperation thrives in a three-strategy setting where there are unconditional cooperators, conditional cooperators and defectors. By means of simulation on various kinds of graphs, we show that conditional cooperators benefit from the existence of unconditional cooperators in the majority of cases. However, in worlds that make cooperation hard to evolve, defectors benefit.

  18. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    Energy Technology Data Exchange (ETDEWEB)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com [Wageningen University, P.O. Box 338, Wageningen 6700 AH (Netherlands); Heijungs, R. [Vrije Universiteit Amsterdam, De Boelelaan 1105, Amsterdam 1081 HV (Netherlands); Leiden University, Einsteinweg 2, Leiden 2333 CC (Netherlands)

    2017-01-15

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlations between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.

  19. Roles of dark energy perturbations in dynamical dark energy models: can we ignore them?

    Science.gov (United States)

    Park, Chan-Gyung; Hwang, Jai-chan; Lee, Jae-heon; Noh, Hyerim

    2009-10-09

    We show the importance of properly including the perturbations of the dark energy component in the dynamical dark energy models based on a scalar field and modified gravity theories in order to meet with present and future observational precisions. Based on a simple scaling scalar field dark energy model, we show that observationally distinguishable substantial differences appear by ignoring the dark energy perturbation. By ignoring it the perturbed system of equations becomes inconsistent and deviations in (gauge-invariant) power spectra depend on the gauge choice.

  20. Detailed Soils 24K

    Data.gov (United States)

    Kansas Data Access and Support Center — This data set is a digital soil survey and is the most detailed level of soil geographic data developed by the National Cooperative Soil Survey. The information was...

  1. Forgotten and Ignored: Special Education in First Nations Schools in Canada

    Science.gov (United States)

    Phillips, Ron

    2010-01-01

    Usually reviews of special education in Canada describe the special education programs, services, policies, and legislation that are provided by the provinces and territories. The reviews consistently ignore the special education programs, services, policies, and legislation that are provided by federal government of Canada. The federal government…

  2. The Capital Costs Conundrum: Why Are Capital Costs Ignored and What Are the Consequences?

    Science.gov (United States)

    Winston, Gordon C.

    1993-01-01

    Colleges and universities historically have ignored the capital costs associated with institutional administration in their estimates of overall and per-student costs. This neglect leads to distortion of data, misunderstandings, and uninformed decision making. The real costs should be recognized in institutional accounting. (MSE)

  3. Monitoring your friends, not your foes: strategic ignorance and the delegation of real authority

    NARCIS (Netherlands)

    Dominguez-Martinez, S.; Sloof, R.; von Siemens, F.

    2010-01-01

    In this laboratory experiment we study the use of strategic ignorance to delegate real authority within a firm. A worker can gather information on investment projects, while a manager makes the implementation decision. The manager can monitor the worker. This allows her to better exploit the

  4. Monitored by your friends, not your foes: Strategic ignorance and the delegation of real authority

    NARCIS (Netherlands)

    Dominguez-Martinez, S.; Sloof, R.; von Siemens, F.

    2012-01-01

    In this laboratory experiment we study the use of strategic ignorance to delegate real authority within a firm. A worker can gather information on investment projects, while a manager makes the implementation decision. The manager can monitor the worker. This allows her to exploit any information

  5. Mathematical Practice as Sculpture of Utopia: Models, Ignorance, and the Emancipated Spectator

    Science.gov (United States)

    Appelbaum, Peter

    2012-01-01

    This article uses Ranciere's notion of the ignorant schoolmaster and McElheny's differentiation of artist's models from those of the architect and scientist to propose the reconceptualization of mathematics education as the support of emancipated spectators and sculptors of utopia.

  6. The effects of systemic crises when investors can be crisis ignorant

    NARCIS (Netherlands)

    H.J.W.G. Kole (Erik); C.G. Koedijk (Kees); M.J.C.M. Verbeek (Marno)

    2004-01-01

    textabstractSystemic crises can largely affect asset allocations due to the rapid deterioration of the risk-return trade-off. We investigate the effects of systemic crises, interpreted as global simultaneous shocks to financial markets, by introducing an investor adopting a crisis ignorant or crisis

  7. Geographies of knowing, geographies of ignorance: jumping scale in Southeast Asia

    NARCIS (Netherlands)

    van Schendel, W.

    2002-01-01

    'Area studies' use a geographical metaphor to visualise and naturalise particular social spaces as well as a particular scale of analysis. They produce specific geographies of knowing but also create geographies of ignorance. Taking Southeast Asia as an example, in this paper I explore how areas are

  8. The Trust Game Behind the Veil of Ignorance : A Note on Gender Differences

    NARCIS (Netherlands)

    Vyrastekova, J.; Onderstal, A.M.

    2005-01-01

    We analyse gender differences in the trust game in a "behind the veil of ignorance" design.This method yields strategies that are consistent with actions observed in the classical trust game experiments.We observe that, on averge, men and women do not differ in "trust", and that women are slightly

  9. The trust game behind the veil of ignorance: A note on gender differences

    NARCIS (Netherlands)

    Vyrastekova, J.; Onderstal, S.

    2008-01-01

    We analyze gender differences in the trust game in a "behind the veil of ignorance" design. This method yields strategies that are consistent with actions observed in the classical trust game experiments. We observe that, on average, men and women do not differ in "trust", and that women are

  10. The Ignorant Facilitator: Education, Politics and Theatre in Co-Communities

    Science.gov (United States)

    Lev-Aladgem, Shulamith

    2015-01-01

    This article discusses the book "The Ignorant Schoolmaster: Five Lessons in Intellectual Emancipation" by the French philosopher, Jacques Rancière. Its intention is to study the potential contribution of this text to the discourse of applied theatre (theatre in co-communities) in general, and the role of the facilitator in particular. It…

  11. Ignoring Memory Hints: The Stubborn Influence of Environmental Cues on Recognition Memory

    Science.gov (United States)

    Selmeczy, Diana; Dobbins, Ian G.

    2017-01-01

    Recognition judgments can benefit from the use of environmental cues that signal the general likelihood of encountering familiar versus unfamiliar stimuli. While incorporating such cues is often adaptive, there are circumstances (e.g., eyewitness testimony) in which observers should fully ignore environmental cues in order to preserve memory…

  12. Uncertain Climate Forecasts From Multimodel Ensembles: When to Use Them and When to Ignore Them

    OpenAIRE

    Jewson, Stephen; Rowlands, Dan

    2010-01-01

    Uncertainty around multimodel ensemble forecasts of changes in future climate reduces the accuracy of those forecasts. For very uncertain forecasts this effect may mean that the forecasts should not be used. We investigate the use of the well-known Bayesian Information Criterion (BIC) to make the decision as to whether a forecast should be used or ignored.

  13. Inattentional blindness for ignored words: comparison of explicit and implicit memory tasks.

    Science.gov (United States)

    Butler, Beverly C; Klein, Raymond

    2009-09-01

    Inattentional blindness is described as the failure to perceive a supra-threshold stimulus when attention is directed away from that stimulus. Based on performance on an explicit recognition memory test and concurrent functional imaging data Rees, Russell, Frith, and Driver [Rees, G., Russell, C., Frith, C. D., & Driver, J. (1999). Inattentional blindness versus inattentional amnesia for fixated but ignored words. Science, 286, 2504-2507] reported inattentional blindness for word stimuli that were fixated but ignored. The present study examined both explicit and implicit memory for fixated but ignored words using a selective-attention task in which overlapping picture/word stimuli were presented at fixation. No explicit awareness of the unattended words was apparent on a recognition memory test. Analysis of an implicit memory task, however, indicated that unattended words were perceived at a perceptual level. Thus, the selective-attention task did not result in perfect filtering as suggested by Rees et al. While there was no evidence of conscious perception, subjects were not blind to the implicit perceptual properties of fixated but ignored words.

  14. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  15. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  16. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  17. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  18. Kinetic energy budget details

    Indian Academy of Sciences (India)

    Abstract. This paper presents the detailed turbulent kinetic energy budget and higher order statistics of flow behind a surface-mounted rib with and without superimposed acoustic excitation. Pattern recognition technique is used to determine the large-scale structure magnitude. It is observed that most of the turbulence ...

  19. Three Latin Phonological Details

    DEFF Research Database (Denmark)

    Olsen, Birgit Anette

    2006-01-01

    The present paper deals with three minor details of Latin phonology: 1) the development of the initial sequence *u¿l¿-, where it is suggested that an apparent vacillation between ul- and vol-/vul- represents sandhi variants going back to the proto-language, 2) the adjectives ama¯rus ‘bitter' and ...

  20. Double jeopardy, the equal value of lives and the veil of ignorance: a rejoinder to Harris.

    Science.gov (United States)

    McKie, J; Kuhse, H; Richardson, J; Singer, P

    1996-08-01

    Harris levels two main criticisms against our original defence of QALYs (Quality Adjusted Life Years). First, he rejects the assumption implicit in the QALY approach that not all lives are of equal value. Second, he rejects our appeal to Rawls's veil of ignorance test in support of the QALY method. In the present article we defend QALYs against Harris's criticisms. We argue that some of the conclusions Harris draws from our view that resources should be allocated on the basis of potential improvements in quality of life and quantity of life are erroneous, and that others lack the moral implications Harris claims for them. On the other hand, we defend our claim that a rational egoist, behind a veil of ignorance, could consistently choose to allocate life-saving resources in accordance with the QALY method, despite Harris's claim that a rational egoist would allocate randomly if there is no better than a 50% chance of being the recipient.

  1. Crimes commited by indigeno us people in ignorance of the law

    Directory of Open Access Journals (Sweden)

    Diego Fernando Chimbo Villacorte

    2017-07-01

    Full Text Available This analysis focuses specifically When the Indian commits crimes in ignorance of the law, not only because it ignores absolutely the unlawfulness of their conduct but when he believes he is acting in strict accordance with their beliefs and ancestral customs which –in squabble some cases– with positive law. Likewise the impossibility of imposing a penalty –when the offense is committed outside the community– or indigenous purification –when it marks an act that disturbs social peace within the indigenous– community is committed but mainly focuses on the impossibility to impose a security measure when it has committed a crime outside their community, because doing so is as unimpeachable and returns to his community, generating a discriminating treatment that prevents the culturally different self-determination.

  2. The Insider Threat to Cybersecurity: How Group Process and Ignorance Affect Analyst Accuracy and Promptitude

    Science.gov (United States)

    2017-09-01

    McCarthy, J. (1980). Circumscription - A Form of Nonmonotonic Reasoning. Artificial Intelligence , 13, 27–39. McClure, S., Scambray, J., & Kurtz, G. (2012...THREAT TO CYBERSECURITY : HOW GROUP PROCESS AND IGNORANCE AFFECT ANALYST ACCURACY AND PROMPTITUDE by Ryan F. Kelly September 2017...September 2017 3. REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE THE INSIDER THREAT TO CYBERSECURITY : HOW GROUP PROCESS AND

  3. Geographies of knowing, geographies of ignorance: jumping scale in Southeast Asia

    OpenAIRE

    van Schendel, W.

    2002-01-01

    'Area studies' use a geographical metaphor to visualise and naturalise particular social spaces as well as a particular scale of analysis. They produce specific geographies of knowing but also create geographies of ignorance. Taking Southeast Asia as an example, in this paper I explore how areas are imagined and how area knowledge is structured to construct area 'heartlands' as well as area `borderlands'. This is illustrated by considering a large region of Asia (here named Zomiatf) that did ...

  4. Early humans' egalitarian politics: runaway synergistic competition under an adapted veil of ignorance.

    Science.gov (United States)

    Harvey, Marc

    2014-09-01

    This paper proposes a model of human uniqueness based on an unusual distinction between two contrasted kinds of political competition and political status: (1) antagonistic competition, in quest of dominance (antagonistic status), a zero-sum, self-limiting game whose stake--who takes what, when, how--summarizes a classical definition of politics (Lasswell 1936), and (2) synergistic competition, in quest of merit (synergistic status), a positive-sum, self-reinforcing game whose stake becomes "who brings what to a team's common good." In this view, Rawls's (1971) famous virtual "veil of ignorance" mainly conceals politics' antagonistic stakes so as to devise the principles of a just, egalitarian society, yet without providing any means to enforce these ideals (Sen 2009). Instead, this paper proposes that human uniqueness flourished under a real "adapted veil of ignorance" concealing the steady inflation of synergistic politics which resulted from early humans' sturdy egalitarianism. This proposition divides into four parts: (1) early humans first stumbled on a purely cultural means to enforce a unique kind of within-team antagonistic equality--dyadic balanced deterrence thanks to handheld weapons (Chapais 2008); (2) this cultural innovation is thus closely tied to humans' darkest side, but it also launched the cumulative evolution of humans' brightest qualities--egalitarian team synergy and solidarity, together with the associated synergistic intelligence, culture, and communications; (3) runaway synergistic competition for differential merit among antagonistically equal obligate teammates is the single politically selective mechanism behind the cumulative evolution of all these brighter qualities, but numerous factors to be clarified here conceal this mighty evolutionary driver; (4) this veil of ignorance persists today, which explains why humans' unique prosocial capacities are still not clearly understood by science. The purpose of this paper is to start lifting

  5. On the perpetuation of ignorance: system dependence, system justification, and the motivated avoidance of sociopolitical information.

    Science.gov (United States)

    Shepherd, Steven; Kay, Aaron C

    2012-02-01

    How do people cope when they feel uninformed or unable to understand important social issues, such as the environment, energy concerns, or the economy? Do they seek out information, or do they simply ignore the threatening issue at hand? One would intuitively expect that a lack of knowledge would motivate an increased, unbiased search for information, thereby facilitating participation and engagement in these issues-especially when they are consequential, pressing, and self-relevant. However, there appears to be a discrepancy between the importance/self-relevance of social issues and people's willingness to engage with and learn about them. Leveraging the literature on system justification theory (Jost & Banaji, 1994), the authors hypothesized that, rather than motivating an increased search for information, a lack of knowledge about a specific sociopolitical issue will (a) foster feelings of dependence on the government, which will (b) increase system justification and government trust, which will (c) increase desires to avoid learning about the relevant issue when information is negative or when information valence is unknown. In other words, the authors suggest that ignorance-as a function of the system justifying tendencies it may activate-may, ironically, breed more ignorance. In the contexts of energy, environmental, and economic issues, the authors present 5 studies that (a) provide evidence for this specific psychological chain (i.e., ignorance about an issue → dependence → government trust → avoidance of information about that issue); (b) shed light on the role of threat and motivation in driving the second and third links in this chain; and (c) illustrate the unfortunate consequences of this process for individual action in those contexts that may need it most.

  6. Ignoring versus updating in working memory reveal differential roles of attention and feature binding

    OpenAIRE

    Fallon, SJ; Mattiesing, RM; Dolfen, N; Manohar, SGM; Husain, M

    2017-01-01

    Ignoring distracting information and updating current contents are essential components of working memory (WM). Yet, although both require controlling irrelevant information, it is unclear whether they have the same effects on recall and produce the same level of misbinding errors (incorrectly joining the features of different memoranda). Moreover, the likelihood of misbinding may be affected by the feature similarity between the items already encoded into memory and the information that has ...

  7. Methodological themes and variations

    International Nuclear Information System (INIS)

    Tetlock, P.E.

    1989-01-01

    This paper reports on the tangible progress that has been made in clarifying the underlying processes that affect both the likelihood of war in general and of nuclear war in particular. It also illustrates how difficult it is to make progress in this area. Nonetheless, what has been achieved should not be minimized. We have learned a good deal on both the theoretical and the methodological fronts and, perhaps, most important, we have learned a good deal about the limits of our knowledge. Knowledge of our ignorance---especially in a policy domain where confident, even glib, causal assertions are so common---can be a major contribution in itself. The most important service the behavioral and social sciences can currently provide to the policy making community may well be to make thoughtful skepticism respectable: to sensitize those who make key decisions to the uncertainty surrounding our understanding of international conflict and to the numerous qualifications that now need to be attached to simple causal theories concerning the origins of war

  8. Methodology for evaluating port vulnerability to nuclear smuggling

    International Nuclear Information System (INIS)

    Ek, D.; Gronager, J.R.; Blankenship, J.A.; Martin, D.

    2001-01-01

    perimeter gates ignores several plausible adversary scenarios, and that these scenarios can be addressed through a combination of procedures, minor physical security upgrades and relocation of portals. The presentation will discuss the adversary profiles used for the analysis, will detail the analysis methodology and discuss the results of the analysis. (author)

  9. Ignoring alarming news brings indifference: Learning about the world and the self.

    Science.gov (United States)

    Paluck, Elizabeth Levy; Shafir, Eldar; Wu, Sherry Jueyu

    2017-10-01

    The broadcast of media reports about moral crises such as famine can subtly depress rather than activate moral concern. Whereas much research has examined the effects of media reports that people attend to, social psychological analysis suggests that what goes unattended can also have an impact. We test the idea that when vivid news accounts of human suffering are broadcast in the background but ignored, people infer from their choice to ignore these accounts that they care less about the issue, compared to those who pay attention and even to those who were not exposed. Consistent with research on self-perception and attribution, three experiments demonstrate that participants who were nudged to distract themselves in front of a television news program about famine in Niger (Study 1), or to skip an online promotional video for the Niger famine program (Study 2), or who chose to ignore the famine in Niger television program in more naturalistic settings (Study 3) all assigned lower importance to poverty and to hunger reduction compared to participants who watched with no distraction or opportunity to skip the program, or to those who did not watch at all. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Detailed Debunking of Denial

    Science.gov (United States)

    Enting, I. G.; Abraham, J. P.

    2012-12-01

    The disinformation campaign against climate science has been compared to a guerilla war whose tactics undermine the traditional checks and balances of science. One comprehensive approach has to been produce archives of generic responses such as the websites of RealClimate and SkepticalScience. We review our experiences with an alternative approach of detailed responses to a small number of high profile cases. Our particular examples were Professor Ian Plimer and Christopher Monckton, the Third Viscount Monckton of Brenchley, each of whom has been taken seriously by political leaders in our respective countries. We relate our experiences to comparable examples such as John Mashey's analysis of the Wegman report and the formal complaints about Lomborg's "Skeptical Environmentalist" and Durkin's "Great Global Warming Swindle". Our two approaches used contrasting approaches: an on-line video of a lecture vs an evolving compendium of misrepresentations. Additionally our approaches differed in the emphasis. The analysis of Monckton concentrated on the misrepresentation of the science, while the analysis of Plimer concentrated on departures from accepted scientific practice: fabrication of data, misrepresentation of cited sources and unattributed use of the work of others. Benefits of an evolving compendium were the ability to incorporate contributions from members of the public who had identified additional errors and the scope for addressing new aspects as they came to public attention. `Detailed debunking' gives non-specialists a reference point for distinguishing non-science when engaging in public debate.

  11. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  12. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  13. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  14. THULE: A detailed description

    International Nuclear Information System (INIS)

    Terry, M.J.

    1964-07-01

    This report describes the THULE scheme of lattice physics calculation which has been developed in FORTRAN for the IBM 7090. This scheme predicts the neutron flux over energy and space, for many groups and regions, together with reactivity and reaction rate edits for both a single lattice cell and a reactor core. This report describes in detail the input requirements for the THULE programme which forms the main part of the scheme. Brief descriptions of the 7090 programmes TED 6 and NOAH are included as appendices. TED 6 will produce the THULE edits from a WDSN output tape and NOAH is a version of the METHUSELAH programme which contains many of the THULE edits and will also produce input cards for THULE. (author)

  15. Early auditory change detection implicitly facilitated by ignored concurrent visual change during a Braille reading task.

    Science.gov (United States)

    Aoyama, Atsushi; Haruyama, Tomohiro; Kuriki, Shinya

    2013-09-01

    Unconscious monitoring of multimodal stimulus changes enables humans to effectively sense the external environment. Such automatic change detection is thought to be reflected in auditory and visual mismatch negativity (MMN) and mismatch negativity fields (MMFs). These are event-related potentials and magnetic fields, respectively, evoked by deviant stimuli within a sequence of standard stimuli, and both are typically studied during irrelevant visual tasks that cause the stimuli to be ignored. Due to the sensitivity of MMN/MMF to potential effects of explicit attention to vision, however, it is unclear whether multisensory co-occurring changes can purely facilitate early sensory change detection reciprocally across modalities. We adopted a tactile task involving the reading of Braille patterns as a neutral ignore condition, while measuring magnetoencephalographic responses to concurrent audiovisual stimuli that were infrequently deviated either in auditory, visual, or audiovisual dimensions; 1000-Hz standard tones were switched to 1050-Hz deviant tones and/or two-by-two standard check patterns displayed on both sides of visual fields were switched to deviant reversed patterns. The check patterns were set to be faint enough so that the reversals could be easily ignored even during Braille reading. While visual MMFs were virtually undetectable even for visual and audiovisual deviants, significant auditory MMFs were observed for auditory and audiovisual deviants, originating from bilateral supratemporal auditory areas. Notably, auditory MMFs were significantly enhanced for audiovisual deviants from about 100 ms post-stimulus, as compared with the summation responses for auditory and visual deviants or for each of the unisensory deviants recorded in separate sessions. Evidenced by high tactile task performance with unawareness of visual changes, we conclude that Braille reading can successfully suppress explicit attention and that simultaneous multisensory changes can

  16. Growth Modeling with Non-Ignorable Dropout: Alternative Analyses of the STAR*D Antidepressant Trial

    Science.gov (United States)

    Muthén, Bengt; Asparouhov, Tihomir; Hunter, Aimee; Leuchter, Andrew

    2011-01-01

    This paper uses a general latent variable framework to study a series of models for non-ignorable missingness due to dropout. Non-ignorable missing data modeling acknowledges that missingness may depend on not only covariates and observed outcomes at previous time points as with the standard missing at random (MAR) assumption, but also on latent variables such as values that would have been observed (missing outcomes), developmental trends (growth factors), and qualitatively different types of development (latent trajectory classes). These alternative predictors of missing data can be explored in a general latent variable framework using the Mplus program. A flexible new model uses an extended pattern-mixture approach where missingness is a function of latent dropout classes in combination with growth mixture modeling using latent trajectory classes. A new selection model allows not only an influence of the outcomes on missingness, but allows this influence to vary across latent trajectory classes. Recommendations are given for choosing models. The missing data models are applied to longitudinal data from STAR*D, the largest antidepressant clinical trial in the U.S. to date. Despite the importance of this trial, STAR*D growth model analyses using non-ignorable missing data techniques have not been explored until now. The STAR*D data are shown to feature distinct trajectory classes, including a low class corresponding to substantial improvement in depression, a minority class with a U-shaped curve corresponding to transient improvement, and a high class corresponding to no improvement. The analyses provide a new way to assess drug efficiency in the presence of dropout. PMID:21381817

  17. Burden of Circulatory System Diseases and Ignored Barriers ofKnowledge Translation

    Directory of Open Access Journals (Sweden)

    Hamed-Basir Ghafouri

    2012-10-01

    Full Text Available Circulatory system disease raise third highest disability-adjusted life years among Iranians and ischemic cardiac diseases are main causes for such burden. Despite available evidences on risk factors of the disease, no effective intervention was implemented to control and prevent the disease. This paper non-systematically reviews available literature on the problem, solutions, and barriers of implementation of knowledge translation in Iran. It seems that there are ignored factors such as cultural and motivational issues in knowledge translation interventions but there are hopes for implementation of started projects and preparation of students as next generation of knowledge transferors.

  18. Crowdsourcing detailed flood data

    Science.gov (United States)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  19. Detailed IR aperture measurements

    CERN Document Server

    Bruce, Roderik; Garcia Morales, Hector; Giovannozzi, Massimo; Hermes, Pascal Dominik; Mirarchi, Daniele; Quaranta, Elena; Redaelli, Stefano; Rossi, Carlo; Skowronski, Piotr Krzysztof; Wretborn, Sven Joel; CERN. Geneva. ATS Department

    2016-01-01

    MD 1673 was carried out on October 5 2016, in order to investigate in more detail the available aperture in the LHC high-luminosity insertions at 6.5 TeV and β∗=40 cm. Previous aperture measurements in 2016 during commissioning had shown that the available aperture is at the edge of protection, and that the aperture bottleneck at β∗=40 cm in certain cases is found in the separation plane instead of in the crossing plane. Furthermore, the bottlenecks were consistently found in close to the upstream end of Q3 on the side of the incoming beam, and not in Q2 on the outgoing beam as expected from calculations. Therefore, this MD aimed at measuring IR1 and IR5 separately (at 6.5 TeV and β∗=40 cm, for 185 µrad half crossing angle), to further localize the bottlenecks longitudinally using newly installed BLMs, investigate the difference in aperture between Q2 and Q3, and to see if any aperture can be gained using special orbit bumps.

  20. Allocating health care: cost-utility analysis, informed democratic decision making, or the veil of ignorance?

    Science.gov (United States)

    Goold, S D

    1996-01-01

    Assuming that rationing health care is unavoidable, and that it requires moral reasoning, how should we allocate limited health care resources? This question is difficult because our pluralistic, liberal society has no consensus on a conception of distributive justice. In this article I focus on an alternative: Who shall decide how to ration health care, and how shall this be done to respect autonomy, pluralism, liberalism, and fairness? I explore three processes for making rationing decisions: cost-utility analysis, informed democratic decision making, and applications of the veil of ignorance. I evaluate these processes as examples of procedural justice, assuming that there is no outcome considered the most just. I use consent as a criterion to judge competing processes so that rationing decisions are, to some extent, self-imposed. I also examine the processes' feasibility in our current health care system. Cost-utility analysis does not meet criteria for actual or presumed consent, even if costs and health-related utility could be measured perfectly. Existing structures of government cannot creditably assimilate the information required for sound rationing decisions, and grassroots efforts are not representative. Applications of the veil of ignorance are more useful for identifying principles relevant to health care rationing than for making concrete rationing decisions. I outline a process of decision making, specifically for health care, that relies on substantive, selected representation, respects pluralism, liberalism, and deliberative democracy, and could be implemented at the community or organizational level.

  1. Excitatory and inhibitory priming by attended and ignored non-recycled words with monolinguals and bilinguals.

    Science.gov (United States)

    Neumann, Ewald; Nkrumah, Ivy K; Chen, Zhe

    2018-03-03

    Experiments examining identity priming from attended and ignored novel words (words that are used only once except when repetition is required due to experimental manipulation) in a lexical decision task are reported. Experiment 1 tested English monolinguals whereas Experiment 2 tested Twi (a native language of Ghana, Africa)-English bilinguals. Participants were presented with sequential pairs of stimuli composed of a prime followed by a probe, with each containing two items. The participants were required to name the target word in the prime display, and to make a lexical decision to the target item in the probe display. On attended repetition (AR) trials the probe target item was identical to the target word on the preceding attentional display. On ignored repetition (IR) trials the probe target item was the same as the distractor word in the preceding attentional display. The experiments produced facilitated (positive) priming in the AR trials and delayed (negative) priming in the IR trials. Significantly, the positive and negative priming effects also replicated across both monolingual and bilingual groups of participants, despite the fact that the bilinguals were responding to the task in their non-dominant language.

  2. Illiteracy, Ignorance, and Willingness to Quit Smoking among Villagers in India

    Science.gov (United States)

    Gorty, Prasad V. S. N. R.; Allam, Apparao

    1992-01-01

    During the field work to control oral cancer, difficulty in communication was encountered with illiterates. A study to define the role of illiteracy, ignorance and willingness to quit smoking among the villagers was undertaken in a rural area surrounding Doddipatla Village, A.P., India. Out of a total population of 3,550, 272 (7.7%) persons, mostly in the age range of 21–50 years, attended a cancer detection camp. There were 173 (63.6%) females and 99 (36.4%) males, among whom 66 (M53 + F13) were smokers; 36.4% of males and 63% of females were illiterate. Among the illiterates, it was observed that smoking rate was high (56%) and 47.7% were ignorant of health effects of smoking. The attitude of illiterate smokers was encouraging, as 83.6% were willing to quit smoking. Further research is necessary to design health education material for 413.5 million illiterates living in India (1991 Indian Census). A community health worker, trained in the use of mass media coupled with a person‐to‐person approach, may help the smoker to quit smoking. PMID:1506267

  3. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  4. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  5. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  6. Methodology for the conceptual design of solar kitchens

    International Nuclear Information System (INIS)

    Macia G, A F; Estrada V, D A; Chejne J, F; Velasquez, H I; Rengifo, R

    2005-01-01

    A detailed description of the methodology for the conceptual design of solar kitchens has appeared, which allows its detailed design. The methodology is based on three main phases that natural and has been very intuitively identified given to the characteristics and conditions of the project: conceptual phase, detail phase and execution phase

  7. E-detailing: information technology applied to pharmaceutical detailing.

    Science.gov (United States)

    Montoya, Isaac D

    2008-11-01

    E-detailing can be best described as the use of information technology in the field of pharmaceutical detailing. It is becoming highly popular among pharmaceutical companies because it maximizes the time of the sales force, cuts down the cost of detailing and increases physician prescribing. Thus, the application of information technology is proving to be beneficial to both physicians and pharmaceutical companies. When e-detailing was introduced in 1996, it was limited to the US; however, numerous other countries soon adopted this novel approach to detailing and now it is popular in many developed nations. The objective of this paper is to demonstrate the rapid growth of e-detailing in the field of pharmaceutical marketing. A review of e-detailing literature was conducted in addition to personal conversations with physicians. E-detailing has the potential to reduce marketing costs, increase accessibility to physicians and offer many of the advantages of face-to-face detailing. E-detailing is gaining acceptance among physicians because they can access the information of a pharmaceutical product at their own time and convenience. However, the drug safety aspect of e-detailing has not been examined and e-detailing remains a supplement to traditional detailing and is not yet a replacement to it.

  8. Spin masters how the media ignored the real news and helped reelect Barack Obama

    CERN Document Server

    Freddoso, David

    2013-01-01

    The biggest story of the election was how the media ignored the biggest story of the election.Amid all the breathless coverage of a non-existent War on Women, there was little or no coverage of Obama's war on the economy?how, for instance, part-time work is replacing full-time work; how low-wage jobs are replacing high-wage ones; how for Americans between the ages of 25 and 54 there are fewer jobs today than there were when the recession officially ended in 2009, and fewer, in fact, than at any time since mid-1997.The downsizing of the American economy wasn't the only stor

  9. On Moderator Detection in Anchoring Research: Implications of Ignoring Estimate Direction

    Directory of Open Access Journals (Sweden)

    Nathan N. Cheek

    2018-05-01

    Full Text Available Anchoring, whereby judgments assimilate to previously considered standards, is one of the most reliable effects in psychology. In the last decade, researchers have become increasingly interested in identifying moderators of anchoring effects. We argue that a drawback of traditional moderator analyses in the standard anchoring paradigm is that they ignore estimate direction—whether participants’ estimates are higher or lower than the anchor value. We suggest that failing to consider estimate direction can sometimes obscure moderation in anchoring tasks, and discuss three potential analytic solutions that take estimate direction into account. Understanding moderators of anchoring effects is essential for a basic understanding of anchoring and for applied research on reducing the influence of anchoring in real-world judgments. Considering estimate direction reduces the risk of failing to detect moderation.

  10. Effects of ignoring baseline on modeling transitions from intact cognition to dementia.

    Science.gov (United States)

    Yu, Lei; Tyas, Suzanne L; Snowdon, David A; Kryscio, Richard J

    2009-07-01

    This paper evaluates the effect of ignoring baseline when modeling transitions from intact cognition to dementia with mild cognitive impairment (MCI) and global impairment (GI) as intervening cognitive states. Transitions among states are modeled by a discrete-time Markov chain having three transient (intact cognition, MCI, and GI) and two competing absorbing states (death and dementia). Transition probabilities depend on two covariates, age and the presence/absence of an apolipoprotein E-epsilon4 allele, through a multinomial logistic model with shared random effects. Results are illustrated with an application to the Nun Study, a cohort of 678 participants 75+ years of age at baseline and followed longitudinally with up to ten cognitive assessments per nun.

  11. The wisdom of ignorant crowds: Predicting sport outcomes by mere recognition

    Directory of Open Access Journals (Sweden)

    Stefan M. Herzog

    2011-02-01

    Full Text Available that bets on the fact that people's recognition knowledge of names is a proxy for their competitiveness: In sports, it predicts that the better-known team or player wins a game. We present two studies on the predictive power of recognition in forecasting soccer games (World Cup 2006 and UEFA Euro 2008 and analyze previously published results. The performance of the collective recognition heuristic is compared to two benchmarks: predictions based on official rankings and aggregated betting odds. Across three soccer and two tennis tournaments, the predictions based on recognition performed similar to those based on rankings; when compared with betting odds, the heuristic fared reasonably well. Forecasts based on rankings---but not on betting odds---were improved by incorporating collective recognition information. We discuss the use of recognition for forecasting in sports and conclude that aggregating across individual ignorance spawns collective wisdom.

  12. Zooplankton Methodology, Collection & identyification - A field manual

    Digital Repository Service at National Institute of Oceanography (India)

    Goswami, S.C.

    and productivity would largely depend upon the use of correct methodology which involves collection of samples, fixation, preservation, analysis and computation of data. The detailed procedures on all these aspects are given in this manual....

  13. Food Quality Certificates and Research on Effect of Food Quality Certificates to Determinate Ignored Level of Buying Behavioral: A Case Study in Hitit University Feas Business Department

    Directory of Open Access Journals (Sweden)

    Hulya CAGIRAN KENDIRLI

    2014-12-01

    According to result of research, there is no relationship between demographic specialties of students and ignored of food and quality legislation. But there is relationship between sexuality and ignored of food and quality legislation.

  14. Influences on physicians' adoption of electronic detailing (e-detailing).

    Science.gov (United States)

    Alkhateeb, Fadi M; Doucette, William R

    2009-01-01

    E-detailing means using digital technology: internet, video conferencing and interactive voice response. There are two types of e-detailing: interactive (virtual) and video. Currently, little is known about what factors influence physicians' adoption of e-detailing. The objectives of this study were to test a model of physicians' adoption of e-detailing and to describe physicians using e-detailing. A mail survey was sent to a random sample of 2000 physicians practicing in Iowa. Binomial logistic regression was used to test the model of influences on physician adoption of e-detailing. On the basis of Rogers' model of adoption, the independent variables included relative advantage, compatibility, complexity, peer influence, attitudes, years in practice, presence of restrictive access to traditional detailing, type of specialty, academic affiliation, type of practice setting and control variables. A total of 671 responses were received giving a response rate of 34.7%. A total of 141 physicians (21.0%) reported using of e-detailing. The overall adoption model for using either type of e-detailing was found to be significant. Relative advantage, peer influence, attitudes, type of specialty, presence of restrictive access and years of practice had significant influences on physician adoption of e-detailing. The model of adoption of innovation is useful to explain physicians' adoption of e-detailing.

  15. Constructivism: a naturalistic methodology for nursing inquiry.

    Science.gov (United States)

    Appleton, J V; King, L

    1997-12-01

    This article will explore the philosophical underpinnings of the constructivist research paradigm. Despite its increasing popularity in evaluative health research studies there is limited recognition of constructivism in popular research texts. Lincoln and Guba's original approach to constructivist methodology is outlined and a detailed framework for nursing research is offered. Fundamental issues and concerns surrounding this methodology are debated and differences between method and methodology are highlighted.

  16. THE FUTURE OF LANGUAGE TEACHING METHODOLOGY

    OpenAIRE

    Ted Rodgers

    1998-01-01

    Abstract : This paper reviews the current state of ELT methodology, particulary in respect to a number of current views suggesting that the profession is now in a "post-methods" era in which previous attention to Methods (Total Physical Response, Silent Way, Natural Approach, etc.) has given way to a more generic approach to ELT methodology. Ten potential future courses of ELT methodology are outlines and three of these are considered in some detail. Particular consideration is given as to ho...

  17. On Detailing in Contemporary Architecture

    DEFF Research Database (Denmark)

    Kristensen, Claus; Kirkegaard, Poul Henning

    2010-01-01

    Details in architecture have a significant influence on how architecture is experienced. One can touch the materials and analyse the detailing - thus details give valuable information about the architectural scheme as a whole. The absence of perceptual stimulation like details and materiality...... / tactility can blur the meaning of the architecture and turn it into an empty statement. The present paper will outline detailing in contemporary architecture and discuss the issue with respect to architectural quality. Architectural cases considered as sublime piece of architecture will be presented...

  18. Econometric Analysis of 2003 Data on the Post-Service Earnings of Military Retirees: Methodology Report

    National Research Council Canada - National Science Library

    Mackin, Patrick C; Darling, Kimberly L

    2004-01-01

    ...). This report details how the estimation datasets were constructed from these two data sources and describes the econometric methodology in detail, including the definition of alternative models...

  19. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  20. Details

    Indian Academy of Sciences (India)

    teju

    2018-05-04

    May 4, 2018 ... ... selected candidate is required to work with Accounts Officer and assist in ... in website of Public Financial Management System etc., and carry out .... Duties also include coordination and liaison with Chief Editors and other ...

  1. Details

    Indian Academy of Sciences (India)

    Admin

    IASc), an institution under the Department of Science &. Technology, Government of India publishes scholarly journals, thematic books and other publications. The Academy currently publishes 10 journals in various disciplines in science.

  2. Details

    Indian Academy of Sciences (India)

    The incumbent should have passed Diploma in Secretarial Practice or Bachelors of Commerce with at least 50% marks. Should be proficient in typing, shorthand and MS office. Age: Not more than. 25 years as on 1 April 2017. Preference will be given to male candidates. Experience: 2 years experience in the administrative ...

  3. Phonological processing of ignored distractor pictures, an fMRI investigation.

    Science.gov (United States)

    Bles, Mart; Jansma, Bernadette M

    2008-02-11

    Neuroimaging studies of attention often focus on interactions between stimulus representations and top-down selection mechanisms in visual cortex. Less is known about the neural representation of distractor stimuli beyond visual areas, and the interactions between stimuli in linguistic processing areas. In the present study, participants viewed simultaneously presented line drawings at peripheral locations, while in the MRI scanner. The names of the objects depicted in these pictures were either phonologically related (i.e. shared the same consonant-vowel onset construction), or unrelated. Attention was directed either at the linguistic properties of one of these pictures, or at the fixation point (i.e. away from the pictures). Phonological representations of unattended pictures could be detected in the posterior superior temporal gyrus, the inferior frontal gyrus, and the insula. Under some circumstances, the name of ignored distractor pictures is retrieved by linguistic areas. This implies that selective attention to a specific location does not completely filter out the representations of distractor stimuli at early perceptual stages.

  4. Tobacco Usage in Uttarakhand: A Dangerous Combination of High Prevalence, Widespread Ignorance, and Resistance to Quitting

    Directory of Open Access Journals (Sweden)

    Nathan John Grills

    2015-01-01

    Full Text Available Background. Nearly one-third of adults in India use tobacco, resulting in 1.2 million deaths. However, little is known about knowledge, attitudes, and practices (KAP related to smoking in the impoverished state of Uttarakhand. Methods. A cross-sectional epidemiological prevalence survey was undertaken. Multistage cluster sampling selected 20 villages and 50 households to survey from which 1853 people were interviewed. Tobacco prevalence and KAP were analyzed by income level, occupation, age, and sex. 95% confidence intervals were calculated using standard formulas and incorporating assumptions in relation to the clustering effect. Results. The overall prevalence of tobacco usage, defined using WHO criteria, was 38.9%. 93% of smokers and 86% of tobacco chewers were male. Prevalence of tobacco use, controlling for other factors, was associated with lower education, older age, and male sex. 97.6% of users and 98.1% of nonusers wanted less tobacco. Except for lung cancer (89% awareness, awareness of diseases caused by tobacco usage was low (cardiac: 67%; infertility: 32.5%; stroke: 40.5%. Conclusion. A dangerous combination of high tobacco usage prevalence, ignorance about its dangers, and few quit attempts being made suggests the need to develop effective and evidence based interventions to prevent a health and development disaster in Uttarakhand.

  5. Reassessing insurers' access to genetic information: genetic privacy, ignorance, and injustice.

    Science.gov (United States)

    Feiring, Eli

    2009-06-01

    Many countries have imposed strict regulations on the genetic information to which insurers have access. Commentators have warned against the emerging body of legislation for different reasons. This paper demonstrates that, when confronted with the argument that genetic information should be available to insurers for health insurance underwriting purposes, one should avoid appeals to rights of genetic privacy and genetic ignorance. The principle of equality of opportunity may nevertheless warrant restrictions. A choice-based account of this principle implies that it is unfair to hold people responsible for the consequences of the genetic lottery, since we have no choice in selecting our genotype or the expression of it. However appealing, this view does not take us all the way to an adequate justification of inaccessibility of genetic information. A contractarian account, suggesting that health is a condition of opportunity and that healthcare is an essential good, seems more promising. I conclude that if or when predictive medical tests (such as genetic tests) are developed with significant actuarial value, individuals have less reason to accept as fair institutions that limit access to healthcare on the grounds of risk status. Given the assumption that a division of risk pools in accordance with a rough estimate of people's level of (genetic) risk will occur, fairness and justice favour universal health insurance based on solidarity.

  6. Phonological processing of ignored distractor pictures, an fMRI investigation

    Directory of Open Access Journals (Sweden)

    Bles Mart

    2008-02-01

    Full Text Available Abstract Background Neuroimaging studies of attention often focus on interactions between stimulus representations and top-down selection mechanisms in visual cortex. Less is known about the neural representation of distractor stimuli beyond visual areas, and the interactions between stimuli in linguistic processing areas. In the present study, participants viewed simultaneously presented line drawings at peripheral locations, while in the MRI scanner. The names of the objects depicted in these pictures were either phonologically related (i.e. shared the same consonant-vowel onset construction, or unrelated. Attention was directed either at the linguistic properties of one of these pictures, or at the fixation point (i.e. away from the pictures. Results Phonological representations of unattended pictures could be detected in the posterior superior temporal gyrus, the inferior frontal gyrus, and the insula. Conclusion Under some circumstances, the name of ignored distractor pictures is retrieved by linguistic areas. This implies that selective attention to a specific location does not completely filter out the representations of distractor stimuli at early perceptual stages.

  7. IGNORING CHILDREN'S BEDTIME CRYING: THE POWER OF WESTERN-ORIENTED BELIEFS.

    Science.gov (United States)

    Maute, Monique; Perren, Sonja

    2018-03-01

    Ignoring children's bedtime crying (ICBC) is an issue that polarizes parents as well as pediatricians. While most studies have focused on the effectiveness of sleep interventions, no study has yet questioned which parents use ICBC. Parents often find children's sleep difficulties to be very challenging, but factors such as the influence of Western approaches to infant care, stress, and sensitivity have not been analyzed in terms of ICBC. A sample of 586 parents completed a questionnaire to investigate the relationships between parental factors and the method of ICBC. Data were analyzed using structural equation modeling. Latent variables were used to measure parental stress (Parental Stress Scale; J.O. Berry & W.H. Jones, 1995), sensitivity (Situation-Reaction-Questionnaire; Y. Hänggi, K. Schweinberger, N. Gugger, & M. Perrez, 2010), Western-oriented parental beliefs (Rigidity), and children's temperament (Parenting Stress Index; H. Tröster & R.R. Abidin). ICBC was used by 32.6% (n = 191) of parents in this study. Parents' Western-oriented beliefs predicted ICBC. Attitudes such as feeding a child on a time schedule and not carrying it out to prevent dependence were associated with letting the child cry to fall asleep. Low-sensitivity parents as well as parents of children with a difficult temperament used ICBC more frequently. Path analysis shows that parental stress did not predict ICBC. The results suggest that ICBC has become part of Western childrearing tradition. © 2018 Michigan Association for Infant Mental Health.

  8. Behavioural responses to human-induced change: Why fishing should not be ignored.

    Science.gov (United States)

    Diaz Pauli, Beatriz; Sih, Andrew

    2017-03-01

    Change in behaviour is usually the first response to human-induced environmental change and key for determining whether a species adapts to environmental change or becomes maladapted. Thus, understanding the behavioural response to human-induced changes is crucial in the interplay between ecology, evolution, conservation and management. Yet the behavioural response to fishing activities has been largely ignored. We review studies contrasting how fish behaviour affects catch by passive (e.g., long lines, angling) versus active gears (e.g., trawls, seines). We show that fishing not only targets certain behaviours, but it leads to a multitrait response including behavioural, physiological and life-history traits with population, community and ecosystem consequences. Fisheries-driven change (plastic or evolutionary) of fish behaviour and its correlated traits could impact fish populations well beyond their survival per se , affecting predation risk, foraging behaviour, dispersal, parental care, etc., and hence numerous ecological issues including population dynamics and trophic cascades . In particular, we discuss implications of behavioural responses to fishing for fisheries management and population resilience. More research on these topics, however, is needed to draw general conclusions, and we suggest fruitful directions for future studies.

  9. Experimental amplification of an entangled photon: what if the detection loophole is ignored?

    International Nuclear Information System (INIS)

    Pomarico, Enrico; Sanguinetti, Bruno; Sekatski, Pavel; Zbinden, Hugo; Gisin, Nicolas

    2011-01-01

    The experimental verification of quantum features, such as entanglement, at large scales is extremely challenging because of environment-induced decoherence. Indeed, measurement techniques for demonstrating the quantumness of multiparticle systems in the presence of losses are difficult to define, and if they are not sufficiently accurate they can provide wrong conclusions. We present a Bell test where one photon of an entangled pair is amplified and then detected by threshold detectors, whose signals undergo postselection. The amplification is performed by a classical machine, which produces a fully separable micro-macro state. However, by adopting such a technique one can surprisingly observe a violation of the Clauser-Horne-Shimony-Holt inequality. This is due to the fact that ignoring the detection loophole opened by the postselection and the system losses can lead to misinterpretations, such as claiming micro-macro entanglement in a setup where evidently it is not present. By using threshold detectors and postselection, one can only infer the entanglement of the initial pair of photons, and so micro-micro entanglement, as is further confirmed by the violation of a nonseparability criterion for bipartite systems. How to detect photonic micro-macro entanglement in the presence of losses with the currently available technology remains an open question.

  10. Commentary: Ignorance as Bias: Radiolab, Yellow Rain, and “The Fact of the Matter”

    Directory of Open Access Journals (Sweden)

    Paul Hillmer

    2017-12-01

    Full Text Available In 2012 the National Public Radio show “Radiolab” released a podcast (later broadcast on air essentially asserting that Hmong victims of a suspected chemical agent known as “yellow rain” were ignorant of their surroundings and the facts, and were merely victims of exposure, dysentery, tainted water, and other natural causes. Relying heavily on the work of Dr. Matthew Meselson, Dr. Thomas Seeley, and former CIA officer Merle Pribbenow, Radiolab asserted that Hmong victims mistook bee droppings, defecated en masse from flying Asian honey bees, as “yellow rain.” They brought their foregone conclusions to an interview with Eng Yang, a self-described yellow rain survivor, and his niece, memoirist Kao Kalia Yang, who served as translator. The interview went horribly wrong when their dogged belief in the “bee dung hypothesis” was met with stiff and ultimately impassioned opposition. Radiolab’s confirmation bias led them to dismiss contradictory scientific evidence and mislead their audience. While the authors remain agnostic about the potential use of yellow rain in Southeast Asia, they believe the evidence shows that further study is needed before a final conclusion can be reached.

  11. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  12. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  13. The detail is dead - long live the detail!

    DEFF Research Database (Denmark)

    Larsen, Steen Nepper; Dalgaard, Kim; Kerstens, Vencent

    2018-01-01

    architecture when we look into architectural history. Too classic examples are; Adolf Loos who provoked already in 1908 with his statement; "Ornament and Crime", which contested the unconscious decorations of contemporary architects. Similarly, referring to the little need for superfluous detailing; "Less...... not change the fact that it is more important than ever to bring this 'small' architectural world to attention. Today, the construction industry is dictated by an economic management that does not leave much room for thorough studies of architectural details or visionary experiments. Today's more efficient......_Delft about the Symposium; "The Detail is Dead - Long Live the Detail". For this occasion a number of leading Danish and Northern European architects, researchers and companies were invited to discuss and suggest their 'architectural detail' and the challenges they face in today's construction. This book...

  14. Detail in architecture: Between arts & crafts

    Science.gov (United States)

    Dulencin, Juraj

    2016-06-01

    Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students' bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its economic cost, an

  15. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  16. A simple approach to ignoring irrelevant variables by population decoding based on multisensory neurons

    Science.gov (United States)

    Kim, HyungGoo R.; Pitkow, Xaq; Angelaki, Dora E.

    2016-01-01

    Sensory input reflects events that occur in the environment, but multiple events may be confounded in sensory signals. For example, under many natural viewing conditions, retinal image motion reflects some combination of self-motion and movement of objects in the world. To estimate one stimulus event and ignore others, the brain can perform marginalization operations, but the neural bases of these operations are poorly understood. Using computational modeling, we examine how multisensory signals may be processed to estimate the direction of self-motion (i.e., heading) and to marginalize out effects of object motion. Multisensory neurons represent heading based on both visual and vestibular inputs and come in two basic types: “congruent” and “opposite” cells. Congruent cells have matched heading tuning for visual and vestibular cues and have been linked to perceptual benefits of cue integration during heading discrimination. Opposite cells have mismatched visual and vestibular heading preferences and are ill-suited for cue integration. We show that decoding a mixed population of congruent and opposite cells substantially reduces errors in heading estimation caused by object motion. In addition, we present a general formulation of an optimal linear decoding scheme that approximates marginalization and can be implemented biologically by simple reinforcement learning mechanisms. We also show that neural response correlations induced by task-irrelevant variables may greatly exceed intrinsic noise correlations. Overall, our findings suggest a general computational strategy by which neurons with mismatched tuning for two different sensory cues may be decoded to perform marginalization operations that dissociate possible causes of sensory inputs. PMID:27334948

  17. Ignorance is no excuse for directors minimizing information asymmetry affecting boards

    Directory of Open Access Journals (Sweden)

    Eythor Ivar Jonsson

    2006-11-01

    Full Text Available This paper looks at information asymmetry at the board level and how lack of information has played a part in undermining the power of the board of directors. Information is power, and at board level, information is essential to keep the board knowledgeable about the failures and successes of the organization that it is supposed to govern. Although lack of information has become a popular excuse for boards, the mantra could –and should –be changing to, “Ignorance is no excuse” (Mueller, 1993. This paper explores some of these information system solutions that have the aim of resolving some of the problems of information asymmetry. Furthermore, three case studies are used to explore the problem of asymmetric information at board level and the how the boards are trying to solve the problem. The focus of the discussion is to a describe how directors experience the information asymmetry and if they find it troublesome, b how important information is for the control and strategy role of the board and c find out how boards can minimize the problem of asymmetric information. The research is conducted through semi-structured interviews with directors, managers and accountants. This paper offers an interesting exploration into information, or the lack of information, at board level. It describes both from a theoretical and practical viewpoint the problem of information asymmetry at board level and how companies are trying to solve this problem. It is an issue that has only been lightly touched upon in the corporate governance literature but is likely to attract more attention and research in the future.

  18. On the practice of ignoring center-patient interactions in evaluating hospital performance.

    Science.gov (United States)

    Varewyck, Machteld; Vansteelandt, Stijn; Eriksson, Marie; Goetghebeur, Els

    2016-01-30

    We evaluate the performance of medical centers based on a continuous or binary patient outcome (e.g., 30-day mortality). Common practice adjusts for differences in patient mix through outcome regression models, which include patient-specific baseline covariates (e.g., age and disease stage) besides center effects. Because a large number of centers may need to be evaluated, the typical model postulates that the effect of a center on outcome is constant over patient characteristics. This may be violated, for example, when some centers are specialized in children or geriatric patients. Including interactions between certain patient characteristics and the many fixed center effects in the model increases the risk for overfitting, however, and could imply a loss of power for detecting centers with deviating mortality. Therefore, we assess how the common practice of ignoring such interactions impacts the bias and precision of directly and indirectly standardized risks. The reassuring conclusion is that the common practice of working with the main effects of a center has minor impact on hospital evaluation, unless some centers actually perform substantially better on a specific group of patients and there is strong confounding through the corresponding patient characteristic. The bias is then driven by an interplay of the relative center size, the overlap between covariate distributions, and the magnitude of the interaction effect. Interestingly, the bias on indirectly standardized risks is smaller than on directly standardized risks. We illustrate our findings by simulation and in an analysis of 30-day mortality on Riksstroke. © 2015 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  19. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  20. Sophisticated Approval Voting, Ignorance Priors, and Plurality Heuristics: A Behavioral Social Choice Analysis in a Thurstonian Framework

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R.; Tsetlin, Ilia

    2007-01-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…

  1. Learning to Ignore: A Modeling Study of a Decremental Cholinergic Pathway and Its Influence on Attention and Learning

    Science.gov (United States)

    Oros, Nicolas; Chiba, Andrea A.; Nitz, Douglas A.; Krichmar, Jeffrey L.

    2014-01-01

    Learning to ignore irrelevant stimuli is essential to achieving efficient and fluid attention, and serves as the complement to increasing attention to relevant stimuli. The different cholinergic (ACh) subsystems within the basal forebrain regulate attention in distinct but complementary ways. ACh projections from the substantia innominata/nucleus…

  2. Settlers Unsettled: Using Field Schools and Digital Stories to Transform Geographies of Ignorance about Indigenous Peoples in Canada

    Science.gov (United States)

    Castleden, Heather; Daley, Kiley; Sloan Morgan, Vanessa; Sylvestre, Paul

    2013-01-01

    Geography is a product of colonial processes, and in Canada, the exclusion from educational curricula of Indigenous worldviews and their lived realities has produced "geographies of ignorance". Transformative learning is an approach geographers can use to initiate changes in non-Indigenous student attitudes about Indigenous…

  3. The Ignorant Environmental Education Teacher: Students Get Empowered and Teach Philosophy of Nature Inspired by Ancient Greek Philosophy

    Science.gov (United States)

    Tsevreni, Irida

    2018-01-01

    This paper presents an attempt to apply Jacques Rancière's emancipatory pedagogy of "the ignorant schoolmaster" to environmental education, which emphasises environmental ethics. The paper tells the story of a philosophy of nature project in the framework of an environmental adult education course at a Second Chance School in Greece,…

  4. Ignorance, Vulnerability and the Occurrence of "Radical Surprises": Theoretical Reflections and Empirical Findings

    Science.gov (United States)

    Kuhlicke, C.

    2009-04-01

    By definition natural disasters always contain a moment of surprise. Their occurrence is mostly unforeseen and unexpected. They hit people unprepared, overwhelm them and expose their helplessness. Yet, there is surprisingly little known on the reasons for their being surprised. Aren't natural disasters expectable and foreseeable after all? Aren't the return rates of most hazards well known and shouldn't people be better prepared? The central question of this presentation is hence: Why do natural disasters so often radically surprise people at all (and how can we explain this being surprised)? In the first part of the presentation, it is argued that most approaches to vulnerability are not able to grasp this moment of surprise. On the contrary, they have their strength in unravelling the expectable: A person who is marginalized or even oppressed in everyday life is also vulnerable during times of crisis and stress, at least this is the central assumption of most vulnerability studies. In the second part, an understanding of vulnerability is developed, which allows taking into account such radical surprises. First, two forms of the unknown are differentiated: An area of the unknown an actor is more or less aware of (ignorance), and an area, which is not even known to be not known (nescience). The discovery of the latter is mostly associated with a "radical surprise", since it is per definition impossible to prepare for it. Second, a definition of vulnerability is proposed, which allows capturing the dynamics of surprises: People are vulnerable when they discover their nescience exceeding by definition previously established routines, stocks of knowledge and resources—in a general sense their capacities—to deal with their physical and/or social environment. This definition explicitly takes the view of different actors serious and departs from their being surprised. In the third part findings of a case study are presented, the 2002 flood in Germany. It is shown

  5. When Ignorance is Bliss* : Information Asymmetries Enhance Prosocial Behavior in Dicator Games

    OpenAIRE

    Winschel, Evguenia; Zahn, Philipp

    2014-01-01

    In most laboratory experiments concerning prosocial behavior subjects are fully informed how their decision influences the payoff of other players. Outside the laboratory, however, individuals typically have to decide without such detailed knowledge. To asses the effect of information asymmetries on prosocial behavior, we conduct a laboratory experiment with a simple non-strategic interaction. A dictator has only limited knowledge about the benefits his prosocial action generates for a recipi...

  6. When Ignorance is Bliss - Information Asymmetries Enhance Prosocial Behavior in Dictator Games

    OpenAIRE

    Evguenia Winschel; Philipp Zahn

    2014-01-01

    In most laboratory experiments concerning prosocial behavior subjects are fully informed how their decision influences the payoff of other players. Outside the laboratory, however, individuals typically have to decide without such detailed knowledge. To assess the effect of information asymmetries on prosocial behavior, we conduct a laboratory experiment with a simple non-strategic interaction. A dictator has only limited knowledge about the benefits his prosocial action generates for a recip...

  7. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  8. Do Not ignore pulmonary hypertension any longer. It’s time to deal with it!

    Directory of Open Access Journals (Sweden)

    Ahmad Mirdamadi

    2011-08-01

    tromboembolic attacks in check.Then,It was time of revolution in pulmonary hypertension management,With the emergence of Advanced PH treatment science of medicine became able to seriously deal with  PH.This new strategy were showed to be able preventing mortality in PH patients’(5,Figure1Prostacycline showed that it is possible to enhance PH patients’ chance of survival. Phosphodiasterase inhibitor drugs, which were used for treating impotency for a long time, were demonstrated to be effective for reducing pulmonary pressure. Eventually, endotheline receptors were targeted.By the advent of endothelin receptor blockers such as Brosentan, physicians’ chances of helping PH patients were further improved.Today, with advanced PH treatment, PH is not counted as before and the science of medicine as a failed discipline.It is important to not forgetting PH in patients,especially ill patients or intractable to traditional treatment, in surgery wards or obstetric,pediatric,internal medicine,ICU or CCU wards of hospitals. By timely diagnosis, it will be possible to control  PH patients in an effective way and to enhance their chance of survival.So,It is time now to pay more attention to PH,Don’t ignore it any longer and it’s time to deal with it

  9. Beneficiation and agglomeration of manganese ore fines (an area so important and yet so ignored)

    Science.gov (United States)

    Sane, R.

    2018-01-01

    Unpredictable changes in demand and prices varying from very attractive to depressing levels have thrown all Manganese ore mines out of normal operating gear. The supply has to be in time-bound fashion, of dependable quality and continuous. With setting-up of numerous small units and with existing ferro-alloy units, ore supply has become extremely sensitive issue. Due to unpredictable swing in price of Mn ore lumps, furnace operators found it economic and convenient to use fines, even at great risks to furnace equipment and operating persons and therefore risks & damages were conveniently & comfortably ignored. Beneficiation Cost(Operating) approx. - (ferruginous ore) - Roast reduction followed by magnetic separation route-particulars - Water 20/-, Power 490/-, Coal fines-675/-, OH-250/-totaling to Rs.1435/T. (Figures are based on actual data from investigations on Orissa & Karnataka sector ores). Feed Grade Mn- 28 to 32 %, Fe - 14 to 25 %, Concentrate (Beneficiated ore fines)- - Mn- 45 to 48 %, Fe - 6 to 8 %., Recovery - 35 %, Price of 28-30 % Mn ore fines = Rs. 2400/T, Cost of Concentrated fines (45/48% Mn grade) = Rs. 8300/T, Price of 47-48 % Mn Lumpy ore = Rs.11,000/T. Sintering Cost (Operating) - Approx-Rs.1195=00/T Sinter. Therefore cost of Sinter produced from beneficiated concentrate is 9130+1195 = Rs. 10325. The difference in cost of 48%Mn ore Lumps & 48%Mn sintered concentrate = 11000-10325 = Rs.675/T. The main purpose of this paper is to show that establishment of beneficiation unit & Sintering unit is economically feasible. There are many misconcepts, still prevailing, about use of Mn ore sinters. Few of the main misconcepts are- 1)Sinters bring no benefit - technical or economical.2) Sinters are very friable and disintegrate easily into high fines during handling/transportation. 3) Fines below 100 mesh cannot be sintered. 4) Silica increases to high level during sintering, resulting in to high slag volume thereby higher power consumption. All are false

  10. Intranasal oxytocin impedes the ability to ignore task-irrelevant facial expressions of sadness in students with depressive symptoms.

    Science.gov (United States)

    Ellenbogen, Mark A; Linnen, Anne-Marie; Cardoso, Christopher; Joober, Ridha

    2013-03-01

    The administration of oxytocin promotes prosocial behavior in humans. The mechanism by which this occurs is unknown, but it likely involves changes in social information processing. In a randomized placebo-controlled study, we examined the influence of intranasal oxytocin and placebo on the interference control component of inhibition (i.e. ability to ignore task-irrelevant information) in 102 participants using a negative affective priming task with sad, angry, and happy faces. In this task, participants are instructed to respond to a facial expression of emotion while simultaneously ignoring another emotional face. On the subsequent trial, the previously-ignored emotional valence may become the emotional valence of the target face. Inhibition is operationalized as the differential delay between responding to a previously-ignored emotional valence and responding to an emotional valence unrelated to the previous one. Although no main effect of drug administration on inhibition was observed, a drug × depressive symptom interaction (β = -0.25; t = -2.6, p < 0.05) predicted the inhibition of sad faces. Relative to placebo, participants with high depression scores who were administered oxytocin were unable to inhibit the processing of sad faces. There was no relationship between drug administration and inhibition among those with low depression scores. These findings are consistent with increasing evidence that oxytocin alters social information processing in ways that have both positive and negative social outcomes. Because elevated depression scores are associated with an increased risk for major depressive disorder, difficulties inhibiting mood-congruent stimuli following oxytocin administration may be associated with risk for depression. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Cross-modal selective attention: on the difficulty of ignoring sounds at the locus of visual attention.

    Science.gov (United States)

    Spence, C; Ranson, J; Driver, J

    2000-02-01

    In three experiments, we investigated whether the ease with which distracting sounds can be ignored depends on their distance from fixation and from attended visual events. In the first experiment, participants shadowed an auditory stream of words presented behind their heads, while simultaneously fixating visual lip-read information consistent with the relevant auditory stream, or meaningless "chewing" lip movements. An irrelevant auditory stream of words, which participants had to ignore, was presented either from the same side as the fixated visual stream or from the opposite side. Selective shadowing was less accurate in the former condition, implying that distracting sounds are harder to ignore when fixated. Furthermore, the impairment when fixating toward distractor sounds was greater when speaking lips were fixated than when chewing lips were fixated, suggesting that people find it particularly difficult to ignore sounds at locations that are actively attended for visual lipreading rather than merely passively fixated. Experiments 2 and 3 tested whether these results are specific to cross-modal links in speech perception by replacing the visual lip movements with a rapidly changing stream of meaningless visual shapes. The auditory task was again shadowing, but the active visual task was now monitoring for a specific visual shape at one location. A decrement in shadowing was again observed when participants passively fixated toward the irrelevant auditory stream. This decrement was larger when participants performed a difficult active visual task there versus fixating, but not for a less demanding visual task versus fixation. The implications for cross-modal links in spatial attention are discussed.

  12. Research on injury compensation and health outcomes: ignoring the problem of reverse causality led to a biased conclusion.

    Science.gov (United States)

    Spearing, Natalie M; Connelly, Luke B; Nghiem, Hong S; Pobereskin, Louis

    2012-11-01

    This study highlights the serious consequences of ignoring reverse causality bias in studies on compensation-related factors and health outcomes and demonstrates a technique for resolving this problem of observational data. Data from an English longitudinal study on factors, including claims for compensation, associated with recovery from neck pain (whiplash) after rear-end collisions are used to demonstrate the potential for reverse causality bias. Although it is commonly believed that claiming compensation leads to worse recovery, it is also possible that poor recovery may lead to compensation claims--a point that is seldom considered and never addressed empirically. This pedagogical study compares the association between compensation claiming and recovery when reverse causality bias is ignored and when it is addressed, controlling for the same observable factors. When reverse causality is ignored, claimants appear to have a worse recovery than nonclaimants; however, when reverse causality bias is addressed, claiming compensation appears to have a beneficial effect on recovery, ceteris paribus. To avert biased policy and judicial decisions that might inadvertently disadvantage people with compensable injuries, there is an urgent need for researchers to address reverse causality bias in studies on compensation-related factors and health. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. DAGAL: Detailed Anatomy of Galaxies

    Science.gov (United States)

    Knapen, Johan H.

    2017-03-01

    The current IAU Symposium is closely connected to the EU-funded network DAGAL (Detailed Anatomy of Galaxies), with the final annual network meeting of DAGAL being at the core of this international symposium. In this short paper, we give an overview of DAGAL, its training activities, and some of the scientific advances that have been made under its umbrella.

  14. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  15. A methodology for string resolution

    International Nuclear Information System (INIS)

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine's status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records

  16. Brightness Variations of Sun-like Stars: The Mystery Deepens - Astronomers facing Socratic "ignorance"

    Science.gov (United States)

    2009-12-01

    ], achieving an impressive collection of the properties of these variable stars. Outstanding sets of data like the one collected by Nicholls and her colleagues often offer guidance on how to solve a cosmic puzzle by narrowing down the plethora of possible explanations proposed by the theoreticians. In this case, however, the observations are incompatible with all the previously conceived models and re-open an issue that has been thoroughly debated. Thanks to this study, astronomers are now aware of their own "ignorance" - a genuine driver of the knowledge-seeking process, as the ancient Greek philosopher Socrates is said to have taught. "The newly gathered data show that pulsations are an extremely unlikely explanation for the additional variation," says team leader Peter Wood. "Another possible mechanism for producing luminosity variations in a star is to have the star itself move in a binary system. However, our observations are strongly incompatible with this hypothesis too." The team found from further analysis that whatever the cause of these unexplained variations is, it also causes the giant stars to eject mass either in clumps or as an expanding disc. "A Sherlock Holmes is needed to solve this very frustrating mystery," concludes Nicholls. Notes [1] Precise brightness measurements were made by the MACHO and OGLE collaborations, running on telescopes in Australia and Chile, respectively. The OGLE observations were made at the same time as the VLT observations. More information This research was presented in two papers: one appeared in the November issue of the Monthly Notices of the Royal Astronomical Society ("Long Secondary Periods in Variable Red Giants", by C. P. Nicholls et al.), and the other has just been published in the Astrophysical Journal ("Evidence for mass ejection associated with long secondary periods in red giants", by P. R. Wood and C. P. Nicholls). The team is composed of Christine P. Nicholls and Peter R. Wood (Research School of Astronomy and

  17. Covariance Evaluation Methodology for Neutron Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Arcilla, R.; Mattoon, C.M.; Mughabghab, S.F.; Oblozinsky, P.; Pigni, M.; Pritychenko, b.; Songzoni, A.A.

    2008-09-01

    We present the NNDC-BNL methodology for estimating neutron cross section covariances in thermal, resolved resonance, unresolved resonance and fast neutron regions. The three key elements of the methodology are Atlas of Neutron Resonances, nuclear reaction code EMPIRE, and the Bayesian code implementing Kalman filter concept. The covariance data processing, visualization and distribution capabilities are integral components of the NNDC methodology. We illustrate its application on examples including relatively detailed evaluation of covariances for two individual nuclei and massive production of simple covariance estimates for 307 materials. Certain peculiarities regarding evaluation of covariances for resolved resonances and the consistency between resonance parameter uncertainties and thermal cross section uncertainties are also discussed.

  18. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  19. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  20. The effects of methylphenidate on prepulse inhibition during attended and ignored prestimuli among boys with attention-deficit hyperactivity disorder.

    Science.gov (United States)

    Hawk, Larry W; Yartz, Andrew R; Pelham, William E; Lock, Thomas M

    2003-01-01

    The present study investigated attentional modification of prepulse inhibition of startle among boys with and without attention-deficit hyperactivity disorder (ADHD). Two hypotheses were tested: (1) whether ADHD is associated with diminished prepulse inhibition during attended prestimuli, but not ignored prestimuli, and (2) whether methylphenidate selectively increases prepulse inhibition to attended prestimuli among boys with ADHD. Participants were 17 boys with ADHD and 14 controls. Participants completed a tone discrimination task in each of two sessions separated by 1 week. ADHD boys were administered methylphenidate (0.3 mg/kg) in one session and placebo in the other session in a randomized, double-blind fashion. During each series of 72 tones (75 dB; half 1200-Hz, half 400-Hz), participants were paid to attend to one pitch and ignore the other. Bilateral eyeblink electromyogram startle responses were recorded in response to acoustic probes (50-ms, 102-dB white noise) presented following the onset of two-thirds of tones, and during one-third of intertrial intervals. Relative to controls, boys with ADHD exhibited diminished prepulse inhibition 120 ms after onset of attended but not ignored prestimuli following placebo administration. Methylphenidate selectively increased prepulse inhibition to attended prestimuli at 120 ms among boys with ADHD to a level comparable to that of controls, who did not receive methylphenidate. These data are consistent with the hypothesis that ADHD involves diminished selective attention and suggest that methylphenidate ameliorates the symptoms of ADHD, at least in part, by altering an early attentional mechanism.

  1. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  2. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  3. Hybrid probabilistic and possibilistic safety assessment. Methodology and application

    International Nuclear Information System (INIS)

    Kato, Kazuyuki; Amano, Osamu; Ueda, Hiroyoshi; Ikeda, Takao; Yoshida, Hideji; Takase, Hiroyasu

    2002-01-01

    This paper presents a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to the safety assessment of geological disposal of high-level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts, while variability was formulated by means of probability density functions (pdfs) based on available data sets. The exercise demonstrated the applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert opinion and in providing information on the dependence of assessment results on the level of conservatism. In addition, it was shown that sensitivity analysis can identify key parameters contributing to uncertainties associated with results of the overall assessment. The information mentioned above can be utilized to support decision-making and to guide the process of disposal system development and optimization of protection against potential exposure. (author)

  4. Detailed clinical models: a review.

    Science.gov (United States)

    Goossen, William; Goossen-Baremans, Anneke; van der Zel, Michael

    2010-12-01

    Due to the increasing use of electronic patient records and other health care information technology, we see an increase in requests to utilize these data. A highly level of standardization is required during the gathering of these data in the clinical context in order to use it for analyses. Detailed Clinical Models (DCM) have been created toward this purpose and several initiatives have been implemented in various parts of the world to create standardized models. This paper presents a review of DCM. Two types of analyses are presented; one comparing DCM against health care information architectures and a second bottom up approach from concept analysis to representation. In addition core parts of the draft ISO standard 13972 on DCM are used such as clinician involvement, data element specification, modeling, meta information, and repository and governance. SIX INITIATIVES WERE SELECTED: Intermountain Healthcare, 13606/OpenEHR Archetypes, Clinical Templates, Clinical Contents Models, Health Level 7 templates, and Dutch Detailed Clinical Models. Each model selected was reviewed for their overall development, involvement of clinicians, use of data types, code bindings, expressing semantics, modeling, meta information, use of repository and governance. Using both a top down and bottom up approach to comparison reveals many commonalties and differences between initiatives. Important differences include the use of or lack of a reference model and expressiveness of models. Applying clinical data element standards facilitates the use of conceptual DCM models in different technical representations.

  5. Application of a very detailed soil survey method in viticultural zoning in Catalonia, Spain

    Directory of Open Access Journals (Sweden)

    Josep Miquel Ubalde

    2009-06-01

    Significance and impact of study: This study showed how very detailed soil maps, which can be difficult to interpret and put into practice, can be valorised as viticultural zoning maps by means of a simple methodology.

  6. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  7. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  8. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  9. Frequent methodological errors in clinical research.

    Science.gov (United States)

    Silva Aycaguer, L C

    2018-03-07

    Several errors that are frequently present in clinical research are listed, discussed and illustrated. A distinction is made between what can be considered an "error" arising from ignorance or neglect, from what stems from a lack of integrity of researchers, although it is recognized and documented that it is not easy to establish when we are in a case and when in another. The work does not intend to make an exhaustive inventory of such problems, but focuses on those that, while frequent, are usually less evident or less marked in the various lists that have been published with this type of problems. It has been a decision to develop in detail the examples that illustrate the problems identified, instead of making a list of errors accompanied by an epidermal description of their characteristics. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  10. Global sea-level rise is recognised, but flooding from anthropogenic land subsidence is ignored around northern Manila Bay, Philippines.

    Science.gov (United States)

    Rodolfo, Kelvin S; Siringan, Fernando P

    2006-03-01

    Land subsidence resulting from excessive extraction of groundwater is particularly acute in East Asian countries. Some Philippine government sectors have begun to recognise that the sea-level rise of one to three millimetres per year due to global warming is a cause of worsening floods around Manila Bay, but are oblivious to, or ignore, the principal reason: excessive groundwater extraction is lowering the land surface by several centimetres to more than a decimetre per year. Such ignorance allows the government to treat flooding as a lesser problem that can be mitigated through large infrastructural projects that are both ineffective and vulnerable to corruption. Money would be better spent on preventing the subsidence by reducing groundwater pumping and moderating population growth and land use, but these approaches are politically and psychologically unacceptable. Even if groundwater use is greatly reduced and enlightened land-use practices are initiated, natural deltaic subsidence and global sea-level rise will continue to aggravate flooding, although at substantially lower rates.

  11. Managing uncertainty, ambiguity and ignorance in impact assessment by embedding evolutionary resilience, participatory modelling and adaptive management.

    Science.gov (United States)

    Bond, Alan; Morrison-Saunders, Angus; Gunn, Jill A E; Pope, Jenny; Retief, Francois

    2015-03-15

    In the context of continuing uncertainty, ambiguity and ignorance in impact assessment (IA) prediction, the case is made that existing IA processes are based on false 'normal' assumptions that science can solve problems and transfer knowledge into policy. Instead, a 'post-normal science' approach is needed that acknowledges the limits of current levels of scientific understanding. We argue that this can be achieved through embedding evolutionary resilience into IA; using participatory workshops; and emphasising adaptive management. The goal is an IA process capable of informing policy choices in the face of uncertain influences acting on socio-ecological systems. We propose a specific set of process steps to operationalise this post-normal science approach which draws on work undertaken by the Resilience Alliance. This process differs significantly from current models of IA, as it has a far greater focus on avoidance of, or adaptation to (through incorporating adaptive management subsequent to decisions), unwanted future scenarios rather than a focus on the identification of the implications of a single preferred vision. Implementing such a process would represent a culture change in IA practice as a lack of knowledge is assumed and explicit, and forms the basis of future planning activity, rather than being ignored. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. From Ignoring to Leading Changes – What Role do Universities Play in Developing Countries? (Case of Croatia

    Directory of Open Access Journals (Sweden)

    Slavica Singer

    2010-12-01

    Full Text Available Using the model of entrepreneurial university, the paper presents major blockages (university’s own institutional rigidity, fragmented organization, lack of mutual trust between the business sector and universities, no real benchmarks, legal framework not supportive of opening the university to new initiatives in Triple Helix interactions in Croatia. Comparing identified blockages with expectations (multidimensional campus, cooperation with the business sector and other stakeholders in designing new educational and research programs expressed by HEIs in developed countries around the world (2008 EIU survey indicates new challenges for universities in developing countries. With Triple Helix approach, not confined within national borders, but as an international networking opportunity, these challenges can be seen as opportunities, otherwise they are threats. On the scale of ignoring, observing, participating and leading positive changes in its surroundings, for the purpose of measuring vitality of Triple Helix interactions, Croatian universities are located more between ignoring and observing position. To move them towards a leading position, coordinated and consistent policies are needed in order to focus on eliminating identified blockages. Universities should take the lead in this process; otherwise they are losing credibility as desired partners in developing space for Triple Helix interactions.

  13. Using Q Methodology to Investigate Undergraduate Students' Attitudes toward the Geosciences

    Science.gov (United States)

    Young, Julia M.; Shepardson, Daniel P.

    2018-01-01

    Undergraduate students have different attitudes toward the geosciences, but few studies have investigated these attitudes using Q methodology. Q methodology allows the researcher to identify more detailed reasons for students' attitudes toward geology than Likert methodology. Thus this study used Q methodology to investigate the attitudes that 15…

  14. Quantum mechanics and faster-than-light communication: methodological considerations

    International Nuclear Information System (INIS)

    Ghirardi, G.C.; Weber, T.

    1983-06-01

    A detailed quantum mechanical analysis of a recent proposal of faster than light communication through wave packet reduction is performed. The discussion allows us to focus on some methodological problems about critical investigations in physical theories. (author)

  15. Challenges and Opportunities for Harmonizing Research Methodology

    DEFF Research Database (Denmark)

    van Hees, V. T.; Thaler-Kall, K.; Wolf, K. H.

    2016-01-01

    Objectives: Raw accelerometry is increasingly being used in physical activity research, but diversity in sensor design, attachment and signal processing challenges the comparability of research results. Therefore, efforts are needed to harmonize the methodology. In this article we reflect on how...... increased methodological harmonization may be achieved. Methods: The authors of this work convened for a two-day workshop (March 2014) themed on methodological harmonization of raw accelerometry. The discussions at the workshop were used as a basis for this review. Results: Key stakeholders were identified...... as manufacturers, method developers, method users (application), publishers, and funders. To facilitate methodological harmonization in raw accelerometry the following action points were proposed: i) Manufacturers are encouraged to provide a detailed specification of their sensors, ii) Each fundamental step...

  16. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  17. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  19. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  20. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  1. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  2. Conceptual and methodological concerns in the theory of perceptual load.

    Science.gov (United States)

    Benoni, Hanna; Tsal, Yehoshua

    2013-01-01

    The present paper provides a short critical review of the theory of perceptual load. It closely examines the basic tenets and assumptions of the theory and identifies major conceptual and methodological problems that have been largely ignored in the literature. The discussion focuses on problems in the definition of the concept of perceptual load, on the circularity in the characterization and manipulation of perceptual load and the confusion between the concept of perceptual load and its operationalization. The paper also selectively reviews evidence supporting the theory as well as inconsistent evidence which proposed alternative dominant factors influencing the efficacy of attentional selection.

  3. Conceptual and Methodological Concerns in the Theory of Perceptual Load

    Directory of Open Access Journals (Sweden)

    Hanna eBenoni

    2013-08-01

    Full Text Available The present paper provides a short critical review of the theory of perceptual load. It closely examines the basic tenets and assumptions of the theory and identifies major conceptual and methodological problems that have been largely ignored in the literature. The discussion focuses on problems in the definition of the concept of perceptual load, on the circularity in the characterization and manipulation of perceptual load and the confusion between the concept of perceptual load and its operationalization. The paper also selectively reviews evidence supporting the theory as well as inconsistent evidence which proposed alternative dominant factors influencing the efficacy of attentional selection.

  4. Methodology and integrated studies for the strategic regional uranium prospects

    International Nuclear Information System (INIS)

    Andrade Tolentino, J. de.

    1984-01-01

    An integrated methodology alternative to the traditional one used for uranium prospecting is presented. For this purpose a detailed revision of some methods for finding uranium was made. Then a geological unit located in Minas Gerais, Brazil, was chosen due to its geographical proximity to the Lagoa Real deposit in the Brazilian Bahia State. The methodology applied to this unit gave a preliminary indications of areas to be prospected in detail. 36 refs, 44 figs, 20 tabs

  5. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  6. I Want to but I Won't: Pluralistic Ignorance Inhibits Intentions to Take Paternity Leave in Japan

    Directory of Open Access Journals (Sweden)

    Takeru Miyajima

    2017-09-01

    Full Text Available The number of male employees who take paternity leave in Japan has been low in past decades. However, the majority of male employees actually wish to take paternity leave if they were to have a child. Previous studies have demonstrated that the organizational climate in workplaces is the major determinant of male employees' use of family-friendly policies, because males are often stigmatized and fear receiving negative evaluation from others. While such normative pressure might be derived from prevailing social practices relevant to people's expectation of social roles (e.g., “Men make houses, women make homes”, these social practices are often perpetuated even after the majority of group members have ceased to support them. The perpetuation of this unpopular norm could be caused by the social psychological phenomenon of pluralistic ignorance. While researches have explored people's beliefs about gender roles from various perspectives, profound understanding of these beliefs regarding gender role norms, and the accuracy of others' beliefs remains to be attained. The current research examined the association between pluralistic ignorance and the perpetually low rates of taking paternity leave in Japan. Specifically, Study 1 (n = 299 examined Japanese male employees' (ages ranging from the 20 s to the 40 s attitudes toward paternity leave and to estimate attitudes of other men of the same age, as well as behavioral intentions (i.e., desire and willingness to take paternity leave if they had a child in the future. The results demonstrated that male employees overestimated other men's negative attitudes toward paternity leave. Moreover, those who had positive attitudes toward taking leave and attributed negative attitudes to others were less willing to take paternity leave than were those who had positive attitudes and believed others shared those attitudes, although there was no significant difference between their desires to take paternity

  7. I Want to but I Won't: Pluralistic Ignorance Inhibits Intentions to Take Paternity Leave in Japan.

    Science.gov (United States)

    Miyajima, Takeru; Yamaguchi, Hiroyuki

    2017-01-01

    The number of male employees who take paternity leave in Japan has been low in past decades. However, the majority of male employees actually wish to take paternity leave if they were to have a child. Previous studies have demonstrated that the organizational climate in workplaces is the major determinant of male employees' use of family-friendly policies, because males are often stigmatized and fear receiving negative evaluation from others. While such normative pressure might be derived from prevailing social practices relevant to people's expectation of social roles (e.g., "Men make houses, women make homes"), these social practices are often perpetuated even after the majority of group members have ceased to support them. The perpetuation of this unpopular norm could be caused by the social psychological phenomenon of pluralistic ignorance. While researches have explored people's beliefs about gender roles from various perspectives, profound understanding of these beliefs regarding gender role norms, and the accuracy of others' beliefs remains to be attained. The current research examined the association between pluralistic ignorance and the perpetually low rates of taking paternity leave in Japan. Specifically, Study 1 ( n = 299) examined Japanese male employees' (ages ranging from the 20 s to the 40 s) attitudes toward paternity leave and to estimate attitudes of other men of the same age, as well as behavioral intentions (i.e., desire and willingness) to take paternity leave if they had a child in the future. The results demonstrated that male employees overestimated other men's negative attitudes toward paternity leave. Moreover, those who had positive attitudes toward taking leave and attributed negative attitudes to others were less willing to take paternity leave than were those who had positive attitudes and believed others shared those attitudes, although there was no significant difference between their desires to take paternity leave. Study 2 ( n

  8. In the Casino of Life: Betting on Risks and Ignoring the Consequences of Climate Change and Hazards

    Science.gov (United States)

    Brosnan, D. M.

    2016-12-01

    Even faced with strong scientific evidence decision-makers cite uncertainty and delay actions. Scientists, confident in the quality of their science and acknowledging that uncertainty while present is low by scientific standards, become more frustrated as their information is ignored. Decreasing scientific uncertainty, a hallmark of long term studies e.g. IPCC reports does little to motivate decision-makers. Imperviousness to scientific data is prevalent across all scales. Municipalities prefer to spend millions of dollars on engineered solutions to climate change and hazards, even if science shows that they perform less well than nature-based ones and cost much more. California is known to be at risk from tsunamis generated by earthquakes off Alaska. A study using a 9.1 earthquake, similar to a 1965 event, calculated the immediate economic price tag in infrastructure loss and business interruption at 9.5billion. The exposure of Los Angeles/Long Beach port trade to damage and downtime exceeds 1.2billion; business interruption would triple the figure. Yet despite several excellent scientific studies, the State is ill prepared; investments in infrastructure commerce and conservation risk being literally washed away. Globally there is a 5-10% probability of an extreme geohazard, e.g, a Tambora like eruption, occurring in this century. With a "value of statistical life" of 2.2 million and population at 7 billion the risk for fatalities alone is 1.1-7billion per yr. But there is little interest in investing the $0.5-3.5 billion per year in volcano monitoring necessary to reduce fatalities and lower risks of global conflict, starvation, and societal destruction. More science and less uncertainty is clearly not the driver of action. But is speaking with certainty really the answer? Decision makers and scientists are in the same casino of life but rarely play at the same tables. Decision makers bet differently to scientists. To motivate action we need to be cognizant of

  9. Analysis of Detailed Energy Audits and Energy Use Measures of University Buildings

    Directory of Open Access Journals (Sweden)

    Kęstutis Valančius

    2011-12-01

    Full Text Available The paper explains the results of a detailed energy audit of the buildings of Vilnius Gediminas Technical University. Energy audits were performed with reference to the international scientific project. The article presents the methodology and results of detailed measurements of energy balance characteristics.Article in Lithuanian

  10. Perceptions of a fluid consensus: uniqueness bias, false consensus, false polarization, and pluralistic ignorance in a water conservation crisis.

    Science.gov (United States)

    Monin, Benoît; Norton, Michael I

    2003-05-01

    A 5-day field study (N = 415) during and right after a shower ban demonstrated multifaceted social projection and the tendency to draw personality inferences from simple behavior in a time of drastic consensus change. Bathers thought showering was more prevalent than did non-bathers (false consensus) and respondents consistently underestimated the prevalence of the desirable and common behavior--be it not showering during the shower ban or showering after the ban (uniqueness bias). Participants thought that bathers and non-bathers during the ban differed greatly in their general concern for the community, but self-reports demonstrated that this gap was illusory (false polarization). Finally, bathers thought other bathers cared less than they did, whereas non-bathers thought other non-bathers cared more than they did (pluralistic ignorance). The study captures the many biases at work in social perception in a time of social change.

  11. Honing in on the Social Difficulties Associated With Sluggish Cognitive Tempo in Children: Withdrawal, Peer Ignoring, and Low Engagement.

    Science.gov (United States)

    Becker, Stephen P; Garner, Annie A; Tamm, Leanne; Antonini, Tanya N; Epstein, Jeffery N

    2017-03-13

    Sluggish cognitive tempo (SCT) symptoms are associated with social difficulties in children, though findings are mixed and many studies have used global measures of social impairment. The present study tested the hypothesis that SCT would be uniquely associated with aspects of social functioning characterized by withdrawal and isolation, whereas attention deficit/hyperactivity disorder (ADHD) and oppositional defiant disorder (ODD) symptoms would be uniquely associated with aspects of social functioning characterized by inappropriate responding in social situations and active peer exclusion. Participants were 158 children (70% boys) between 7-12 years of age being evaluated for possible ADHD. Both parents and teachers completed measures of SCT, ADHD, ODD, and internalizing (anxiety/depression) symptoms. Parents also completed ratings of social engagement and self-control. Teachers also completed measures assessing asociality and exclusion, as well as peer ignoring and dislike. In regression analyses controlling for demographic characteristics and other psychopathology symptoms, parent-reported SCT symptoms were significantly associated with lower social engagement (e.g., starting conversations, joining activities). Teacher-reported SCT symptoms were significantly associated with greater asociality/withdrawal and ratings of more frequent ignoring by peers, as well as greater exclusion. ODD symptoms and ADHD hyperactive-impulsive symptoms were more consistently associated with other aspects of social behavior, including peer exclusion, being disliked by peers, and poorer self-control during social situations. Findings provide the clearest evidence to date that the social difficulties associated with SCT are primarily due to withdrawal, isolation, and low initiative in social situations. Social skills training interventions may be effective for children displaying elevated SCT symptomatology.

  12. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  13. Detailed Astrometric Analysis of Pluto

    Science.gov (United States)

    ROSSI, GUSTAVO B.; Vieira-Martins, R.; Camargo, J. I.; Assafin, M.

    2013-05-01

    Abstract (2,250 Maximum Characters): Pluto is the main representant of the transneptunian objects (TNO's), presenting some peculiarities such as an atmosphere and a satellite system with 5 known moons: Charon, discovered in 1978, Nix and Hydra, in 2006, P4 in 2011 and P5 in 2012. Until the arrival of the New Horizons spacecraft to this system (july 2015), stellar occultations are the most efficient method, from the ground, to know physical and dinamical properties of this system. In 2010, it was evident a drift in declinations (about 20 mas/year) comparing to the ephemerides. This fact motivated us to remake the reductions and analysis of a great set of our observations at OPD/LNA, in a total of 15 years. The ephemerides and occultations results was then compared with the astrometric and photometric reductions of CCD images of Pluto (around 6500 images). Two corrections were used for a refinement of the data set: diferential chromatic refraction and photocenter. The first is due to the mean color of background stars beeing redder than the color of Pluto, resulting in a slightly different path of light through the atmosphere (that may cause a difference in position of 0.1”). It became more evident because Pluto is crossing the region of the galactic plane. The photocenter correction is based on two gaussians curves overlapped, with different hights and non-coincident centers, corresponding to Pluto and Charon (since they have less than 1” of angular separation). The objective is to separate these two gaussian curves from the observed one and find the right position of Pluto. The method is strongly dependent of the hight of each of the gaussian curves, related to the respective albedos of charon and Pluto. A detailed analysis of the astrometric results, as well a comparison with occultation results was made. Since Pluto has an orbital period of 248,9 years and our interval of observation is about 15 years, we have around 12% of its observed orbit and also, our

  14. Ignorance is not bliss

    DEFF Research Database (Denmark)

    Jervelund, Signe Smith; Maltesen, Thomas; Wimmelmann, Camilla Lawaetz

    2017-01-01

    AIMS: Suboptimal healthcare utilisation and lower satisfaction with the patient-doctor encounter among immigrants has been documented. Immigrants' lack of familiarity with the healthcare system has been proposed as an explanation for this. This study investigated whether a systematic delivery...

  15. Ignoring the Market.

    Science.gov (United States)

    Chubb, John E.

    2003-01-01

    Argues that market-driven education (charter schools, vouchers) is the most effective, albeit overlooked, reform strategy since publication of "A Nation at Risk." Describes corresponding growth of for-profit school management. Offers several recommendations to improve effectiveness of market-based reforms, such as state' continuing…

  16. Agreeing in Ignorance

    DEFF Research Database (Denmark)

    Ploug, Thomas; Holm, Søren

    2014-01-01

    Many ICT services require that users explicitly consent to conditions of use and policies for the protection of personal information. This consent may become 'routinised'. We define the concept of routinisation and investigate to what extent routinisation occurs as well as the factors influencing...... routinisation in a survey study of internet use. We show that routinisation is common and that it is influenced by factors including gender, age, educational level and average daily internet use. We further explore the reasons users provide for not reading conditions and policies and show that they can...

  17. The challenges of ignorance

    CSIR Research Space (South Africa)

    Barnard, E

    2009-11-01

    Full Text Available The authors have previously argued that the infamous "No Free Lunch" theorem for supervised learning is a paradoxical result of a misleading choice of prior probabilities. Here, they provide more analysis of the dangers of uniform densities...

  18. The Cost of Ignorance

    DEFF Research Database (Denmark)

    Persson, Karl Gunnar; Sharp, Paul Richard

    This paper argues that imperfectly informed consumers use simple signals to identify the characteristics of wine. The geographical denomination and vintage of a wine as well as the characteristics of a particular wine will be considered here. However, the specific characteristics of a wine...... are difficult to ascertain ex ante given the enormous product variety. The reputation of a denomination will thus be an important guide for consumers when assessing individual wines. Denomination reputation is a function of average quality as revealed by the past performance of producers. The impact of past...... performance increases over time, since producers consider improved average quality to be an important factor in enhancing the price, but this necessitates monitoring of members in the denomination. The market and pricing of Tuscan red wines provide a natural experiment because there are a number...

  19. The Limits of Ignorance

    DEFF Research Database (Denmark)

    Højbjerg, Erik

    institutions alike. The logic seems to be that financially capable individuals will enjoy social and political inclusion as well as an ability to exercise a stronger influence in markets.The paper specifically contributes to our understanding of the governmentalization of the present by addressing how...... and political goals? The research question will be discussed in the context of financial literacy educational initiatives. In the aftermath of the 2008 global financial crisis, increasing the financial literacy of ordinary citizen-­‐consumers has taken a prominent position among regulators and financial...... -­ at least in part -­ the corporate spread of financial literacy educational initiatives can be observed as a particular form of power at-­a-­distance. The focus is on the role of private enterprise in governmentalizing the‘business of life’ by establishing and mobilizing specific conceptual forms around...

  20. The Limits of Ignorance

    DEFF Research Database (Denmark)

    Højbjerg, Erik

    2015-01-01

    , increasing the financial literacy of ordinary citizen-consumers has taken a prominent position among regulators and financial institutions alike. The logic seems to be that financially capable individuals will enjoy social and political inclusion as well as an ability to exercise a stronger influence....... The focus is on the role of private enterprise in governmentalizing the business of life by establishing and mobilizing specific conceptual forms around which the life skills of the entrepreneurial self involves a responsibilization of the individual citizen-consumer....

  1. Pollination syndromes ignored

    DEFF Research Database (Denmark)

    Maruyama, P. K.; Oliveira, G. M.; Ferreira, Célia Maria Dias

    2013-01-01

    Generalization prevails in flower-animal interactions, and although animal visitors are not equally effective pollinators, most interactions likely represent an important energy intake for the animal visitor. Hummingbirds are nectar-feeding specialists, and many tropical plants are specialized...... to increase the overall nectar availability. We showed that mean nectar offer, at the transect scale, was the only parameter related to hummingbird visitation frequency, more so than nectar offer at single flowers and at the plant scale, or pollination syndrome. Centrality indices, calculated using...... energy provided by non-ornithophilous plants may facilitate reproduction of truly ornithophilous flowers by attracting and maintaining hummingbirds in the area. This may promote asymmetric hummingbird-plant associations, i.e., pollination depends on floral traits adapted to hummingbird morphology...

  2. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  3. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  4. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  5. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  6. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  7. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  8. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  9. Detailed Characterization of Nearshore Processes During NCEX

    Science.gov (United States)

    Holland, K.; Kaihatu, J. M.; Plant, N.

    2004-12-01

    Recent technology advances have allowed the coupling of remote sensing methods with advanced wave and circulation models to yield detailed characterizations of nearshore processes. This methodology was demonstrated as part of the Nearshore Canyon EXperiment (NCEX) in La Jolla, CA during Fall 2003. An array of high-resolution, color digital cameras was installed to monitor an alongshore distance of nearly 2 km out to depths of 25 m. This digital imagery was analyzed over the three-month period through an automated process to produce hourly estimates of wave period, wave direction, breaker height, shoreline position, sandbar location, and bathymetry at numerous locations during daylight hours. Interesting wave propagation patterns in the vicinity of the canyons were observed. In addition, directional wave spectra and swash / surf flow velocities were estimated using more computationally intensive methods. These measurements were used to provide forcing and boundary conditions for the Delft3D wave and circulation model, giving additional estimates of nearshore processes such as dissipation and rip currents. An optimal approach for coupling these remotely sensed observations to the numerical model was selected to yield accurate, but also timely characterizations. This involved assimilation of directional spectral estimates near the offshore boundary to mimic forcing conditions achieved under traditional approaches involving nested domains. Measurements of breaker heights and flow speeds were also used to adaptively tune model parameters to provide enhanced accuracy. Comparisons of model predictions and video observations show significant correlation. As compared to nesting within larger-scale and coarser resolution models, the advantages of providing boundary conditions data using remote sensing is much improved resolution and fidelity. For example, rip current development was both modeled and observed. These results indicate that this approach to data-model coupling

  10. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  11. Application of Agent Methodology in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Reem Abdalla

    2017-02-01

    Full Text Available This paper presents a case study to describe the features and the phases of the two agent methodologies. The Gaia methodology for agent oriented analysis and design, Tropos is a detailed agent oriented software engineering methodology to explore each methodology's ability to present solutions for small problems. Also we provide an attempt to discover whether the methodology is in fact understandable and usable. In addition we were collecting and taking notes of the advantages and weaknesses of these methodologies during the study analysis for each methodology and the relationships among their models. The Guardian Angle: Patient-Centered Health Information System (GA: PCHIS is the personal system to help track, manage, and interpret the subject's health history, and give advice to both patient and provider is used as the case study throughout the paper.

  12. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  13. Fungi spores dimension matters in health effects: a methodology for more detail fungi exposure assessment

    OpenAIRE

    Viegas, Carla; Faria, Tiago; Sabino, Raquel; Viegas, Susana

    2016-01-01

    Health effects resulting from dust inhalation in occupational environments may be more strongly associated with specific microbial components, such as fungi, than to the particles. The aim of the present study is to characterize the occupational exposure to the fungal burden in four different occupational settings (two feed industries, one poultry and one waste sorting industry), presenting results from two air sampling methods – the impinger collector and the use of filters. In addition, ...

  14. Development and testing of incident detection algorithms. Vol. 2, research methodology and detailed results.

    Science.gov (United States)

    1976-04-01

    The development and testing of incident detection algorithms was based on Los Angeles and Minneapolis freeway surveillance data. Algorithms considered were based on times series and pattern recognition techniques. Attention was given to the effects o...

  15. Applicability of the Directed Graph Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Huszti, Jozsef [Institute of Isotope of the Hungarian Academy of Sciences, Budapest (Hungary); Nemeth, Andras [ESRI Hungary, Budapest (Hungary); Vincze, Arpad [Hungarian Atomic Energy Authority, Budapest (Hungary)

    2012-06-15

    Possible methods to construct, visualize and analyse the 'map' of the State's nuclear infrastructure based on different directed graph approaches are proposed. The transportation and the flow network models are described in detail. The use of the possible evaluation methodologies and the use of available software tools to construct and maintain the nuclear 'map' using pre-defined standard building blocks (nuclear facilities) are introduced and discussed.

  16. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  17. Researching Lean: Methodological implications of loose definitions

    DEFF Research Database (Denmark)

    Brännmark, Mikael; Langstrand, Jostein; Johansson, Stina

    2012-01-01

    practices seem to overlap with other popular management concepts, such as High Performance Work Systems, World Class Manufacturing and Total Quality Management. This confusion, combined with different methodological and theoretical traditions, has led to much debate and contradictory conclusions regarding...... Lean. The purpose of the paper is to illustrate some key methodological issues that need to be considered in future Lean research to allow increased understanding of Lean effects for different stakeholders, primarily meaning the customer, employer and employees. Design/methodology/approach – The paper...... on the case studies, we suggest that future investigations describe the Lean interventions in more detail. General descriptions or analogies, e.g. ‘learning organizations’, presumably increase the present confusion regarding Lean impact on different stakeholders. The case studies also illustrate...

  18. Template Assembly for Detailed Urban Reconstruction

    KAUST Repository

    Nan, Liangliang; Wonka, Peter; Ghanem, Bernard; Jiang, Caigui

    2015-01-01

    Structure from Motion and Multi View Stereo, and we model a set of 3D templates of facade details. Next, we optimize the initial coarse model to enforce consistency between geometry and appearance (texture images). Then, building details are reconstructed

  19. Monte Carlo methods beyond detailed balance

    NARCIS (Netherlands)

    Schram, Raoul D.; Barkema, Gerard T.|info:eu-repo/dai/nl/101275080

    2015-01-01

    Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying

  20. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  1. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  2. Ignoring the irrelevant: auditory tolerance of audible but innocuous sounds in the bat-detecting ears of moths

    Science.gov (United States)

    Fullard, James H.; Ratcliffe, John M.; Jacobs, David S.

    2008-03-01

    Noctuid moths listen for the echolocation calls of hunting bats and respond to these predator cues with evasive flight. The African bollworm moth, Helicoverpa armigera, feeds at flowers near intensely singing cicadas, Platypleura capensis, yet does not avoid them. We determined that the moth can hear the cicada by observing that both of its auditory receptors (A1 and A2 cells) respond to the cicada’s song. The firing response of the A1 cell rapidly adapts to the song and develops spike periods in less than a second that are in excess of those reported to elicit avoidance flight to bats in earlier studies. The possibility also exists that for at least part of the day, sensory input in the form of olfaction or vision overrides the moth’s auditory responses. While auditory tolerance appears to allow H. armigera to exploit a food resource in close proximity to acoustic interference, it may render their hearing defence ineffective and make them vulnerable to predation by bats during the evening when cicadas continue to sing. Our study describes the first field observation of an eared insect ignoring audible but innocuous sounds.

  3. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  4. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  5. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  6. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  7. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  8. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  9. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  10. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  11. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  12. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  13. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  14. Creative Methodologies to Enhance Communication

    Science.gov (United States)

    Kennedy, Lucille; Brewer, Gayle

    2016-01-01

    The experiences and opinions of people with learning disabilities are often ignored or devalued. Oral and life history projects allow individuals to communicate their own opinions and experiences. This process can lead to more meaningful interactions between those with learning disabilities and support workers. Whilst the interview techniques…

  15. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    Science.gov (United States)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  16. Cesare Lombroso: Methodological ambiguities and brilliant intuitions.

    Science.gov (United States)

    Gatti, Uberto; Verde, Alfredo

    2012-01-01

    This paper on Cesare Lombroso aims to assess his contribution to the criminological sciences. Although much praised worldwide, Lombroso was also the target of scathing criticism and unmitigated condemnation. Examination of Lombroso's method of data collection and analysis reveals his weakness. Indeed, his approach was extremely naive, simplistic and uncritical, aimed at irrefutably demonstrating the hypotheses that he championed, without exercising the methodological caution that was already beginning to characterize scientific research in his day. However, we must acknowledge that his biological theories of crime are undergoing new developments as a result of the recent success of biological psychiatry. On the other hand we should recognize that his work was not limited to his biological central theory; rather, it covered a range of cues and concepts, for the most part ignored, that demonstrate his interest in the economic, cultural and social factors that impact on crime. For these reasons, Lombroso appears to have anticipated many modern conceptions regarding delinquent behavior and criminal justice, such as those of restorative justice, the so-called "situational" theories of criminal behavior and white collar crime. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  18. Clinical professional governance for detailed clinical models.

    Science.gov (United States)

    Goossen, William; Goossen-Baremans, Anneke

    2013-01-01

    This chapter describes the need for Detailed Clinical Models for contemporary Electronic Health Systems, data exchange and data reuse. It starts with an explanation of the components related to Detailed Clinical Models with a brief summary of knowledge representation, including terminologies representing clinic relevant "things" in the real world, and information models that abstract these in order to let computers process data about these things. Next, Detailed Clinical Models are defined and their purpose is described. It builds on existing developments around the world and accumulates in current work to create a technical specification at the level of the International Standards Organization. The core components of properly expressed Detailed Clinical Models are illustrated, including clinical knowledge and context, data element specification, code bindings to terminologies and meta-information about authors, versioning among others. Detailed Clinical Models to date are heavily based on user requirements and specify the conceptual and logical levels of modelling. It is not precise enough for specific implementations, which requires an additional step. However, this allows Detailed Clinical Models to serve as specifications for many different kinds of implementations. Examples of Detailed Clinical Models are presented both in text and in Unified Modelling Language. Detailed Clinical Models can be positioned in health information architectures, where they serve at the most detailed granular level. The chapter ends with examples of projects that create and deploy Detailed Clinical Models. All have in common that they can often reuse materials from earlier projects, and that strict governance of these models is essential to use them safely in health care information and communication technology. Clinical validation is one point of such governance, and model testing another. The Plan Do Check Act cycle can be applied for governance of Detailed Clinical Models

  19. Social control of the quality of public services: Theory, methodology and results of empirical research

    Directory of Open Access Journals (Sweden)

    Evgeny A. Kapoguzov

    2017-06-01

    Full Text Available The article reveals the theoretical and methodological aspect of the problem of social control in relation to the possibility of its implementation in the production of public services. The interdisciplinary nature of the discourse on the nature of social control is presented, the evolution of ideas about it in the framework of social science concepts is presented, and the relationship with related categories is revealed, in particular, "public control", "civil control". The evolution of essence is also traced the category "institutionalization", it is shown the lack of unambiguousness in its interpretation. The normative value of the institutionalization of social practices in the implementation of institutional design is presented, in particular, with regard to the improvement of the provision of public services. The barriers of institutionalization of social control (resource, information, institutional for quality of public services are characterized. The results of a mass survey of consumers of public services conducted in December 2016 in the Multifunctional Center (MFC of city Omsk are presented. Unlike other surveys and publications that only assess the level of customer satisfaction and do not give a detailed explanation of the attitude of consumers to the ongoing institutional changes, this paper presents an analysis of consumer attitudes and beliefs to meaningful attributes of the quality of public services on the one hand, and for various institutional alternatives of influence on the quality of public services on the other. According to the results of the mass survey, the low readiness for social action was established due to high transaction costs, the rational ignorance and a free-rider problem. The possibility of institutionalizing the practice of social action and setting up consumers for the creation of a specialized organization for the protection of consumer rights in the production of public services was discussed.

  20. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  1. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  2. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  3. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  4. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  5. Waste Package Component Design Methodology Report

    International Nuclear Information System (INIS)

    D.C. Mecham

    2004-01-01

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational

  6. Waste Package Component Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety

  7. Visual Memory : The Price of Encoding Details

    NARCIS (Netherlands)

    Nieuwenstein, Mark; Kromm, Maria

    2017-01-01

    Studies on visual long-term memory have shown that we have a tremendous capacity for remembering pictures of objects, even at a highly detailed level. What remains unclear, however, is whether encoding objects at such a detailed level comes at any cost. In the current study, we examined how the

  8. Understanding brains: details, intuition, and big data.

    Science.gov (United States)

    Marder, Eve

    2015-05-01

    Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  9. Understanding Brains: Details, Intuition, and Big Data

    OpenAIRE

    Marder, Eve

    2015-01-01

    Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  10. Understanding brains: details, intuition, and big data.

    Directory of Open Access Journals (Sweden)

    Eve Marder

    2015-05-01

    Full Text Available Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  11. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  12. Methodology of site protection studies

    International Nuclear Information System (INIS)

    Farges, L.

    1980-01-01

    Preliminary studies preceding building of a nuclear facility aim at assessing the choice of a site and establishing operating and control procedures. These studies are of two types. Studies on the impact of environment on the nuclear facility to be constructed form one type and studies on the impact of nuclear facilities on the environment form the second type. A methodology giving a framework to studies of second type is presented. These studies are undertaken to choose suitable sites for nuclear facilities. After a preliminary selection of a site based on the first estimate, a detailed site study is undertaken. The procedure for this consists of five successive phases, namely, (1) an inquiry assessing the initial state of the site, (2) an initial synthesis of accumulated information for assessing the health and safety consequences of releases, (3) laboratory and field studies simulating the movement of waste products for a quantitative assessment of effects, (4) final synthesis for laying down the release limits and radiological control methods, and (5) conclusions based on comparing the data of final synthesis to the limits prescribed by regulations. These five phases are outlined. Role of periodic reassessments after the facility is in operation for same time is explained. (M.G.B.)

  13. Methodology of a systematic review.

    Science.gov (United States)

    Linares-Espinós, E; Hernández, V; Domínguez-Escrig, J L; Fernández-Pello, S; Hevia, V; Mayor, J; Padilla-Fernández, B; Ribal, M J

    2018-05-03

    The objective of evidence-based medicine is to employ the best scientific information available to apply to clinical practice. Understanding and interpreting the scientific evidence involves understanding the available levels of evidence, where systematic reviews and meta-analyses of clinical trials are at the top of the levels-of-evidence pyramid. The review process should be well developed and planned to reduce biases and eliminate irrelevant and low-quality studies. The steps for implementing a systematic review include (i) correctly formulating the clinical question to answer (PICO), (ii) developing a protocol (inclusion and exclusion criteria), (iii) performing a detailed and broad literature search and (iv) screening the abstracts of the studies identified in the search and subsequently of the selected complete texts (PRISMA). Once the studies have been selected, we need to (v) extract the necessary data into a form designed in the protocol to summarise the included studies, (vi) assess the biases of each study, identifying the quality of the available evidence, and (vii) develop tables and text that synthesise the evidence. A systematic review involves a critical and reproducible summary of the results of the available publications on a particular topic or clinical question. To improve scientific writing, the methodology is shown in a structured manner to implement a systematic review. Copyright © 2018 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  14. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  15. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  16. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  17. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  18. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  19. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  20. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  1. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  2. Alternative pricing methodologies

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    With the increased interest in competitive market forces and growing recognition of the deficiencies in current practices, FERC and others are exploring alternatives to embedded cost pricing. A number of these alternatives are discussed in this chapter. Marketplace pricing, discussed briefly here, is the subject of the next chapter. Obviously, the pricing formula may combine several of these methodologies. One utility of which the authors are aware is seeking a price equal to the sum of embedded costs, opportunity costs, line losses, value of service, FERC's percentage adder formula and a contract service charge

  3. Methodology to remediate a mixed waste site

    Energy Technology Data Exchange (ETDEWEB)

    Berry, J.B.

    1994-08-01

    In response to the need for a comprehensive and consistent approach to the complex issue of mixed waste management, a generalized methodology for remediation of a mixed waste site has been developed. The methodology is based on requirements set forth in the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and the Resource Conservation and Recovery Act (RCRA) and incorporates ``lessons learned`` from process design, remediation methodologies, and remediation projects. The methodology is applied to the treatment of 32,000 drums of mixed waste sludge at the Oak Ridge K-25 Site. Process technology options are developed and evaluated, first with regard to meeting system requirements and then with regard to CERCLA performance criteria. The following process technology options are investigated: (1) no action, (2) separation of hazardous and radioactive species, (3) dewatering, (4) drying, and (5) solidification/stabilization. The first two options were eliminated from detailed consideration because they did not meet the system requirements. A quantitative evaluation clearly showed that, based on system constraints and project objectives, either dewatering or drying the mixed waste sludge was superior to the solidification/stabilization process option. The ultimate choice between the drying and the dewatering options will be made on the basis of a technical evaluation of the relative merits of proposals submitted by potential subcontractors.

  4. Methodology to remediate a mixed waste site

    International Nuclear Information System (INIS)

    Berry, J.B.

    1994-08-01

    In response to the need for a comprehensive and consistent approach to the complex issue of mixed waste management, a generalized methodology for remediation of a mixed waste site has been developed. The methodology is based on requirements set forth in the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and the Resource Conservation and Recovery Act (RCRA) and incorporates ''lessons learned'' from process design, remediation methodologies, and remediation projects. The methodology is applied to the treatment of 32,000 drums of mixed waste sludge at the Oak Ridge K-25 Site. Process technology options are developed and evaluated, first with regard to meeting system requirements and then with regard to CERCLA performance criteria. The following process technology options are investigated: (1) no action, (2) separation of hazardous and radioactive species, (3) dewatering, (4) drying, and (5) solidification/stabilization. The first two options were eliminated from detailed consideration because they did not meet the system requirements. A quantitative evaluation clearly showed that, based on system constraints and project objectives, either dewatering or drying the mixed waste sludge was superior to the solidification/stabilization process option. The ultimate choice between the drying and the dewatering options will be made on the basis of a technical evaluation of the relative merits of proposals submitted by potential subcontractors

  5. Comparative Studies: historical, epistemological and methodological notes

    Directory of Open Access Journals (Sweden)

    Juan Ignacio Piovani

    2017-09-01

    Full Text Available In this article some historical, epistemological and methodological issues related to comparative studies in the social sciences are addressed, with specific reference to the field of education. The starting point is a discussion of the meaning of comparison, its logical structure and its presence in science and in everyday life. It follows the presentation and critical appraisal of the perspectives regarding comparison as a scientific method. It is argued that, even rejecting this restrictive meaning of comparison as a method, there is some consensus on the specificity of comparative studies within the social sciences. And in relation to them, the article address in more detail those studies that can be defined as trans-contextual (cross-national and cross-cultural, with emphasis on the main methodological and technical challenges they face. The socio-historical comparative perspective, which has gained importance in recent years in the field of education, is also discussed.

  6. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  7. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  8. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  9. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  10. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  11. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  12. Using proliferation assessment methodologies for Safeguards-by-Design

    International Nuclear Information System (INIS)

    Van der Meer, K.; Rossa, R.; Turcanu, C.; Borella, A.

    2013-01-01

    MYRRHA, an accelerator driven system (ADS) is designed as a proton accelerator coupled to a liquid Pb-Bi spallation target, surrounded by a Pb-Bi cooled sub-critical neutron multiplying medium in a pool type configuration. An assessment based on three methodologies was made of the proliferation risks of the MYRRHA ADS in comparison with the BR2 MTR, an existing research reactor at the Belgian Nuclear Research Centre SCK-CEN. The used methodologies were the TOPS (Technical Opportunities to Increase the Proliferation Resistance of Nuclear Power Systems), the PR-PP and the INPRO methodologies. The various features of the methodologies are described and the results of the assessments are given and discussed. It is concluded that it would be useful to define one single methodology with two options to perform a quick and a more detailed assessment. The paper is followed by the slides of the presentation

  13. Post Entitlement Management Information - Detail Database

    Data.gov (United States)

    Social Security Administration — Contains data that supports the detailed and aggregate receipt, pending and clearance data, as well as other strategic and tactical MI for many Title II and Title...

  14. Detailed Safety Review of Anthrax Vaccine Adsorbed

    National Research Council Canada - National Science Library

    2001-01-01

    To date, 18 human studies have assessed the safety of anthrax vaccination. These studies, some stretching back almost 50 years, reported adverse events after vaccination in varying degrees of detail...

  15. Cleaner combustion developing detailed chemical kinetic models

    CERN Document Server

    Battin-Leclerc, Frédérique; Simmie, John M

    2013-01-01

    This book describes the reactive chemistry of minor pollutants within extensively validated detailed mechanisms for traditional fuels, and also for innovative surrogates, describing the complex chemistry of new, environmentally important bio-fuels.

  16. Template Assembly for Detailed Urban Reconstruction

    KAUST Repository

    Nan, Liangliang

    2015-05-04

    We propose a new framework to reconstruct building details by automatically assembling 3D templates on coarse textured building models. In a preprocessing step, we generate an initial coarse model to approximate a point cloud computed using Structure from Motion and Multi View Stereo, and we model a set of 3D templates of facade details. Next, we optimize the initial coarse model to enforce consistency between geometry and appearance (texture images). Then, building details are reconstructed by assembling templates on the textured faces of the coarse model. The 3D templates are automatically chosen and located by our optimization-based template assembly algorithm that balances image matching and structural regularity. In the results, we demonstrate how our framework can enrich the details of coarse models using various data sets.

  17. Factors influencing detail detectability in radiologic imaging

    International Nuclear Information System (INIS)

    Gurvich, A.M.

    1985-01-01

    The detectability of various details is estimated quantitatively from the essential technical parameters of the imaging system and additional influencing factors including viewing of the image. The analysis implies the formation of the input radiation distribution (contrast formation, influence of kVp). Noise, image contrast (gamma), modulation transfer function and contrast threshold of the observer are of different influence on details of different size. Thus further optimization of imaging systems and their adaption to specific imaging tasks are facilitated

  18. An Innovative Synthesis Methodology for Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip

    to improve a process. However, to date only a limited number have achieved implementation in industry, such as reactive distillation, dividing wall columns and reverse flow reactors. A reason for this is that the identification of the best PI option is neither simple nor systematic. That is to decide where......‐based solution approach. Starting from an analysis of existing processes, the methodology generates a set of PI process options. Subsequently, the initial search space is reduced through an ordered sequence of steps. As the search space decreases, more process details are added, increasing the complexity...

  19. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  20. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  1. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  2. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  3. T cell ignorance is bliss: T cells are not tolerized by Langerhans cells presenting human papillomavirus antigens in the absence of costimulation

    Directory of Open Access Journals (Sweden)

    Andrew W. Woodham

    2016-12-01

    Full Text Available Human papillomavirus type 16 (HPV16 infections are intra-epithelial, and thus, HPV16 is known to interact with Langerhans cells (LCs, the resident epithelial antigen-presenting cells (APCs. The current paradigm for APC-mediated induction of T cell anergy is through delivery of T cell receptor signals via peptides on MHC molecules (signal 1, but without costimulation (signal 2. We previously demonstrated that LCs exposed to HPV16 in vitro present HPV antigens to T cells without costimulation, but it remained uncertain if such T cells would remain ignorant, become anergic, or in the case of CD4+ T cells, differentiate into Tregs. Here we demonstrate that Tregs were not induced by LCs presenting only signal 1, and through a series of in vitro immunizations show that CD8+ T cells receiving signal 1+2 from LCs weeks after consistently receiving signal 1 are capable of robust effector functions. Importantly, this indicates that T cells are not tolerized but instead remain ignorant to HPV, and are activated given the proper signals. Keywords: T cell anergy, T cell ignorance, Immune tolerance, Human papillomavirus, HPV16, Langerhans cells

  4. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  5. Magnetic resonance imaging methodology

    International Nuclear Information System (INIS)

    Moser, Ewald; Stadlbauer, Andreas; Windischberger, Christian; Quick, Harald H.; Ladd, Mark E.

    2009-01-01

    Magnetic resonance (MR) methods are non-invasive techniques to provide detailed, multi-parametric information on human anatomy, function and metabolism. Sensitivity, specificity, spatial and temporal resolution may, however, vary depending on hardware (e.g., field strength, gradient strength and speed) and software (optimised measurement protocols and parameters for the various techniques). Furthermore, multi-modality imaging may enhance specificity to better characterise complex disease patterns. Positron emission tomography (PET) is an interesting, largely complementary modality, which might be combined with MR. Despite obvious advantages, combining these rather different physical methods may also pose challenging problems. At this early stage, it seems that PET quality may be preserved in the magnetic field and, if an adequate detector material is used for the PET, MR sensitivity should not be significantly degraded. Again, this may vary for the different MR techniques, whereby functional and metabolic MR is more susceptible than standard anatomical imaging. Here we provide a short introduction to MR basics and MR techniques, also discussing advantages, artefacts and problems when MR hardware and PET detectors are combined. In addition to references for more detailed descriptions of MR fundamentals and applications, we provide an early outlook on this novel and exciting multi-modality approach to PET/MR. (orig.)

  6. A detailed and verified wind resource atlas for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Mortensen, N G; Landberg, L; Rathmann, O; Nielsen, M N [Risoe National Lab., Roskilde (Denmark); Nielsen, P [Energy and Environmental Data, Aalberg (Denmark)

    1999-03-01

    A detailed and reliable wind resource atlas covering the entire land area of Denmark has been established. Key words of the methodology are wind atlas analysis, interpolation of wind atlas data sets, automated generation of digital terrain descriptions and modelling of local wind climates. The atlas contains wind speed and direction distributions, as well as mean energy densities of the wind, for 12 sectors and four heights above ground level: 25, 45, 70 and 100 m. The spatial resolution is 200 meters in the horizontal. The atlas has been verified by comparison with actual wind turbine power productions from over 1200 turbines. More than 80% of these turbines were predicted to within 10%. The atlas will become available on CD-ROM and on the Internet. (au)

  7. Internal fire analysis screening methodology for the Salem Nuclear Generating Station

    International Nuclear Information System (INIS)

    Eide, S.; Bertucio, R.; Quilici, M.; Bearden, R.

    1989-01-01

    This paper reports on an internal fire analysis screening methodology that has been utilized for the Salem Nuclear Generating Station (SNGS) Probabilistic Risk Assessment (PRA). The methodology was first developed and applied in the Brunswick Steam Electric Plant (BSEP) PRA. The SNGS application includes several improvements and extensions to the original methodology. The SNGS approach differs significantly from traditional fire analysis methodologies by providing a much more detailed treatment of transient combustibles. This level of detail results in a model which is more usable for assisting in the management of fire risk at the plant

  8. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  9. Air barrier details: How effective are they

    Energy Technology Data Exchange (ETDEWEB)

    A project was initiated to measure the air leakage through three typical details in wood frame walls: the header joist, electric outlets, and window openings. Three construction methods were tested: the poly approach, where a sealed internal polyethylene sheet and caulking provide the air barrier; an external air barrier approach using a continuous vapor permeable membrane sandwiched between two layers of external wall sheathing; and the airtight drywall approach (ADA), where the interior gypsum board finish along with framing and gaskets are the air barrier. Twelve sample panels using each of the three details were built using each of the construction approaches. A traditional wood-frame wall construction detail, with no effort made to create a continuous air barrier, was also built and tested for comparison. The samples were put in a test chamber so that air pressures could create infiltration or exfiltration through the panel under loads similar to those due to wind action. Measurements were made at several stages during construction of each sample to see the effect of different components on the air leakage. Overall, all but the traditional samples and the ADA electrical outlet panel exceeded the current tightness standards for glass and aluminum curtain walls. All three approaches could meet the airtightness standards of the R-2000 program. The total air leakage calculated for each approach is under 20% of that in traditional construction. Of the details tested, window detailing offers the greatest potential for increasing overall airtightness compared to traditional methods. 1 ref., 2 figs., 1 tab.

  10. La sociologie peut-elle ignorer la phylogenèse de l'esprit  ?

    Directory of Open Access Journals (Sweden)

    Joëlle Proust

    2012-01-01

    research in present day cognitive science that is relevant to the de facto discussion fails to be taken into account. De jure irreducibility, on the other hand, introduces a dualism in the social sciences that is difficult to justify. The distinction between the epistemic and the cognitive realms is further presented as the ground of a de jure irreducibility; Albert Ogien, however, fails to conclusively establish that social coordination is a necessary precondition of sensitivity to epistemic norms. Louis Quéré, on his part, objects that cognitive science makes an ambiguous use of the concept of concept; a "rich" concept, which cognitive science tends to ignore, involving the understanding of truth, correction, etc., is of crucial relevance to sociology. It is responded that a meager concept of concept (unaccompanied by the analysis of what is epistemically distinctive of concept hood is not only applied to characterize non-propositional thinking in animals; meager concepts are also part of humans' associative and evaluative repertoire, concerning, inter alia, their own capacities, and the trustworthiness of their partners.¿La sociología puede ignorar la filogénesis del pensamiento?Este artículo examina los argumentos de Albert Ogien y de Louis Quéré contra el naturalismo social es decir el proyecto meta teórico que consiste en integrar los conocimientos de lo social, resultantes de la biología evolucionista y de las ciencias cognitivas, en las investigaciones llevadas acabo en el ámbito de las ciencias sociales.  Frente a los argumentos de Albert Ogien sobre la irreductibilidad de facto y de jure con respecto a lo cognitivo se objeta que deberían ser tomadas en cuenta investigaciones pertinentes recientes en ciencias cognitivas y que el dogmatismo de facto de lo irreductible provoca el dualismo difícilmente justificable en el seno de las ciencias sociales. Por otra parte si la distinción de jure entre lo epistémico y lo cognitivo, base en que se asienta la

  11. Cost estimating for CERCLA remedial alternatives a unit cost methodology

    International Nuclear Information System (INIS)

    Brettin, R.W.; Carr, D.J.; Janke, R.J.

    1995-06-01

    The United States Environmental Protection Agency (EPA) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA, Interim Final, dated October 1988 (EPA 1988) requires a detailed analysis be conducted of the most promising remedial alternatives against several evaluation criteria, including cost. To complete the detailed analysis, order-of-magnitude cost estimates (having an accuracy of +50 percent to -30 percent) must be developed for each remedial alternative. This paper presents a methodology for developing cost estimates of remedial alternatives comprised of various technology and process options with a wide range of estimated contaminated media quantities. In addition, the cost estimating methodology provides flexibility for incorporating revisions to remedial alternatives and achieves the desired range of accuracy. It is important to note that the cost estimating methodology presented here was developed as a concurrent path to the development of contaminated media quantity estimates. This methodology can be initiated before contaminated media quantities are estimated. As a result, this methodology is useful in developing cost estimates for use in screening and evaluating remedial technologies and process options. However, remedial alternative cost estimates cannot be prepared without the contaminated media quantity estimates. In the conduct of the feasibility study for Operable Unit 5 at the Fernald Environmental Management Project (FEMP), fourteen remedial alternatives were retained for detailed analysis. Each remedial alternative was composed of combinations of remedial technologies and processes which were earlier determined to be best suited for addressing the media-specific contaminants found at the FEMP site, and achieving desired remedial action objectives

  12. Fatigue-Prone Details in Steel Bridges

    Directory of Open Access Journals (Sweden)

    Mohsen Heshmati

    2012-11-01

    Full Text Available This paper reviews the results of a comprehensive investigation including more than 100 fatigue damage cases, reported for steel and composite bridges. The damage cases are categorized according to types of detail. The mechanisms behind fatigue damage in each category are identified and studied. It was found that more than 90% of all reported damage cases are of deformation-induced type and generated by some kind of unintentional or otherwise overlooked interaction between different load-carrying members or systems in the bridge. Poor detailing, with unstiffened gaps and abrupt changes in stiffness at the connections between different members were also found to contribute to fatigue cracking in many details.

  13. Contribution to a Theory of Detailed Design

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik

    1999-01-01

    It has been recognised, that literature actually do not propose a theory of detailed design. In this paper a theory contribution is proposed, linking part design to organ design and allowing a type of functional reasoning. The proposed theory satisfies our need for explaining the nature of a part...... structure, for support of synthesis of part structure, i.e. detailed design, and our need for digital modelling of part structures.The aim of this paper is to contribute to a design theory valid for detailed design. The proposal is based upon the theory's ability to explain the nature of machine parts...... and assemblies, to support the synthesis of parts and to allow the modelling, especially digital modelling of a part structure. The contribution is based upon Theory of Technical Systems, Hubka, and the Domain Theory, Andreasen. This paper is based on a paper presented at ICED 99, Mortensen, but focus...

  14. Detailed balance and reciprocity in solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Kirchartz, Thomas; Rau, Uwe [IEF5-Photovoltaik, Forschungszentrum Juelich, 52425 Juelich (Germany)

    2008-12-15

    The limiting efficiency of photovoltaic devices follows from the detailed balance of absorption and emission of a diode according to the Shockley-Queisser theory. However, the principle of detailed balance has more implications for the understanding of photovoltaic devices than only defining the efficiency limit. We show how reciprocity relations between carrier collection and dark carrier injection, between electroluminescence emission and photovoltaic quantum efficiency and between open circuit voltage and light emitting diode quantum efficiency all follow from the principle of detailed balance. We also discuss the validity range of the Shockley-Queisser limit and the reciprocity relations. Discussing the validity of the reciprocity relations helps to deepen the understanding of photovoltaic devices and allows us to identify interrelationships between the superposition principle, the diode ideality and the reciprocity relations. (copyright 2008 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  15. Memory for details with self-referencing.

    Science.gov (United States)

    Serbun, Sarah J; Shih, Joanne Y; Gutchess, Angela H

    2011-11-01

    Self-referencing benefits item memory, but little is known about the ways in which referencing the self affects memory for details. Experiment 1 assessed whether the effects of self-referencing operate only at the item, or general, level or whether they also enhance memory for specific visual details of objects. Participants incidentally encoded objects by making judgements in reference to the self, a close other (one's mother), or a familiar other (Bill Clinton). Results indicate that referencing the self or a close other enhances both specific and general memory. Experiments 2 and 3 assessed verbal memory for source in a task that relied on distinguishing between different mental operations (internal sources). The results indicate that self-referencing disproportionately enhances source memory, relative to conditions referencing other people, semantic, or perceptual information. We conclude that self-referencing not only enhances specific memory for both visual and verbal information, but can also disproportionately improve memory for specific internal source details.

  16. Capturing Individual Uptake: Toward a Disruptive Research Methodology

    Science.gov (United States)

    Bastian, Heather

    2015-01-01

    This article presents and illustrates a qualitative research methodology for studies of uptake. It does so by articulating a theoretical framework for qualitative investigations of uptake and detailing a research study designed to invoke and capture students' uptakes in a first-year writing classroom. The research design sought to make uptake…

  17. Fire safety analysis: methodology

    International Nuclear Information System (INIS)

    Kazarians, M.

    1998-01-01

    From a review of the fires that have occurred in nuclear power plants and the results of fire risk studies that have been completed over the last 17 years, we can conclude that internal fires in nuclear power plants can be an important contributor to plant risk. Methods and data are available to quantify the fire risk. These methods and data have been subjected to a series of reviews and detailed scrutiny and have been applied to a large number of plants. There is no doubt that we do not know everything about fire and its impact on a nuclear power plants. However, this lack of knowledge or uncertainty can be quantified and can be used in the decision making process. In other words, the methods entail uncertainties and limitations that are not insurmountable and there is little or no basis for the results of a fire risk analysis fail to support a decision process

  18. Local address and emergency contact details

    CERN Multimedia

    2013-01-01

    The HR Department would like to remind members of the personnel that they are responsible for ensuring that their personal data concerning local address and preferred emergency contact details remains valid and up-to-date.   Both are easily accessible via the links below: Local address: https://edh.cern.ch/Document/Personnel/LocalAddressChange   Emergency contacts: https://edh.cern.ch/Document/Personnel/EC   Please take a few minutes to check your details and modify if necessary. Thank you in advance. HR Department Head Office

  19. Severn Barrage project. Detailed report - V. 5

    Energy Technology Data Exchange (ETDEWEB)

    1989-01-01

    Prior to the present programme of work, the effects which a tidal power barrage would have on the region, during both construction and operation, had not been studied in detail. This volume of the Detailed Report therefore represents a significant extension of work into these aspects of the Severn Barrage Project. In the Regional Study, a number of benefits have been identified, some of which may represent net benefits nationally. The economic assessment of both regional and national benefits and costs is presented. The second part of this volume reports on the work done on the Legal Background for the Project. (author).

  20. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  1. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  2. Structural concepts and details for seismic design

    International Nuclear Information System (INIS)

    Johnson, M.W.; Smietana, E.A.; Murray, R.C.

    1991-01-01

    As a part of the DOE Natural Phenomena Hazards Program, a new manual has been developed, entitled UCRL-CR-106554, open-quotes Structural Concepts and Details for Seismic Design.close quotes This manual describes and illustrates good practice for seismic-resistant design

  3. 16 CFR 1750.5 - Detailed requirements.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Detailed requirements. 1750.5 Section 1750.5 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION REFRIGERATOR SAFETY ACT REGULATIONS STANDARD FOR... directed perpendicularly to the plane of the door and applied anywhere along the latch edge of the inside...

  4. New details emerge from the Einstein files

    CERN Multimedia

    Overbye, D

    2002-01-01

    For many years the FBI spied on Einstein. New details of this surveilance are emerging in "The Einstein File: J. Edgar Hoover's Secret War Against the World's Most Famous Scientist," by Fred Jerome, who sued the government with the help of the Public Citizen Litigation Group to obtain a less censored version of the file (1 page).

  5. Detailed numerical simulations of laser cooling processes

    Science.gov (United States)

    Ramirez-Serrano, J.; Kohel, J.; Thompson, R.; Yu, N.

    2001-01-01

    We developed a detailed semiclassical numerical code of the forces applied on atoms in optical and magnetic fields to increase the understanding of the different roles that light, atomic collisions, background pressure, and number of particles play in experiments with laser cooled and trapped atoms.

  6. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  7. Analytical methodology for nuclear safeguards

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2011-01-01

    This paper attempts to briefly describe the analytical methodologies available and also highlight some of the challenges, expectations from nuclear material accounting and control (NUMAC) point of view

  8. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  9. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  10. Generalized detailed balance theory of solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Kirchartz, Thomas

    2009-12-12

    The principle of detailed balance is the requirement that every microscopic process in a system must be in equilibrium with its inverse process, when the system itself is in thermodynamic equilibrium. This detailed balance principle has been of special importance for photovoltaics, since it allows the calculation of the limiting efficiency of a given solar cell by defining the only fundamental loss process as the radiative recombination of electron/hole pairs followed by the emission of a photon. In equilibrium, i.e. in the dark and without applied voltage, the absorbed and emitted photon flux must be equal due to detailed balance. This equality determines the radiative recombination from absorption and vice versa. While the classical theory of photovoltaic efficiency limits by Shockley and Queisser considers only one detailed balance pair, namely photogeneration and radiative recombination, the present work extends the detailed balance principle to any given process in the solar cell. Applying the detailed balance principle to the whole device leads to two major results, namely (i) a model that is compatible with the Shockley-Queisser efficiency limit for efficient particle transport, while still being able to describe non-ideal and non-linear solar cells, and (ii) an analytical relation between electroluminescent emission and photovoltaic action of a diode that is applied to a variety of different solar cells. This thesis presents several variations of a detailed balance model that are applicable to different types of solar cells. Any typical inorganic solar cell is a mainly bipolar device, meaning that the current is carried by electrons and holes. The detailed balance model for pn-type and pin-type bipolar solar cells is therefore the most basic incorporation of a detailed balance model. The only addition compared to the classical diode theory or compared to standard one-dimensional device simulators is the incorporation of photon recycling, making the model

  11. Systems selection methodology for civil nuclear power applications

    International Nuclear Information System (INIS)

    Scarborough, J.

    1988-01-01

    A methodology for evaluation and selection of a preferred Advanced Small or Medium Power Reactor (SMPR) for commercial electric power generation is discussed, and an illustrative example is presented with five US Advanced SMPR power plants. The evaluation procedure was developed from a methodology for ranking small, advanced nuclear power plant designs under development by the US Department of Energy (DOE) and Department of Defense (DOD). The methodology involves establishing numerical probability distributions for each of fifteen evaluation criteria for each Advanced SMPR plant. A resultant single probability distribution with its associated numerical mean value is then developed for each Advanced SMPR plant by Monte Carlo sampling techniques in order that each plant may be ranked with an associated statement of certainty. The selection methodology is intended as a screening procedure for commercial offerings to preclude detailed technical and commercial assessments from being conducted for those offerings which do not meet the initial screening criteria

  12. Systems selection methodology for civil nuclear power applications

    International Nuclear Information System (INIS)

    Scarborough, J.C.

    1987-01-01

    A methodology for evaluation and selection of a preferred Advanced Small or Medium Power Reactor (SMPR) for commercial electric power generation is discussed, and an illustrative example is presented with five U.S. Advanced SMPR power plants. The evaluation procedure was developed from a methodology for ranking small. advenced nuclear power plant designs under development by the U.S. Department of Energy (DOE) and Department of Defense (DOD). The methodology involves establishing numerical probability distributions for each of fifteen evaluation criteria for each Advanced SMPR plant. A resultant single probability distribution with its associated numerical mean value is then developed for each Advanced SMPR plant by Monte Carlo sampling techniques in order that each plant may be ranked with an associated statement of certainty. The selection methodology is intended as a screening procedure for commercial offerings to preclude detailed technical and commercial assessments from being conducted for those offerings which do not meet the initial screening criteria. (auhtor)

  13. Spent fuel management fee methodology and computer code user's manual

    International Nuclear Information System (INIS)

    Engel, R.L.; White, M.K.

    1982-01-01

    The methodology and computer model described here were developed to analyze the cash flows for the federal government taking title to and managing spent nuclear fuel. The methodology has been used by the US Department of Energy (DOE) to estimate the spent fuel disposal fee that will provide full cost recovery. Although the methodology was designed to analyze interim storage followed by spent fuel disposal, it could be used to calculate a fee for reprocessing spent fuel and disposing of the waste. The methodology consists of two phases. The first phase estimates government expenditures for spent fuel management. The second phase determines the fees that will result in revenues such that the government attains full cost recovery assuming various revenue collection philosophies. These two phases are discussed in detail in subsequent sections of this report. Each of the two phases constitute a computer module, called SPADE (SPent fuel Analysis and Disposal Economics) and FEAN (FEe ANalysis), respectively

  14. Bolivia-Brazil gas line route detailed

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that state oil companies of Brazil and Bolivia have signed an agreement outlining the route for a 2,270 km pipeline system to deliver natural gas from Bolivian fields to Southeast Brazil. The two sides currently are negotiating details about construction costs as well as contract volumes and prices. Capacity is projected at 283-565 MMcfd. No official details are available, but Roberto Y. Hukai, a director of the Sao Paulo engineering company Jaako Poyry/Technoplan, estimates transportation cost of the Bolivian gas at 90 cents/MMBTU. That would be competitive with the price of gas delivered to the Sao Paulo gas utility Comgas, he the. Brazil's Petroleos Brasileiro SA estimates construction of the pipeline on the Brazilian side alone with cost $1.2-1.4 billion. Bolivia's Yacimientos Petroliferos Fiscales Bolivianos (YPFB) is negotiating with private domestic and foreign investors for construction of the Bolivian portion of the project

  15. Detailed Electrochemical Characterisation of Large SOFC Stacks

    DEFF Research Database (Denmark)

    Mosbæk, Rasmus Rode; Hjelm, Johan; Barfod, R.

    2012-01-01

    application of advanced methods for detailed electrochemical characterisation during operation. An operating stack is subject to steep compositional gradients in the gaseous reactant streams, and significant temperature gradients across each cell and across the stack, which makes it a complex system...... Fuel Cell A/S was characterised in detail using electrochemical impedance spectroscopy. An investigation of the optimal geometrical placement of the current probes and voltage probes was carried out in order to minimise measurement errors caused by stray impedances. Unwanted stray impedances...... are particularly problematic at high frequencies. Stray impedances may be caused by mutual inductance and stray capacitance in the geometrical set-up and do not describe the fuel cell. Three different stack geometries were investigated by electrochemical impedance spectroscopy. Impedance measurements were carried...

  16. Detailed sectional anatomy of the spine

    International Nuclear Information System (INIS)

    Rauschning, W.

    1985-01-01

    Morphologic studies on the human spine constitute a special challenge because of the spine's complex topographic anatomy and the intimate relationship between the supporting skeleton and the contiguous soft tissues (muscles, discs, joint capsules) as well as the neurovascular contents of the spinal canal and intervertebral foramina. The improving resolution and multiplanar image reformatting capabilities of modern CT scanners call for accurate anatomic reference material. Such anatomic images should be available without distortion, in natural colors, and in considerable detail. The images should present the anatomy in the correct axial, sagittal, and coronal planes and should also be sufficiently closely spaced so as to follow the thin cuts of modern CT scanners. This chapter details one of several recent attempts to correlate gross anatomy with the images depicted by high-resolution CT. The methods of specimen preparation, sectioning, and photographing have been documented elsewhere

  17. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  18. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  19. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  20. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  1. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  2. Detailed observations of NGC 4151 with IUE

    International Nuclear Information System (INIS)

    Bromage, G.E.; Boksenberg, A.; Clavel, J.

    1984-12-01

    A detailed analysis is presented of the ultraviolet (lambdalambda 1150-3200 A) absorption spectrum of the NGC 4151 Seyfert nucleus. The IUE data base consisted of high dispersion (Δlambda approx. 0.2 A) spectra at 5 epochs, and 137 low dispersion (Δlambda approx. 4-8 A) spectra at 31 epochs from 1978 February to 1980 May, together with further low dispersion data in 1980-81 with NGC 4151 in a very faint quiescent state. (author)

  3. Detailed Sensory Memory, Sloppy Working Memory

    OpenAIRE

    Sligte, Ilja G.; Vandenbroucke, Annelinde R. E.; Scholte, H. Steven; Lamme, Victor A. F.

    2010-01-01

    Visual short-term memory (VSTM) enables us to actively maintain information in mind for a brief period of time after stimulus disappearance. According to recent studies, VSTM consists of three stages - iconic memory, fragile VSTM, and visual working memory - with increasingly stricter capacity limits and progressively longer lifetimes. Still, the resolution (or amount of visual detail) of each VSTM stage has remained unexplored and we test this in the present study. We presented people with a...

  4. Reserving by detailed conditioning on individual claim

    Science.gov (United States)

    Kartikasari, Mujiati Dwi; Effendie, Adhitya Ronnie; Wilandari, Yuciana

    2017-03-01

    The estimation of claim reserves is an important activity in insurance companies to fulfill their liabilities. Recently, reserving method of individual claim have attracted a lot of interest in the actuarial science, which overcome some deficiency of aggregated claim method. This paper explores the Reserving by Detailed Conditioning (RDC) method using all of claim information for reserving with individual claim of liability insurance from an Indonesian general insurance company. Furthermore, we compare it to Chain Ladder and Bornhuetter-Ferguson method.

  5. Detailed design of product oriented manufacturing systems

    OpenAIRE

    Silva, Sílvio Carmo; Alves, Anabela Carvalho

    2006-01-01

    This paper presents a procedure for the detailed design and redesign of manufacturing systems within a framework of constantly fitting production system configuration to the varying production needs of products. With such an approach is achieved the design of Product Oriented Manufacturing Systems – POMS. This approach is in opposition to the fitting, before hand, of a production system to all products within a company. In this case is usual to adopt a Function Oriented Manufactur...

  6. An investigation of constructions of justice and injustice in chronic pain: a Q-methodology approach.

    Science.gov (United States)

    McParland, Joanna; Hezseltine, Louisa; Serpell, Michael; Eccleston, Christopher; Stenner, Paul

    2011-09-01

    This study used Q-methodology to explore justice-related accounts of chronic pain. Eighty participants completed the Q-sorting procedure (33 chronic pain sufferers and 47 non-pain sufferers). Analysis revealed five main factors. Three factors blame: society for poor medical and interpersonal treatment; the chronic pain sufferer for indulging in self-pity and unempathic healthcare workers for ignoring patients. A fourth factor acknowledges the unfairness of pain and encourages self-reliance. The fifth factor rejects injustice in the chronic pain discourse. Overall, there is a shared view that chronic pain brings unfair treatment, disrespect and a de-legitimization of pain. Future research ideas are suggested.

  7. Revisiting the Seductive Details Effect in Multimedia Learning: Context-Dependency of Seductive Details

    Science.gov (United States)

    Ozdemir, Devrim; Doolittle, Peter

    2015-01-01

    The purpose of this study was to investigate the effects of context-dependency of seductive details on recall and transfer in multimedia learning environments. Seductive details were interesting yet irrelevant sentences in the instructional text. Two experiments were conducted. The purpose of Experiment 1 was to identify context-dependent and…

  8. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  9. Scrum Methodology in Higher Education: Innovation in Teaching, Learning and Assessment

    Science.gov (United States)

    Jurado-Navas, Antonio; Munoz-Luna, Rosa

    2017-01-01

    The present paper aims to detail the experience developed in a classroom of English Studies from the Spanish University of Málaga, where an alternative project-based learning methodology has been implemented. Such methodology is inspired by scrum sessions widely extended in technological companies where staff members work in teams and are assigned…

  10. A Preliminary Methodology, and a Cautionary Tale, for Determining How Students Seek Research Help Online

    Science.gov (United States)

    Pellegrino, Catherine

    2014-01-01

    This article reports on a pilot study to examine undergraduate students' help-seeking behavior when undertaking library research in online courses. A novel methodology incorporating elements of ethnographic research resulted in a small, but rich and detailed, collection of qualitative data. The data suggest that the methodology has promise for…

  11. Draft report: a selection methodology for LWR safety R and D programs and proposals

    Energy Technology Data Exchange (ETDEWEB)

    Husseiny, A. A.; Ritzman, R. L.

    1980-03-01

    The results of work done to develop a methodology for selecting LWR safety R and D programs and proposals is described. A critical survey of relevant decision analysis methods is provided including the specifics of multiattribute utility theory. This latter method forms the basis of the developed selection methodology. Details of the methodology and its use are provided along with a sample illustration of its application.

  12. Draft report: a selection methodology for LWR safety R and D programs and proposals

    International Nuclear Information System (INIS)

    Husseiny, A.A.; Ritzman, R.L.

    1980-03-01

    The results of work done to develop a methodology for selecting LWR safety R and D programs and proposals is described. A critical survey of relevant decision analysis methods is provided including the specifics of multiattribute utility theory. This latter method forms the basis of the developed selection methodology. Details of the methodology and its use are provided along with a sample illustration of its application

  13. Boosting flood warning schemes with fast emulator of detailed hydrodynamic models

    Science.gov (United States)

    Bellos, V.; Carbajal, J. P.; Leitao, J. P.

    2017-12-01

    Floods are among the most destructive catastrophic events and their frequency has incremented over the last decades. To reduce flood impact and risks, flood warning schemes are installed in flood prone areas. Frequently, these schemes are based on numerical models which quickly provide predictions of water levels and other relevant observables. However, the high complexity of flood wave propagation in the real world and the need of accurate predictions in urban environments or in floodplains hinders the use of detailed simulators. This sets the difficulty, we need fast predictions that meet the accuracy requirements. Most physics based detailed simulators although accurate, will not fulfill the speed demand. Even if High Performance Computing techniques are used (the magnitude of required simulation time is minutes/hours). As a consequence, most flood warning schemes are based in coarse ad-hoc approximations that cannot take advantage a detailed hydrodynamic simulation. In this work, we present a methodology for developing a flood warning scheme using an Gaussian Processes based emulator of a detailed hydrodynamic model. The methodology consists of two main stages: 1) offline stage to build the emulator; 2) online stage using the emulator to predict and generate warnings. The offline stage consists of the following steps: a) definition of the critical sites of the area under study, and the specification of the observables to predict at those sites, e.g. water depth, flow velocity, etc.; b) generation of a detailed simulation dataset to train the emulator; c) calibration of the required parameters (if measurements are available). The online stage is carried on using the emulator to predict the relevant observables quickly, and the detailed simulator is used in parallel to verify key predictions of the emulator. The speed gain given by the emulator allows also to quantify uncertainty in predictions using ensemble methods. The above methodology is applied in real

  14. Detailed modeling of mountain wave PSCs

    Directory of Open Access Journals (Sweden)

    S. Fueglistaler

    2003-01-01

    Full Text Available Polar stratospheric clouds (PSCs play a key role in polar ozone depletion. In the Arctic, PSCs can occur on the mesoscale due to orographically induced gravity waves. Here we present a detailed study of a mountain wave PSC event on 25-27 January 2000 over Scandinavia. The mountain wave PSCs were intensively observed by in-situ and remote-sensing techniques during the second phase of the SOLVE/THESEO-2000 Arctic campaign. We use these excellent data of PSC observations on 3 successive days to analyze the PSCs and to perform a detailed comparison with modeled clouds. We simulated the 3-dimensional PSC structure on all 3 days with a mesoscale numerical weather prediction (NWP model and a microphysical box model (using best available nucleation rates for ice and nitric acid trihydrate particles. We show that the combined mesoscale/microphysical model is capable of reproducing the PSC measurements within the uncertainty of data interpretation with respect to spatial dimensions, temporal development and microphysical properties, without manipulating temperatures or using other tuning parameters. In contrast, microphysical modeling based upon coarser scale global NWP data, e.g. current ECMWF analysis data, cannot reproduce observations, in particular the occurrence of ice and nitric acid trihydrate clouds. Combined mesoscale/microphysical modeling may be used for detailed a posteriori PSC analysis and for future Arctic campaign flight and mission planning. The fact that remote sensing alone cannot further constrain model results due to uncertainities in the interpretation of measurements, underlines the need for synchronous in-situ PSC observations in campaigns.

  15. IPUMS: Detailed global data on population characteristics

    Science.gov (United States)

    Kugler, T.

    2017-12-01

    Many new and exciting sources of data on human population distributions based on remote sensing, mobile technology, and other mechanisms are becoming available. These new data sources often provide fine scale spatial and/or temporal resolution. However, they typically focus on the location of population, with little or no information on population characteristics. The large and growing collection of data available through the IPUMS family of products complements datasets that provide spatial and temporal detail but little attribute detail by providing the full depth of characteristics covered by population censuses, including demographic, household structure, economic, employment, education, and housing characteristics. IPUMS International provides census microdata for 85 countries. Microdata provide the responses to every census question for each individual in a sample of households. Microdata identify the sub-national geographic unit in which a household is located, but for confidentiality reasons, identified units must include a minimum population, typically 20,000 people. Small-area aggregate data often describe much smaller geographic units, enabling study of detailed spatial patterns of population characteristics. However the structure of aggregate data tables is highly heterogeneous across countries, census years, and even topics within a given census, making these data difficult to work with in any systematic way. A recently funded project will assemble small-area aggregate population and agricultural census data published by national statistical offices. Through preliminary work collecting and cataloging over 10,000 tables, we have identified a small number of structural families that can be used to organize the many different structures. These structural families will form the basis for software tools to document and standardize the tables for ingest into a common database. Both the microdata and aggregate data are made available through IPUMS Terra

  16. Human Factors Considerations in New Nuclear Power Plants: Detailed Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    OHara,J.; Higgins, J.; Brown, W.; Fink, R.

    2008-02-14

    This Nuclear Regulatory Commission (NRC) sponsored study has identified human-performance issues in new and advanced nuclear power plants. To identify the issues, current industry developments and trends were evaluated in the areas of reactor technology, instrumentation and control technology, human-system integration technology, and human factors engineering (HFE) methods and tools. The issues were organized into seven high-level HFE topic areas: Role of Personnel and Automation, Staffing and Training, Normal Operations Management, Disturbance and Emergency Management, Maintenance and Change Management, Plant Design and Construction, and HFE Methods and Tools. The issues where then prioritized into four categories using a 'Phenomena Identification and Ranking Table' methodology based on evaluations provided by 14 independent subject matter experts. The subject matter experts were knowledgeable in a variety of disciplines. Vendors, utilities, research organizations and regulators all participated. Twenty issues were categorized into the top priority category. This Brookhaven National Laboratory (BNL) technical report provides the detailed methodology, issue analysis, and results. A summary of the results of this study can be found in NUREG/CR-6947. The research performed for this project has identified a large number of human-performance issues for new control stations and new nuclear power plant designs. The information gathered in this project can serve as input to the development of a long-term strategy and plan for addressing human performance in these areas through regulatory research. Addressing human-performance issues will provide the technical basis from which regulatory review guidance can be developed to meet these challenges. The availability of this review guidance will help set clear expectations for how the NRC staff will evaluate new designs, reduce regulatory uncertainty, and provide a well-defined path to new nuclear power plant

  17. Academic detailing to teach aging and geriatrics.

    Science.gov (United States)

    Duckett, Ashley; Cuoco, Theresa; Pride, Pamela; Wiley, Kathy; Iverson, Patty J; Marsden, Justin; Moran, William; Caton, Cathryn

    2015-01-01

    Geriatric education is a required component of internal medicine training. Work hour rules and hectic schedules have challenged residency training programs to develop and utilize innovative teaching methods. In this study, the authors examined the use of academic detailing as a teaching intervention in their residents' clinic and on the general medicine inpatient wards to improve clinical knowledge and skills in geriatric care. The authors found that this teaching method enables efficient, directed education without disrupting patient care. We were able to show improvements in medical knowledge as well as self-efficacy across multiple geriatric topics.

  18. A detailed phylogeny for the Methanomicrobiales

    Science.gov (United States)

    Rouviere, P.; Mandelco, L.; Winker, S.; Woese, C. R.

    1992-01-01

    The small subunit rRNA sequence of twenty archaea, members of the Methanomicrobiales, permits a detailed phylogenetic tree to be inferred for the group. The tree confirms earlier studies, based on far fewer sequences, in showing the group to be divided into two major clusters, temporarily designated the "methanosarcina" group and the "methanogenium" group. The tree also defines phylogenetic relationships within these two groups, which in some cases do not agree with the phylogenetic relationships implied by current taxonomic names--a problem most acute for the genus Methanogenium and its relatives. The present phylogenetic characterization provides the basis for a consistent taxonomic restructuring of this major methanogenic taxon.

  19. The Soleil detailed pre-project report

    International Nuclear Information System (INIS)

    1999-01-01

    The aim of the joint CNRS/CEA Soleil project was to develop a facility equipped with several synchrotron radiation sources and their associated experimental devices in order to answer the estimated research needs in this domain for the 20 to 30 forthcoming years. This document is the detailed pre-project. It describes the studies carried out and relative to the infrastructures and buildings, to the accelerators and light sources (storage ring, injector, radiation production), to the program of experiments, to the computer science aspects, and to the administrative and organisational aspects. (J.S.)

  20. Moving the gender agenda or stirring chicken’s entrails?: where next for feminist methodologies in accounting?

    OpenAIRE

    Haynes, Kathryn

    2007-01-01

    Purpose – The paper critiques recent research on gender and accounting to explore how feminist methodology can move on and radicalise the gender agenda in the accounting context. Design/methodology/approach – After examining current research on gender and accounting, the paper explores the nature of feminist methodology and its relation to epistemology. It explores three inter-related tenets of feminist methodology in detail: Power and Politics, Subjectivity and Reflexivity. Findings – The pa...

  1. Data on the descriptive overview and the quality assessment details of 12 qualitative research papers.

    Science.gov (United States)

    Barnabishvili, Maia; Ulrichs, Timo; Waldherr, Ruth

    2016-09-01

    This data article presents the supplementary material for the review paper "Role of acceptability barriers in delayed diagnosis of Tuberculosis: Literature review from high burden countries" (Barnabishvili et al., in press) [1]. General overview of 12 qualitative papers, including the details about authors, years of publication, data source locations, study objectives, overview of methods, study population characteristics, as well as the details of intervention and the outcome parameters of the papers are summarized in the first two tables included to the article. Quality assessment process of the methodological strength of 12 papers and the results of the critical appraisal are further described and summarized in the second part of the article.

  2. ANL calculational methodologies for determining spent nuclear fuel source term

    International Nuclear Information System (INIS)

    McKnight, R. D.

    2000-01-01

    Over the last decade Argonne National Laboratory has developed reactor depletion methods and models to determine radionuclide inventories of irradiated EBR-II fuels. Predicted masses based on these calculational methodologies have been validated using available data from destructive measurements--first from measurements of lead EBR-II experimental test assemblies and later using data obtained from processing irradiated EBR-II fuel assemblies in the Fuel Conditioning Facility. Details of these generic methodologies are described herein. Validation results demonstrate these methods meet the FCF operations and material control and accountancy requirements

  3. Evaluation of safeguards procedures: a summary of a methodology

    International Nuclear Information System (INIS)

    Salisbury, J.D.; Savage, J.W.

    1979-01-01

    A methodology for the evaluation of safeguards procedures is described. As presently conceptualized, the methodology will consist of the following steps: (1) expansion of the general protection requirements that are contained in the NRC regulations into more detailed but still generic requirements for use at the working level; (2) development of techniques and formats for using the working-level requirements in an evaluation; (3) development of a technique for converting specific facility protection procedures into a format that will allow comparison with the working-level requirements; (4) development of an evaluation technique for comparing the facility protection procedures to determine if they meet the protection requirements

  4. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  5. Detailed Chemical Kinetic Modeling of Hydrazine Decomposition

    Science.gov (United States)

    Meagher, Nancy E.; Bates, Kami R.

    2000-01-01

    The purpose of this research project is to develop and validate a detailed chemical kinetic mechanism for gas-phase hydrazine decomposition. Hydrazine is used extensively in aerospace propulsion, and although liquid hydrazine is not considered detonable, many fuel handling systems create multiphase mixtures of fuels and fuel vapors during their operation. Therefore, a thorough knowledge of the decomposition chemistry of hydrazine under a variety of conditions can be of value in assessing potential operational hazards in hydrazine fuel systems. To gain such knowledge, a reasonable starting point is the development and validation of a detailed chemical kinetic mechanism for gas-phase hydrazine decomposition. A reasonably complete mechanism was published in 1996, however, many of the elementary steps included had outdated rate expressions and a thorough investigation of the behavior of the mechanism under a variety of conditions was not presented. The current work has included substantial revision of the previously published mechanism, along with a more extensive examination of the decomposition behavior of hydrazine. An attempt to validate the mechanism against the limited experimental data available has been made and was moderately successful. Further computational and experimental research into the chemistry of this fuel needs to be completed.

  6. Detailed balance of the Feynman micromotor

    Science.gov (United States)

    Abbott, Derek; Davis, Bruce R.; Parrondo, Juan M. R.

    1999-09-01

    One existing implication of micromotors is that they can be powered by rectifying non-equilibrium thermal fluctuations or mechanical vibrations via the so-called Feynman- micromotor. An example of mechanical rectification is found in the batteryless wristwatch. The original concept was described in as early as 1912 by Smoluchowski and was later revisited in 1963 by Feynman, in the context of rectifying thermal fluctuations to obtain useful motion. It has been shown that, although rectification is impossible at equilibrium, it is possible for the Feynman-micromotor to perform work under non-equilibrium conditions. These concepts can now be realized by MEMS technology and may have exciting implications in biomedicine - where the Feynman- micromotor can be used to power a smart pill, for example. Previously, Feynman's analysis of the motor's efficiency has been shown to be flawed by Parrondo and Espanol. We now show there are further problems in Feynman's treatment of detailed balance. In order to design and understand this device correctly, the equations of detailed balance must be found. Feynman's approach was to use probabilities based on energies and we show that this is problematic. In this paper, we demonstrate corrected equations using level crossing probabilities instead. A potential application of the Feynman-micromotor is a batteryless nanopump that consists of a small MEMS chip that adheres to the skin of a patient and dispense nanoliter quantities of medication. Either mechanical or thermal rectification via a Feynman- micromotor, as the power source, is open for possible investigation.

  7. Detailed radon emanation mapping in Northern Latium

    International Nuclear Information System (INIS)

    Aumento, F.

    1993-01-01

    Detailed radon surveys over 5,000 km 2 of Northern Latium, covering the northern part of the volcanic province of Central Italy, commenced in the mid eighties as part of a geothermal exploration programme; the surveys have subsequently been continued and amplified with environmental protection in mind. The area is now covered by ground emission maps, radon levels in water supplies, emissions from the different lithologies and concentrations in houses. The high uraniferous content of the volcanics, the porous nature of the ubiquitous pyroclastics, and active geothermal systems in the area combine to convey to ground level high concentrations of radon. The emissions show strong lateral variations which are geologically and tectonically controlled, such that only detailed surveys reveal the extent and locations of anomalous radon emanations. Unfortunately, long ago towns often developed in strategic locations. For Northern Latium this means on volcanic highs formed by faulted tuff blocks, two geological features associated with particularly high radon emissions. As a result, in contrast to the low average indoor radon concentrations for the greater part of Italy, in some of these town the average values exceed 450 Bq/m 3 . (author). 1 fig

  8. Calibrating Detailed Chemical Analysis of M dwarfs

    Science.gov (United States)

    Veyette, Mark; Muirhead, Philip Steven; Mann, Andrew; Brewer, John; Allard, France; Homeier, Derek

    2018-01-01

    The ability to perform detailed chemical analysis of Sun-like F-, G-, and K-type stars is a powerful tool with many applications including studying the chemical evolution of the Galaxy, assessing membership in stellar kinematic groups, and constraining planet formation theories. Unfortunately, complications in modeling cooler stellar atmospheres has hindered similar analysis of M-dwarf stars. Large surveys of FGK abundances play an important role in developing methods to measure the compositions of M dwarfs by providing benchmark FGK stars that have widely-separated M dwarf companions. These systems allow us to empirically calibrate metallicity-sensitive features in M dwarf spectra. However, current methods to measure metallicity in M dwarfs from moderate-resolution spectra are limited to measuring overall metallicity and largely rely on astrophysical abundance correlations in stellar populations. In this talk, I will discuss how large, homogeneous catalogs of precise FGK abundances are crucial to advancing chemical analysis of M dwarfs beyond overall metallicity to direct measurements of individual elemental abundances. I will present a new method to analyze high-resolution, NIR spectra of M dwarfs that employs an empirical calibration of synthetic M dwarf spectra to infer effective temperature, Fe abundance, and Ti abundance. This work is a step toward detailed chemical analysis of M dwarfs at a similar precision achieved for FGK stars.

  9. Detailed Design Documentation, without the Pain

    Science.gov (United States)

    Ramsay, C. D.; Parkes, S.

    2004-06-01

    Producing detailed forms of design documentation, such as pseudocode and structured flowcharts, to describe the procedures of a software system:(1) allows software developers to model and discuss their understanding of a problem and the design of a solution free from the syntax of a programming language,(2) facilitates deeper involvement of non-technical stakeholders, such as the customer or project managers, whose influence ensures the quality, correctness and timeliness of the resulting system,(3) forms comprehensive documentation of the system for its future maintenance, reuse and/or redeployment.However, such forms of documentation require effort to create and maintain.This paper describes a software tool which is currently being developed within the Space Systems Research Group at the University of Dundee which aims to improve the utility of, and the incentive for, creating detailed design documentation for the procedures of a software system. The rationale for creating such a tool is briefly discussed, followed by a description of the tool itself, a summary of its perceived benefits, and plans for future work.

  10. Methodology for fire PSA during design process

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Blombach, Joerg

    2009-01-01

    Fire PSA is an essential part of a full scope level 1 PSA. Cable fires play an important role in fire PSA. Usually, cable routing is therefore modeled in detail. During the design of new nuclear power plants the information on cable routing is not yet available. However, for the use of probabilistic safety insights during the design and for licensing purposes a fire PSA may be requested. Therefore a methodology has been developed which makes use of the strictly divisional separation of redundancies in the design of modern nuclear power plants: cable routing is not needed within one division but replaced by the conservative assumption that all equipment fails due to a fire in the concerned division; critical fire areas are defined where components belonging to different divisions may be affected by a fire. For the determination of fire frequencies a component based approach is proposed. The resulting core damage frequencies due to fire are conservative. (orig.)

  11. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  12. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    generalization. Theoretically, the argumentation in the article is based on practice theory. The main part of the article describes three different examples of ways of generalizing on the basis of the same qualitative data material. There is a particular focus on describing the methodological strategies......In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...

  13. Reflective Methodology: The Beginning Teacher

    Science.gov (United States)

    Templeton, Ronald K.; Siefert, Thomas E.

    1970-01-01

    Offers a variety of specific techniques which will help the beginning teacher to implement reflective methodology and create an inquiry-centered classroom atmosphere, at the same time meeting the many more pressing demands of first-year teaching. (JES)

  14. Methodologies used in Project Management

    OpenAIRE

    UNGUREANU, Adrian; UNGUREANU, Anca

    2014-01-01

    Undoubtedly, a methodology properly defined and strictly followed for project management provides a firm guarantee that the work will be done on time, in budget and according to specifications. A project management methodology in simple terms is a “must-have” to avoid failure and reduce risks, because is one of the critical success factors, such basic skills of the management team. This is the simple way to guide the team through the design and execution phases, processes and tasks throughout...

  15. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  16. Coupled Dictionary Learning for the Detail-Enhanced Synthesis of 3-D Facial Expressions.

    Science.gov (United States)

    Liang, Haoran; Liang, Ronghua; Song, Mingli; He, Xiaofei

    2016-04-01

    The desire to reconstruct 3-D face models with expressions from 2-D face images fosters increasing interest in addressing the problem of face modeling. This task is important and challenging in the field of computer animation. Facial contours and wrinkles are essential to generate a face with a certain expression; however, these details are generally ignored or are not seriously considered in previous studies on face model reconstruction. Thus, we employ coupled radius basis function networks to derive an intermediate 3-D face model from a single 2-D face image. To optimize the 3-D face model further through landmarks, a coupled dictionary that is related to 3-D face models and their corresponding 3-D landmarks is learned from the given training set through local coordinate coding. Another coupled dictionary is then constructed to bridge the 2-D and 3-D landmarks for the transfer of vertices on the face model. As a result, the final 3-D face can be generated with the appropriate expression. In the testing phase, the 2-D input faces are converted into 3-D models that display different expressions. Experimental results indicate that the proposed approach to facial expression synthesis can obtain model details more effectively than previous methods can.

  17. Investigating surety methodologies for cognitive systems.

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, Thomas P. (University of New Mexico, Albuquerque, NM); Peercy, David Eugene; Mills, Kristy (University of New Mexico, Albuquerque, NM); Caldera, Eva (University of New Mexico, Albuquerque, NM)

    2006-11-01

    Advances in cognitive science provide a foundation for new tools that promise to advance human capabilities with significant positive impacts. As with any new technology breakthrough, associated technical and non-technical risks are involved. Sandia has mitigated both technical and non-technical risks by applying advanced surety methodologies in such areas as nuclear weapons, nuclear reactor safety, nuclear materials transport, and energy systems. In order to apply surety to the development of cognitive systems, we must understand the concepts and principles that characterize the certainty of a system's operation as well as the risk areas of cognitive sciences. This SAND report documents a preliminary spectrum of risks involved with cognitive sciences, and identifies some surety methodologies that can be applied to potentially mitigate such risks. Some potential areas for further study are recommended. In particular, a recommendation is made to develop a cognitive systems epistemology framework for more detailed study of these risk areas and applications of surety methods and techniques.

  18. Understanding ensemble protein folding at atomic detail

    International Nuclear Information System (INIS)

    Wallin, Stefan; Shakhnovich, Eugene I

    2008-01-01

    Although far from routine, simulating the folding of specific short protein chains on the computer, at a detailed atomic level, is starting to become a reality. This remarkable progress, which has been made over the last decade or so, allows a fundamental aspect of the protein folding process to be addressed, namely its statistical nature. In order to make quantitative comparisons with experimental kinetic data a complete ensemble view of folding must be achieved, with key observables averaged over the large number of microscopically different folding trajectories available to a protein chain. Here we review recent advances in atomic-level protein folding simulations and the new insight provided by them into the protein folding process. An important element in understanding ensemble folding kinetics are methods for analyzing many separate folding trajectories, and we discuss techniques developed to condense the large amount of information contained in an ensemble of trajectories into a manageable picture of the folding process. (topical review)

  19. “Influence Method”. Detailed mathematical description

    International Nuclear Information System (INIS)

    Rios, I.J.; Mayer, R.E.

    2015-01-01

    A new method for the absolute determination of nuclear particle flux in the absence of known detector efficiency, the “Influence Method”, was recently published (I.J. Rios and R.E. Mayer, Nuclear Instruments & Methods in Physics Research A 775 (2015) 99–104). The method defines an estimator for the population and another estimator for the efficiency. In this article we present a detailed mathematical description which yields the conditions for its application, the probability distributions of the estimators and their characteristic parameters. An analysis of the different cases leads to expressions of the estimators and their uncertainties. - Highlights: • “Influence Method”, a new method for absolute particle flux determination. • Absolute counting method when detector efficiencies are not known. • Absolute detector efficiency determination

  20. Radioactive contamination mapping system detailed design report

    International Nuclear Information System (INIS)

    Bauer, R.G.; O'Callaghan, P.B.

    1996-08-01

    The Hanford Site's 100 Area production reactors released radioactively and chemically contaminated liquids into the soil column. The primary source of the contaminated liquids was reactor coolant and various waste waters released from planned liquid discharges, as well as pipelines, pipe junctions, and retention basins leaking into the disposal sites. Site remediation involves excavating the contaminated soils using conventional earthmoving techniques and equipment, treating as appropriate, transporting the soils, and disposing the soils at ERDF. To support remediation excavation, disposal, and documentation requirements, an automated radiological monitoring system was deemed necessary. The RCMS (Radioactive Contamination Mapping System) was designed to fulfill this need. This Detailed Design Report provides design information for the RCMS in accordance with Bechtel Hanford, Inc. Engineering Design Project Instructions

  1. Generation and memory for contextual detail.

    Science.gov (United States)

    Mulligan, Neil W

    2004-07-01

    Generation enhances item memory but may not enhance other aspects of memory. In 12 experiments, the author investigated the effect of generation on context memory, motivated in part by the hypothesis that generation produces a trade-off in encoding item and contextual information. Participants generated some study words (e.g., hot-c__) and read others (e.g., hot-cold). Generation consistently enhanced item memory but did not enhance context memory. More specifically, generation disrupted context memory for the color of the target word but did not affect context memory for location, background color, and cue-word color. The specificity of the negative generation effect in context memory argues against a general item-context trade-off. A processing account of generation meets greater success. In addition, the results provide no evidence that generation enhances recollection of contextual details. Copyright 2004 APA, all rights reserved

  2. Detailed α -decay study of 180Tl

    Science.gov (United States)

    Andel, B.; Andreyev, A. N.; Antalic, S.; Barzakh, A.; Bree, N.; Cocolios, T. E.; Comas, V. F.; Diriken, J.; Elseviers, J.; Fedorov, D. V.; Fedosseev, V. N.; Franchoo, S.; Ghys, L.; Heredia, J. A.; Huyse, M.; Ivanov, O.; Köster, U.; Liberati, V.; Marsh, B. A.; Nishio, K.; Page, R. D.; Patronis, N.; Seliverstov, M. D.; Tsekhanovich, I.; Van den Bergh, P.; Van De Walle, J.; Van Duppen, P.; Venhart, M.; Vermote, S.; Veselský, M.; Wagemans, C.

    2017-11-01

    A detailed α -decay spectroscopy study of 180Tl has been performed at ISOLDE (CERN). Z -selective ionization by the Resonance Ionization Laser Ion Source (RILIS) coupled to mass separation provided a high-purity beam of 180Tl. Fine-structure α decays to excited levels in the daughter 176Au were identified and an α -decay scheme of 180Tl was constructed based on an analysis of α -γ and α -γ -γ coincidences. Multipolarities of several γ -ray transitions deexciting levels in 176Au were determined. Based on the analysis of reduced α -decay widths, it was found that all α decays are hindered, which signifies a change of configuration between the parent and all daughter states.

  3. Comparative analysis as a basic research orientation: Key methodological problems

    Directory of Open Access Journals (Sweden)

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  4. Intellectual potential of population: theoretical and methodological framework for research

    Directory of Open Access Journals (Sweden)

    Galina Valentinovna Leonidova

    2014-03-01

    Full Text Available The article considers the theoretical and methodological framework for the research into the population’s intellectual potential. The presented materials show that this category is the subject of interdisciplinary studies, including philosophy, psychology, sociology, pedagogics, economics. One of the important conclusions drawn from the analysis of the essence of intellectual potential is the conclusion that the actual level of intelligence is the result of its development. It means that certain efforts on the part of such social institutions like family, education, government, promote not only the formation of smart people, but also the implementation of their potential intellectual capabilities in the production, creation of cultural values, society management, education, etc. when using this approach, the intellect ceases to be just a research object of related disciplines, but it acquires social dimension and becomes a socio-economic category. The basic theories, concepts and approaches, used in its study, were analyzed. The theory of human capital was given a most thorough consideration, because, according to this theory, the income of a person is earned by knowledge, abilities and skills, i.e. the essence of intellectual properties of an individual. The article provides the author’s definition of the intellectual potential of the population, which brings to the fore the following elements necessary for the understanding of this category: relation to socioeconomic development, factors in the formation of the characteristic, including the need for training (reproduction of intelligent people, the psychological aspect (abilities, the carriers of intellectual potential are not ignored, because it is an attribute of the population. The article identifies methodological approaches to the estimation of the population’s intellectual potential, describes the applied procedures and research methods. The authors propose methodological

  5. Neuroinflammation: the devil is in the details.

    Science.gov (United States)

    DiSabato, Damon J; Quan, Ning; Godbout, Jonathan P

    2016-10-01

    There is significant interest in understanding inflammatory responses within the brain and spinal cord. Inflammatory responses that are centralized within the brain and spinal cord are generally referred to as 'neuroinflammatory'. Aspects of neuroinflammation vary within the context of disease, injury, infection, or stress. The context, course, and duration of these inflammatory responses are all critical aspects in the understanding of these processes and their corresponding physiological, biochemical, and behavioral consequences. Microglia, innate immune cells of the CNS, play key roles in mediating these neuroinflammatory responses. Because the connotation of neuroinflammation is inherently negative and maladaptive, the majority of research focus is on the pathological aspects of neuroinflammation. There are, however, several degrees of neuroinflammatory responses, some of which are positive. In many circumstances including CNS injury, there is a balance of inflammatory and intrinsic repair processes that influences functional recovery. In addition, there are several other examples where communication between the brain and immune system involves neuroinflammatory processes that are beneficial and adaptive. The purpose of this review is to distinguish different variations of neuroinflammation in a context-specific manner and detail both positive and negative aspects of neuroinflammatory processes. In this review, we will use brain and spinal cord injury, stress, aging, and other inflammatory events to illustrate the potential harm and benefits inherent to neuroinflammation. Context, course, and duration of the inflammation are highly important to the interpretation of these events, and we aim to provide insight into this by detailing several commonly studied insults. This article is part of the 60th anniversary supplemental issue. © 2016 International Society for Neurochemistry.

  6. Emerging Concepts and Methodologies in Cancer Biomarker Discovery.

    Science.gov (United States)

    Lu, Meixia; Zhang, Jinxiang; Zhang, Lanjing

    2017-01-01

    Cancer biomarker discovery is a critical part of cancer prevention and treatment. Despite the decades of effort, only a small number of cancer biomarkers have been identified for and validated in clinical settings. Conceptual and methodological breakthroughs may help accelerate the discovery of additional cancer biomarkers, particularly their use for diagnostics. In this review, we have attempted to review the emerging concepts in cancer biomarker discovery, including real-world evidence, open access data, and data paucity in rare or uncommon cancers. We have also summarized the recent methodological progress in cancer biomarker discovery, such as high-throughput sequencing, liquid biopsy, big data, artificial intelligence (AI), and deep learning and neural networks. Much attention has been given to the methodological details and comparison of the methodologies. Notably, these concepts and methodologies interact with each other and will likely lead to synergistic effects when carefully combined. Newer, more innovative concepts and methodologies are emerging as the current emerging ones became mainstream and widely applied to the field. Some future challenges are also discussed. This review contributes to the development of future theoretical frameworks and technologies in cancer biomarker discovery and will contribute to the discovery of more useful cancer biomarkers.

  7. Detailed free span assessment for Mexilhao flow lines

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Antonio; Franco, Luciano; Eigbe, Uwa; BomfimSilva, Carlos [INTECSEA, Houston, TX (United States); Escudero, Carlos [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    design life. This paper presents the FE methodology and associated tools to perform the detailed free span assessment of the Mexilhao flow lines, considering the design information, the post-lay survey data and the as-built reports after span correction in order to accurately account for the multi-spans and multimode effects in the span assessment procedure. (author)

  8. Urban scale air quality modelling using detailed traffic emissions estimates

    Science.gov (United States)

    Borrego, C.; Amorim, J. H.; Tchepel, O.; Dias, D.; Rafael, S.; Sá, E.; Pimentel, C.; Fontes, T.; Fernandes, P.; Pereira, S. R.; Bandeira, J. M.; Coelho, M. C.

    2016-04-01

    The atmospheric dispersion of NOx and PM10 was simulated with a second generation Gaussian model over a medium-size south-European city. Microscopic traffic models calibrated with GPS data were used to derive typical driving cycles for each road link, while instantaneous emissions were estimated applying a combined Vehicle Specific Power/Co-operative Programme for Monitoring and Evaluation of the Long-range Transmission of Air Pollutants in Europe (VSP/EMEP) methodology. Site-specific background concentrations were estimated using time series analysis and a low-pass filter applied to local observations. Air quality modelling results are compared against measurements at two locations for a 1 week period. 78% of the results are within a factor of two of the observations for 1-h average concentrations, increasing to 94% for daily averages. Correlation significantly improves when background is added, with an average of 0.89 for the 24 h record. The results highlight the potential of detailed traffic and instantaneous exhaust emissions estimates, together with filtered urban background, to provide accurate input data to Gaussian models applied at the urban scale.

  9. Performance evaluation of contrast-detail in full field digital mammography systems using ideal (Hotelling) observer vs. conventional automated analysis of CDMAM images for quality control of contrast-detail characteristics.

    Science.gov (United States)

    Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia

    2015-11-01

    The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. The bases for the differences in the training methodology for male and female athletes

    Directory of Open Access Journals (Sweden)

    Владимир Платонов

    2017-09-01

    Full Text Available The analytical review article presents the data reflecting the need for significant differentiation of the methodology of sports training for male and female athletes, which unfortunately is ignored in sports practice and is not adequately reflected in the vast majority of publications in the field of theory and methodology of sports training. This differentiation can be attributed to the following main components: the physique, strength qualities and flexibility; the energy systems; the peculiarities of the psyche and behavioral reactions; the menstrual cycle; female athlete triad; hyperandrogenism; pregnancy and parturition; and age dependence of sports performance. The clearly insufficient consideration of the peculiarities of the female body not only does not allow to fully use the natural talent of athletes for achieving the highest attainable sports performance, but also may with high probability disturb the normal age-related development and produce serious health problems in female athletes.

  11. Detailed Facility Report Data Dictionary | ECHO | US EPA

    Science.gov (United States)

    The Detailed Facility Report Data Dictionary provides users with a list of the variables and definitions that have been incorporated into the Detailed Facility Report. The Detailed Facility Report provides a concise enforcement and compliance history for a facility.

  12. The devil is in the detail: children's recollection of details about their prior experiences.

    Science.gov (United States)

    Strange, Deryn; Hayne, Harlene

    2013-01-01

    Adults sometimes report highly specific details of childhood events, including the weather, what they or others were wearing, as well as information about what they or others said or were thinking at the time. When these details are reported in the course of research they shape our theories of memory development; when they are reported in a criminal trial they influence jurors' evaluation of guilt or innocence. The key question is whether these details were encoded at the time the event took place or have been added after the fact. We addressed this question prospectively by examining the memory accounts of children. In Experiment 1 we coded the reports of 5- to 6-year-olds and 9- to 10-year-olds who had experienced a unique event. We found that spontaneous mentions of these specific details were exceedingly rare. In Experiment 2 we questioned additional children about a similar event using specific questions to extract those details. We found that 9- to 10-year-olds were able to accurately answer, while 5- to 6-year-olds had considerable difficulty. Moreover, when the younger children did respond they provided generic, forensically inadequate, information. These data have important implications for the courtroom and for current theories of memory development and childhood amnesia.

  13. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    Science.gov (United States)

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research.

  14. Environmental impact assessment for energy pathways: an integrated methodology

    International Nuclear Information System (INIS)

    Sommereux-Blanc, Isabelle

    2010-01-01

    This document presents the synthesis of my research work contributing to the development of an integrated methodology of environmental impact assessment for energy pathways. In the context of world globalization, environmental impact assessments issues are highly linked with the following questioning: Which environmental impacts? for which demand? at which location? at which temporal scale? My work is built upon the definition of a conceptual framework able to handle these issues and upon its progressive implementation. The integration of the spatial and temporal issues within the methodology are key elements. Fundamental cornerstones of this framework are presented along the DPSIR concept (Driving forces, Pressures, State, Impacts, Responses). They cover a comprehensive analysis of the limits and the relevance of life cycle analysis and the development of a geo-spatialized environmental performance approach for an electrical production pathway. Perspectives linked with the development of this integrated methodology are detailed for energy pathways. (author)

  15. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  16. Structured Intuition: A Methodology to Analyse Entity Authentication

    DEFF Research Database (Denmark)

    Ahmed, Naveed

    and the level of abstraction used in the analysis. Thus, the goal of developing a high level methodology that can be used with different notions of security, authentication, and abstraction is worth considering. In this thesis, we propose a new methodology, called the structured intuition (SI), which addresses...... in our methodology, which is called canonicity, which is a weaker form of message authenticity. As compared to many contemporary analysis techniques, an SI based analysis provides detailed results regarding the design rationales and entity authentication goals of a protocol....... consequences for the security of the system, e.g., private information of legitimate parties may be leaked or the security policy of a trusted system may be violated. At a corporate level, such a failure of authentication may result in loss of proprietary technology or customers' credit card information...

  17. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  18. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  19. Has Political Science Ignored Religion?

    Science.gov (United States)

    Kettell, Steven

    2012-01-01

    A common complaint from political scientists involved in the study of religion is that religious issues have been largely overlooked by political science. Through a content analysis of leading political science and sociology journals from 2000 to 2010, this article considers the extent of this claim. The results show that political science…

  20. Credentialism in Our Ignorant Society.

    Science.gov (United States)

    Marien, Michael

    All societies have procedures for selecting who will occupy important positions. The use of credentials characterizes our system of social selection, and our worship of them has created the following problems: an artificial demand for education, artificial restraints to learning, the overlooking of obsolescence, generational inversion (wherein the…

  1. Pancreatic Stones: Treat or Ignore?

    Directory of Open Access Journals (Sweden)

    DA Howell

    1999-01-01

    Full Text Available Painful, chronic pancreatitis is of complex etiology, but increasing clinical experience suggests that removal of pancreatic duct stones in many cases significantly improves patients’ symptoms. The development and refinement of therapeutic endoscopic retrograde choledochopancreatography have permitted improved access to the pancreatic duct, which makes the development of new techniques of stone fragmentation and fragment removal a much more successful nonsurgical intervention. A major step forward has been the understanding of the safety and efficacy of pancreatic sphincterotomy, which is necessary for the removal of these difficult stones. The recognition that extracorporeal shock wave lithotripsy can be delivered safely with good efficacy has revolutionized the nonsurgical management of pancreatic duct stones. Nevertheless, advanced and sophisticated therapeutic endoscopy is necessary to achieve clearance of the duct, which can generally be accomplished in the majority of selected patients. State-of-the-art treatments are described, and some new approaches using pancreatoscopy and electrohydrolic lithotripsy are discussed. Newly recognized long term complications are reviewed. Finally, it must be recognized that chronic pancreatitis is an ongoing disease that does not have a simple treatment or cure, and frequently represents a process of remissions and relapses requiring interventions and problem solving.

  2. Hypoxic radiosensitization: adored and ignored

    DEFF Research Database (Denmark)

    Overgaard, Jens

    2007-01-01

    resistance can be eliminated or modified by normobaric or hyperbaric oxygen or by the use of nitroimidazoles as hypoxic radiation sensitizers. More recently, attention has been given to hypoxic cytotoxins, a group of drugs that selectively or preferably destroys cells in a hypoxic environment. An updated......Since observations from the beginning of the last century, it has become well established that solid tumors may contain oxygen-deficient hypoxic areas and that cells in such areas may cause tumors to become radioresistant. Identifying hypoxic cells in human tumors has improved by the help of new...

  3. Optoelectronic pH Meter: Further Details

    Science.gov (United States)

    Jeevarajan, Antony S.; Anderson, Mejody M.; Macatangay, Ariel V.

    2009-01-01

    A collection of documents provides further detailed information about an optoelectronic instrument that measures the pH of an aqueous cell-culture medium to within 0.1 unit in the range from 6.5 to 7.5. The instrument at an earlier stage of development was reported in Optoelectronic Instrument Monitors pH in a Culture Medium (MSC-23107), NASA Tech Briefs, Vol. 28, No. 9 (September 2004), page 4a. To recapitulate: The instrument includes a quartz cuvette through which the medium flows as it is circulated through a bioreactor. The medium contains some phenol red, which is an organic pH-indicator dye. The cuvette sits between a light source and a photodetector. [The light source in the earlier version comprised red (625 nm) and green (558 nm) light-emitting diodes (LEDs); the light source in the present version comprises a single green- (560 nm)-or-red (623 nm) LED.] The red and green are repeatedly flashed in alternation. The responses of the photodiode to the green and red are processed electronically to obtain the ratio between the amounts of green and red light transmitted through the medium. The optical absorbance of the phenol red in the green light varies as a known function of pH. Hence, the pH of the medium can be calculated from the aforesaid ratio.

  4. Basic and detail engineering development of PTAMB

    International Nuclear Information System (INIS)

    Beuter, Oscar; Reibel, Jose A.; Mirad, Andres E.; Furriel, Miguel; Diaz, L.

    2009-01-01

    The purpose of the future Treatment and Conditioning of Medium and Low Activity Solid and Liquid Waste Plant (PTAMB) of the National Atomic Energy Commission (CNEA) will be to put up medium and low activity solid and liquid waste and to verify the quality of the conditioned waste generated in Ezeiza Atomic Center (CAE), Constituyentes Atomic Center (CAC) and other national producers outside CNEA. The PTAMB is a Class I Radioactive Installation (according to Basic Standard AR 10.1.1, Rev. 3 RNA, paragraphs 17 and 22) also called Relevant Installation. The aim of this document is to list the steps that carried out the Projects Department of the National Program of Radioactive Waste Management (PNGRR) to arrive to the realization of the detailed engineering of the plant. The project is in Public Tender stage and the beginning of the construction would be March 2010. Once built, the Plant will process the radioactive waste contained in the conceptual engineering, offering more precise control of these and their compatibility with the new final disposal systems to build. (author)

  5. WA uranium find under detailed study

    International Nuclear Information System (INIS)

    1987-01-01

    Results of detailed geological surveys of CRA's Kintyre prospect in Western Australia have confirmed the presence of uranium in significant quantities with a number of features that make it promising for mining. The deposit is set in the remote Rudall River area, about 1,200 kilometres northeast of Perth. So far, probable ore reserves of 15,000 tonnes of U 3 O 8 and possible reserves of 15,000 tonnes have been identified and announced. Grades vary widely within a range of 1.5 to 4kg per tonne. The bulk of the ore body lies within 160 metres of the surface, which means it could be mined by open cut methods. The uranium mineralisation has been encountered in bands of pitchblende occurring as veins within the host rock. Current indications are that conventional acid/leach solvent extraction processes can be used to extract the uranium. The Kintyre deposit lies about 700 metres inside the northern boundary of Western Australia's Rudall River National Park. Exploration by CRA at the southern end of the park, in the vicinity of Mt. Cotton, has been halted temporarily. While the Kintyre geological results to date are most encouraging, studies are now being carried out to determine the commercial potential of the deposit

  6. Work life after psychosis: A detailed examination.

    Science.gov (United States)

    Turner, Niall; O'Mahony, Paul; Hill, Michelle; Fanning, Felicity; Larkin, Conall; Waddington, John; O'Callaghan, Eadbhard; Clarke, Mary

    2015-01-01

    Conducting research on the work outcomes of first episode psychosis (FEP) samples may extend our understanding of the factors associated with the work outcome of people with schizophrenia and other psychotic illnesses. To conduct a detailed study of the work outcome of an FEP sample. Members of a FEP cohort, who had completed a 12-year clinical outcome assessment, were invited to participate in an adjunctive work outcome study. Engagement in paid and non-paid work was first established and the relationship with potentially influential baseline characteristics investigated. Subsequently the influence of work outcome to participants' level of quality of life, mental health, recovery, and social inclusion were examined. Among the 38 participants the mean percentage of time spent in work was 62% of which 50% was in paid work and 12% was in non-paid work. Being employed at inception was the only independent predictor of the duration of the follow-up period spent in work. Relationships between work outcome and all measures of wellbeing were found. The paid and non-paid work attained by people affected by a psychotic illness played an important role in the extent of their wellbeing, recovery, and social inclusion.

  7. Polonium-210 in Euphauslids: A Detailed Study

    Energy Technology Data Exchange (ETDEWEB)

    Heyraud, M.; Fowler, S. W.; Beasley, T. M.; Cherry, R. D.

    1976-07-01

    A detailed study of {sup 210}Po, the predominant alpha-emitting nuclide found in most marine organisms, has been undertaken in a particular zooplanktonic species, the euphausiid Meganyctiphanes norvegica. The purpose was to obtain information concerning the origin, the localization and the flux of the nuclide in and through this organism. Measurements of {sup 210}Po were made in euphausiids of different sizes, in dissected organs and tissues, and in excretion products. The results show higher concentrations in the smaller specimens; this fact cannot be explained on the basis of surface adsorption, but is probably related to the ingestion of food. Dissection results show that the distribution of {sup 210}Po in euphausiids is not homogeneous, but that the majority is concentrated in the internal organs, the alimentary tract and the hepatopancreas in particular. The natural radiation dose received by these organs is in consequence much higher than that received by the whole animal. Use of a dynamic model allowed the flux of {sup 210}Po through M. norvegica to be calculated. The calculations confirm that food is the principal sauce of {sup 210}Po for this species, and clearly show that fecal pellets constitute the major elimination route. Extrapolation of the data to zooplankton in general leads to the conclusion that zooplankton metabolic activity plays an important role in transporting {sup 210}Po from the surface layers of the ocean to depth. (author)

  8. Polonium-210 in Euphauslids: A Detailed Study

    International Nuclear Information System (INIS)

    Heyraud, M.; Fowler, S.W.; Beasley, T.M.; Cherry, R.D.

    1976-01-01

    A detailed study of 210 Po, the predominant alpha-emitting nuclide found in most marine organisms, has been undertaken in a particular zooplanktonic species, the euphausiid Meganyctiphanes norvegica. The purpose was to obtain information concerning the origin, the localization and the flux of the nuclide in and through this organism. Measurements of 210 Po were made in euphausiids of different sizes, in dissected organs and tissues, and in excretion products. The results show higher concentrations in the smaller specimens; this fact cannot be explained on the basis of surface adsorption, but is probably related to the ingestion of food. Dissection results show that the distribution of 210 Po in euphausiids is not homogeneous, but that the majority is concentrated in the internal organs, the alimentary tract and the hepatopancreas in particular. The natural radiation dose received by these organs is in consequence much higher than that received by the whole animal. Use of a dynamic model allowed the flux of 210 Po through M. norvegica to be calculated. The calculations confirm that food is the principal sauce of 210 Po for this species, and clearly show that fecal pellets constitute the major elimination route. Extrapolation of the data to zooplankton in general leads to the conclusion that zooplankton metabolic activity plays an important role in transporting 210 Po from the surface layers of the ocean to depth. (author)

  9. Polonium-210 in euphausiids: a detailed study

    Energy Technology Data Exchange (ETDEWEB)

    Heyraud, M; Fowler, S W; Beasley, T M; Cherry, R D

    1976-02-13

    A detailed study of /sup 210/Po, the predominant alpha-emitting nuclide found in most marine organisms, has been undertaken in a particular zooplanktonic species, the euphausiid Meganyctiphanes norvegica. Information was obtained concerning the origin, the localization and the flux of the nuclide in and through the organism. Measurements of /sup 210/Po were made in euphausiids of different sizes, in dissected organs and tissues, and in excretion products. The results show higher concentrations in the smaller specimens; this fact cannot be explained on the basis of surface adsorption, but is probably related to the ingestion of food. Dissection results show that the distribution of /sup 210/Po in euphausiids is not homogeneous, but that the majority is concentrated in the internal organs, the alimentary tract and the hepatopancreas in particular. The natural radiation dose received by these organs is in consequence much higher than that received by the whole animal. Use of a dynamic model allowed the flux of /sup 210/Po through M. norvegica to be calculated. The calculations confirm that food is the principal source of /sup 210/Po for this species, and clearly show that fecal pellets constitute the major elimination route. Extrapolation of the data to zooplankton in general leads to the conclusion that zooplankton metabolic activity plays an important role in transporting /sup 210/Po from the surface layers of the ocean to depth. (auth)

  10. Detailed inelastic analysis of an LMFBR pipeline

    International Nuclear Information System (INIS)

    Hibbitt, H.D.; Leung, E.K.; Ohalla, A.K.

    1982-01-01

    The paper describes detailed inelastic analyses of a large diameter, thin walled pipeline configuration typical of liquid metal cooled reactor primary piping, subject to thermal shock, with intermediate periods of creep hold time. Three such analyses are compared. Two of these analyses are performed with recently developed elements based on a combination of Fourier and polynomial interpolation to describe the deformation of the pipe. One of these two analyses includes continuous deformation of the pipe wall between each elbow and the adjacent straight pipe segments, while the other neglects such ''end effects'' on the elbow deformation. The third analysis is based on a modified axi-symmetric shell element for modeling the elbows (neglecting and effects). The results thus provide an assessment of the relative cost and importance of including consideration of end effects in modeling a realistic piping system, as well as providing a similar comparison between the two basic deforming section pipe models (Fourier/polynomial versus modified axi-symmetric shells)

  11. Active solar distillation - A detailed review

    Energy Technology Data Exchange (ETDEWEB)

    Sampathkumar, K.; Pitchandi, P. [Department of Mechanical Engineering, Tamilnadu College of Engineering, Coimbatore 641659, Tamilnadu (India); Arjunan, T.V. [Department of Automobile Engineering, PSG College of Technology, Coimbatore 641004, Tamilnadu (India); Senthilkumar, P. [Department of Mechanical Engineering, KSR College of Engineering, Tiruchengode 637215, Tamilnadu (India)

    2010-08-15

    All over the world, access to potable water to the people are narrowing down day by day. Most of the human diseases are due to polluted or non-purified water resources. Even today, under developed countries and developing countries face a huge water scarcity because of unplanned mechanism and pollution created by manmade activities. Water purification without affecting the ecosystem is the need of the hour. In this context, many conventional and non-conventional techniques have been developed for purification of saline water. Among these, solar distillation proves to be both economical and eco-friendly technique particularly in rural areas. Many active distillation systems have been developed to overcome the problem of lower distillate output in passive solar stills. This article provides a detailed review of different studies on active solar distillation system over the years. Thermal modelling was done for various types of active single slope solar distillation system. This review would also throw light on the scope for further research and recommendations in active solar distillation system. (author)

  12. Detailed sensory memory, sloppy working memory.

    Science.gov (United States)

    Sligte, Ilja G; Vandenbroucke, Annelinde R E; Scholte, H Steven; Lamme, Victor A F

    2010-01-01

    Visual short-term memory (VSTM) enables us to actively maintain information in mind for a brief period of time after stimulus disappearance. According to recent studies, VSTM consists of three stages - iconic memory, fragile VSTM, and visual working memory - with increasingly stricter capacity limits and progressively longer lifetimes. Still, the resolution (or amount of visual detail) of each VSTM stage has remained unexplored and we test this in the present study. We presented people with a change detection task that measures the capacity of all three forms of VSTM, and we added an identification display after each change trial that required people to identify the "pre-change" object. Accurate change detection plus pre-change identification requires subjects to have a high-resolution representation of the "pre-change" object, whereas change detection or identification only can be based on the hunch that something has changed, without exactly knowing what was presented before. We observed that people maintained 6.1 objects in iconic memory, 4.6 objects in fragile VSTM, and 2.1 objects in visual working memory. Moreover, when people detected the change, they could also identify the pre-change object on 88% of the iconic memory trials, on 71% of the fragile VSTM trials and merely on 53% of the visual working memory trials. This suggests that people maintain many high-resolution representations in iconic memory and fragile VSTM, but only one high-resolution object representation in visual working memory.

  13. Guidance on the Technology Performance Level (TPL) Assessment Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Jochem [National Renewable Energy Lab. (NREL), Golden, CO (United States); Roberts, Jesse D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Babarit, Aurelien [Ecole Centrale de Nantes (France). Lab. of Research in Hydrodynamics, Energetics and Atmospheric Environment (LHEEA); Costello, Ronan [Wave Venture, Penstraze (United Kingdom); Bull, Diana L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neilson, Kim [Ramboll, Copenhagen (Denmark); Bittencourt, Claudio [DNV GL, London (United Kingdom); Kennedy, Ben [Wave Venture, Penstraze (United Kingdom); Malins, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dykes, Katherine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This document presents the revised Technology Performance Level (TPL) assessment methodology. There are three parts to this revised methodology 1) the Stakeholder Needs and Assessment Guidance (this document), 2) the Technical Submission form, 3) the TPL scoring spreadsheet. The TPL assessment is designed to give a technology neutral or agnostic assessment of any wave energy converter technology. The focus of the TPL is on the performance of the technology in meeting the customer’s needs. The original TPL is described in [1, 2] and those references also detail the critical differences in the nature of the TPL when compared to the more widely used technology readiness level (TRL). (Wave energy TRL is described in [3]). The revised TPL is particularly intended to be useful to investors and also to assist technology developers to conduct comprehensive assessments in a way that is meaningful and attractive to investors. The revised TPL assessment methodology has been derived through a structured Systems Engineering approach. This was a formal process which involved analyzing customer and stakeholder needs through the discipline of Systems Engineering. The results of the process confirmed the high level of completeness of the original methodology presented in [1] (as used in the Wave Energy Prize judging) and now add a significantly increased level of detail in the assessment and an improved more investment focused structure. The revised TPL also incorporates the feedback of the Wave Energy Prize judges.

  14. Regional issue identification and assessment: study methodology. First annual report

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    The overall assessment methodologies and models utilized for the first project under the Regional Issue Identification and Assessment (RIIA) program are described. Detailed descriptions are given of the methodologies used by lead laboratories for the quantification of the impacts of an energy scenario on one or more media (e.g., air, water, land, human and ecology), and by all laboratories to assess the regional impacts on all media. The research and assessments reflected in this document were performed by the following national laboratories: Argonne National Laboratory; Brookhaven National Laboratory; Lawrence Berkeley Laboratory; Los Alamos Scientific Laboratory; Oak Ridge National Laboratory; and Pacific Northwest Laboratory. This report contains five chapters. Chapter 1 briefly describes the overall study methodology and introduces the technical participants. Chapter 2 is a summary of the energy policy scenario selected for the RIIA I study and Chapter 3 describes how this scenario was translated into a county-level siting pattern of energy development. The fourth chapter is a detailed description of the individual methodologies used to quantify the environmental and socioeconomic impacts of the scenario while Chapter 5 describes how these impacts were translated into comprehensive regional assessments for each Federal Region.

  15. Managerial Methodology in Public Institutions

    Directory of Open Access Journals (Sweden)

    Ion VERBONCU

    2010-10-01

    Full Text Available One of the most important ways of making public institutions more efficient is by applying managerial methodology, embodied in the promotion of management tools, modern and sophisticated methodologies, as well as operation of designing/redesigning and maintenance of the management process and its components. Their implementation abides the imprint of constructive and functional particularities of public institutions, decentralized and devolved, and, of course, the managers’ expertise of these organizations. Managerial methodology is addressed through three important instruments diagnosis, management by objectives and scoreboard. Its presence in the performance management process should be mandatory, given the favorable influence on the management and economic performance and the degree of scholastic approach of the managers’ performance.

  16. Blanket safety by GEMSAFE methodology

    International Nuclear Information System (INIS)

    Sawada, Tetsuo; Saito, Masaki

    2001-01-01

    General Methodology of Safety Analysis and Evaluation for Fusion Energy Systems (GEMSAFE) has been applied to a number of fusion system designs, such as R-tokamak, Fusion Experimental Reactor (FER), and the International Thermonuclear Experimental Reactor (ITER) designs in the both stages of Conceptual Design Activities (CDA) and Engineering Design Activities (EDA). Though the major objective of GEMSAFE is to reasonably select design basis events (DBEs) it is also useful to elucidate related safety functions as well as requirements to ensure its safety. In this paper, we apply the methodology to fusion systems with future tritium breeding blankets and make clear which points of the system should be of concern from safety ensuring point of view. In this context, we have obtained five DBEs that are related to the blanket system. We have also clarified the safety functions required to prevent accident propagations initiated by those blanket-specific DBEs. The outline of the methodology is also reviewed. (author)

  17. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  18. Creating and evaluating a new clicker methodology

    Science.gov (United States)

    Li, Pengfei

    "Clickers", an in-class polling system, has been used by many instructors to add active learning and formative assessment to previously passive traditional lectures. While considerable research has been conducted on clicker increasing student interaction in class, less research has been reported on the effectiveness of using clicker to help students understand concepts. This thesis reported a systemic project by the OSU Physics Education group to develop and test a new clicker methodology. Clickers question sequences based on a constructivist model of learning were used to improve classroom dynamics and student learning. They also helped students and lecturers understand in real time whether a concept had been assimilated or more effort was required. Chapter 1 provided an introduction to the clicker project. Chapter 2 summarized widely-accepted teaching principles that have arisen from a long history of research and practice in psychology, cognitive science and physics education. The OSU clicker methodology described in this thesis originated partly from our years of teaching experience, but mostly was based on these teaching principles. Chapter 3 provided an overview of the history of clicker technology and different types of clickers. Also, OSU's use of clickers was summarized together with a list of common problems and corresponding solutions. These technical details may be useful for those who want to use clickers. Chapter 4 discussed examples of the type and use of question sequences based on the new clicker methodology. In several years of research, we developed a base of clicker materials for calculus-based introductory physics courses at OSU. As discussed in chapter 5, a year-long controlled quantitative study was conducted to determine whether using clickers helps students learn, how using clickers helps students learn and whether students perceive that clicker has a positive effect on their own learning process. The strategy for this test was based on

  19. Methodology, theoretical framework and scholarly significance: An ...

    African Journals Online (AJOL)

    Methodology, theoretical framework and scholarly significance: An overview ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... Keywords: Legal Research, Methodology, Theory, Pedagogy, Legal Training, Scholarship ...

  20. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...